Sample records for target large-scale human

  1. Towards large scale multi-target tracking

    NASA Astrophysics Data System (ADS)

    Vo, Ba-Ngu; Vo, Ba-Tuong; Reuter, Stephan; Lam, Quang; Dietmayer, Klaus

    2014-06-01

    Multi-target tracking is intrinsically an NP-hard problem and the complexity of multi-target tracking solutions usually do not scale gracefully with problem size. Multi-target tracking for on-line applications involving a large number of targets is extremely challenging. This article demonstrates the capability of the random finite set approach to provide large scale multi-target tracking algorithms. In particular it is shown that an approximate filter known as the labeled multi-Bernoulli filter can simultaneously track one thousand five hundred targets in clutter on a standard laptop computer.

  2. Spatiotemporal property and predictability of large-scale human mobility

    NASA Astrophysics Data System (ADS)

    Zhang, Hai-Tao; Zhu, Tao; Fu, Dongfei; Xu, Bowen; Han, Xiao-Pu; Chen, Duxin

    2018-04-01

    Spatiotemporal characteristics of human mobility emerging from complexity on individual scale have been extensively studied due to the application potential on human behavior prediction and recommendation, and control of epidemic spreading. We collect and investigate a comprehensive data set of human activities on large geographical scales, including both websites browse and mobile towers visit. Numerical results show that the degree of activity decays as a power law, indicating that human behaviors are reminiscent of scale-free random walks known as Lévy flight. More significantly, this study suggests that human activities on large geographical scales have specific non-Markovian characteristics, such as a two-segment power-law distribution of dwelling time and a high possibility for prediction. Furthermore, a scale-free featured mobility model with two essential ingredients, i.e., preferential return and exploration, and a Gaussian distribution assumption on the exploration tendency parameter is proposed, which outperforms existing human mobility models under scenarios of large geographical scales.

  3. Large-Scale Analysis of Network Bistability for Human Cancers

    PubMed Central

    Shiraishi, Tetsuya; Matsuyama, Shinako; Kitano, Hiroaki

    2010-01-01

    Protein–protein interaction and gene regulatory networks are likely to be locked in a state corresponding to a disease by the behavior of one or more bistable circuits exhibiting switch-like behavior. Sets of genes could be over-expressed or repressed when anomalies due to disease appear, and the circuits responsible for this over- or under-expression might persist for as long as the disease state continues. This paper shows how a large-scale analysis of network bistability for various human cancers can identify genes that can potentially serve as drug targets or diagnosis biomarkers. PMID:20628618

  4. Towards large-scale, human-based, mesoscopic neurotechnologies.

    PubMed

    Chang, Edward F

    2015-04-08

    Direct human brain recordings have transformed the scope of neuroscience in the past decade. Progress has relied upon currently available neurophysiological approaches in the context of patients undergoing neurosurgical procedures for medical treatment. While this setting has provided precious opportunities for scientific research, it also has presented significant constraints on the development of new neurotechnologies. A major challenge now is how to achieve high-resolution spatiotemporal neural recordings at a large scale. By narrowing the gap between current approaches, new directions tailored to the mesoscopic (intermediate) scale of resolution may overcome the barriers towards safe and reliable human-based neurotechnology development, with major implications for advancing both basic research and clinical translation. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. Unfolding large-scale online collaborative human dynamics

    PubMed Central

    Zha, Yilong; Zhou, Tao; Zhou, Changsong

    2016-01-01

    Large-scale interacting human activities underlie all social and economic phenomena, but quantitative understanding of regular patterns and mechanism is very challenging and still rare. Self-organized online collaborative activities with a precise record of event timing provide unprecedented opportunity. Our empirical analysis of the history of millions of updates in Wikipedia shows a universal double–power-law distribution of time intervals between consecutive updates of an article. We then propose a generic model to unfold collaborative human activities into three modules: (i) individual behavior characterized by Poissonian initiation of an action, (ii) human interaction captured by a cascading response to previous actions with a power-law waiting time, and (iii) population growth due to the increasing number of interacting individuals. This unfolding allows us to obtain an analytical formula that is fully supported by the universal patterns in empirical data. Our modeling approaches reveal “simplicity” beyond complex interacting human activities. PMID:27911766

  6. How institutions shaped the last major evolutionary transition to large-scale human societies

    PubMed Central

    Powers, Simon T.; van Schaik, Carel P.; Lehmann, Laurent

    2016-01-01

    What drove the transition from small-scale human societies centred on kinship and personal exchange, to large-scale societies comprising cooperation and division of labour among untold numbers of unrelated individuals? We propose that the unique human capacity to negotiate institutional rules that coordinate social actions was a key driver of this transition. By creating institutions, humans have been able to move from the default ‘Hobbesian’ rules of the ‘game of life’, determined by physical/environmental constraints, into self-created rules of social organization where cooperation can be individually advantageous even in large groups of unrelated individuals. Examples include rules of food sharing in hunter–gatherers, rules for the usage of irrigation systems in agriculturalists, property rights and systems for sharing reputation between mediaeval traders. Successful institutions create rules of interaction that are self-enforcing, providing direct benefits both to individuals that follow them, and to individuals that sanction rule breakers. Forming institutions requires shared intentionality, language and other cognitive abilities largely absent in other primates. We explain how cooperative breeding likely selected for these abilities early in the Homo lineage. This allowed anatomically modern humans to create institutions that transformed the self-reliance of our primate ancestors into the division of labour of large-scale human social organization. PMID:26729937

  7. Expediting SRM assay development for large-scale targeted proteomics experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Chaochao; Shi, Tujin; Brown, Joseph N.

    2014-08-22

    Due to their high sensitivity and specificity, targeted proteomics measurements, e.g. selected reaction monitoring (SRM), are becoming increasingly popular for biological and translational applications. Selection of optimal transitions and optimization of collision energy (CE) are important assay development steps for achieving sensitive detection and accurate quantification; however, these steps can be labor-intensive, especially for large-scale applications. Herein, we explored several options for accelerating SRM assay development evaluated in the context of a relatively large set of 215 synthetic peptide targets. We first showed that HCD fragmentation is very similar to CID in triple quadrupole (QQQ) instrumentation, and by selection ofmore » top six y fragment ions from HCD spectra, >86% of top transitions optimized from direct infusion on QQQ instrument are covered. We also demonstrated that the CE calculated by existing prediction tools was less accurate for +3 precursors, and a significant increase in intensity for transitions could be obtained using a new CE prediction equation constructed from the present experimental data. Overall, our study illustrates the feasibility of expediting the development of larger numbers of high-sensitivity SRM assays through automation of transitions selection and accurate prediction of optimal CE to improve both SRM throughput and measurement quality.« less

  8. Target-decoy Based False Discovery Rate Estimation for Large-scale Metabolite Identification.

    PubMed

    Wang, Xusheng; Jones, Drew R; Shaw, Timothy I; Cho, Ji-Hoon; Wang, Yuanyuan; Tan, Haiyan; Xie, Boer; Zhou, Suiping; Li, Yuxin; Peng, Junmin

    2018-05-23

    Metabolite identification is a crucial step in mass spectrometry (MS)-based metabolomics. However, it is still challenging to assess the confidence of assigned metabolites. In this study, we report a novel method for estimating false discovery rate (FDR) of metabolite assignment with a target-decoy strategy, in which the decoys are generated through violating the octet rule of chemistry by adding small odd numbers of hydrogen atoms. The target-decoy strategy was integrated into JUMPm, an automated metabolite identification pipeline for large-scale MS analysis, and was also evaluated with two other metabolomics tools, mzMatch and mzMine 2. The reliability of FDR calculation was examined by false datasets, which were simulated by altering MS1 or MS2 spectra. Finally, we used the JUMPm pipeline coupled with the target-decoy strategy to process unlabeled and stable-isotope labeled metabolomic datasets. The results demonstrate that the target-decoy strategy is a simple and effective method for evaluating the confidence of high-throughput metabolite identification.

  9. Human3.6M: Large Scale Datasets and Predictive Methods for 3D Human Sensing in Natural Environments.

    PubMed

    Ionescu, Catalin; Papava, Dragos; Olaru, Vlad; Sminchisescu, Cristian

    2014-07-01

    We introduce a new dataset, Human3.6M, of 3.6 Million accurate 3D Human poses, acquired by recording the performance of 5 female and 6 male subjects, under 4 different viewpoints, for training realistic human sensing systems and for evaluating the next generation of human pose estimation models and algorithms. Besides increasing the size of the datasets in the current state-of-the-art by several orders of magnitude, we also aim to complement such datasets with a diverse set of motions and poses encountered as part of typical human activities (taking photos, talking on the phone, posing, greeting, eating, etc.), with additional synchronized image, human motion capture, and time of flight (depth) data, and with accurate 3D body scans of all the subject actors involved. We also provide controlled mixed reality evaluation scenarios where 3D human models are animated using motion capture and inserted using correct 3D geometry, in complex real environments, viewed with moving cameras, and under occlusion. Finally, we provide a set of large-scale statistical models and detailed evaluation baselines for the dataset illustrating its diversity and the scope for improvement by future work in the research community. Our experiments show that our best large-scale model can leverage our full training set to obtain a 20% improvement in performance compared to a training set of the scale of the largest existing public dataset for this problem. Yet the potential for improvement by leveraging higher capacity, more complex models with our large dataset, is substantially vaster and should stimulate future research. The dataset together with code for the associated large-scale learning models, features, visualization tools, as well as the evaluation server, is available online at http://vision.imar.ro/human3.6m.

  10. Large-scale prediction of ADAR-mediated effective human A-to-I RNA editing.

    PubMed

    Yao, Li; Wang, Heming; Song, Yuanyuan; Dai, Zhen; Yu, Hao; Yin, Ming; Wang, Dongxu; Yang, Xin; Wang, Jinlin; Wang, Tiedong; Cao, Nan; Zhu, Jimin; Shen, Xizhong; Song, Guangqi; Zhao, Yicheng

    2017-08-10

    Adenosine-to-inosine (A-to-I) editing by adenosine deaminase acting on the RNA (ADAR) proteins is one of the most frequent modifications during post- and co-transcription. To facilitate the assignment of biological functions to specific editing sites, we designed an automatic online platform to annotate A-to-I RNA editing sites in pre-mRNA splicing signals, microRNAs (miRNAs) and miRNA target untranslated regions (3' UTRs) from human (Homo sapiens) high-throughput sequencing data and predict their effects based on large-scale bioinformatic analysis. After analysing plenty of previously reported RNA editing events and human normal tissues RNA high-seq data, >60 000 potentially effective RNA editing events on functional genes were found. The RNA Editing Plus platform is available for free at https://www.rnaeditplus.org/, and we believe our platform governing multiple optimized methods will improve further studies of A-to-I-induced editing post-transcriptional regulation. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  11. Captured metagenomics: large-scale targeting of genes based on ‘sequence capture’ reveals functional diversity in soils

    PubMed Central

    Manoharan, Lokeshwaran; Kushwaha, Sandeep K.; Hedlund, Katarina; Ahrén, Dag

    2015-01-01

    Microbial enzyme diversity is a key to understand many ecosystem processes. Whole metagenome sequencing (WMG) obtains information on functional genes, but it is costly and inefficient due to large amount of sequencing that is required. In this study, we have applied a captured metagenomics technique for functional genes in soil microorganisms, as an alternative to WMG. Large-scale targeting of functional genes, coding for enzymes related to organic matter degradation, was applied to two agricultural soil communities through captured metagenomics. Captured metagenomics uses custom-designed, hybridization-based oligonucleotide probes that enrich functional genes of interest in metagenomic libraries where only probe-bound DNA fragments are sequenced. The captured metagenomes were highly enriched with targeted genes while maintaining their target diversity and their taxonomic distribution correlated well with the traditional ribosomal sequencing. The captured metagenomes were highly enriched with genes related to organic matter degradation; at least five times more than similar, publicly available soil WMG projects. This target enrichment technique also preserves the functional representation of the soils, thereby facilitating comparative metagenomics projects. Here, we present the first study that applies the captured metagenomics approach in large scale, and this novel method allows deep investigations of central ecosystem processes by studying functional gene abundances. PMID:26490729

  12. Large-Scale Chemical Similarity Networks for Target Profiling of Compounds Identified in Cell-Based Chemical Screens

    PubMed Central

    Lo, Yu-Chen; Senese, Silvia; Li, Chien-Ming; Hu, Qiyang; Huang, Yong; Damoiseaux, Robert; Torres, Jorge Z.

    2015-01-01

    Target identification is one of the most critical steps following cell-based phenotypic chemical screens aimed at identifying compounds with potential uses in cell biology and for developing novel disease therapies. Current in silico target identification methods, including chemical similarity database searches, are limited to single or sequential ligand analysis that have limited capabilities for accurate deconvolution of a large number of compounds with diverse chemical structures. Here, we present CSNAP (Chemical Similarity Network Analysis Pulldown), a new computational target identification method that utilizes chemical similarity networks for large-scale chemotype (consensus chemical pattern) recognition and drug target profiling. Our benchmark study showed that CSNAP can achieve an overall higher accuracy (>80%) of target prediction with respect to representative chemotypes in large (>200) compound sets, in comparison to the SEA approach (60–70%). Additionally, CSNAP is capable of integrating with biological knowledge-based databases (Uniprot, GO) and high-throughput biology platforms (proteomic, genetic, etc) for system-wise drug target validation. To demonstrate the utility of the CSNAP approach, we combined CSNAP's target prediction with experimental ligand evaluation to identify the major mitotic targets of hit compounds from a cell-based chemical screen and we highlight novel compounds targeting microtubules, an important cancer therapeutic target. The CSNAP method is freely available and can be accessed from the CSNAP web server (http://services.mbi.ucla.edu/CSNAP/). PMID:25826798

  13. Development and Evaluation of a Parallel Reaction Monitoring Strategy for Large-Scale Targeted Metabolomics Quantification.

    PubMed

    Zhou, Juntuo; Liu, Huiying; Liu, Yang; Liu, Jia; Zhao, Xuyang; Yin, Yuxin

    2016-04-19

    Recent advances in mass spectrometers which have yielded higher resolution and faster scanning speeds have expanded their application in metabolomics of diverse diseases. Using a quadrupole-Orbitrap LC-MS system, we developed an efficient large-scale quantitative method targeting 237 metabolites involved in various metabolic pathways using scheduled, parallel reaction monitoring (PRM). We assessed the dynamic range, linearity, reproducibility, and system suitability of the PRM assay by measuring concentration curves, biological samples, and clinical serum samples. The quantification performances of PRM and MS1-based assays in Q-Exactive were compared, and the MRM assay in QTRAP 6500 was also compared. The PRM assay monitoring 237 polar metabolites showed greater reproducibility and quantitative accuracy than MS1-based quantification and also showed greater flexibility in postacquisition assay refinement than the MRM assay in QTRAP 6500. We present a workflow for convenient PRM data processing using Skyline software which is free of charge. In this study we have established a reliable PRM methodology on a quadrupole-Orbitrap platform for evaluation of large-scale targeted metabolomics, which provides a new choice for basic and clinical metabolomics study.

  14. Large-scale production of functional human lysozyme from marker-free transgenic cloned cows.

    PubMed

    Lu, Dan; Liu, Shen; Ding, Fangrong; Wang, Haiping; Li, Jing; Li, Ling; Dai, Yunping; Li, Ning

    2016-03-10

    Human lysozyme is an important natural non-specific immune protein that is highly expressed in breast milk and participates in the immune response of infants against bacterial and viral infections. Considering the medicinal value and market demand for human lysozyme, an animal model for large-scale production of recombinant human lysozyme (rhLZ) is needed. In this study, we generated transgenic cloned cows with the marker-free vector pBAC-hLF-hLZ, which was shown to efficiently express rhLZ in cow milk. Seven transgenic cloned cows, identified by polymerase chain reaction, Southern blot, and western blot analyses, produced rhLZ in milk at concentrations of up to 3149.19 ± 24.80 mg/L. The purified rhLZ had a similar molecular weight and enzymatic activity as wild-type human lysozyme possessed the same C-terminal and N-terminal amino acid sequences. The preliminary results from the milk yield and milk compositions from a naturally lactating transgenic cloned cow 0906 were also tested. These results provide a solid foundation for the large-scale production of rhLZ in the future.

  15. Large-scale 3D simulations of ICF and HEDP targets

    NASA Astrophysics Data System (ADS)

    Marinak, Michael M.

    2000-10-01

    The radiation hydrodynamics code HYDRA continues to be developed and applied to 3D simulations of a variety of targets for both inertial confinement fusion (ICF) and high energy density physics. Several packages have been added enabling this code to perform ICF target simulations with similar accuracy as two-dimensional codes of long-time historical use. These include a laser ray trace and deposition package, a heavy ion deposition package, implicit Monte Carlo photonics, and non-LTE opacities, derived from XSN or the linearized response matrix approach.(R. More, T. Kato, Phys. Rev. Lett. 81, 814 (1998), S. Libby, F. Graziani, R. More, T. Kato, Proceedings of the 13th International Conference on Laser Interactions and Related Plasma Phenomena, (AIP, New York, 1997).) LTE opacities can also be calculated for arbitrary mixtures online by combining tabular values generated by different opacity codes. Thermonuclear burn, charged particle transport, neutron energy deposition, electron-ion coupling and conduction, and multigroup radiation diffusion packages are also installed. HYDRA can employ ALE hydrodynamics; a number of grid motion algorithms are available. Multi-material flows are resolved using material interface reconstruction. Results from large-scale simulations run on up to 1680 processors, using a combination of massively parallel processing and symmetric multiprocessing, will be described. A large solid angle simulation of Rayleigh-Taylor instability growth in a NIF ignition capsule has resolved simultaneously the full spectrum of the most dangerous modes that grow from surface roughness. Simulations of a NIF hohlraum illuminated with the initial 96 beam configuration have also been performed. The effect of the hohlraum’s 3D intrinsic drive asymmetry on the capsule implosion will be considered. We will also discuss results from a Nova experiment in which a copper sphere is crushed by a planar shock. Several interacting hydrodynamic instabilities, including

  16. Effects on aquatic and human health due to large scale bioenergy crop expansion.

    PubMed

    Love, Bradley J; Einheuser, Matthew D; Nejadhashemi, A Pouyan

    2011-08-01

    In this study, the environmental impacts of large scale bioenergy crops were evaluated using the Soil and Water Assessment Tool (SWAT). Daily pesticide concentration data for a study area consisting of four large watersheds located in Michigan (totaling 53,358 km²) was estimated over a six year period (2000-2005). Model outputs for atrazine, bromoxynil, glyphosate, metolachlor, pendimethalin, sethoxydim, triflualin, and 2,4-D model output were used to predict the possible long-term implications that large-scale bioenergy crop expansion may have on the bluegill (Lepomis macrochirus) and humans. Threshold toxicity levels were obtained for the bluegill and for human consumption for all pesticides being evaluated through an extensive literature review. Model output was compared to each toxicity level for the suggested exposure time (96-hour for bluegill and 24-hour for humans). The results suggest that traditional intensive row crops such as canola, corn and sorghum may negatively impact aquatic life, and in most cases affect the safe drinking water availability. The continuous corn rotation, the most representative rotation for current agricultural practices for a starch-based ethanol economy, delivers the highest concentrations of glyphosate to the stream. In addition, continuous canola contributed to a concentration of 1.11 ppm of trifluralin, a highly toxic herbicide, which is 8.7 times the 96-hour ecotoxicity of bluegills and 21 times the safe drinking water level. Also during the period of study, continuous corn resulted in the impairment of 541,152 km of stream. However, there is promise with second-generation lignocellulosic bioenergy crops such as switchgrass, which resulted in a 171,667 km reduction in total stream length that exceeds the human threshold criteria, as compared to the base scenario. Results of this study may be useful in determining the suitability of bioenergy crop rotations and aid in decision making regarding the adaptation of large-scale

  17. Geomorphic and human influence on large-scale coastal change

    USGS Publications Warehouse

    Hapke, Cheryl J.; Kratzmann, Meredith G.; Himmelstoss, Emily A.

    2013-01-01

    An increasing need exists for regional-scale measurements of shoreline change to aid in management and planning decisions over a broad portion of the coast and to inform assessments of coastal vulnerabilities and hazards. A recent dataset of regional shoreline change, covering a large portion of the U.S. East coast (New England and Mid-Atlantic), provides rates of shoreline change over historical (~ 150 years) and recent (25–30 years) time periods making it ideal for a broad assessment of the regional variation of shoreline change, and the natural and human-induced influences on coastal behavior. The variable coastal landforms of the region provide an opportunity to investigate how specific geomorphic landforms relate to the spatial variability of shoreline change. In addition to natural influences on the rates of change, we examine the effects that development and human modifications to the coastline have on the measurements of regional shoreline change.Regional variation in the rates of shoreline change is a function of the dominant type and distribution of coastal landform as well as the relative amount of human development. Our results indicate that geomorphology has measurable influence on shoreline change rates. Anthropogenic impacts are found to be greater along the more densely developed and modified portion of the coast where jetties at engineered inlets impound large volumes of sediment resulting in extreme but discrete progradation updrift of jetties. This produces a shift in averaged values of rates that may mask the natural long-term record. Additionally, a strong correlation is found to exist between rates of shoreline change and relative level of human development. Using a geomorphic characterization of the types of coastal landform as a guide for expected relative rates of change, we found that the shoreline appears to be changing naturally only along sparsely developed coasts. Even modest amounts of development influence the rates of change

  18. Domain-Adapted Convolutional Networks for Satellite Image Classification: A Large-Scale Interactive Learning Workflow

    DOE PAGES

    Lunga, Dalton D.; Yang, Hsiuhan Lexie; Reith, Andrew E.; ...

    2018-02-06

    Satellite imagery often exhibits large spatial extent areas that encompass object classes with considerable variability. This often limits large-scale model generalization with machine learning algorithms. Notably, acquisition conditions, including dates, sensor position, lighting condition, and sensor types, often translate into class distribution shifts introducing complex nonlinear factors and hamper the potential impact of machine learning classifiers. Here, this article investigates the challenge of exploiting satellite images using convolutional neural networks (CNN) for settlement classification where the class distribution shifts are significant. We present a large-scale human settlement mapping workflow based-off multiple modules to adapt a pretrained CNN to address themore » negative impact of distribution shift on classification performance. To extend a locally trained classifier onto large spatial extents areas we introduce several submodules: First, a human-in-the-loop element for relabeling of misclassified target domain samples to generate representative examples for model adaptation; second, an efficient hashing module to minimize redundancy and noisy samples from the mass-selected examples; and third, a novel relevance ranking module to minimize the dominance of source example on the target domain. The workflow presents a novel and practical approach to achieve large-scale domain adaptation with binary classifiers that are based-off CNN features. Experimental evaluations are conducted on areas of interest that encompass various image characteristics, including multisensors, multitemporal, and multiangular conditions. Domain adaptation is assessed on source–target pairs through the transfer loss and transfer ratio metrics to illustrate the utility of the workflow.« less

  19. Domain-Adapted Convolutional Networks for Satellite Image Classification: A Large-Scale Interactive Learning Workflow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lunga, Dalton D.; Yang, Hsiuhan Lexie; Reith, Andrew E.

    Satellite imagery often exhibits large spatial extent areas that encompass object classes with considerable variability. This often limits large-scale model generalization with machine learning algorithms. Notably, acquisition conditions, including dates, sensor position, lighting condition, and sensor types, often translate into class distribution shifts introducing complex nonlinear factors and hamper the potential impact of machine learning classifiers. Here, this article investigates the challenge of exploiting satellite images using convolutional neural networks (CNN) for settlement classification where the class distribution shifts are significant. We present a large-scale human settlement mapping workflow based-off multiple modules to adapt a pretrained CNN to address themore » negative impact of distribution shift on classification performance. To extend a locally trained classifier onto large spatial extents areas we introduce several submodules: First, a human-in-the-loop element for relabeling of misclassified target domain samples to generate representative examples for model adaptation; second, an efficient hashing module to minimize redundancy and noisy samples from the mass-selected examples; and third, a novel relevance ranking module to minimize the dominance of source example on the target domain. The workflow presents a novel and practical approach to achieve large-scale domain adaptation with binary classifiers that are based-off CNN features. Experimental evaluations are conducted on areas of interest that encompass various image characteristics, including multisensors, multitemporal, and multiangular conditions. Domain adaptation is assessed on source–target pairs through the transfer loss and transfer ratio metrics to illustrate the utility of the workflow.« less

  20. Translating chimpanzee personality to humans: Investigating the transportability of chimpanzee-derived personality scales to humans.

    PubMed

    Latzman, Robert D; Sauvigné, Katheryn C; Hopkins, William D

    2016-06-01

    There is a growing interest in the study of personality in chimpanzees with repeated findings of a similar structure of personality in apes to that found in humans. To date, however, the direct translational value of instruments used to assess chimpanzee personality to humans has yet to be explicitly tested. As such, in the current study we sought to determine the transportability of factor analytically-derived chimpanzee personality scales to humans in a large human sample (N = 301). Human informants reporting on target individuals they knew well completed chimpanzee-derived and human-derived measures of personality from the two most widely studied models of human personality: Big Five and Big Three. The correspondence between informant-reported chimpanzee- and human-derived personality scales was then investigated. Results indicated high convergence for corresponding scales across most chimpanzee- and human-derived personality scales. Findings from the current study provide evidence that chimpanzee-derived scales translate well to humans and operate quite similarly to the established human-derived personality scales in a human sample. This evidence of transportability lends support to the translational nature of chimpanzee personality research suggesting clear relevance of this growing literature to humans. Am. J. Primatol. 78:601-609, 2016. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  1. Large-scale production and properties of human plasma-derived activated Factor VII concentrate.

    PubMed

    Tomokiyo, K; Yano, H; Imamura, M; Nakano, Y; Nakagaki, T; Ogata, Y; Terano, T; Miyamoto, S; Funatsu, A

    2003-01-01

    An activated Factor VII (FVIIa) concentrate, prepared from human plasma on a large scale, has to date not been available for clinical use for haemophiliacs with antibodies against FVIII and FIX. In the present study, we attempted to establish a large-scale manufacturing process to obtain plasma-derived FVIIa concentrate with high recovery and safety, and to characterize its biochemical and biological properties. FVII was purified from human cryoprecipitate-poor plasma, by a combination of anion exchange and immunoaffinity chromatography, using Ca2+-dependent anti-FVII monoclonal antibody. To activate FVII, a FVII preparation that was nanofiltered using a Bemberg Microporous Membrane-15 nm was partially converted to FVIIa by autoactivation on an anion-exchange resin. The residual FVII in the FVII and FVIIa mixture was completely activated by further incubating the mixture in the presence of Ca2+ for 18 h at 10 degrees C, without any additional activators. For preparation of the FVIIa concentrate, after dialysis of FVIIa against 20 mm citrate, pH 6.9, containing 13 mm glycine and 240 mm NaCl, the FVIIa preparation was supplemented with 2.5% human albumin (which was first pasteurized at 60 degrees C for 10 h) and lyophilized in vials. To inactivate viruses contaminating the FVIIa concentrate, the lyophilized product was further heated at 65 degrees C for 96 h in a water bath. Total recovery of FVII from 15 000 l of plasma was approximately 40%, and the FVII preparation was fully converted to FVIIa with trace amounts of degraded products (FVIIabeta and FVIIagamma). The specific activity of the FVIIa was approximately 40 U/ micro g. Furthermore, virus-spiking tests demonstrated that immunoaffinity chromatography, nanofiltration and dry-heating effectively removed and inactivated the spiked viruses in the FVIIa. These results indicated that the FVIIa concentrate had both high specific activity and safety. We established a large-scale manufacturing process of human plasma

  2. Centrifuge impact cratering experiments: Scaling laws for non-porous targets

    NASA Technical Reports Server (NTRS)

    Schmidt, Robert M.

    1987-01-01

    A geotechnical centrifuge was used to investigate large body impacts onto planetary surfaces. At elevated gravity, it is possible to match various dimensionless similarity parameters which were shown to govern large scale impacts. Observations of crater growth and target flow fields have provided detailed and critical tests of a complete and unified scaling theory for impact cratering. Scaling estimates were determined for nonporous targets. Scaling estimates for large scale cratering in rock proposed previously by others have assumed that the crater radius is proportional to powers of the impactor energy and gravity, with no additional dependence on impact velocity. The size scaling laws determined from ongoing centrifuge experiments differ from earlier ones in three respects. First, a distinct dependence of impact velocity is recognized, even for constant impactor energy. Second, the present energy exponent for low porosity targets, like competent rock, is lower than earlier estimates. Third, the gravity exponent is recognized here as being related to both the energy and the velocity exponents.

  3. Experimental Simulations of Large-Scale Collisions

    NASA Technical Reports Server (NTRS)

    Housen, Kevin R.

    2002-01-01

    This report summarizes research on the effects of target porosity on the mechanics of impact cratering. Impact experiments conducted on a centrifuge provide direct simulations of large-scale cratering on porous asteroids. The experiments show that large craters in porous materials form mostly by compaction, with essentially no deposition of material into the ejecta blanket that is a signature of cratering in less-porous materials. The ratio of ejecta mass to crater mass is shown to decrease with increasing crater size or target porosity. These results are consistent with the observation that large closely-packed craters on asteroid Mathilde appear to have formed without degradation to earlier craters.

  4. The co-evolution of social institutions, demography, and large-scale human cooperation.

    PubMed

    Powers, Simon T; Lehmann, Laurent

    2013-11-01

    Human cooperation is typically coordinated by institutions, which determine the outcome structure of the social interactions individuals engage in. Explaining the Neolithic transition from small- to large-scale societies involves understanding how these institutions co-evolve with demography. We study this using a demographically explicit model of institution formation in a patch-structured population. Each patch supports both social and asocial niches. Social individuals create an institution, at a cost to themselves, by negotiating how much of the costly public good provided by cooperators is invested into sanctioning defectors. The remainder of their public good is invested in technology that increases carrying capacity, such as irrigation systems. We show that social individuals can invade a population of asocials, and form institutions that support high levels of cooperation. We then demonstrate conditions where the co-evolution of cooperation, institutions, and demographic carrying capacity creates a transition from small- to large-scale social groups. © 2013 John Wiley & Sons Ltd/CNRS.

  5. Skin Friction Reduction Through Large-Scale Forcing

    NASA Astrophysics Data System (ADS)

    Bhatt, Shibani; Artham, Sravan; Gnanamanickam, Ebenezer

    2017-11-01

    Flow structures in a turbulent boundary layer larger than an integral length scale (δ), referred to as large-scales, interact with the finer scales in a non-linear manner. By targeting these large-scales and exploiting this non-linear interaction wall shear stress (WSS) reduction of over 10% has been achieved. The plane wall jet (PWJ), a boundary layer which has highly energetic large-scales that become turbulent independent of the near-wall finer scales, is the chosen model flow field. It's unique configuration allows for the independent control of the large-scales through acoustic forcing. Perturbation wavelengths from about 1 δ to 14 δ were considered with a reduction in WSS for all wavelengths considered. This reduction, over a large subset of the wavelengths, scales with both inner and outer variables indicating a mixed scaling to the underlying physics, while also showing dependence on the PWJ global properties. A triple decomposition of the velocity fields shows an increase in coherence due to forcing with a clear organization of the small scale turbulence with respect to the introduced large-scale. The maximum reduction in WSS occurs when the introduced large-scale acts in a manner so as to reduce the turbulent activity in the very near wall region. This material is based upon work supported by the Air Force Office of Scientific Research under Award Number FA9550-16-1-0194 monitored by Dr. Douglas Smith.

  6. MRMPROBS: a data assessment and metabolite identification tool for large-scale multiple reaction monitoring based widely targeted metabolomics.

    PubMed

    Tsugawa, Hiroshi; Arita, Masanori; Kanazawa, Mitsuhiro; Ogiwara, Atsushi; Bamba, Takeshi; Fukusaki, Eiichiro

    2013-05-21

    We developed a new software program, MRMPROBS, for widely targeted metabolomics by using the large-scale multiple reaction monitoring (MRM) mode. The strategy became increasingly popular for the simultaneous analysis of up to several hundred metabolites at high sensitivity, selectivity, and quantitative capability. However, the traditional method of assessing measured metabolomics data without probabilistic criteria is not only time-consuming but is often subjective and makeshift work. Our program overcomes these problems by detecting and identifying metabolites automatically, by separating isomeric metabolites, and by removing background noise using a probabilistic score defined as the odds ratio from an optimized multivariate logistic regression model. Our software program also provides a user-friendly graphical interface to curate and organize data matrices and to apply principal component analyses and statistical tests. For a demonstration, we conducted a widely targeted metabolome analysis (152 metabolites) of propagating Saccharomyces cerevisiae measured at 15 time points by gas and liquid chromatography coupled to triple quadrupole mass spectrometry. MRMPROBS is a useful and practical tool for the assessment of large-scale MRM data available to any instrument or any experimental condition.

  7. Large-Scale Off-Target Identification Using Fast and Accurate Dual Regularized One-Class Collaborative Filtering and Its Application to Drug Repurposing.

    PubMed

    Lim, Hansaim; Poleksic, Aleksandar; Yao, Yuan; Tong, Hanghang; He, Di; Zhuang, Luke; Meng, Patrick; Xie, Lei

    2016-10-01

    Target-based screening is one of the major approaches in drug discovery. Besides the intended target, unexpected drug off-target interactions often occur, and many of them have not been recognized and characterized. The off-target interactions can be responsible for either therapeutic or side effects. Thus, identifying the genome-wide off-targets of lead compounds or existing drugs will be critical for designing effective and safe drugs, and providing new opportunities for drug repurposing. Although many computational methods have been developed to predict drug-target interactions, they are either less accurate than the one that we are proposing here or computationally too intensive, thereby limiting their capability for large-scale off-target identification. In addition, the performances of most machine learning based algorithms have been mainly evaluated to predict off-target interactions in the same gene family for hundreds of chemicals. It is not clear how these algorithms perform in terms of detecting off-targets across gene families on a proteome scale. Here, we are presenting a fast and accurate off-target prediction method, REMAP, which is based on a dual regularized one-class collaborative filtering algorithm, to explore continuous chemical space, protein space, and their interactome on a large scale. When tested in a reliable, extensive, and cross-gene family benchmark, REMAP outperforms the state-of-the-art methods. Furthermore, REMAP is highly scalable. It can screen a dataset of 200 thousands chemicals against 20 thousands proteins within 2 hours. Using the reconstructed genome-wide target profile as the fingerprint of a chemical compound, we predicted that seven FDA-approved drugs can be repurposed as novel anti-cancer therapies. The anti-cancer activity of six of them is supported by experimental evidences. Thus, REMAP is a valuable addition to the existing in silico toolbox for drug target identification, drug repurposing, phenotypic screening, and

  8. Large-Scale Off-Target Identification Using Fast and Accurate Dual Regularized One-Class Collaborative Filtering and Its Application to Drug Repurposing

    PubMed Central

    Poleksic, Aleksandar; Yao, Yuan; Tong, Hanghang; Meng, Patrick; Xie, Lei

    2016-01-01

    Target-based screening is one of the major approaches in drug discovery. Besides the intended target, unexpected drug off-target interactions often occur, and many of them have not been recognized and characterized. The off-target interactions can be responsible for either therapeutic or side effects. Thus, identifying the genome-wide off-targets of lead compounds or existing drugs will be critical for designing effective and safe drugs, and providing new opportunities for drug repurposing. Although many computational methods have been developed to predict drug-target interactions, they are either less accurate than the one that we are proposing here or computationally too intensive, thereby limiting their capability for large-scale off-target identification. In addition, the performances of most machine learning based algorithms have been mainly evaluated to predict off-target interactions in the same gene family for hundreds of chemicals. It is not clear how these algorithms perform in terms of detecting off-targets across gene families on a proteome scale. Here, we are presenting a fast and accurate off-target prediction method, REMAP, which is based on a dual regularized one-class collaborative filtering algorithm, to explore continuous chemical space, protein space, and their interactome on a large scale. When tested in a reliable, extensive, and cross-gene family benchmark, REMAP outperforms the state-of-the-art methods. Furthermore, REMAP is highly scalable. It can screen a dataset of 200 thousands chemicals against 20 thousands proteins within 2 hours. Using the reconstructed genome-wide target profile as the fingerprint of a chemical compound, we predicted that seven FDA-approved drugs can be repurposed as novel anti-cancer therapies. The anti-cancer activity of six of them is supported by experimental evidences. Thus, REMAP is a valuable addition to the existing in silico toolbox for drug target identification, drug repurposing, phenotypic screening, and

  9. The causality analysis of climate change and large-scale human crisis.

    PubMed

    Zhang, David D; Lee, Harry F; Wang, Cong; Li, Baosheng; Pei, Qing; Zhang, Jane; An, Yulun

    2011-10-18

    Recent studies have shown strong temporal correlations between past climate changes and societal crises. However, the specific causal mechanisms underlying this relation have not been addressed. We explored quantitative responses of 14 fine-grained agro-ecological, socioeconomic, and demographic variables to climate fluctuations from A.D. 1500-1800 in Europe. Results show that cooling from A.D. 1560-1660 caused successive agro-ecological, socioeconomic, and demographic catastrophes, leading to the General Crisis of the Seventeenth Century. We identified a set of causal linkages between climate change and human crisis. Using temperature data and climate-driven economic variables, we simulated the alternation of defined "golden" and "dark" ages in Europe and the Northern Hemisphere during the past millennium. Our findings indicate that climate change was the ultimate cause, and climate-driven economic downturn was the direct cause, of large-scale human crises in preindustrial Europe and the Northern Hemisphere.

  10. Heritability maps of human face morphology through large-scale automated three-dimensional phenotyping

    NASA Astrophysics Data System (ADS)

    Tsagkrasoulis, Dimosthenis; Hysi, Pirro; Spector, Tim; Montana, Giovanni

    2017-04-01

    The human face is a complex trait under strong genetic control, as evidenced by the striking visual similarity between twins. Nevertheless, heritability estimates of facial traits have often been surprisingly low or difficult to replicate. Furthermore, the construction of facial phenotypes that correspond to naturally perceived facial features remains largely a mystery. We present here a large-scale heritability study of face geometry that aims to address these issues. High-resolution, three-dimensional facial models have been acquired on a cohort of 952 twins recruited from the TwinsUK registry, and processed through a novel landmarking workflow, GESSA (Geodesic Ensemble Surface Sampling Algorithm). The algorithm places thousands of landmarks throughout the facial surface and automatically establishes point-wise correspondence across faces. These landmarks enabled us to intuitively characterize facial geometry at a fine level of detail through curvature measurements, yielding accurate heritability maps of the human face (www.heritabilitymaps.info).

  11. Large-scale metabolite analysis of standards and human serum by laser desorption ionization mass spectrometry from silicon nanopost arrays

    DOE PAGES

    Korte, Andrew R.; Stopka, Sylwia A.; Morris, Nicholas; ...

    2016-07-11

    The unique challenges presented by metabolomics have driven the development of new mass spectrometry (MS)-based techniques for small molecule analysis. We have previously demonstrated silicon nanopost arrays (NAPA) to be an effective substrate for laser desorption ionization (LDI) of small molecules for MS. However, the utility of NAPA-LDI-MS for a wide range of metabolite classes has not been investigated. Here we apply NAPA-LDI-MS to the large-scale acquisition of high-resolution mass spectra and tandem mass spectra from a collection of metabolite standards covering a range of compound classes including amino acids, nucleotides, carbohydrates, xenobiotics, lipids, and other classes. In untargeted analysismore » of metabolite standard mixtures, detection was achieved for 374 compounds and useful MS/MS spectra were obtained for 287 compounds, without individual optimization of ionization or fragmentation conditions. Metabolite detection was evaluated in the context of 31 metabolic pathways, and NAPA-LDI-MS was found to provide detection for 63% of investigated pathway metabolites. Individual, targeted analysis of the 20 common amino acids provided detection of 100% of the investigated compounds, demonstrating that improved coverage is possible through optimization and targeting of individual analytes or analyte classes. In direct analysis of aqueous and organic extracts from human serum samples, spectral features were assigned to a total of 108 small metabolites and lipids. Glucose and amino acids were quantitated within their physiological concentration ranges. Finally, the broad coverage demonstrated by this large-scale screening experiment opens the door for use of NAPA-LDI-MS in numerous metabolite analysis applications« less

  12. Large-scale metabolite analysis of standards and human serum by laser desorption ionization mass spectrometry from silicon nanopost arrays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Korte, Andrew R.; Stopka, Sylwia A.; Morris, Nicholas

    The unique challenges presented by metabolomics have driven the development of new mass spectrometry (MS)-based techniques for small molecule analysis. We have previously demonstrated silicon nanopost arrays (NAPA) to be an effective substrate for laser desorption ionization (LDI) of small molecules for MS. However, the utility of NAPA-LDI-MS for a wide range of metabolite classes has not been investigated. Here we apply NAPA-LDI-MS to the large-scale acquisition of high-resolution mass spectra and tandem mass spectra from a collection of metabolite standards covering a range of compound classes including amino acids, nucleotides, carbohydrates, xenobiotics, lipids, and other classes. In untargeted analysismore » of metabolite standard mixtures, detection was achieved for 374 compounds and useful MS/MS spectra were obtained for 287 compounds, without individual optimization of ionization or fragmentation conditions. Metabolite detection was evaluated in the context of 31 metabolic pathways, and NAPA-LDI-MS was found to provide detection for 63% of investigated pathway metabolites. Individual, targeted analysis of the 20 common amino acids provided detection of 100% of the investigated compounds, demonstrating that improved coverage is possible through optimization and targeting of individual analytes or analyte classes. In direct analysis of aqueous and organic extracts from human serum samples, spectral features were assigned to a total of 108 small metabolites and lipids. Glucose and amino acids were quantitated within their physiological concentration ranges. Finally, the broad coverage demonstrated by this large-scale screening experiment opens the door for use of NAPA-LDI-MS in numerous metabolite analysis applications« less

  13. Large-scale imputation of epigenomic datasets for systematic annotation of diverse human tissues.

    PubMed

    Ernst, Jason; Kellis, Manolis

    2015-04-01

    With hundreds of epigenomic maps, the opportunity arises to exploit the correlated nature of epigenetic signals, across both marks and samples, for large-scale prediction of additional datasets. Here, we undertake epigenome imputation by leveraging such correlations through an ensemble of regression trees. We impute 4,315 high-resolution signal maps, of which 26% are also experimentally observed. Imputed signal tracks show overall similarity to observed signals and surpass experimental datasets in consistency, recovery of gene annotations and enrichment for disease-associated variants. We use the imputed data to detect low-quality experimental datasets, to find genomic sites with unexpected epigenomic signals, to define high-priority marks for new experiments and to delineate chromatin states in 127 reference epigenomes spanning diverse tissues and cell types. Our imputed datasets provide the most comprehensive human regulatory region annotation to date, and our approach and the ChromImpute software constitute a useful complement to large-scale experimental mapping of epigenomic information.

  14. Modeling large-scale human alteration of land surface hydrology and climate

    NASA Astrophysics Data System (ADS)

    Pokhrel, Yadu N.; Felfelani, Farshid; Shin, Sanghoon; Yamada, Tomohito J.; Satoh, Yusuke

    2017-12-01

    Rapidly expanding human activities have profoundly affected various biophysical and biogeochemical processes of the Earth system over a broad range of scales, and freshwater systems are now amongst the most extensively altered ecosystems. In this study, we examine the human-induced changes in land surface water and energy balances and the associated climate impacts using a coupled hydrological-climate model framework which also simulates the impacts of human activities on the water cycle. We present three sets of analyses using the results from two model versions—one with and the other without considering human activities; both versions are run in offline and coupled mode resulting in a series of four experiments in total. First, we examine climate and human-induced changes in regional water balance focusing on the widely debated issue of the desiccation of the Aral Sea in central Asia. Then, we discuss the changes in surface temperature as a result of changes in land surface energy balance due to irrigation over global and regional scales. Finally, we examine the global and regional climate impacts of increased atmospheric water vapor content due to irrigation. Results indicate that the direct anthropogenic alteration of river flow in the Aral Sea basin resulted in the loss of 510 km3 of water during the latter half of the twentieth century which explains about half of the total loss of water from the sea. Results of irrigation-induced changes in surface energy balance suggest a significant surface cooling of up to 3.3 K over 1° grids in highly irrigated areas but a negligible change in land surface temperature when averaged over sufficiently large global regions. Results from the coupled model indicate a substantial change in 2 m air temperature and outgoing longwave radiation due to irrigation, highlighting the non-local (regional and global) implications of irrigation. These results provide important insights on the direct human alteration of land surface

  15. Centrifuge impact cratering experiments: Scaling laws for non-porous targets

    NASA Technical Reports Server (NTRS)

    Schmidt, Robert M.

    1987-01-01

    This research is a continuation of an ongoing program whose objective is to perform experiments and to develop scaling relationships for large body impacts onto planetary surfaces. The development of the centrifuge technique has been pioneered by the present investigator and is used to provide experimental data for actual target materials of interest. With both powder and gas guns mounted on a rotor arm, it is possible to match various dimensionless similarity parameters, which have been shown to govern the behavior of large scale impacts. Current work is directed toward the determination of scaling estimates for nonporous targets. The results are presented in summary form.

  16. The causality analysis of climate change and large-scale human crisis

    PubMed Central

    Zhang, David D.; Lee, Harry F.; Wang, Cong; Li, Baosheng; Pei, Qing; Zhang, Jane; An, Yulun

    2011-01-01

    Recent studies have shown strong temporal correlations between past climate changes and societal crises. However, the specific causal mechanisms underlying this relation have not been addressed. We explored quantitative responses of 14 fine-grained agro-ecological, socioeconomic, and demographic variables to climate fluctuations from A.D. 1500–1800 in Europe. Results show that cooling from A.D. 1560–1660 caused successive agro-ecological, socioeconomic, and demographic catastrophes, leading to the General Crisis of the Seventeenth Century. We identified a set of causal linkages between climate change and human crisis. Using temperature data and climate-driven economic variables, we simulated the alternation of defined “golden” and “dark” ages in Europe and the Northern Hemisphere during the past millennium. Our findings indicate that climate change was the ultimate cause, and climate-driven economic downturn was the direct cause, of large-scale human crises in preindustrial Europe and the Northern Hemisphere. PMID:21969578

  17. Assessing Human Modifications to Floodplains using Large-Scale Hydrogeomorphic Floodplain Modeling

    NASA Astrophysics Data System (ADS)

    Morrison, R. R.; Scheel, K.; Nardi, F.; Annis, A.

    2017-12-01

    Human modifications to floodplains for water resource and flood management purposes have significantly transformed river-floodplain connectivity dynamics in many watersheds. Bridges, levees, reservoirs, shifts in land use, and other hydraulic engineering works have altered flow patterns and caused changes in the timing and extent of floodplain inundation processes. These hydrogeomorphic changes have likely resulted in negative impacts to aquatic habitat and ecological processes. The availability of large-scale topographic datasets at high resolution provide an opportunity for detecting anthropogenic impacts by means of geomorphic mapping. We have developed and are implementing a methodology for comparing a hydrogeomorphic floodplain mapping technique to hydraulically-modeled floodplain boundaries to estimate floodplain loss due to human activities. Our hydrogeomorphic mapping methodology assumes that river valley morphology intrinsically includes information on flood-driven erosion and depositional phenomena. We use a digital elevation model-based algorithm to identify the floodplain as the area of the fluvial corridor laying below water reference levels, which are estimated using a simplified hydrologic model. Results from our hydrogeomorphic method are compared to hydraulically-derived flood zone maps and spatial datasets of levee protected-areas to explore where water management features, such as levees, have changed floodplain dynamics and landscape features. Parameters associated with commonly used F-index functions are quantified and analyzed to better understand how floodplain areas have been reduced within a basin. Preliminary results indicate that the hydrogeomorphic floodplain model is useful for quickly delineating floodplains at large watershed scales, but further analyses are needed to understand the caveats for using the model in determining floodplain loss due to levees. We plan to continue this work by exploring the spatial dependencies of the F

  18. Knowledge-Guided Robust MRI Brain Extraction for Diverse Large-Scale Neuroimaging Studies on Humans and Non-Human Primates

    PubMed Central

    Wang, Yaping; Nie, Jingxin; Yap, Pew-Thian; Li, Gang; Shi, Feng; Geng, Xiujuan; Guo, Lei; Shen, Dinggang

    2014-01-01

    Accurate and robust brain extraction is a critical step in most neuroimaging analysis pipelines. In particular, for the large-scale multi-site neuroimaging studies involving a significant number of subjects with diverse age and diagnostic groups, accurate and robust extraction of the brain automatically and consistently is highly desirable. In this paper, we introduce population-specific probability maps to guide the brain extraction of diverse subject groups, including both healthy and diseased adult human populations, both developing and aging human populations, as well as non-human primates. Specifically, the proposed method combines an atlas-based approach, for coarse skull-stripping, with a deformable-surface-based approach that is guided by local intensity information and population-specific prior information learned from a set of real brain images for more localized refinement. Comprehensive quantitative evaluations were performed on the diverse large-scale populations of ADNI dataset with over 800 subjects (55∼90 years of age, multi-site, various diagnosis groups), OASIS dataset with over 400 subjects (18∼96 years of age, wide age range, various diagnosis groups), and NIH pediatrics dataset with 150 subjects (5∼18 years of age, multi-site, wide age range as a complementary age group to the adult dataset). The results demonstrate that our method consistently yields the best overall results across almost the entire human life span, with only a single set of parameters. To demonstrate its capability to work on non-human primates, the proposed method is further evaluated using a rhesus macaque dataset with 20 subjects. Quantitative comparisons with popularly used state-of-the-art methods, including BET, Two-pass BET, BET-B, BSE, HWA, ROBEX and AFNI, demonstrate that the proposed method performs favorably with superior performance on all testing datasets, indicating its robustness and effectiveness. PMID:24489639

  19. Value-focused framework for defining landscape-scale conservation targets

    USGS Publications Warehouse

    Romañach, Stephanie; Benscoter, Allison M.; Brandt, Laura A.

    2016-01-01

    Conservation of natural resources can be challenging in a rapidly changing world and require collaborative efforts for success. Conservation planning is the process of deciding how to protect, conserve, and enhance or minimize loss of natural and cultural resources. Establishing conservation targets (also called indicators or endpoints), the measurable expressions of desired resource conditions, can help with site-specific up to landscape-scale conservation planning. Using conservation targets and tracking them through time can deliver benefits such as insight into ecosystem health and providing early warnings about undesirable trends. We describe an approach using value-focused thinking to develop statewide conservation targets for Florida. Using such an approach allowed us to first identify stakeholder objectives and then define conservation targets to meet those objectives. Stakeholders were able to see how their shared efforts fit into the broader conservation context, and also anticipate the benefits of multi-agency and -organization collaboration. We developed an iterative process for large-scale conservation planning that included defining a shared framework for the process, defining the conservation targets themselves, as well as developing management and monitoring strategies for evaluation of their effectiveness. The process we describe is applicable to other geographies where multiple parties are seeking to implement collaborative, large-scale biological planning.

  20. Gene targeting by TALEN-induced homologous recombination in goats directs production of β-lactoglobulin-free, high-human lactoferrin milk

    PubMed Central

    Cui, Chenchen; Song, Yujie; Liu, Jun; Ge, Hengtao; Li, Qian; Huang, Hui; Hu, Linyong; Zhu, Hongmei; Jin, Yaping; Zhang, Yong

    2015-01-01

    β-Lactoglobulin (BLG) is a major goat’s milk allergen that is absent in human milk. Engineered endonucleases, including transcription activator-like effector nucleases (TALENs) and zinc-finger nucleases, enable targeted genetic modification in livestock. In this study, TALEN-mediated gene knockout followed by gene knock-in were used to generate BLG knockout goats as mammary gland bioreactors for large-scale production of human lactoferrin (hLF). We introduced precise genetic modifications in the goat genome at frequencies of approximately 13.6% and 6.09% for the first and second sequential targeting, respectively, by using targeting vectors that underwent TALEN-induced homologous recombination (HR). Analysis of milk from the cloned goats revealed large-scale hLF expression or/and decreased BLG levels in milk from heterozygous goats as well as the absence of BLG in milk from homozygous goats. Furthermore, the TALEN-mediated targeting events in somatic cells can be transmitted through the germline after SCNT. Our result suggests that gene targeting via TALEN-induced HR may expedite the production of genetically engineered livestock for agriculture and biomedicine. PMID:25994151

  1. Gene targeting by TALEN-induced homologous recombination in goats directs production of β-lactoglobulin-free, high-human lactoferrin milk.

    PubMed

    Cui, Chenchen; Song, Yujie; Liu, Jun; Ge, Hengtao; Li, Qian; Huang, Hui; Hu, Linyong; Zhu, Hongmei; Jin, Yaping; Zhang, Yong

    2015-05-21

    β-Lactoglobulin (BLG) is a major goat's milk allergen that is absent in human milk. Engineered endonucleases, including transcription activator-like effector nucleases (TALENs) and zinc-finger nucleases, enable targeted genetic modification in livestock. In this study, TALEN-mediated gene knockout followed by gene knock-in were used to generate BLG knockout goats as mammary gland bioreactors for large-scale production of human lactoferrin (hLF). We introduced precise genetic modifications in the goat genome at frequencies of approximately 13.6% and 6.09% for the first and second sequential targeting, respectively, by using targeting vectors that underwent TALEN-induced homologous recombination (HR). Analysis of milk from the cloned goats revealed large-scale hLF expression or/and decreased BLG levels in milk from heterozygous goats as well as the absence of BLG in milk from homozygous goats. Furthermore, the TALEN-mediated targeting events in somatic cells can be transmitted through the germline after SCNT. Our result suggests that gene targeting via TALEN-induced HR may expedite the production of genetically engineered livestock for agriculture and biomedicine.

  2. New strategy for drug discovery by large-scale association analysis of molecular networks of different species.

    PubMed

    Zhang, Bo; Fu, Yingxue; Huang, Chao; Zheng, Chunli; Wu, Ziyin; Zhang, Wenjuan; Yang, Xiaoyan; Gong, Fukai; Li, Yuerong; Chen, Xiaoyu; Gao, Shuo; Chen, Xuetong; Li, Yan; Lu, Aiping; Wang, Yonghua

    2016-02-25

    The development of modern omics technology has not significantly improved the efficiency of drug development. Rather precise and targeted drug discovery remains unsolved. Here a large-scale cross-species molecular network association (CSMNA) approach for targeted drug screening from natural sources is presented. The algorithm integrates molecular network omics data from humans and 267 plants and microbes, establishing the biological relationships between them and extracting evolutionarily convergent chemicals. This technique allows the researcher to assess targeted drugs for specific human diseases based on specific plant or microbe pathways. In a perspective validation, connections between the plant Halliwell-Asada (HA) cycle and the human Nrf2-ARE pathway were verified and the manner by which the HA cycle molecules act on the human Nrf2-ARE pathway as antioxidants was determined. This shows the potential applicability of this approach in drug discovery. The current method integrates disparate evolutionary species into chemico-biologically coherent circuits, suggesting a new cross-species omics analysis strategy for rational drug development.

  3. Food security through large scale investments in agriculture

    NASA Astrophysics Data System (ADS)

    Rulli, M.; D'Odorico, P.

    2013-12-01

    Most of the human appropriation of freshwater resources is for food production. There is some concern that in the near future the finite freshwater resources available on Earth might not be sufficient to meet the increasing human demand for agricultural products. In the late 1700s Malthus argued that in the long run the humanity would not have enough resources to feed itself. Malthus' analysis, however, did not account for the emergence of technological innovations that could increase the rate of food production. The modern and contemporary history has seen at least three major technological advances that have increased humans' access to food, namely, the industrial revolution, the green revolution, and the intensification of global trade. Here we argue that a fourth revolution has just started to happen. It involves foreign direct investments in agriculture, which intensify the crop yields of potentially highly productive agricultural lands by introducing the use of more modern technologies. The increasing demand for agricultural products and the uncertainty of international food markets has recently drawn the attention of governments and agribusiness firms toward investments in productive agricultural land, mostly in the developing world. The targeted countries are typically located in regions that have remained only marginally utilized because of lack of modern technology. It is expected that in the long run large scale land acquisitions for commercial farming will bring the technology required to close the existing yield gaps. While the extent of the acquired land and the associated appropriation of freshwater resources have been investigated in detail, the amount of food this land can produce and the number of people it could feed still need to be quantified. Here we use a unique dataset of verified land deals to provide a global quantitative assessment of the rates of crop and food appropriation potentially associated with large scale land acquisitions. We

  4. Assessing large-scale wildlife responses to human infrastructure development

    PubMed Central

    Torres, Aurora; Jaeger, Jochen A. G.; Alonso, Juan Carlos

    2016-01-01

    Habitat loss and deterioration represent the main threats to wildlife species, and are closely linked to the expansion of roads and human settlements. Unfortunately, large-scale effects of these structures remain generally overlooked. Here, we analyzed the European transportation infrastructure network and found that 50% of the continent is within 1.5 km of transportation infrastructure. We present a method for assessing the impacts from infrastructure on wildlife, based on functional response curves describing density reductions in birds and mammals (e.g., road-effect zones), and apply it to Spain as a case study. The imprint of infrastructure extends over most of the country (55.5% in the case of birds and 97.9% for mammals), with moderate declines predicted for birds (22.6% of individuals) and severe declines predicted for mammals (46.6%). Despite certain limitations, we suggest the approach proposed is widely applicable to the evaluation of effects of planned infrastructure developments under multiple scenarios, and propose an internationally coordinated strategy to update and improve it in the future. PMID:27402749

  5. Assessing large-scale wildlife responses to human infrastructure development.

    PubMed

    Torres, Aurora; Jaeger, Jochen A G; Alonso, Juan Carlos

    2016-07-26

    Habitat loss and deterioration represent the main threats to wildlife species, and are closely linked to the expansion of roads and human settlements. Unfortunately, large-scale effects of these structures remain generally overlooked. Here, we analyzed the European transportation infrastructure network and found that 50% of the continent is within 1.5 km of transportation infrastructure. We present a method for assessing the impacts from infrastructure on wildlife, based on functional response curves describing density reductions in birds and mammals (e.g., road-effect zones), and apply it to Spain as a case study. The imprint of infrastructure extends over most of the country (55.5% in the case of birds and 97.9% for mammals), with moderate declines predicted for birds (22.6% of individuals) and severe declines predicted for mammals (46.6%). Despite certain limitations, we suggest the approach proposed is widely applicable to the evaluation of effects of planned infrastructure developments under multiple scenarios, and propose an internationally coordinated strategy to update and improve it in the future.

  6. Balancing selfishness and norm conformity can explain human behavior in large-scale prisoner's dilemma games and can poise human groups near criticality

    NASA Astrophysics Data System (ADS)

    Realpe-Gómez, John; Andrighetto, Giulia; Nardin, Luis Gustavo; Montoya, Javier Antonio

    2018-04-01

    Cooperation is central to the success of human societies as it is crucial for overcoming some of the most pressing social challenges of our time; still, how human cooperation is achieved and may persist is a main puzzle in the social and biological sciences. Recently, scholars have recognized the importance of social norms as solutions to major local and large-scale collective action problems, from the management of water resources to the reduction of smoking in public places to the change in fertility practices. Yet a well-founded model of the effect of social norms on human cooperation is still lacking. Using statistical-physics techniques and integrating findings from cognitive and behavioral sciences, we present an analytically tractable model in which individuals base their decisions to cooperate both on the economic rewards they obtain and on the degree to which their action complies with social norms. Results from this parsimonious model are in agreement with observations in recent large-scale experiments with humans. We also find the phase diagram of the model and show that the experimental human group is poised near a critical point, a regime where recent work suggests living systems respond to changing external conditions in an efficient and coordinated manner.

  7. Demonstrating the feasibility of large-scale development of standardized assays to quantify human proteins

    PubMed Central

    Kennedy, Jacob J.; Abbatiello, Susan E.; Kim, Kyunggon; Yan, Ping; Whiteaker, Jeffrey R.; Lin, Chenwei; Kim, Jun Seok; Zhang, Yuzheng; Wang, Xianlong; Ivey, Richard G.; Zhao, Lei; Min, Hophil; Lee, Youngju; Yu, Myeong-Hee; Yang, Eun Gyeong; Lee, Cheolju; Wang, Pei; Rodriguez, Henry; Kim, Youngsoo; Carr, Steven A.; Paulovich, Amanda G.

    2014-01-01

    The successful application of MRM in biological specimens raises the exciting possibility that assays can be configured to measure all human proteins, resulting in an assay resource that would promote advances in biomedical research. We report the results of a pilot study designed to test the feasibility of a large-scale, international effort in MRM assay generation. We have configured, validated across three laboratories, and made publicly available as a resource to the community 645 novel MRM assays representing 319 proteins expressed in human breast cancer. Assays were multiplexed in groups of >150 peptides and deployed to quantify endogenous analyte in a panel of breast cancer-related cell lines. Median assay precision was 5.4%, with high inter-laboratory correlation (R2 >0.96). Peptide measurements in breast cancer cell lines were able to discriminate amongst molecular subtypes and identify genome-driven changes in the cancer proteome. These results establish the feasibility of a scaled, international effort. PMID:24317253

  8. Validating Bayesian truth serum in large-scale online human experiments.

    PubMed

    Frank, Morgan R; Cebrian, Manuel; Pickard, Galen; Rahwan, Iyad

    2017-01-01

    Bayesian truth serum (BTS) is an exciting new method for improving honesty and information quality in multiple-choice survey, but, despite the method's mathematical reliance on large sample sizes, existing literature about BTS only focuses on small experiments. Combined with the prevalence of online survey platforms, such as Amazon's Mechanical Turk, which facilitate surveys with hundreds or thousands of participants, BTS must be effective in large-scale experiments for BTS to become a readily accepted tool in real-world applications. We demonstrate that BTS quantifiably improves honesty in large-scale online surveys where the "honest" distribution of answers is known in expectation on aggregate. Furthermore, we explore a marketing application where "honest" answers cannot be known, but find that BTS treatment impacts the resulting distributions of answers.

  9. Large scale tracking algorithms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansen, Ross L.; Love, Joshua Alan; Melgaard, David Kennett

    2015-01-01

    Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase signi cantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, "blob" tracking is the norm. For highermore » resolution data, additional information may be employed in the detection and classfication steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment.« less

  10. SDSS-III Baryon Oscillation Spectroscopic Survey data release 12: Galaxy target selection and large-scale structure catalogues

    DOE PAGES

    Reid, Beth; Ho, Shirley; Padmanabhan, Nikhil; ...

    2015-11-17

    The Baryon Oscillation Spectroscopic Survey (BOSS), part of the Sloan Digital Sky Survey (SDSS) III project, has provided the largest survey of galaxy redshifts available to date, in terms of both the number of galaxy redshifts measured by a single survey, and the effective cosmological volume covered. Key to analysing the clustering of these data to provide cosmological measurements is understanding the detailed properties of this sample. Potential issues include variations in the target catalogue caused by changes either in the targeting algorithm or properties of the data used, the pattern of spectroscopic observations, the spatial distribution of targets formore » which redshifts were not obtained, and variations in the target sky density due to observational systematics. We document here the target selection algorithms used to create the galaxy samples that comprise BOSS. We also present the algorithms used to create large-scale structure catalogues for the final Data Release (DR12) samples and the associated random catalogues that quantify the survey mask. The algorithms are an evolution of those used by the BOSS team to construct catalogues from earlier data, and have been designed to accurately quantify the galaxy sample. Furthermore, the code used, designated mksample, is released with this paper.« less

  11. Large-scale identification of target proteins of a glycosyltransferase isozyme by Lectin-IGOT-LC/MS, an LC/MS-based glycoproteomic approach

    PubMed Central

    Sugahara, Daisuke; Kaji, Hiroyuki; Sugihara, Kazushi; Asano, Masahide; Narimatsu, Hisashi

    2012-01-01

    Model organisms containing deletion or mutation in a glycosyltransferase-gene exhibit various physiological abnormalities, suggesting that specific glycan motifs on certain proteins play important roles in vivo. Identification of the target proteins of glycosyltransferase isozymes is the key to understand the roles of glycans. Here, we demonstrated the proteome-scale identification of the target proteins specific for a glycosyltransferase isozyme, β1,4-galactosyltransferase-I (β4GalT-I). Although β4GalT-I is the most characterized glycosyltransferase, its distinctive contribution to β1,4-galactosylation has been hardly described so far. We identified a large number of candidates for the target proteins specific to β4GalT-I by comparative analysis of β4GalT-I-deleted and wild-type mice using the LC/MS-based technique with the isotope-coded glycosylation site-specific tagging (IGOT) of lectin-captured N-glycopeptides. Our approach to identify the target proteins in a proteome-scale offers common features and trends in the target proteins, which facilitate understanding of the mechanism that controls assembly of a particular glycan motif on specific proteins. PMID:23002422

  12. Validating Bayesian truth serum in large-scale online human experiments

    PubMed Central

    Frank, Morgan R.; Cebrian, Manuel; Pickard, Galen; Rahwan, Iyad

    2017-01-01

    Bayesian truth serum (BTS) is an exciting new method for improving honesty and information quality in multiple-choice survey, but, despite the method’s mathematical reliance on large sample sizes, existing literature about BTS only focuses on small experiments. Combined with the prevalence of online survey platforms, such as Amazon’s Mechanical Turk, which facilitate surveys with hundreds or thousands of participants, BTS must be effective in large-scale experiments for BTS to become a readily accepted tool in real-world applications. We demonstrate that BTS quantifiably improves honesty in large-scale online surveys where the “honest” distribution of answers is known in expectation on aggregate. Furthermore, we explore a marketing application where “honest” answers cannot be known, but find that BTS treatment impacts the resulting distributions of answers. PMID:28494000

  13. Large-scale absence of sharks on reefs in the greater-Caribbean: a footprint of human pressures.

    PubMed

    Ward-Paige, Christine A; Mora, Camilo; Lotze, Heike K; Pattengill-Semmens, Christy; McClenachan, Loren; Arias-Castro, Ery; Myers, Ransom A

    2010-08-05

    In recent decades, large pelagic and coastal shark populations have declined dramatically with increased fishing; however, the status of sharks in other systems such as coral reefs remains largely unassessed despite a long history of exploitation. Here we explore the contemporary distribution and sighting frequency of sharks on reefs in the greater-Caribbean and assess the possible role of human pressures on observed patterns. We analyzed 76,340 underwater surveys carried out by trained volunteer divers between 1993 and 2008. Surveys were grouped within one km2 cells, which allowed us to determine the contemporary geographical distribution and sighting frequency of sharks. Sighting frequency was calculated as the ratio of surveys with sharks to the total number of surveys in each cell. We compared sighting frequency to the number of people in the cell vicinity and used population viability analyses to assess the effects of exploitation on population trends. Sharks, with the exception of nurse sharks occurred mainly in areas with very low human population or strong fishing regulations and marine conservation. Population viability analysis suggests that exploitation alone could explain the large-scale absence; however, this pattern is likely to be exacerbated by additional anthropogenic stressors, such as pollution and habitat degradation, that also correlate with human population. Human pressures in coastal zones have lead to the broad-scale absence of sharks on reefs in the greater-Caribbean. Preventing further loss of sharks requires urgent management measures to curb fishing mortality and to mitigate other anthropogenic stressors to protect sites where sharks still exist. The fact that sharks still occur in some densely populated areas where strong fishing regulations are in place indicates the possibility of success and encourages the implementation of conservation measures.

  14. Human-Machine Cooperation in Large-Scale Multimedia Retrieval: A Survey

    ERIC Educational Resources Information Center

    Shirahama, Kimiaki; Grzegorzek, Marcin; Indurkhya, Bipin

    2015-01-01

    "Large-Scale Multimedia Retrieval" (LSMR) is the task to fast analyze a large amount of multimedia data like images or videos and accurately find the ones relevant to a certain semantic meaning. Although LSMR has been investigated for more than two decades in the fields of multimedia processing and computer vision, a more…

  15. Age distribution of human gene families shows significant roles of both large- and small-scale duplications in vertebrate evolution.

    PubMed

    Gu, Xun; Wang, Yufeng; Gu, Jianying

    2002-06-01

    The classical (two-round) hypothesis of vertebrate genome duplication proposes two successive whole-genome duplication(s) (polyploidizations) predating the origin of fishes, a view now being seriously challenged. As the debate largely concerns the relative merits of the 'big-bang mode' theory (large-scale duplication) and the 'continuous mode' theory (constant creation by small-scale duplications), we tested whether a significant proportion of paralogous genes in the contemporary human genome was indeed generated in the early stage of vertebrate evolution. After an extensive search of major databases, we dated 1,739 gene duplication events from the phylogenetic analysis of 749 vertebrate gene families. We found a pattern characterized by two waves (I, II) and an ancient component. Wave I represents a recent gene family expansion by tandem or segmental duplications, whereas wave II, a rapid paralogous gene increase in the early stage of vertebrate evolution, supports the idea of genome duplication(s) (the big-bang mode). Further analysis indicated that large- and small-scale gene duplications both make a significant contribution during the early stage of vertebrate evolution to build the current hierarchy of the human proteome.

  16. Evolution of scaling emergence in large-scale spatial epidemic spreading.

    PubMed

    Wang, Lin; Li, Xiang; Zhang, Yi-Qing; Zhang, Yan; Zhang, Kan

    2011-01-01

    Zipf's law and Heaps' law are two representatives of the scaling concepts, which play a significant role in the study of complexity science. The coexistence of the Zipf's law and the Heaps' law motivates different understandings on the dependence between these two scalings, which has still hardly been clarified. In this article, we observe an evolution process of the scalings: the Zipf's law and the Heaps' law are naturally shaped to coexist at the initial time, while the crossover comes with the emergence of their inconsistency at the larger time before reaching a stable state, where the Heaps' law still exists with the disappearance of strict Zipf's law. Such findings are illustrated with a scenario of large-scale spatial epidemic spreading, and the empirical results of pandemic disease support a universal analysis of the relation between the two laws regardless of the biological details of disease. Employing the United States domestic air transportation and demographic data to construct a metapopulation model for simulating the pandemic spread at the U.S. country level, we uncover that the broad heterogeneity of the infrastructure plays a key role in the evolution of scaling emergence. The analyses of large-scale spatial epidemic spreading help understand the temporal evolution of scalings, indicating the coexistence of the Zipf's law and the Heaps' law depends on the collective dynamics of epidemic processes, and the heterogeneity of epidemic spread indicates the significance of performing targeted containment strategies at the early time of a pandemic disease.

  17. Evaluation of nucleus segmentation in digital pathology images through large scale image synthesis

    NASA Astrophysics Data System (ADS)

    Zhou, Naiyun; Yu, Xiaxia; Zhao, Tianhao; Wen, Si; Wang, Fusheng; Zhu, Wei; Kurc, Tahsin; Tannenbaum, Allen; Saltz, Joel; Gao, Yi

    2017-03-01

    Digital histopathology images with more than 1 Gigapixel are drawing more and more attention in clinical, biomedical research, and computer vision fields. Among the multiple observable features spanning multiple scales in the pathology images, the nuclear morphology is one of the central criteria for diagnosis and grading. As a result it is also the mostly studied target in image computing. Large amount of research papers have devoted to the problem of extracting nuclei from digital pathology images, which is the foundation of any further correlation study. However, the validation and evaluation of nucleus extraction have yet been formulated rigorously and systematically. Some researches report a human verified segmentation with thousands of nuclei, whereas a single whole slide image may contain up to million. The main obstacle lies in the difficulty of obtaining such a large number of validated nuclei, which is essentially an impossible task for pathologist. We propose a systematic validation and evaluation approach based on large scale image synthesis. This could facilitate a more quantitatively validated study for current and future histopathology image analysis field.

  18. Large scale analysis of signal reachability.

    PubMed

    Todor, Andrei; Gabr, Haitham; Dobra, Alin; Kahveci, Tamer

    2014-06-15

    Major disorders, such as leukemia, have been shown to alter the transcription of genes. Understanding how gene regulation is affected by such aberrations is of utmost importance. One promising strategy toward this objective is to compute whether signals can reach to the transcription factors through the transcription regulatory network (TRN). Due to the uncertainty of the regulatory interactions, this is a #P-complete problem and thus solving it for very large TRNs remains to be a challenge. We develop a novel and scalable method to compute the probability that a signal originating at any given set of source genes can arrive at any given set of target genes (i.e., transcription factors) when the topology of the underlying signaling network is uncertain. Our method tackles this problem for large networks while providing a provably accurate result. Our method follows a divide-and-conquer strategy. We break down the given network into a sequence of non-overlapping subnetworks such that reachability can be computed autonomously and sequentially on each subnetwork. We represent each interaction using a small polynomial. The product of these polynomials express different scenarios when a signal can or cannot reach to target genes from the source genes. We introduce polynomial collapsing operators for each subnetwork. These operators reduce the size of the resulting polynomial and thus the computational complexity dramatically. We show that our method scales to entire human regulatory networks in only seconds, while the existing methods fail beyond a few tens of genes and interactions. We demonstrate that our method can successfully characterize key reachability characteristics of the entire transcriptions regulatory networks of patients affected by eight different subtypes of leukemia, as well as those from healthy control samples. All the datasets and code used in this article are available at bioinformatics.cise.ufl.edu/PReach/scalable.htm. © The Author 2014

  19. DESIGN OF LARGE-SCALE AIR MONITORING NETWORKS

    EPA Science Inventory

    The potential effects of air pollution on human health have received much attention in recent years. In the U.S. and other countries, there are extensive large-scale monitoring networks designed to collect data to inform the public of exposure risks to air pollution. A major crit...

  20. Scale dependent behavioral responses to human development by a large predator, the puma.

    PubMed

    Wilmers, Christopher C; Wang, Yiwei; Nickel, Barry; Houghtaling, Paul; Shakeri, Yasaman; Allen, Maximilian L; Kermish-Wells, Joe; Yovovich, Veronica; Williams, Terrie

    2013-01-01

    The spatial scale at which organisms respond to human activity can affect both ecological function and conservation planning. Yet little is known regarding the spatial scale at which distinct behaviors related to reproduction and survival are impacted by human interference. Here we provide a novel approach to estimating the spatial scale at which a top predator, the puma (Puma concolor), responds to human development when it is moving, feeding, communicating, and denning. We find that reproductive behaviors (communication and denning) require at least a 4× larger buffer from human development than non-reproductive behaviors (movement and feeding). In addition, pumas give a wider berth to types of human development that provide a more consistent source of human interference (neighborhoods) than they do to those in which human presence is more intermittent (arterial roads with speeds >35 mph). Neighborhoods were a deterrent to pumas regardless of behavior, while arterial roads only deterred pumas when they were communicating and denning. Female pumas were less deterred by human development than males, but they showed larger variation in their responses overall. Our behaviorally explicit approach to modeling animal response to human activity can be used as a novel tool to assess habitat quality, identify wildlife corridors, and mitigate human-wildlife conflict.

  1. Neurogenomics and the role of a large mutational target on rapid behavioral change.

    PubMed

    Stanley, Craig E; Kulathinal, Rob J

    2016-11-08

    Behavior, while complex and dynamic, is among the most diverse, derived, and rapidly evolving traits in animals. The highly labile nature of heritable behavioral change is observed in such evolutionary phenomena as the emergence of converged behaviors in domesticated animals, the rapid evolution of preferences, and the routine development of ethological isolation between diverging populations and species. In fact, it is believed that nervous system development and its potential to evolve a seemingly infinite array of behavioral innovations played a major role in the successful diversification of metazoans, including our own human lineage. However, unlike other rapidly evolving functional systems such as sperm-egg interactions and immune defense, the genetic basis of rapid behavioral change remains elusive. Here we propose that the rapid divergence and widespread novelty of innate and adaptive behavior is primarily a function of its genomic architecture. Specifically, we hypothesize that the broad diversity of behavioral phenotypes present at micro- and macroevolutionary scales is promoted by a disproportionately large mutational target of neurogenic genes. We present evidence that these large neuro-behavioral targets are significant and ubiquitous in animal genomes and suggest that behavior's novelty and rapid emergence are driven by a number of factors including more selection on a larger pool of variants, a greater role of phenotypic plasticity, and/or unique molecular features present in large genes. We briefly discuss the origins of these large neurogenic genes, as they relate to the remarkable diversity of metazoan behaviors, and highlight key consequences on both behavioral traits and neurogenic disease across, respectively, evolutionary and ontogenetic time scales. Current approaches to studying the genetic mechanisms underlying rapid phenotypic change primarily focus on identifying signatures of Darwinian selection in protein-coding regions. In contrast

  2. Large-scale weakly supervised object localization via latent category learning.

    PubMed

    Chong Wang; Kaiqi Huang; Weiqiang Ren; Junge Zhang; Maybank, Steve

    2015-04-01

    Localizing objects in cluttered backgrounds is challenging under large-scale weakly supervised conditions. Due to the cluttered image condition, objects usually have large ambiguity with backgrounds. Besides, there is also a lack of effective algorithm for large-scale weakly supervised localization in cluttered backgrounds. However, backgrounds contain useful latent information, e.g., the sky in the aeroplane class. If this latent information can be learned, object-background ambiguity can be largely reduced and background can be suppressed effectively. In this paper, we propose the latent category learning (LCL) in large-scale cluttered conditions. LCL is an unsupervised learning method which requires only image-level class labels. First, we use the latent semantic analysis with semantic object representation to learn the latent categories, which represent objects, object parts or backgrounds. Second, to determine which category contains the target object, we propose a category selection strategy by evaluating each category's discrimination. Finally, we propose the online LCL for use in large-scale conditions. Evaluation on the challenging PASCAL Visual Object Class (VOC) 2007 and the large-scale imagenet large-scale visual recognition challenge 2013 detection data sets shows that the method can improve the annotation precision by 10% over previous methods. More importantly, we achieve the detection precision which outperforms previous results by a large margin and can be competitive to the supervised deformable part model 5.0 baseline on both data sets.

  3. Extending SME to Handle Large-Scale Cognitive Modeling.

    PubMed

    Forbus, Kenneth D; Ferguson, Ronald W; Lovett, Andrew; Gentner, Dedre

    2017-07-01

    Analogy and similarity are central phenomena in human cognition, involved in processes ranging from visual perception to conceptual change. To capture this centrality requires that a model of comparison must be able to integrate with other processes and handle the size and complexity of the representations required by the tasks being modeled. This paper describes extensions to Structure-Mapping Engine (SME) since its inception in 1986 that have increased its scope of operation. We first review the basic SME algorithm, describe psychological evidence for SME as a process model, and summarize its role in simulating similarity-based retrieval and generalization. Then we describe five techniques now incorporated into the SME that have enabled it to tackle large-scale modeling tasks: (a) Greedy merging rapidly constructs one or more best interpretations of a match in polynomial time: O(n 2 log(n)); (b) Incremental operation enables mappings to be extended as new information is retrieved or derived about the base or target, to model situations where information in a task is updated over time; (c) Ubiquitous predicates model the varying degrees to which items may suggest alignment; (d) Structural evaluation of analogical inferences models aspects of plausibility judgments; (e) Match filters enable large-scale task models to communicate constraints to SME to influence the mapping process. We illustrate via examples from published studies how these enable it to capture a broader range of psychological phenomena than before. Copyright © 2016 Cognitive Science Society, Inc.

  4. Large-scale protein/antibody patterning with limiting unspecific adsorption

    NASA Astrophysics Data System (ADS)

    Fedorenko, Viktoriia; Bechelany, Mikhael; Janot, Jean-Marc; Smyntyna, Valentyn; Balme, Sebastien

    2017-10-01

    A simple synthetic route based on nanosphere lithography has been developed in order to design a large-scale nanoarray for specific control of protein anchoring. This technique based on two-dimensional (2D) colloidal crystals composed of polystyrene spheres allows the easy and inexpensive fabrication of large arrays (up to several centimeters) by reducing the cost. A silicon wafer coated with a thin adhesion layer of chromium (15 nm) and a layer of gold (50 nm) is used as a substrate. PS spheres are deposited on the gold surface using the floating-transferring technique. The PS spheres were then functionalized with PEG-biotin and the defects by self-assembly monolayer (SAM) PEG to prevent unspecific adsorption. Using epifluorescence microscopy, we show that after immersion of sample on target protein (avidin and anti-avidin) solution, the latter are specifically located on polystyrene spheres. Thus, these results are meaningful for exploration of devices based on a large-scale nanoarray of PS spheres and can be used for detection of target proteins or simply to pattern a surface with specific proteins.

  5. Large-Scale Cognitive GWAS Meta-Analysis Reveals Tissue-Specific Neural Expression and Potential Nootropic Drug Targets.

    PubMed

    Lam, Max; Trampush, Joey W; Yu, Jin; Knowles, Emma; Davies, Gail; Liewald, David C; Starr, John M; Djurovic, Srdjan; Melle, Ingrid; Sundet, Kjetil; Christoforou, Andrea; Reinvang, Ivar; DeRosse, Pamela; Lundervold, Astri J; Steen, Vidar M; Espeseth, Thomas; Räikkönen, Katri; Widen, Elisabeth; Palotie, Aarno; Eriksson, Johan G; Giegling, Ina; Konte, Bettina; Roussos, Panos; Giakoumaki, Stella; Burdick, Katherine E; Payton, Antony; Ollier, William; Chiba-Falek, Ornit; Attix, Deborah K; Need, Anna C; Cirulli, Elizabeth T; Voineskos, Aristotle N; Stefanis, Nikos C; Avramopoulos, Dimitrios; Hatzimanolis, Alex; Arking, Dan E; Smyrnis, Nikolaos; Bilder, Robert M; Freimer, Nelson A; Cannon, Tyrone D; London, Edythe; Poldrack, Russell A; Sabb, Fred W; Congdon, Eliza; Conley, Emily Drabant; Scult, Matthew A; Dickinson, Dwight; Straub, Richard E; Donohoe, Gary; Morris, Derek; Corvin, Aiden; Gill, Michael; Hariri, Ahmad R; Weinberger, Daniel R; Pendleton, Neil; Bitsios, Panos; Rujescu, Dan; Lahti, Jari; Le Hellard, Stephanie; Keller, Matthew C; Andreassen, Ole A; Deary, Ian J; Glahn, David C; Malhotra, Anil K; Lencz, Todd

    2017-11-28

    Here, we present a large (n = 107,207) genome-wide association study (GWAS) of general cognitive ability ("g"), further enhanced by combining results with a large-scale GWAS of educational attainment. We identified 70 independent genomic loci associated with general cognitive ability. Results showed significant enrichment for genes causing Mendelian disorders with an intellectual disability phenotype. Competitive pathway analysis implicated the biological processes of neurogenesis and synaptic regulation, as well as the gene targets of two pharmacologic agents: cinnarizine, a T-type calcium channel blocker, and LY97241, a potassium channel inhibitor. Transcriptome-wide and epigenome-wide analysis revealed that the implicated loci were enriched for genes expressed across all brain regions (most strongly in the cerebellum). Enrichment was exclusive to genes expressed in neurons but not oligodendrocytes or astrocytes. Finally, we report genetic correlations between cognitive ability and disparate phenotypes including psychiatric disorders, several autoimmune disorders, longevity, and maternal age at first birth. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.

  6. Evolution of Scaling Emergence in Large-Scale Spatial Epidemic Spreading

    PubMed Central

    Wang, Lin; Li, Xiang; Zhang, Yi-Qing; Zhang, Yan; Zhang, Kan

    2011-01-01

    Background Zipf's law and Heaps' law are two representatives of the scaling concepts, which play a significant role in the study of complexity science. The coexistence of the Zipf's law and the Heaps' law motivates different understandings on the dependence between these two scalings, which has still hardly been clarified. Methodology/Principal Findings In this article, we observe an evolution process of the scalings: the Zipf's law and the Heaps' law are naturally shaped to coexist at the initial time, while the crossover comes with the emergence of their inconsistency at the larger time before reaching a stable state, where the Heaps' law still exists with the disappearance of strict Zipf's law. Such findings are illustrated with a scenario of large-scale spatial epidemic spreading, and the empirical results of pandemic disease support a universal analysis of the relation between the two laws regardless of the biological details of disease. Employing the United States domestic air transportation and demographic data to construct a metapopulation model for simulating the pandemic spread at the U.S. country level, we uncover that the broad heterogeneity of the infrastructure plays a key role in the evolution of scaling emergence. Conclusions/Significance The analyses of large-scale spatial epidemic spreading help understand the temporal evolution of scalings, indicating the coexistence of the Zipf's law and the Heaps' law depends on the collective dynamics of epidemic processes, and the heterogeneity of epidemic spread indicates the significance of performing targeted containment strategies at the early time of a pandemic disease. PMID:21747932

  7. Zebrafish Whole-Adult-Organism Chemogenomics for Large-Scale Predictive and Discovery Chemical Biology

    PubMed Central

    Lam, Siew Hong; Mathavan, Sinnakarupan; Tong, Yan; Li, Haixia; Karuturi, R. Krishna Murthy; Wu, Yilian; Vega, Vinsensius B.; Liu, Edison T.; Gong, Zhiyuan

    2008-01-01

    The ability to perform large-scale, expression-based chemogenomics on whole adult organisms, as in invertebrate models (worm and fly), is highly desirable for a vertebrate model but its feasibility and potential has not been demonstrated. We performed expression-based chemogenomics on the whole adult organism of a vertebrate model, the zebrafish, and demonstrated its potential for large-scale predictive and discovery chemical biology. Focusing on two classes of compounds with wide implications to human health, polycyclic (halogenated) aromatic hydrocarbons [P(H)AHs] and estrogenic compounds (ECs), we generated robust prediction models that can discriminate compounds of the same class from those of different classes in two large independent experiments. The robust expression signatures led to the identification of biomarkers for potent aryl hydrocarbon receptor (AHR) and estrogen receptor (ER) agonists, respectively, and were validated in multiple targeted tissues. Knowledge-based data mining of human homologs of zebrafish genes revealed highly conserved chemical-induced biological responses/effects, health risks, and novel biological insights associated with AHR and ER that could be inferred to humans. Thus, our study presents an effective, high-throughput strategy of capturing molecular snapshots of chemical-induced biological states of a whole adult vertebrate that provides information on biomarkers of effects, deregulated signaling pathways, and possible affected biological functions, perturbed physiological systems, and increased health risks. These findings place zebrafish in a strategic position to bridge the wide gap between cell-based and rodent models in chemogenomics research and applications, especially in preclinical drug discovery and toxicology. PMID:18618001

  8. Human target acquisition performance

    NASA Astrophysics Data System (ADS)

    Teaney, Brian P.; Du Bosq, Todd W.; Reynolds, Joseph P.; Thompson, Roger; Aghera, Sameer; Moyer, Steven K.; Flug, Eric; Espinola, Richard; Hixson, Jonathan

    2012-06-01

    The battlefield has shifted from armored vehicles to armed insurgents. Target acquisition (identification, recognition, and detection) range performance involving humans as targets is vital for modern warfare. The acquisition and neutralization of armed insurgents while at the same time minimizing fratricide and civilian casualties is a mounting concern. U.S. Army RDECOM CERDEC NVESD has conducted many experiments involving human targets for infrared and reflective band sensors. The target sets include human activities, hand-held objects, uniforms & armament, and other tactically relevant targets. This paper will define a set of standard task difficulty values for identification and recognition associated with human target acquisition performance.

  9. Large Scale Generation and Characterization of Anti-Human CD34 Monoclonal Antibody in Ascetic Fluid of Balb/c Mice

    PubMed Central

    Aghebati Maleki, Leili; Majidi, Jafar; Baradaran, Behzad; Abdolalizadeh, Jalal; Kazemi, Tohid; Aghebati Maleki, Ali; Sineh sepehr, Koushan

    2013-01-01

    Purpose: Monoclonal antibodies or specific antibodies are now an essential tool of biomedical research and are of great commercial and medical value. The purpose of this study was to produce large scale of monoclonal antibody against CD34 in order to diagnostic application in leukemia and purification of human hematopoietic stem/progenitor cells. Methods: For large scale production of monoclonal antibody, hybridoma cells that produce monoclonal antibody against human CD34 were injected into the peritoneum of the Balb/c mice which have previously been primed with 0.5 ml Pristane. 5 ml ascitic fluid was harvested from each mouse in two times. Evaluation of mAb titration was assessed by ELISA method. The ascitic fluid was examined for class and subclasses by ELISA mouse mAb isotyping Kit. mAb was purified from ascitic fluid by affinity chromatography on Protein A-Sepharose. Purity of monoclonal antibody was monitored by SDS -PAGE and the purified monoclonal antibody was conjugated with FITC. Results: Monoclonal antibodies with high specificity and sensitivity against human CD34 by hybridoma technology were prepared. The subclass of antibody was IgG1 and its light chain was kappa. Conclusion: The conjugated monoclonal antibody could be a useful tool for isolation, purification and characterization of human hematopoietic stem cells. PMID:24312838

  10. Exploiting multi-scale parallelism for large scale numerical modelling of laser wakefield accelerators

    NASA Astrophysics Data System (ADS)

    Fonseca, R. A.; Vieira, J.; Fiuza, F.; Davidson, A.; Tsung, F. S.; Mori, W. B.; Silva, L. O.

    2013-12-01

    A new generation of laser wakefield accelerators (LWFA), supported by the extreme accelerating fields generated in the interaction of PW-Class lasers and underdense targets, promises the production of high quality electron beams in short distances for multiple applications. Achieving this goal will rely heavily on numerical modelling to further understand the underlying physics and identify optimal regimes, but large scale modelling of these scenarios is computationally heavy and requires the efficient use of state-of-the-art petascale supercomputing systems. We discuss the main difficulties involved in running these simulations and the new developments implemented in the OSIRIS framework to address these issues, ranging from multi-dimensional dynamic load balancing and hybrid distributed/shared memory parallelism to the vectorization of the PIC algorithm. We present the results of the OASCR Joule Metric program on the issue of large scale modelling of LWFA, demonstrating speedups of over 1 order of magnitude on the same hardware. Finally, scalability to over ˜106 cores and sustained performance over ˜2 P Flops is demonstrated, opening the way for large scale modelling of LWFA scenarios.

  11. River Food Web Response to Large-Scale Riparian Zone Manipulations

    PubMed Central

    Wootton, J. Timothy

    2012-01-01

    Conservation programs often focus on select species, leading to management plans based on the autecology of the focal species, but multiple ecosystem components can be affected both by the environmental factors impacting, and the management targeting, focal species. These broader effects can have indirect impacts on target species through the web of interactions within ecosystems. For example, human activity can strongly alter riparian vegetation, potentially impacting both economically-important salmonids and their associated river food web. In an Olympic Peninsula river, Washington state, USA, replicated large-scale riparian vegetation manipulations implemented with the long-term (>40 yr) goal of improving salmon habitat did not affect water temperature, nutrient limitation or habitat characteristics, but reduced canopy cover, causing reduced energy input via leaf litter, increased incident solar radiation (UV and PAR) and increased algal production compared to controls. In response, benthic algae, most insect taxa, and juvenile salmonids increased in manipulated areas. Stable isotope analysis revealed a predominant contribution of algal-derived energy to salmonid diets in manipulated reaches. The experiment demonstrates that riparian management targeting salmonids strongly affects river food webs via changes in the energy base, illustrates how species-based management strategies can have unanticipated indirect effects on the target species via the associated food web, and supports ecosystem-based management approaches for restoring depleted salmonid stocks. PMID:23284786

  12. Large Scale Spectral Line Mapping of Galactic Regions with CCAT-Prime

    NASA Astrophysics Data System (ADS)

    Simon, Robert

    2018-01-01

    CCAT-prime is a 6-m submillimeter telescope that is being built on the top of Cerro Chajnantor (5600 m altitude) overlooking the ALMA plateau in the Atacama Desert. Its novel Crossed-Dragone design enables a large field of view without blockage and is thus particularly well suited for large scale surveys in the continuum and spectral lines targeting important questions ranging from star formation in the Milky Way to cosmology. On this poster, we focus on the large scale mapping opportunities in important spectral cooling lines of the interstellar medium opened up by CCAT-prime and the Cologne heterodyne instrument CHAI.

  13. Enhancements for a Dynamic Data Warehousing and Mining System for Large-Scale Human Social Cultural Behavioral (HSBC) Data

    DTIC Science & Technology

    2016-09-26

    Intelligent Automation Incorporated Enhancements for a Dynamic Data Warehousing and Mining ...Enhancements for a Dynamic Data Warehousing and Mining System for N00014-16-P-3014 Large-Scale Human Social Cultural Behavioral (HSBC) Data 5b. GRANT NUMBER...Representative Media Gallery View. We perform Scraawl’s NER algorithm to the text associated with YouTube post, which classifies the named entities into

  14. Large-scale Proteomics Analysis of the Human Kinome

    PubMed Central

    Oppermann, Felix S.; Gnad, Florian; Olsen, Jesper V.; Hornberger, Renate; Greff, Zoltán; Kéri, György; Mann, Matthias; Daub, Henrik

    2009-01-01

    Members of the human protein kinase superfamily are the major regulatory enzymes involved in the activity control of eukaryotic signal transduction pathways. As protein kinases reside at the nodes of phosphorylation-based signal transmission, comprehensive analysis of their cellular expression and site-specific phosphorylation can provide important insights into the architecture and functionality of signaling networks. However, in global proteome studies, low cellular abundance of protein kinases often results in rather minor peptide species that are occluded by a vast excess of peptides from other cellular proteins. These analytical limitations create a rationale for kinome-wide enrichment of protein kinases prior to mass spectrometry analysis. Here, we employed stable isotope labeling by amino acids in cell culture (SILAC) to compare the binding characteristics of three kinase-selective affinity resins by quantitative mass spectrometry. The evaluated pre-fractionation tools possessed pyrido[2,3-d]pyrimidine-based kinase inhibitors as immobilized capture ligands and retained considerable subsets of the human kinome. Based on these results, an affinity resin displaying the broadly selective kinase ligand VI16832 was employed to quantify the relative expression of more than 170 protein kinases across three different, SILAC-encoded cancer cell lines. These experiments demonstrated the feasibility of comparative kinome profiling in a compact experimental format. Interestingly, we found high levels of cytoplasmic and low levels of receptor tyrosine kinases in MV4–11 leukemia cells compared with the adherent cancer lines HCT116 and MDA-MB-435S. The VI16832 resin was further exploited to pre-fractionate kinases for targeted phosphoproteomics analysis, which revealed about 1200 distinct phosphorylation sites on more than 200 protein kinases. This hitherto largest survey of site-specific phosphorylation across the kinome significantly expands the basis for functional

  15. Towards human-computer synergetic analysis of large-scale biological data.

    PubMed

    Singh, Rahul; Yang, Hui; Dalziel, Ben; Asarnow, Daniel; Murad, William; Foote, David; Gormley, Matthew; Stillman, Jonathan; Fisher, Susan

    2013-01-01

    Advances in technology have led to the generation of massive amounts of complex and multifarious biological data in areas ranging from genomics to structural biology. The volume and complexity of such data leads to significant challenges in terms of its analysis, especially when one seeks to generate hypotheses or explore the underlying biological processes. At the state-of-the-art, the application of automated algorithms followed by perusal and analysis of the results by an expert continues to be the predominant paradigm for analyzing biological data. This paradigm works well in many problem domains. However, it also is limiting, since domain experts are forced to apply their instincts and expertise such as contextual reasoning, hypothesis formulation, and exploratory analysis after the algorithm has produced its results. In many areas where the organization and interaction of the biological processes is poorly understood and exploratory analysis is crucial, what is needed is to integrate domain expertise during the data analysis process and use it to drive the analysis itself. In context of the aforementioned background, the results presented in this paper describe advancements along two methodological directions. First, given the context of biological data, we utilize and extend a design approach called experiential computing from multimedia information system design. This paradigm combines information visualization and human-computer interaction with algorithms for exploratory analysis of large-scale and complex data. In the proposed approach, emphasis is laid on: (1) allowing users to directly visualize, interact, experience, and explore the data through interoperable visualization-based and algorithmic components, (2) supporting unified query and presentation spaces to facilitate experimentation and exploration, (3) providing external contextual information by assimilating relevant supplementary data, and (4) encouraging user-directed information

  16. Towards human-computer synergetic analysis of large-scale biological data

    PubMed Central

    2013-01-01

    Background Advances in technology have led to the generation of massive amounts of complex and multifarious biological data in areas ranging from genomics to structural biology. The volume and complexity of such data leads to significant challenges in terms of its analysis, especially when one seeks to generate hypotheses or explore the underlying biological processes. At the state-of-the-art, the application of automated algorithms followed by perusal and analysis of the results by an expert continues to be the predominant paradigm for analyzing biological data. This paradigm works well in many problem domains. However, it also is limiting, since domain experts are forced to apply their instincts and expertise such as contextual reasoning, hypothesis formulation, and exploratory analysis after the algorithm has produced its results. In many areas where the organization and interaction of the biological processes is poorly understood and exploratory analysis is crucial, what is needed is to integrate domain expertise during the data analysis process and use it to drive the analysis itself. Results In context of the aforementioned background, the results presented in this paper describe advancements along two methodological directions. First, given the context of biological data, we utilize and extend a design approach called experiential computing from multimedia information system design. This paradigm combines information visualization and human-computer interaction with algorithms for exploratory analysis of large-scale and complex data. In the proposed approach, emphasis is laid on: (1) allowing users to directly visualize, interact, experience, and explore the data through interoperable visualization-based and algorithmic components, (2) supporting unified query and presentation spaces to facilitate experimentation and exploration, (3) providing external contextual information by assimilating relevant supplementary data, and (4) encouraging user

  17. Study of multi-functional precision optical measuring system for large scale equipment

    NASA Astrophysics Data System (ADS)

    Jiang, Wei; Lao, Dabao; Zhou, Weihu; Zhang, Wenying; Jiang, Xingjian; Wang, Yongxi

    2017-10-01

    The effective application of high performance measurement technology can greatly improve the large-scale equipment manufacturing ability. Therefore, the geometric parameters measurement, such as size, attitude and position, requires the measurement system with high precision, multi-function, portability and other characteristics. However, the existing measuring instruments, such as laser tracker, total station, photogrammetry system, mostly has single function, station moving and other shortcomings. Laser tracker needs to work with cooperative target, but it can hardly meet the requirement of measurement in extreme environment. Total station is mainly used for outdoor surveying and mapping, it is hard to achieve the demand of accuracy in industrial measurement. Photogrammetry system can achieve a wide range of multi-point measurement, but the measuring range is limited and need to repeatedly move station. The paper presents a non-contact opto-electronic measuring instrument, not only it can work by scanning the measurement path but also measuring the cooperative target by tracking measurement. The system is based on some key technologies, such as absolute distance measurement, two-dimensional angle measurement, automatically target recognition and accurate aiming, precision control, assembly of complex mechanical system and multi-functional 3D visualization software. Among them, the absolute distance measurement module ensures measurement with high accuracy, and the twodimensional angle measuring module provides precision angle measurement. The system is suitable for the case of noncontact measurement of large-scale equipment, it can ensure the quality and performance of large-scale equipment throughout the process of manufacturing and improve the manufacturing ability of large-scale and high-end equipment.

  18. The Observations of Redshift Evolution in Large Scale Environments (ORELSE) Survey

    NASA Astrophysics Data System (ADS)

    Squires, Gordon K.; Lubin, L. M.; Gal, R. R.

    2007-05-01

    We present the motivation, design, and latest results from the Observations of Redshift Evolution in Large Scale Environments (ORELSE) Survey, a systematic search for structure on scales greater than 10 Mpc around 20 known galaxy clusters at z > 0.6. When complete, the survey will cover nearly 5 square degrees, all targeted at high-density regions, making it complementary and comparable to field surveys such as DEEP2, GOODS, and COSMOS. For the survey, we are using the Large Format Camera on the Palomar 5-m and SuPRIME-Cam on the Subaru 8-m to obtain optical/near-infrared imaging of an approximately 30 arcmin region around previously studied high-redshift clusters. Colors are used to identify likely member galaxies which are targeted for follow-up spectroscopy with the DEep Imaging Multi-Object Spectrograph on the Keck 10-m. This technique has been used to identify successfully the Cl 1604 supercluster at z = 0.9, a large scale structure containing at least eight clusters (Gal & Lubin 2004; Gal, Lubin & Squires 2005). We present the most recent structures to be photometrically and spectroscopically confirmed through this program, discuss the properties of the member galaxies as a function of environment, and describe our planned multi-wavelength (radio, mid-IR, and X-ray) observations of these systems. The goal of this survey is to identify and examine a statistical sample of large scale structures during an active period in the assembly history of the most massive clusters. With such a sample, we can begin to constrain large scale cluster dynamics and determine the effect of the larger environment on galaxy evolution.

  19. Punishment sustains large-scale cooperation in prestate warfare

    PubMed Central

    Mathew, Sarah; Boyd, Robert

    2011-01-01

    Understanding cooperation and punishment in small-scale societies is crucial for explaining the origins of human cooperation. We studied warfare among the Turkana, a politically uncentralized, egalitarian, nomadic pastoral society in East Africa. Based on a representative sample of 88 recent raids, we show that the Turkana sustain costly cooperation in combat at a remarkably large scale, at least in part, through punishment of free-riders. Raiding parties comprised several hundred warriors and participants are not kin or day-to-day interactants. Warriors incur substantial risk of death and produce collective benefits. Cowardice and desertions occur, and are punished by community-imposed sanctions, including collective corporal punishment and fines. Furthermore, Turkana norms governing warfare benefit the ethnolinguistic group, a population of a half-million people, at the expense of smaller social groupings. These results challenge current views that punishment is unimportant in small-scale societies and that human cooperation evolved in small groups of kin and familiar individuals. Instead, these results suggest that cooperation at the larger scale of ethnolinguistic units enforced by third-party sanctions could have a deep evolutionary history in the human species. PMID:21670285

  20. Energy transfers in large-scale and small-scale dynamos

    NASA Astrophysics Data System (ADS)

    Samtaney, Ravi; Kumar, Rohit; Verma, Mahendra

    2015-11-01

    We present the energy transfers, mainly energy fluxes and shell-to-shell energy transfers in small-scale dynamo (SSD) and large-scale dynamo (LSD) using numerical simulations of MHD turbulence for Pm = 20 (SSD) and for Pm = 0.2 on 10243 grid. For SSD, we demonstrate that the magnetic energy growth is caused by nonlocal energy transfers from the large-scale or forcing-scale velocity field to small-scale magnetic field. The peak of these energy transfers move towards lower wavenumbers as dynamo evolves, which is the reason for the growth of the magnetic fields at the large scales. The energy transfers U2U (velocity to velocity) and B2B (magnetic to magnetic) are forward and local. For LSD, we show that the magnetic energy growth takes place via energy transfers from large-scale velocity field to large-scale magnetic field. We observe forward U2U and B2B energy flux, similar to SSD.

  1. Developing procedures for the large-scale purification of human serum butyrylcholinesterase.

    PubMed

    Saxena, Ashima; Luo, Chunyuan; Doctor, Bhupendra P

    2008-10-01

    Human serum butyrylcholinesterase (Hu BChE) is the most viable candidate for the prophylactic treatment of organophosphate poisoning. A dose of 200 mg/70 kg is predicted to protect humans against 2x LD(50) of soman. Therefore, the aim of this study was to develop procedures for the purification of gram quantities of this enzyme from outdated human plasma or Cohn Fraction IV-4. The purification of Hu BChE was accomplished by batch adsorption on procainamide-Sepharose-CL-4B affinity gel followed by ion-exchange chromatography on a DEAE-Sepharose column. For the purification of enzyme from Cohn Fraction IV-4, it was resuspended in 25 mM sodium phosphate buffer, pH 8.0, and fat was removed by decantation, prior to batch adsorption on procainamide-Sepharose gel. In both cases, the procainamide gel was thoroughly washed with 25 mM sodium phosphate buffer, pH 8.0, containing 0.05 M NaCl, and the enzyme was eluted with the same buffer containing 0.1 M procainamide. The enzyme was dialyzed and the pH was adjusted to 4.0 before loading on the DEAE column equilibrated in sodium acetate buffer, pH 4.0. The column was thoroughly washed with 25 mM sodium phosphate buffer, pH 8.0 containing 0.05 M NaCl before elution with a gradient of 0.05-0.2M NaCl in the same buffer. The purity of the enzyme following these steps ranged from 20% to 40%. The purity of the enzyme increased to >90% by chromatography on an analytical procainamide affinity column. Results show that Cohn Fraction IV-4 is a much better source than plasma for the large-scale isolation of purified Hu BChE.

  2. Large-scale image-based profiling of single-cell phenotypes in arrayed CRISPR-Cas9 gene perturbation screens.

    PubMed

    de Groot, Reinoud; Lüthi, Joel; Lindsay, Helen; Holtackers, René; Pelkmans, Lucas

    2018-01-23

    High-content imaging using automated microscopy and computer vision allows multivariate profiling of single-cell phenotypes. Here, we present methods for the application of the CISPR-Cas9 system in large-scale, image-based, gene perturbation experiments. We show that CRISPR-Cas9-mediated gene perturbation can be achieved in human tissue culture cells in a timeframe that is compatible with image-based phenotyping. We developed a pipeline to construct a large-scale arrayed library of 2,281 sequence-verified CRISPR-Cas9 targeting plasmids and profiled this library for genes affecting cellular morphology and the subcellular localization of components of the nuclear pore complex (NPC). We conceived a machine-learning method that harnesses genetic heterogeneity to score gene perturbations and identify phenotypically perturbed cells for in-depth characterization of gene perturbation effects. This approach enables genome-scale image-based multivariate gene perturbation profiling using CRISPR-Cas9. © 2018 The Authors. Published under the terms of the CC BY 4.0 license.

  3. Comparison of human and algorithmic target detection in passive infrared imagery

    NASA Astrophysics Data System (ADS)

    Weber, Bruce A.; Hutchinson, Meredith

    2003-09-01

    We have designed an experiment that compares the performance of human observers and a scale-insensitive target detection algorithm that uses pixel level information for the detection of ground targets in passive infrared imagery. The test database contains targets near clutter whose detectability ranged from easy to very difficult. Results indicate that human observers detect more "easy-to-detect" targets, and with far fewer false alarms, than the algorithm. For "difficult-to-detect" targets, human and algorithm detection rates are considerably degraded, and algorithm false alarms excessive. Analysis of detections as a function of observer confidence shows that algorithm confidence attribution does not correspond to human attribution, and does not adequately correlate with correct detections. The best target detection score for any human observer was 84%, as compared to 55% for the algorithm for the same false alarm rate. At 81%, the maximum detection score for the algorithm, the same human observer had 6 false alarms per frame as compared to 29 for the algorithm. Detector ROC curves and observer-confidence analysis benchmarks the algorithm and provides insights into algorithm deficiencies and possible paths to improvement.

  4. Climate, Water, and Human Health: Large Scale Hydroclimatic Controls in Forecasting Cholera Epidemics

    NASA Astrophysics Data System (ADS)

    Akanda, A. S.; Jutla, A. S.; Islam, S.

    2009-12-01

    Despite ravaging the continents through seven global pandemics in past centuries, the seasonal and interannual variability of cholera outbreaks remain a mystery. Previous studies have focused on the role of various environmental and climatic factors, but provided little or no predictive capability. Recent findings suggest a more prominent role of large scale hydroclimatic extremes - droughts and floods - and attempt to explain the seasonality and the unique dual cholera peaks in the Bengal Delta region of South Asia. We investigate the seasonal and interannual nature of cholera epidemiology in three geographically distinct locations within the region to identify the larger scale hydroclimatic controls that can set the ecological and environmental ‘stage’ for outbreaks and have significant memory on a seasonal scale. Here we show that two distinctly different, pre and post monsoon, cholera transmission mechanisms related to large scale climatic controls prevail in the region. An implication of our findings is that extreme climatic events such as prolonged droughts, record floods, and major cyclones may cause major disruption in the ecosystem and trigger large epidemics. We postulate that a quantitative understanding of the large-scale hydroclimatic controls and dominant processes with significant system memory will form the basis for forecasting such epidemic outbreaks. A multivariate regression method using these predictor variables to develop probabilistic forecasts of cholera outbreaks will be explored. Forecasts from such a system with a seasonal lead-time are likely to have measurable impact on early cholera detection and prevention efforts in endemic regions.

  5. XLID-causing mutations and associated genes challenged in light of data from large-scale human exome sequencing.

    PubMed

    Piton, Amélie; Redin, Claire; Mandel, Jean-Louis

    2013-08-08

    Because of the unbalanced sex ratio (1.3-1.4 to 1) observed in intellectual disability (ID) and the identification of large ID-affected families showing X-linked segregation, much attention has been focused on the genetics of X-linked ID (XLID). Mutations causing monogenic XLID have now been reported in over 100 genes, most of which are commonly included in XLID diagnostic gene panels. Nonetheless, the boundary between true mutations and rare non-disease-causing variants often remains elusive. The sequencing of a large number of control X chromosomes, required for avoiding false-positive results, was not systematically possible in the past. Such information is now available thanks to large-scale sequencing projects such as the National Heart, Lung, and Blood (NHLBI) Exome Sequencing Project, which provides variation information on 10,563 X chromosomes from the general population. We used this NHLBI cohort to systematically reassess the implication of 106 genes proposed to be involved in monogenic forms of XLID. We particularly question the implication in XLID of ten of them (AGTR2, MAGT1, ZNF674, SRPX2, ATP6AP2, ARHGEF6, NXF5, ZCCHC12, ZNF41, and ZNF81), in which truncating variants or previously published mutations are observed at a relatively high frequency within this cohort. We also highlight 15 other genes (CCDC22, CLIC2, CNKSR2, FRMPD4, HCFC1, IGBP1, KIAA2022, KLF8, MAOA, NAA10, NLGN3, RPL10, SHROOM4, ZDHHC15, and ZNF261) for which replication studies are warranted. We propose that similar reassessment of reported mutations (and genes) with the use of data from large-scale human exome sequencing would be relevant for a wide range of other genetic diseases. Copyright © 2013 The American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.

  6. Utilization of Large Scale Surface Models for Detailed Visibility Analyses

    NASA Astrophysics Data System (ADS)

    Caha, J.; Kačmařík, M.

    2017-11-01

    This article demonstrates utilization of large scale surface models with small spatial resolution and high accuracy, acquired from Unmanned Aerial Vehicle scanning, for visibility analyses. The importance of large scale data for visibility analyses on the local scale, where the detail of the surface model is the most defining factor, is described. The focus is not only the classic Boolean visibility, that is usually determined within GIS, but also on so called extended viewsheds that aims to provide more information about visibility. The case study with examples of visibility analyses was performed on river Opava, near the Ostrava city (Czech Republic). The multiple Boolean viewshed analysis and global horizon viewshed were calculated to determine most prominent features and visibility barriers of the surface. Besides that, the extended viewshed showing angle difference above the local horizon, which describes angular height of the target area above the barrier, is shown. The case study proved that large scale models are appropriate data source for visibility analyses on local level. The discussion summarizes possible future applications and further development directions of visibility analyses.

  7. Space and time scales in human-landscape systems.

    PubMed

    Kondolf, G Mathias; Podolak, Kristen

    2014-01-01

    Exploring spatial and temporal scales provides a way to understand human alteration of landscape processes and human responses to these processes. We address three topics relevant to human-landscape systems: (1) scales of human impacts on geomorphic processes, (2) spatial and temporal scales in river restoration, and (3) time scales of natural disasters and behavioral and institutional responses. Studies showing dramatic recent change in sediment yields from uplands to the ocean via rivers illustrate the increasingly vast spatial extent and quick rate of human landscape change in the last two millennia, but especially in the second half of the twentieth century. Recent river restoration efforts are typically small in spatial and temporal scale compared to the historical human changes to ecosystem processes, but the cumulative effectiveness of multiple small restoration projects in achieving large ecosystem goals has yet to be demonstrated. The mismatch between infrequent natural disasters and individual risk perception, media coverage, and institutional response to natural disasters results in un-preparedness and unsustainable land use and building practices.

  8. Designing large-scale conservation corridors for pattern and process.

    PubMed

    Rouget, Mathieu; Cowling, Richard M; Lombard, Amanda T; Knight, Andrew T; Kerley, Graham I H

    2006-04-01

    A major challenge for conservation assessments is to identify priority areas that incorporate biological patterns and processes. Because large-scale processes are mostly oriented along environmental gradients, we propose to accommodate them by designing regional-scale corridors to capture these gradients. Based on systematic conservation planning principles such as representation and persistence, we identified large tracts of untransformed land (i.e., conservation corridors) for conservation that would achieve biodiversity targets for pattern and process in the Subtropical Thicket Biome of South Africa. We combined least-cost path analysis with a target-driven algorithm to identify the best option for capturing key environmental gradients while considering biodiversity targets and conservation opportunities and constraints. We identified seven conservation corridors on the basis of subtropical thicket representation, habitat transformation and degradation, wildlife suitability, irreplaceability of vegetation types, protected area networks, and future land-use pressures. These conservation corridors covered 21.1% of the planning region (ranging from 600 to 5200 km2) and successfully achieved targets for biological processes and to a lesser extent for vegetation types. The corridors we identified are intended to promote the persistence of ecological processes (gradients and fixed processes) and fulfill half of the biodiversity pattern target. We compared the conservation corridors with a simplified corridor design consisting of a fixed-width buffer along major rivers. Conservation corridors outperformed river buffers in seven out of eight criteria. Our corridor design can provide a tool for quantifying trade-offs between various criteria (biodiversity pattern and process, implementation constraints and opportunities). A land-use management model was developed to facilitate implementation of conservation actions within these corridors.

  9. Large-scale purification and characterization of recombinant human stem cell factor in Escherichia coli.

    PubMed

    Chen, Liang-Hua; Cai, Feng; Zhang, Dan-Ju; Zhang, Li; Zhu, Peng; Gao, Shun

    2017-07-01

    The pharmacological importance of recombinant human stem cell factor (rhSCF) has increased the demand to establish effective and large-scale production and purification processes. A good source of bioactive recombinant protein with capability of being scaled-up without losing activity has always been a challenge. The objectives of the study were the rapid and efficient pilot-scale expression and purification of rhSCF. The gene encoding stem cell factor (SCF) was cloned into pBV220 and transformed into Escherichia coli. The recombinant SCF was expressed and isolated using a procedure consisting of isolation of inclusion bodies (IBs), denaturation, and refolding followed by chromatographic steps toward purification. The yield of rhSCF reached 835.6 g/20 L, and the expression levels of rhSCF were about 33.9% of the total E. coli protein content. rhSCF was purified by isolation of IBs, denaturation, and refolding, followed by SP-Sepharose chromatography, Source 30 reversed-phase chromatography, and Q-Sepharose chromatography. This procedure was developed to isolate 5.5 g of rhSCF (99.5% purity) with specific activity at 0.96 × 10 6  IU/mg, endotoxin levels of pyrogen at 1.0 EU/mg, and bacterial DNA at 10 ng/mg. Pilot-scale fermentations and purifications were set up for the production of rhSCF that can be upscaled for industry. © 2016 International Union of Biochemistry and Molecular Biology, Inc.

  10. Reflections on the Increasing Relevance of Large-Scale Professional Development

    ERIC Educational Resources Information Center

    Krainer, Konrad

    2015-01-01

    This paper focuses on commonalities and differences of three approaches to large-scale professional development (PD) in mathematics education, based on two studies from Germany and one from the United States of America. All three initiatives break new ground in improving PD targeted at educating "multipliers", and in all three cases…

  11. Mitochondrial Targets for Pharmacological Intervention in Human Disease

    PubMed Central

    2015-01-01

    Over the past several years, mitochondrial dysfunction has been linked to an increasing number of human illnesses, making mitochondrial proteins (MPs) an ever more appealing target for therapeutic intervention. With 20% of the mitochondrial proteome (312 of an estimated 1500 MPs) having known interactions with small molecules, MPs appear to be highly targetable. Yet, despite these targeted proteins functioning in a range of biological processes (including induction of apoptosis, calcium homeostasis, and metabolism), very few of the compounds targeting MPs find clinical use. Recent work has greatly expanded the number of proteins known to localize to the mitochondria and has generated a considerable increase in MP 3D structures available in public databases, allowing experimental screening and in silico prediction of mitochondrial drug targets on an unprecedented scale. Here, we summarize the current literature on clinically active drugs that target MPs, with a focus on how existing drug targets are distributed across biochemical pathways and organelle substructures. Also, we examine current strategies for mitochondrial drug discovery, focusing on genetic, proteomic, and chemogenomic assays, and relevant model systems. As cell models and screening techniques improve, MPs appear poised to emerge as relevant targets for a wide range of complex human diseases, an eventuality that can be expedited through systematic analysis of MP function. PMID:25367773

  12. Cross-indexing of binary SIFT codes for large-scale image search.

    PubMed

    Liu, Zhen; Li, Houqiang; Zhang, Liyan; Zhou, Wengang; Tian, Qi

    2014-05-01

    In recent years, there has been growing interest in mapping visual features into compact binary codes for applications on large-scale image collections. Encoding high-dimensional data as compact binary codes reduces the memory cost for storage. Besides, it benefits the computational efficiency since the computation of similarity can be efficiently measured by Hamming distance. In this paper, we propose a novel flexible scale invariant feature transform (SIFT) binarization (FSB) algorithm for large-scale image search. The FSB algorithm explores the magnitude patterns of SIFT descriptor. It is unsupervised and the generated binary codes are demonstrated to be dispreserving. Besides, we propose a new searching strategy to find target features based on the cross-indexing in the binary SIFT space and original SIFT space. We evaluate our approach on two publicly released data sets. The experiments on large-scale partial duplicate image retrieval system demonstrate the effectiveness and efficiency of the proposed algorithm.

  13. Large Scale Generation and Characterization of Anti-Human IgA Monoclonal Antibody in Ascitic Fluid of Balb/c Mice

    PubMed Central

    Ezzatifar, Fatemeh; Majidi, Jafar; Baradaran, Behzad; Aghebati Maleki, Leili; Abdolalizadeh, Jalal; Yousefi, Mehdi

    2015-01-01

    Purpose: Monoclonal antibodies are potentially powerful tools used in biomedical research, diagnosis, and treatment of infectious diseases and cancers. The monoclonal antibody against Human IgA can be used as a diagnostic application to detect infectious diseases. The aim of this study was to improve an appropriate protocol for large-scale production of mAbs against IgA. Methods: For large-scale production of the monoclonal antibody, hybridoma cells that produce monoclonal antibodies against Human IgA were injected intraperitoneally into Balb/c mice that were previously primed with 0.5 ml Pristane. After ten days, ascitic fluid was harvested from the peritoneum of each mouse. The ELISA method was carried out for evaluation of the titration of produced mAbs. The ascitic fluid was investigated in terms of class and subclass by a mouse mAb isotyping kit. MAb was purified from the ascitic fluid by ion exchange chromatography. The purity of the monoclonal antibody was confirmed by SDS-PAGE, and the purified monoclonal antibody was conjugated with HRP. Results: Monoclonal antibodies with high specificity and sensitivity against Human IgA were prepared by hybridoma technology. The subclass of antibody was IgG1 and its light chain was the kappa type. Conclusion: This conjugated monoclonal antibody could have applications in designing ELISA kits in order to diagnose different infectious diseases such as toxoplasmosis and H. Pylori. PMID:25789225

  14. Modelling large scale human activity in San Francisco

    NASA Astrophysics Data System (ADS)

    Gonzalez, Marta

    2010-03-01

    Diverse group of people with a wide variety of schedules, activities and travel needs compose our cities nowadays. This represents a big challenge for modeling travel behaviors in urban environments; those models are of crucial interest for a wide variety of applications such as traffic forecasting, spreading of viruses, or measuring human exposure to air pollutants. The traditional means to obtain knowledge about travel behavior is limited to surveys on travel journeys. The obtained information is based in questionnaires that are usually costly to implement and with intrinsic limitations to cover large number of individuals and some problems of reliability. Using mobile phone data, we explore the basic characteristics of a model of human travel: The distribution of agents is proportional to the population density of a given region, and each agent has a characteristic trajectory size contain information on frequency of visits to different locations. Additionally we use a complementary data set given by smart subway fare cards offering us information about the exact time of each passenger getting in or getting out of the subway station and the coordinates of it. This allows us to uncover the temporal aspects of the mobility. Since we have the actual time and place of individual's origin and destination we can understand the temporal patterns in each visited location with further details. Integrating two described data set we provide a dynamical model of human travels that incorporates different aspects observed empirically.

  15. Fabrication of the HIAD Large-Scale Demonstration Assembly

    NASA Technical Reports Server (NTRS)

    Swanson, G. T.; Johnson, R. K.; Hughes, S. J.; DiNonno, J. M.; Cheatwood, F. M.

    2017-01-01

    Over a decade of work has been conducted in the development of NASA's Hypersonic Inflatable Aerodynamic Decelerator (HIAD) technology. This effort has included multiple ground test campaigns and flight tests culminating in the HIAD projects second generation (Gen-2) deployable aeroshell system and associated analytical tools. NASAs HIAD project team has developed, fabricated, and tested inflatable structures (IS) integrated with flexible thermal protection system (F-TPS), ranging in diameters from 3-6m, with cone angles of 60 and 70 deg.In 2015, United Launch Alliance (ULA) announced that they will use a HIAD (10-12m) as part of their Sensible, Modular, Autonomous Return Technology (SMART) for their upcoming Vulcan rocket. ULA expects SMART reusability, coupled with other advancements for Vulcan, will substantially reduce the cost of access to space. The first booster engine recovery via HIAD is scheduled for 2024. To meet this near-term need, as well as future NASA applications, the HIAD team is investigating taking the technology to the 10-15m diameter scale. In the last year, many significant development and fabrication efforts have been accomplished, culminating in the construction of a large-scale inflatable structure demonstration assembly. This assembly incorporated the first three tori for a 12m Mars Human-Scale Pathfinder HIAD conceptual design that was constructed with the current state of the art material set. Numerous design trades and torus fabrication demonstrations preceded this effort. In 2016, three large-scale tori (0.61m cross-section) and six subscale tori (0.25m cross-section) were manufactured to demonstrate fabrication techniques using the newest candidate material sets. These tori were tested to evaluate durability and load capacity. This work led to the selection of the inflatable structures third generation (Gen-3) structural liner. In late 2016, the three tori required for the large-scale demonstration assembly were fabricated, and then

  16. Design of an omnidirectional single-point photodetector for large-scale spatial coordinate measurement

    NASA Astrophysics Data System (ADS)

    Xie, Hongbo; Mao, Chensheng; Ren, Yongjie; Zhu, Jigui; Wang, Chao; Yang, Lei

    2017-10-01

    In high precision and large-scale coordinate measurement, one commonly used approach to determine the coordinate of a target point is utilizing the spatial trigonometric relationships between multiple laser transmitter stations and the target point. A light receiving device at the target point is the key element in large-scale coordinate measurement systems. To ensure high-resolution and highly sensitive spatial coordinate measurement, a high-performance and miniaturized omnidirectional single-point photodetector (OSPD) is greatly desired. We report one design of OSPD using an aspheric lens, which achieves an enhanced reception angle of -5 deg to 45 deg in vertical and 360 deg in horizontal. As the heart of our OSPD, the aspheric lens is designed in a geometric model and optimized by LightTools Software, which enables the reflection of a wide-angle incident light beam into the single-point photodiode. The performance of home-made OSPD is characterized with working distances from 1 to 13 m and further analyzed utilizing developed a geometric model. The experimental and analytic results verify that our device is highly suitable for large-scale coordinate metrology. The developed device also holds great potential in various applications such as omnidirectional vision sensor, indoor global positioning system, and optical wireless communication systems.

  17. Response of human populations to large-scale emergencies

    NASA Astrophysics Data System (ADS)

    Bagrow, James; Wang, Dashun; Barabási, Albert-László

    2010-03-01

    Until recently, little quantitative data regarding collective human behavior during dangerous events such as bombings and riots have been available, despite its importance for emergency management, safety and urban planning. Understanding how populations react to danger is critical for prediction, detection and intervention strategies. Using a large telecommunications dataset, we study for the first time the spatiotemporal, social and demographic response properties of people during several disasters, including a bombing, a city-wide power outage, and an earthquake. Call activity rapidly increases after an event and we find that, when faced with a truly life-threatening emergency, information rapidly propagates through a population's social network. Other events, such as sports games, do not exhibit this propagation.

  18. Large-scale generation of human iPSC-derived neural stem cells/early neural progenitor cells and their neuronal differentiation.

    PubMed

    D'Aiuto, Leonardo; Zhi, Yun; Kumar Das, Dhanjit; Wilcox, Madeleine R; Johnson, Jon W; McClain, Lora; MacDonald, Matthew L; Di Maio, Roberto; Schurdak, Mark E; Piazza, Paolo; Viggiano, Luigi; Sweet, Robert; Kinchington, Paul R; Bhattacharjee, Ayantika G; Yolken, Robert; Nimgaonka, Vishwajit L; Nimgaonkar, Vishwajit L

    2014-01-01

    Induced pluripotent stem cell (iPSC)-based technologies offer an unprecedented opportunity to perform high-throughput screening of novel drugs for neurological and neurodegenerative diseases. Such screenings require a robust and scalable method for generating large numbers of mature, differentiated neuronal cells. Currently available methods based on differentiation of embryoid bodies (EBs) or directed differentiation of adherent culture systems are either expensive or are not scalable. We developed a protocol for large-scale generation of neuronal stem cells (NSCs)/early neural progenitor cells (eNPCs) and their differentiation into neurons. Our scalable protocol allows robust and cost-effective generation of NSCs/eNPCs from iPSCs. Following culture in neurobasal medium supplemented with B27 and BDNF, NSCs/eNPCs differentiate predominantly into vesicular glutamate transporter 1 (VGLUT1) positive neurons. Targeted mass spectrometry analysis demonstrates that iPSC-derived neurons express ligand-gated channels and other synaptic proteins and whole-cell patch-clamp experiments indicate that these channels are functional. The robust and cost-effective differentiation protocol described here for large-scale generation of NSCs/eNPCs and their differentiation into neurons paves the way for automated high-throughput screening of drugs for neurological and neurodegenerative diseases.

  19. European large-scale farmland investments and the land-water-energy-food nexus

    NASA Astrophysics Data System (ADS)

    Siciliano, Giuseppina; Rulli, Maria Cristina; D'Odorico, Paolo

    2017-12-01

    The escalating human demand for food, water, energy, fibres and minerals have resulted in increasing commercial pressures on land and water resources, which are partly reflected by the recent increase in transnational land investments. Studies have shown that many of the land-water issues associated with land acquisitions are directly related to the areas of energy and food production. This paper explores the land-water-energy-food nexus in relation to large-scale farmland investments pursued by investors from European countries. The analysis is based on a "resource assessment approach" which evaluates the linkages between land acquisitions for agricultural (including both energy and food production) and forestry purposes, and the availability of land and water in the target countries. To that end, the water appropriated by agricultural and forestry productions is quantitatively assessed and its impact on water resource availability is analysed. The analysis is meant to provide useful information to investors from EU countries and policy makers on aspects of resource acquisition, scarcity, and access to promote responsible land investments in the target countries.

  20. A large-scale evaluation of computational protein function prediction

    PubMed Central

    Radivojac, Predrag; Clark, Wyatt T; Ronnen Oron, Tal; Schnoes, Alexandra M; Wittkop, Tobias; Sokolov, Artem; Graim, Kiley; Funk, Christopher; Verspoor, Karin; Ben-Hur, Asa; Pandey, Gaurav; Yunes, Jeffrey M; Talwalkar, Ameet S; Repo, Susanna; Souza, Michael L; Piovesan, Damiano; Casadio, Rita; Wang, Zheng; Cheng, Jianlin; Fang, Hai; Gough, Julian; Koskinen, Patrik; Törönen, Petri; Nokso-Koivisto, Jussi; Holm, Liisa; Cozzetto, Domenico; Buchan, Daniel W A; Bryson, Kevin; Jones, David T; Limaye, Bhakti; Inamdar, Harshal; Datta, Avik; Manjari, Sunitha K; Joshi, Rajendra; Chitale, Meghana; Kihara, Daisuke; Lisewski, Andreas M; Erdin, Serkan; Venner, Eric; Lichtarge, Olivier; Rentzsch, Robert; Yang, Haixuan; Romero, Alfonso E; Bhat, Prajwal; Paccanaro, Alberto; Hamp, Tobias; Kassner, Rebecca; Seemayer, Stefan; Vicedo, Esmeralda; Schaefer, Christian; Achten, Dominik; Auer, Florian; Böhm, Ariane; Braun, Tatjana; Hecht, Maximilian; Heron, Mark; Hönigschmid, Peter; Hopf, Thomas; Kaufmann, Stefanie; Kiening, Michael; Krompass, Denis; Landerer, Cedric; Mahlich, Yannick; Roos, Manfred; Björne, Jari; Salakoski, Tapio; Wong, Andrew; Shatkay, Hagit; Gatzmann, Fanny; Sommer, Ingolf; Wass, Mark N; Sternberg, Michael J E; Škunca, Nives; Supek, Fran; Bošnjak, Matko; Panov, Panče; Džeroski, Sašo; Šmuc, Tomislav; Kourmpetis, Yiannis A I; van Dijk, Aalt D J; ter Braak, Cajo J F; Zhou, Yuanpeng; Gong, Qingtian; Dong, Xinran; Tian, Weidong; Falda, Marco; Fontana, Paolo; Lavezzo, Enrico; Di Camillo, Barbara; Toppo, Stefano; Lan, Liang; Djuric, Nemanja; Guo, Yuhong; Vucetic, Slobodan; Bairoch, Amos; Linial, Michal; Babbitt, Patricia C; Brenner, Steven E; Orengo, Christine; Rost, Burkhard; Mooney, Sean D; Friedberg, Iddo

    2013-01-01

    Automated annotation of protein function is challenging. As the number of sequenced genomes rapidly grows, the overwhelming majority of protein products can only be annotated computationally. If computational predictions are to be relied upon, it is crucial that the accuracy of these methods be high. Here we report the results from the first large-scale community-based Critical Assessment of protein Function Annotation (CAFA) experiment. Fifty-four methods representing the state-of-the-art for protein function prediction were evaluated on a target set of 866 proteins from eleven organisms. Two findings stand out: (i) today’s best protein function prediction algorithms significantly outperformed widely-used first-generation methods, with large gains on all types of targets; and (ii) although the top methods perform well enough to guide experiments, there is significant need for improvement of currently available tools. PMID:23353650

  1. On the scaling of small-scale jet noise to large scale

    NASA Technical Reports Server (NTRS)

    Soderman, Paul T.; Allen, Christopher S.

    1992-01-01

    An examination was made of several published jet noise studies for the purpose of evaluating scale effects important to the simulation of jet aeroacoustics. Several studies confirmed that small conical jets, one as small as 59 mm diameter, could be used to correctly simulate the overall or PNL noise of large jets dominated by mixing noise. However, the detailed acoustic spectra of large jets are more difficult to simulate because of the lack of broad-band turbulence spectra in small jets. One study indicated that a jet Reynolds number of 5 x 10 exp 6 based on exhaust diameter enabled the generation of broad-band noise representative of large jet mixing noise. Jet suppressor aeroacoustics is even more difficult to simulate at small scale because of the small mixer nozzles with flows sensitive to Reynolds number. Likewise, one study showed incorrect ejector mixing and entrainment using small-scale, short ejector that led to poor acoustic scaling. Conversely, fairly good results were found with a longer ejector and, in a different study, with a 32-chute suppressor nozzle. Finally, it was found that small-scale aeroacoustic resonance produced by jets impacting ground boards does not reproduce at large scale.

  2. Multiplex amplification of large sets of human exons.

    PubMed

    Porreca, Gregory J; Zhang, Kun; Li, Jin Billy; Xie, Bin; Austin, Derek; Vassallo, Sara L; LeProust, Emily M; Peck, Bill J; Emig, Christopher J; Dahl, Fredrik; Gao, Yuan; Church, George M; Shendure, Jay

    2007-11-01

    A new generation of technologies is poised to reduce DNA sequencing costs by several orders of magnitude. But our ability to fully leverage the power of these technologies is crippled by the absence of suitable 'front-end' methods for isolating complex subsets of a mammalian genome at a scale that matches the throughput at which these platforms will routinely operate. We show that targeting oligonucleotides released from programmable microarrays can be used to capture and amplify approximately 10,000 human exons in a single multiplex reaction. Additionally, we show integration of this protocol with ultra-high-throughput sequencing for targeted variation discovery. Although the multiplex capture reaction is highly specific, we found that nonuniform capture is a key issue that will need to be resolved by additional optimization. We anticipate that highly multiplexed methods for targeted amplification will enable the comprehensive resequencing of human exons at a fraction of the cost of whole-genome resequencing.

  3. Effects of Pre-Existing Target Structure on the Formation of Large Craters

    NASA Technical Reports Server (NTRS)

    Barnouin-Jha, O. S.; Cintala, M. J.; Crawford, D. A.

    2003-01-01

    The shapes of large-scale craters and the mechanics responsible for melt generation are influenced by broad and small-scale structures present in a target prior to impact. For example, well-developed systems of fractures often create craters that appear square in outline, good examples being Meteor Crater, AZ and the square craters of 433 Eros. Pre-broken target material also affects melt generation. Kieffer has shown how the shock wave generated in Coconino sandstone at Meteor crater created reverberations which, in combination with the natural target heterogeneity present, created peaks and troughs in pressure and compressed density as individual grains collided to produce a range of shock mineralogies and melts within neighboring samples. In this study, we further explore how pre-existing target structure influences various aspects of the cratering process. We combine experimental and numerical techniques to explore the connection between the scales of the impact generated shock wave and the pre-existing target structure. We focus on the propagation of shock waves in coarse, granular media, emphasizing its consequences on excavation, crater growth, ejecta production, cratering efficiency, melt generation, and crater shape. As a baseline, we present a first series of results for idealized targets where the particles are all identical in size and possess the same shock impedance. We will also present a few results, whereby we increase the complexities of the target properties by varying the grain size, strength, impedance and frictional properties. In addition, we investigate the origin and implications of reverberations that are created by the presence of physical and chemical heterogeneity in a target.

  4. Large-Scale Disasters

    NASA Astrophysics Data System (ADS)

    Gad-El-Hak, Mohamed

    "Extreme" events - including climatic events, such as hurricanes, tornadoes, and drought - can cause massive disruption to society, including large death tolls and property damage in the billions of dollars. Events in recent years have shown the importance of being prepared and that countries need to work together to help alleviate the resulting pain and suffering. This volume presents a review of the broad research field of large-scale disasters. It establishes a common framework for predicting, controlling and managing both manmade and natural disasters. There is a particular focus on events caused by weather and climate change. Other topics include air pollution, tsunamis, disaster modeling, the use of remote sensing and the logistics of disaster management. It will appeal to scientists, engineers, first responders and health-care professionals, in addition to graduate students and researchers who have an interest in the prediction, prevention or mitigation of large-scale disasters.

  5. EFFECTS OF LARGE-SCALE POULTRY FARMS ON AQUATIC MICROBIAL COMMUNITIES: A MOLECULAR INVESTIGATION.

    EPA Science Inventory

    The effects of large-scale poultry production operations on water quality and human health are largely unknown. Poultry litter is frequently applied as fertilizer to agricultural lands adjacent to large poultry farms. Run-off from the land introduces a variety of stressors into t...

  6. A Novel Method for Proximity Detection of Moving Targets Using a Large-Scale Planar Capacitive Sensor System

    PubMed Central

    Ye, Yong; Deng, Jiahao; Shen, Sanmin; Hou, Zhuo; Liu, Yuting

    2016-01-01

    A novel method for proximity detection of moving targets (with high dielectric constants) using a large-scale (the size of each sensor is 31 cm × 19 cm) planar capacitive sensor system (PCSS) is proposed. The capacitive variation with distance is derived, and a pair of electrodes in a planar capacitive sensor unit (PCSU) with a spiral shape is found to have better performance on sensitivity distribution homogeneity and dynamic range than three other shapes (comb shape, rectangular shape, and circular shape). A driving excitation circuit with a Clapp oscillator is proposed, and a capacitance measuring circuit with sensitivity of 0.21 Vp−p/pF is designed. The results of static experiments and dynamic experiments demonstrate that the voltage curves of static experiments are similar to those of dynamic experiments; therefore, the static data can be used to simulate the dynamic curves. The dynamic range of proximity detection for three projectiles is up to 60 cm, and the results of the following static experiments show that the PCSU with four neighboring units has the highest sensitivity (the sensitivities of other units are at least 4% lower); when the attack angle decreases, the intensity of sensor signal increases. This proposed method leads to the design of a feasible moving target detector with simple structure and low cost, which can be applied in the interception system. PMID:27196905

  7. Development of a large-scale isolation chamber system for the safe and humane care of medium-sized laboratory animals harboring infectious diseases*

    PubMed Central

    Pan, Xin; Qi, Jian-cheng; Long, Ming; Liang, Hao; Chen, Xiao; Li, Han; Li, Guang-bo; Zheng, Hao

    2010-01-01

    The close phylogenetic relationship between humans and non-human primates makes non-human primates an irreplaceable model for the study of human infectious diseases. In this study, we describe the development of a large-scale automatic multi-functional isolation chamber for use with medium-sized laboratory animals carrying infectious diseases. The isolation chamber, including the transfer chain, disinfection chain, negative air pressure isolation system, animal welfare system, and the automated system, is designed to meet all biological safety standards. To create an internal chamber environment that is completely isolated from the exterior, variable frequency drive blowers are used in the air-intake and air-exhaust system, precisely controlling the filtered air flow and providing an air-barrier protection. A double door transfer port is used to transfer material between the interior of the isolation chamber and the outside. A peracetic acid sterilizer and its associated pipeline allow for complete disinfection of the isolation chamber. All of the isolation chamber parameters can be automatically controlled by a programmable computerized menu, allowing for work with different animals in different-sized cages depending on the research project. The large-scale multi-functional isolation chamber provides a useful and safe system for working with infectious medium-sized laboratory animals in high-level bio-safety laboratories. PMID:20872984

  8. On the scaling of small-scale jet noise to large scale

    NASA Technical Reports Server (NTRS)

    Soderman, Paul T.; Allen, Christopher S.

    1992-01-01

    An examination was made of several published jet noise studies for the purpose of evaluating scale effects important to the simulation of jet aeroacoustics. Several studies confirmed that small conical jets, one as small as 59 mm diameter, could be used to correctly simulate the overall or perceived noise level (PNL) noise of large jets dominated by mixing noise. However, the detailed acoustic spectra of large jets are more difficult to simulate because of the lack of broad-band turbulence spectra in small jets. One study indicated that a jet Reynolds number of 5 x 10(exp 6) based on exhaust diameter enabled the generation of broad-band noise representative of large jet mixing noise. Jet suppressor aeroacoustics is even more difficult to simulate at small scale because of the small mixer nozzles with flows sensitive to Reynolds number. Likewise, one study showed incorrect ejector mixing and entrainment using a small-scale, short ejector that led to poor acoustic scaling. Conversely, fairly good results were found with a longer ejector and, in a different study, with a 32-chute suppressor nozzle. Finally, it was found that small-scale aeroacoustic resonance produced by jets impacting ground boards does not reproduce at large scale.

  9. Large-Scale Analysis of Auditory Segregation Behavior Crowdsourced via a Smartphone App.

    PubMed

    Teki, Sundeep; Kumar, Sukhbinder; Griffiths, Timothy D

    2016-01-01

    The human auditory system is adept at detecting sound sources of interest from a complex mixture of several other simultaneous sounds. The ability to selectively attend to the speech of one speaker whilst ignoring other speakers and background noise is of vital biological significance-the capacity to make sense of complex 'auditory scenes' is significantly impaired in aging populations as well as those with hearing loss. We investigated this problem by designing a synthetic signal, termed the 'stochastic figure-ground' stimulus that captures essential aspects of complex sounds in the natural environment. Previously, we showed that under controlled laboratory conditions, young listeners sampled from the university subject pool (n = 10) performed very well in detecting targets embedded in the stochastic figure-ground signal. Here, we presented a modified version of this cocktail party paradigm as a 'game' featured in a smartphone app (The Great Brain Experiment) and obtained data from a large population with diverse demographical patterns (n = 5148). Despite differences in paradigms and experimental settings, the observed target-detection performance by users of the app was robust and consistent with our previous results from the psychophysical study. Our results highlight the potential use of smartphone apps in capturing robust large-scale auditory behavioral data from normal healthy volunteers, which can also be extended to study auditory deficits in clinical populations with hearing impairments and central auditory disorders.

  10. Large-scale structural optimization

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.

    1983-01-01

    Problems encountered by aerospace designers in attempting to optimize whole aircraft are discussed, along with possible solutions. Large scale optimization, as opposed to component-by-component optimization, is hindered by computational costs, software inflexibility, concentration on a single, rather than trade-off, design methodology and the incompatibility of large-scale optimization with single program, single computer methods. The software problem can be approached by placing the full analysis outside of the optimization loop. Full analysis is then performed only periodically. Problem-dependent software can be removed from the generic code using a systems programming technique, and then embody the definitions of design variables, objective function and design constraints. Trade-off algorithms can be used at the design points to obtain quantitative answers. Finally, decomposing the large-scale problem into independent subproblems allows systematic optimization of the problems by an organization of people and machines.

  11. Quantitative Missense Variant Effect Prediction Using Large-Scale Mutagenesis Data.

    PubMed

    Gray, Vanessa E; Hause, Ronald J; Luebeck, Jens; Shendure, Jay; Fowler, Douglas M

    2018-01-24

    Large datasets describing the quantitative effects of mutations on protein function are becoming increasingly available. Here, we leverage these datasets to develop Envision, which predicts the magnitude of a missense variant's molecular effect. Envision combines 21,026 variant effect measurements from nine large-scale experimental mutagenesis datasets, a hitherto untapped training resource, with a supervised, stochastic gradient boosting learning algorithm. Envision outperforms other missense variant effect predictors both on large-scale mutagenesis data and on an independent test dataset comprising 2,312 TP53 variants whose effects were measured using a low-throughput approach. This dataset was never used for hyperparameter tuning or model training and thus serves as an independent validation set. Envision prediction accuracy is also more consistent across amino acids than other predictors. Finally, we demonstrate that Envision's performance improves as more large-scale mutagenesis data are incorporated. We precompute Envision predictions for every possible single amino acid variant in human, mouse, frog, zebrafish, fruit fly, worm, and yeast proteomes (https://envision.gs.washington.edu/). Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Causal Inferences with Large Scale Assessment Data: Using a Validity Framework

    ERIC Educational Resources Information Center

    Rutkowski, David; Delandshere, Ginette

    2016-01-01

    To answer the calls for stronger evidence by the policy community, educational researchers and their associated organizations increasingly demand more studies that can yield causal inferences. International large scale assessments (ILSAs) have been targeted as a rich data sources for causal research. It is in this context that we take up a…

  13. Scaling behavior of online human activity

    NASA Astrophysics Data System (ADS)

    Zhao, Zhi-Dan; Cai, Shi-Min; Huang, Junming; Fu, Yan; Zhou, Tao

    2012-11-01

    The rapid development of the Internet technology enables humans to explore the web and record the traces of online activities. From the analysis of these large-scale data sets (i.e., traces), we can get insights about the dynamic behavior of human activity. In this letter, the scaling behavior and complexity of human activity in the e-commerce, such as music, books, and movies rating, are comprehensively investigated by using the detrended fluctuation analysis technique and the multiscale entropy method. Firstly, the interevent time series of rating behaviors of these three types of media show similar scaling properties with exponents ranging from 0.53 to 0.58, which implies that the collective behaviors of rating media follow a process embodying self-similarity and long-range correlation. Meanwhile, by dividing the users into three groups based on their activities (i.e., rating per unit time), we find that the scaling exponents of the interevent time series in the three groups are different. Hence, these results suggest that a stronger long-range correlations exist in these collective behaviors. Furthermore, their information complexities vary in the three groups. To explain the differences of the collective behaviors restricted to the three groups, we study the dynamic behavior of human activity at the individual level, and find that the dynamic behaviors of a few users have extremely small scaling exponents associated with long-range anticorrelations. By comparing the interevent time distributions of four representative users, we can find that the bimodal distributions may bring forth the extraordinary scaling behaviors. These results of the analysis of the online human activity in the e-commerce may not only provide insight into its dynamic behaviors but may also be applied to acquire potential economic interest.

  14. Large Scale Metal Additive Techniques Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nycz, Andrzej; Adediran, Adeola I; Noakes, Mark W

    2016-01-01

    In recent years additive manufacturing made long strides toward becoming a main stream production technology. Particularly strong progress has been made in large-scale polymer deposition. However, large scale metal additive has not yet reached parity with large scale polymer. This paper is a review study of the metal additive techniques in the context of building large structures. Current commercial devices are capable of printing metal parts on the order of several cubic feet compared to hundreds of cubic feet for the polymer side. In order to follow the polymer progress path several factors are considered: potential to scale, economy, environmentmore » friendliness, material properties, feedstock availability, robustness of the process, quality and accuracy, potential for defects, and post processing as well as potential applications. This paper focuses on current state of art of large scale metal additive technology with a focus on expanding the geometric limits.« less

  15. Millennial-scale faunal record reveals differential resilience of European large mammals to human impacts across the Holocene.

    PubMed

    Crees, Jennifer J; Carbone, Chris; Sommer, Robert S; Benecke, Norbert; Turvey, Samuel T

    2016-03-30

    The use of short-term indicators for understanding patterns and processes of biodiversity loss can mask longer-term faunal responses to human pressures. We use an extensive database of approximately 18,700 mammalian zooarchaeological records for the last 11,700 years across Europe to reconstruct spatio-temporal dynamics of Holocene range change for 15 large-bodied mammal species. European mammals experienced protracted, non-congruent range losses, with significant declines starting in some species approximately 3000 years ago and continuing to the present, and with the timing, duration and magnitude of declines varying individually between species. Some European mammals became globally extinct during the Holocene, whereas others experienced limited or no significant range change. These findings demonstrate the relatively early onset of prehistoric human impacts on postglacial biodiversity, and mirror species-specific patterns of mammalian extinction during the Late Pleistocene. Herbivores experienced significantly greater declines than carnivores, revealing an important historical extinction filter that informs our understanding of relative resilience and vulnerability to human pressures for different taxa. We highlight the importance of large-scale, long-term datasets for understanding complex protracted extinction processes, although the dynamic pattern of progressive faunal depletion of European mammal assemblages across the Holocene challenges easy identification of 'static' past baselines to inform current-day environmental management and restoration. © 2016 The Author(s).

  16. Large-scale production and study of a synthetic G protein-coupled receptor: Human olfactory receptor 17-4

    PubMed Central

    Cook, Brian L.; Steuerwald, Dirk; Kaiser, Liselotte; Graveland-Bikker, Johanna; Vanberghem, Melanie; Berke, Allison P.; Herlihy, Kara; Pick, Horst; Vogel, Horst; Zhang, Shuguang

    2009-01-01

    Although understanding of the olfactory system has progressed at the level of downstream receptor signaling and the wiring of olfactory neurons, the system remains poorly understood at the molecular level of the receptors and their interaction with and recognition of odorant ligands. The structure and functional mechanisms of these receptors still remain a tantalizing enigma, because numerous previous attempts at the large-scale production of functional olfactory receptors (ORs) have not been successful to date. To investigate the elusive biochemistry and molecular mechanisms of olfaction, we have developed a mammalian expression system for the large-scale production and purification of a functional OR protein in milligram quantities. Here, we report the study of human OR17-4 (hOR17-4) purified from a HEK293S tetracycline-inducible system. Scale-up of production yield was achieved through suspension culture in a bioreactor, which enabled the preparation of >10 mg of monomeric hOR17-4 receptor after immunoaffinity and size exclusion chromatography, with expression yields reaching 3 mg/L of culture medium. Several key post-translational modifications were identified using MS, and CD spectroscopy showed the receptor to be ≈50% α-helix, similar to other recently determined G protein-coupled receptor structures. Detergent-solubilized hOR17-4 specifically bound its known activating odorants lilial and floralozone in vitro, as measured by surface plasmon resonance. The hOR17-4 also recognized specific odorants in heterologous cells as determined by calcium ion mobilization. Our system is feasible for the production of large quantities of OR necessary for structural and functional analyses and research into OR biosensor devices. PMID:19581598

  17. Large-scale production and study of a synthetic G protein-coupled receptor: human olfactory receptor 17-4.

    PubMed

    Cook, Brian L; Steuerwald, Dirk; Kaiser, Liselotte; Graveland-Bikker, Johanna; Vanberghem, Melanie; Berke, Allison P; Herlihy, Kara; Pick, Horst; Vogel, Horst; Zhang, Shuguang

    2009-07-21

    Although understanding of the olfactory system has progressed at the level of downstream receptor signaling and the wiring of olfactory neurons, the system remains poorly understood at the molecular level of the receptors and their interaction with and recognition of odorant ligands. The structure and functional mechanisms of these receptors still remain a tantalizing enigma, because numerous previous attempts at the large-scale production of functional olfactory receptors (ORs) have not been successful to date. To investigate the elusive biochemistry and molecular mechanisms of olfaction, we have developed a mammalian expression system for the large-scale production and purification of a functional OR protein in milligram quantities. Here, we report the study of human OR17-4 (hOR17-4) purified from a HEK293S tetracycline-inducible system. Scale-up of production yield was achieved through suspension culture in a bioreactor, which enabled the preparation of >10 mg of monomeric hOR17-4 receptor after immunoaffinity and size exclusion chromatography, with expression yields reaching 3 mg/L of culture medium. Several key post-translational modifications were identified using MS, and CD spectroscopy showed the receptor to be approximately 50% alpha-helix, similar to other recently determined G protein-coupled receptor structures. Detergent-solubilized hOR17-4 specifically bound its known activating odorants lilial and floralozone in vitro, as measured by surface plasmon resonance. The hOR17-4 also recognized specific odorants in heterologous cells as determined by calcium ion mobilization. Our system is feasible for the production of large quantities of OR necessary for structural and functional analyses and research into OR biosensor devices.

  18. Enabling Large-Scale Design, Synthesis and Validation of Small Molecule Protein-Protein Antagonists

    PubMed Central

    Koes, David; Khoury, Kareem; Huang, Yijun; Wang, Wei; Bista, Michal; Popowicz, Grzegorz M.; Wolf, Siglinde; Holak, Tad A.; Dömling, Alexander; Camacho, Carlos J.

    2012-01-01

    Although there is no shortage of potential drug targets, there are only a handful known low-molecular-weight inhibitors of protein-protein interactions (PPIs). One problem is that current efforts are dominated by low-yield high-throughput screening, whose rigid framework is not suitable for the diverse chemotypes present in PPIs. Here, we developed a novel pharmacophore-based interactive screening technology that builds on the role anchor residues, or deeply buried hot spots, have in PPIs, and redesigns these entry points with anchor-biased virtual multicomponent reactions, delivering tens of millions of readily synthesizable novel compounds. Application of this approach to the MDM2/p53 cancer target led to high hit rates, resulting in a large and diverse set of confirmed inhibitors, and co-crystal structures validate the designed compounds. Our unique open-access technology promises to expand chemical space and the exploration of the human interactome by leveraging in-house small-scale assays and user-friendly chemistry to rationally design ligands for PPIs with known structure. PMID:22427896

  19. A calibration method based on virtual large planar target for cameras with large FOV

    NASA Astrophysics Data System (ADS)

    Yu, Lei; Han, Yangyang; Nie, Hong; Ou, Qiaofeng; Xiong, Bangshu

    2018-02-01

    In order to obtain high precision in camera calibration, a target should be large enough to cover the whole field of view (FOV). For cameras with large FOV, using a small target will seriously reduce the precision of calibration. However, using a large target causes many difficulties in making, carrying and employing the large target. In order to solve this problem, a calibration method based on the virtual large planar target (VLPT), which is virtually constructed with multiple small targets (STs), is proposed for cameras with large FOV. In the VLPT-based calibration method, first, the positions and directions of STs are changed several times to obtain a number of calibration images. Secondly, the VLPT of each calibration image is created by finding the virtual point corresponding to the feature points of the STs. Finally, intrinsic and extrinsic parameters of the camera are calculated by using the VLPTs. Experiment results show that the proposed method can not only achieve the similar calibration precision as those employing a large target, but also have good stability in the whole measurement area. Thus, the difficulties to accurately calibrate cameras with large FOV can be perfectly tackled by the proposed method with good operability.

  20. Genome-Wide Identification of CBX2 Targets: Insights in the Human Sex Development Network

    PubMed Central

    Eid, Wassim; Opitz, Lennart

    2015-01-01

    Chromobox homolog 2 (CBX2) is a chromatin modifier that plays an important role in sexual development and its disorders (disorders of sex development [DSD]), yet the exact rank and function of human CBX2 in this pathway remains unclear. Here, we performed large-scale mapping and analysis of in vivo target loci of the protein CBX2 in Sertoli-like NT-2D1 cells, using the DNA adenine methyltransferase identification technique. We identified close to 1600 direct targets for CBX2. Intriguingly, validation of selected candidate genes using qRT-PCR in cells overexpressing CBX2 or in which CBX2 has been knocked down indicated that several CBX2-responsive genes encode proteins that are involved in DSD. We further validated these effects on the candidate genes using a mutated CBX2 causing DSD in human patient. Overall, our findings suggest that CBX2 role in the sex development cascade is to stimulate the male pathway and concurrently inhibit the female pathway. These data provide fundamental insights into potential etiology of DSD. PMID:25569159

  1. Line segment extraction for large scale unorganized point clouds

    NASA Astrophysics Data System (ADS)

    Lin, Yangbin; Wang, Cheng; Cheng, Jun; Chen, Bili; Jia, Fukai; Chen, Zhonggui; Li, Jonathan

    2015-04-01

    Line segment detection in images is already a well-investigated topic, although it has received considerably less attention in 3D point clouds. Benefiting from current LiDAR devices, large-scale point clouds are becoming increasingly common. Most human-made objects have flat surfaces. Line segments that occur where pairs of planes intersect give important information regarding the geometric content of point clouds, which is especially useful for automatic building reconstruction and segmentation. This paper proposes a novel method that is capable of accurately extracting plane intersection line segments from large-scale raw scan points. The 3D line-support region, namely, a point set near a straight linear structure, is extracted simultaneously. The 3D line-support region is fitted by our Line-Segment-Half-Planes (LSHP) structure, which provides a geometric constraint for a line segment, making the line segment more reliable and accurate. We demonstrate our method on the point clouds of large-scale, complex, real-world scenes acquired by LiDAR devices. We also demonstrate the application of 3D line-support regions and their LSHP structures on urban scene abstraction.

  2. Protein crystal growth in microgravity review of large scale temperature induction method: Bovine insulin, human insulin and human α-interferon

    NASA Astrophysics Data System (ADS)

    Long, Marianna M.; Bishop, John Bradford; Delucas, Lawrence J.; Nagabhushan, Tattanhalli L.; Reichert, Paul; Smith, G. David

    1997-01-01

    The Protein Crystal Growth Facility (PCF) is space-flight hardware that accommodates large scale protein crystal growth experiments using temperature change as the inductive step. Recent modifications include specialized instrumentation for monitoring crystal nucleation with laser light scattering. This paper reviews results from its first seven flights on the Space Shuttle, the last with laser light scattering instrumentation in place. The PCF's objective is twofold: (1) the production of high quality protein crystals for x-ray analysis and subsequent structure-based drug design and (2) preparation of a large quantity of relatively contaminant free crystals for use as time-release protein pharmaceuticals. The first three Shuttle flights with bovine insulin constituted the PCF's proof of concept, demonstrating that the space-grown crystals were larger and diffracted to higher resolution than their earth-grown counterparts. The later four PCF missions were used to grow recombinant human insulin crystals for x-ray analysis and continue productions trials aimed at the development of a processing facility for crystalline recombinant a-interferon.

  3. Scaling identity connects human mobility and social interactions.

    PubMed

    Deville, Pierre; Song, Chaoming; Eagle, Nathan; Blondel, Vincent D; Barabási, Albert-László; Wang, Dashun

    2016-06-28

    Massive datasets that capture human movements and social interactions have catalyzed rapid advances in our quantitative understanding of human behavior during the past years. One important aspect affecting both areas is the critical role space plays. Indeed, growing evidence suggests both our movements and communication patterns are associated with spatial costs that follow reproducible scaling laws, each characterized by its specific critical exponents. Although human mobility and social networks develop concomitantly as two prolific yet largely separated fields, we lack any known relationships between the critical exponents explored by them, despite the fact that they often study the same datasets. Here, by exploiting three different mobile phone datasets that capture simultaneously these two aspects, we discovered a new scaling relationship, mediated by a universal flux distribution, which links the critical exponents characterizing the spatial dependencies in human mobility and social networks. Therefore, the widely studied scaling laws uncovered in these two areas are not independent but connected through a deeper underlying reality.

  4. Scaling identity connects human mobility and social interactions

    PubMed Central

    Deville, Pierre; Song, Chaoming; Eagle, Nathan; Blondel, Vincent D.; Barabási, Albert-László; Wang, Dashun

    2016-01-01

    Massive datasets that capture human movements and social interactions have catalyzed rapid advances in our quantitative understanding of human behavior during the past years. One important aspect affecting both areas is the critical role space plays. Indeed, growing evidence suggests both our movements and communication patterns are associated with spatial costs that follow reproducible scaling laws, each characterized by its specific critical exponents. Although human mobility and social networks develop concomitantly as two prolific yet largely separated fields, we lack any known relationships between the critical exponents explored by them, despite the fact that they often study the same datasets. Here, by exploiting three different mobile phone datasets that capture simultaneously these two aspects, we discovered a new scaling relationship, mediated by a universal flux distribution, which links the critical exponents characterizing the spatial dependencies in human mobility and social networks. Therefore, the widely studied scaling laws uncovered in these two areas are not independent but connected through a deeper underlying reality. PMID:27274050

  5. Large-scale neuromorphic computing systems

    NASA Astrophysics Data System (ADS)

    Furber, Steve

    2016-10-01

    Neuromorphic computing covers a diverse range of approaches to information processing all of which demonstrate some degree of neurobiological inspiration that differentiates them from mainstream conventional computing systems. The philosophy behind neuromorphic computing has its origins in the seminal work carried out by Carver Mead at Caltech in the late 1980s. This early work influenced others to carry developments forward, and advances in VLSI technology supported steady growth in the scale and capability of neuromorphic devices. Recently, a number of large-scale neuromorphic projects have emerged, taking the approach to unprecedented scales and capabilities. These large-scale projects are associated with major new funding initiatives for brain-related research, creating a sense that the time and circumstances are right for progress in our understanding of information processing in the brain. In this review we present a brief history of neuromorphic engineering then focus on some of the principal current large-scale projects, their main features, how their approaches are complementary and distinct, their advantages and drawbacks, and highlight the sorts of capabilities that each can deliver to neural modellers.

  6. Automated selected reaction monitoring data analysis workflow for large-scale targeted proteomic studies.

    PubMed

    Surinova, Silvia; Hüttenhain, Ruth; Chang, Ching-Yun; Espona, Lucia; Vitek, Olga; Aebersold, Ruedi

    2013-08-01

    Targeted proteomics based on selected reaction monitoring (SRM) mass spectrometry is commonly used for accurate and reproducible quantification of protein analytes in complex biological mixtures. Strictly hypothesis-driven, SRM assays quantify each targeted protein by collecting measurements on its peptide fragment ions, called transitions. To achieve sensitive and accurate quantitative results, experimental design and data analysis must consistently account for the variability of the quantified transitions. This consistency is especially important in large experiments, which increasingly require profiling up to hundreds of proteins over hundreds of samples. Here we describe a robust and automated workflow for the analysis of large quantitative SRM data sets that integrates data processing, statistical protein identification and quantification, and dissemination of the results. The integrated workflow combines three software tools: mProphet for peptide identification via probabilistic scoring; SRMstats for protein significance analysis with linear mixed-effect models; and PASSEL, a public repository for storage, retrieval and query of SRM data. The input requirements for the protocol are files with SRM traces in mzXML format, and a file with a list of transitions in a text tab-separated format. The protocol is especially suited for data with heavy isotope-labeled peptide internal standards. We demonstrate the protocol on a clinical data set in which the abundances of 35 biomarker candidates were profiled in 83 blood plasma samples of subjects with ovarian cancer or benign ovarian tumors. The time frame to realize the protocol is 1-2 weeks, depending on the number of replicates used in the experiment.

  7. Evaluation of human response to blasting vibration from excavation of a large scale rock slope: A case study

    NASA Astrophysics Data System (ADS)

    Yan, Peng; Lu, Wenbo; Zhang, Jing; Zou, Yujun; Chen, Ming

    2017-04-01

    Ground vibration, as the most critical public hazard of blasting, has received much attention from the community. Many countries established national standards to suppress vibration impact on structures, but a world-accepted blasting vibration criterion on human safety is still missing. In order to evaluate human response to the vibration from blasting excavation of a large-scale rock slope in China, this study aims to suggest a revised criterion. The vibration frequency was introduced to improve the existing single-factor (peak particle velocity) standard recommended by the United States Bureau of Mines (USBM). The feasibility of the new criterion was checked based on field vibration monitoring and investigation of human reactions. Moreover, the air overpressure or blast effects on human beings have also been discussed. The result indicates that the entire zone of influence can be divided into three subzones: severe-annoyance, light-annoyance and perception zone according to the revised safety standard. Both the construction company and local residents have provided positive comments on this influence degree assessment, which indicates that the presented criterion is suitable for evaluating human response to nearby blasts. Nevertheless, this specific criterion needs more field tests and verifications before it can be

  8. Large-scale DNA Barcode Library Generation for Biomolecule Identification in High-throughput Screens.

    PubMed

    Lyons, Eli; Sheridan, Paul; Tremmel, Georg; Miyano, Satoru; Sugano, Sumio

    2017-10-24

    High-throughput screens allow for the identification of specific biomolecules with characteristics of interest. In barcoded screens, DNA barcodes are linked to target biomolecules in a manner allowing for the target molecules making up a library to be identified by sequencing the DNA barcodes using Next Generation Sequencing. To be useful in experimental settings, the DNA barcodes in a library must satisfy certain constraints related to GC content, homopolymer length, Hamming distance, and blacklisted subsequences. Here we report a novel framework to quickly generate large-scale libraries of DNA barcodes for use in high-throughput screens. We show that our framework dramatically reduces the computation time required to generate large-scale DNA barcode libraries, compared with a naїve approach to DNA barcode library generation. As a proof of concept, we demonstrate that our framework is able to generate a library consisting of one million DNA barcodes for use in a fragment antibody phage display screening experiment. We also report generating a general purpose one billion DNA barcode library, the largest such library yet reported in literature. Our results demonstrate the value of our novel large-scale DNA barcode library generation framework for use in high-throughput screening applications.

  9. Highly Efficient Large-Scale Lentiviral Vector Concentration by Tandem Tangential Flow Filtration

    PubMed Central

    Cooper, Aaron R.; Patel, Sanjeet; Senadheera, Shantha; Plath, Kathrin; Kohn, Donald B.; Hollis, Roger P.

    2014-01-01

    Large-scale lentiviral vector (LV) concentration can be inefficient and time consuming, often involving multiple rounds of filtration and centrifugation. This report describes a simpler method using two tangential flow filtration (TFF) steps to concentrate liter-scale volumes of LV supernatant, achieving in excess of 2000-fold concentration in less than 3 hours with very high recovery (>97%). Large volumes of LV supernatant can be produced easily through the use of multi-layer flasks, each having 1720 cm2 surface area and producing ~560 mL of supernatant per flask. Combining the use of such flasks and TFF greatly simplifies large-scale production of LV. As a demonstration, the method is used to produce a very high titer LV (>1010 TU/mL) and transduce primary human CD34+ hematopoietic stem/progenitor cells at high final vector concentrations with no overt toxicity. A complex LV (STEMCCA) for induced pluripotent stem cell generation is also concentrated from low initial titer and used to transduce and reprogram primary human fibroblasts with no overt toxicity. Additionally, a generalized and simple multiplexed real- time PCR assay is described for lentiviral vector titer and copy number determination. PMID:21784103

  10. Large Scale System Safety Integration for Human Rated Space Vehicles

    NASA Astrophysics Data System (ADS)

    Massie, Michael J.

    2005-12-01

    Since the 1960s man has searched for ways to establish a human presence in space. Unfortunately, the development and operation of human spaceflight vehicles carry significant safety risks that are not always well understood. As a result, the countries with human space programs have felt the pain of loss of lives in the attempt to develop human space travel systems. Integrated System Safety is a process developed through years of experience (since before Apollo and Soyuz) as a way to assess risks involved in space travel and prevent such losses. The intent of Integrated System Safety is to take a look at an entire program and put together all the pieces in such a way that the risks can be identified, understood and dispositioned by program management. This process has many inherent challenges and they need to be explored, understood and addressed.In order to prepare truly integrated analysis safety professionals must gain a level of technical understanding of all of the project's pieces and how they interact. Next, they must find a way to present the analysis so the customer can understand the risks and make decisions about managing them. However, every organization in a large-scale project can have different ideas about what is or is not a hazard, what is or is not an appropriate hazard control, and what is or is not adequate hazard control verification. NASA provides some direction on these topics, but interpretations of those instructions can vary widely.Even more challenging is the fact that every individual/organization involved in a project has different levels of risk tolerance. When the discrete hazard controls of the contracts and agreements cannot be met, additional risk must be accepted. However, when one has left the arena of compliance with the known rules, there can be no longer be specific ground rules on which to base a decision as to what is acceptable and what is not. The integrator must find common grounds between all parties to achieve

  11. Shift of large-scale atmospheric systems over Europe during late MIS 3 and implications for Modern Human dispersal.

    PubMed

    Obreht, Igor; Hambach, Ulrich; Veres, Daniel; Zeeden, Christian; Bösken, Janina; Stevens, Thomas; Marković, Slobodan B; Klasen, Nicole; Brill, Dominik; Burow, Christoph; Lehmkuhl, Frank

    2017-07-19

    Understanding the past dynamics of large-scale atmospheric systems is crucial for our knowledge of the palaeoclimate conditions in Europe. Southeastern Europe currently lies at the border between Atlantic, Mediterranean, and continental climate zones. Past changes in the relative influence of associated atmospheric systems must have been recorded in the region's palaeoarchives. By comparing high-resolution grain-size, environmental magnetic and geochemical data from two loess-palaeosol sequences in the Lower Danube Basin with other Eurasian palaeorecords, we reconstructed past climatic patterns over Southeastern Europe and the related interaction of the prevailing large-scale circulation modes over Europe, especially during late Marine Isotope Stage 3 (40,000-27,000 years ago). We demonstrate that during this time interval, the intensification of the Siberian High had a crucial influence on European climate causing the more continental conditions over major parts of Europe, and a southwards shift of the Westerlies. Such a climatic and environmental change, combined with the Campanian Ignimbrite/Y-5 volcanic eruption, may have driven the Anatomically Modern Human dispersal towards Central and Western Europe, pointing to a corridor over the Eastern European Plain as an important pathway in their dispersal.

  12. Large-scale Scanning Transmission Electron Microscopy (Nanotomy) of Healthy and Injured Zebrafish Brain.

    PubMed

    Kuipers, Jeroen; Kalicharan, Ruby D; Wolters, Anouk H G; van Ham, Tjakko J; Giepmans, Ben N G

    2016-05-25

    Large-scale 2D electron microscopy (EM), or nanotomy, is the tissue-wide application of nanoscale resolution electron microscopy. Others and we previously applied large scale EM to human skin pancreatic islets, tissue culture and whole zebrafish larvae(1-7). Here we describe a universally applicable method for tissue-scale scanning EM for unbiased detection of sub-cellular and molecular features. Nanotomy was applied to investigate the healthy and a neurodegenerative zebrafish brain. Our method is based on standardized EM sample preparation protocols: Fixation with glutaraldehyde and osmium, followed by epoxy-resin embedding, ultrathin sectioning and mounting of ultrathin-sections on one-hole grids, followed by post staining with uranyl and lead. Large-scale 2D EM mosaic images are acquired using a scanning EM connected to an external large area scan generator using scanning transmission EM (STEM). Large scale EM images are typically ~ 5 - 50 G pixels in size, and best viewed using zoomable HTML files, which can be opened in any web browser, similar to online geographical HTML maps. This method can be applied to (human) tissue, cross sections of whole animals as well as tissue culture(1-5). Here, zebrafish brains were analyzed in a non-invasive neuronal ablation model. We visualize within a single dataset tissue, cellular and subcellular changes which can be quantified in various cell types including neurons and microglia, the brain's macrophages. In addition, nanotomy facilitates the correlation of EM with light microscopy (CLEM)(8) on the same tissue, as large surface areas previously imaged using fluorescent microscopy, can subsequently be subjected to large area EM, resulting in the nano-anatomy (nanotomy) of tissues. In all, nanotomy allows unbiased detection of features at EM level in a tissue-wide quantifiable manner.

  13. Large-scale Scanning Transmission Electron Microscopy (Nanotomy) of Healthy and Injured Zebrafish Brain

    PubMed Central

    Kuipers, Jeroen; Kalicharan, Ruby D.; Wolters, Anouk H. G.

    2016-01-01

    Large-scale 2D electron microscopy (EM), or nanotomy, is the tissue-wide application of nanoscale resolution electron microscopy. Others and we previously applied large scale EM to human skin pancreatic islets, tissue culture and whole zebrafish larvae1-7. Here we describe a universally applicable method for tissue-scale scanning EM for unbiased detection of sub-cellular and molecular features. Nanotomy was applied to investigate the healthy and a neurodegenerative zebrafish brain. Our method is based on standardized EM sample preparation protocols: Fixation with glutaraldehyde and osmium, followed by epoxy-resin embedding, ultrathin sectioning and mounting of ultrathin-sections on one-hole grids, followed by post staining with uranyl and lead. Large-scale 2D EM mosaic images are acquired using a scanning EM connected to an external large area scan generator using scanning transmission EM (STEM). Large scale EM images are typically ~ 5 - 50 G pixels in size, and best viewed using zoomable HTML files, which can be opened in any web browser, similar to online geographical HTML maps. This method can be applied to (human) tissue, cross sections of whole animals as well as tissue culture1-5. Here, zebrafish brains were analyzed in a non-invasive neuronal ablation model. We visualize within a single dataset tissue, cellular and subcellular changes which can be quantified in various cell types including neurons and microglia, the brain's macrophages. In addition, nanotomy facilitates the correlation of EM with light microscopy (CLEM)8 on the same tissue, as large surface areas previously imaged using fluorescent microscopy, can subsequently be subjected to large area EM, resulting in the nano-anatomy (nanotomy) of tissues. In all, nanotomy allows unbiased detection of features at EM level in a tissue-wide quantifiable manner. PMID:27285162

  14. Large-scale restoration mitigate land degradation and support the establishment of green infrastructure

    NASA Astrophysics Data System (ADS)

    Tóthmérész, Béla; Mitchley, Jonathan; Jongepierová, Ivana; Baasch, Annett; Fajmon, Karel; Kirmer, Anita; Prach, Karel; Řehounková, Klára; Tischew, Sabine; Twiston-Davies, Grace; Dutoit, Thierry; Buisson, Elise; Jeunatre, Renaud; Valkó, Orsolya; Deák, Balázs; Török, Péter

    2017-04-01

    Sustaining the human well-being and the quality of life, it is essential to develop and support green infrastructure (strategically planned network of natural and semi-natural areas with other environmental features designed and managed to deliver a wide range of ecosystem services). For developing and sustaining green infrastructure the conservation and restoration of biodiversity in natural and traditionally managed habitats is essential. Species-rich landscapes in Europe have been maintained over centuries by various kinds of low-intensity use. Recently, they suffered by losses in extent and diversity due to land degradation by intensification or abandonment. Conservation of landscape-scale biodiversity requires the maintenance of species-rich habitats and the restoration of lost grasslands. We are focusing on landscape-level restoration studies including multiple sites in wide geographical scale (including Czech Republic, France, Germany, Hungary, and UK). In a European-wide perspective we aimed at to address four specific questions: (i) What were the aims and objectives of landscape-scale restoration? (ii) What results have been achieved? (iii) What are the costs of large-scale restoration? (iv) What policy tools are available for the restoration of landscape-scale biodiversity? We conclude that landscape-level restoration offers exciting new opportunities to reconnect long-disrupted ecological processes and to restore landscape connectivity. Generally, these measures enable to enhance the biodiversity at the landscape scale. The development of policy tools to achieve restoration at the landscape scale are essential for the achievement of the ambitious targets of the Convention on Biological Diversity and the European Biodiversity Strategy for ecosystem restoration.

  15. Large-scale production of embryonic red blood cells from human embryonic stem cells.

    PubMed

    Olivier, Emmanuel N; Qiu, Caihong; Velho, Michelle; Hirsch, Rhoda Elison; Bouhassira, Eric E

    2006-12-01

    To develop a method to produce in culture large number of erythroid cells from human embryonic stem cells. Human H1 embryonic stem cells were differentiated into hematopoietic cells by coculture with a human fetal liver cell line, and the resulting CD34-positive cells were expanded in vitro in liquid culture using a three-step method. The erythroid cells produced were then analyzed by light microscopy and flow cytometry. Globin expression was characterized by quantitative reverse-transcriptase polymerase chain reaction and by high-performance liquid chromatography. CD34-positive cells produced from human embryonic stem cells could be efficiently differentiated into erythroid cells in liquid culture leading to a more than 5000-fold increase in cell number. The erythroid cells produced are similar to primitive erythroid cells present in the yolk sac of early human embryos and did not enucleate. They are fully hemoglobinized and express a mixture of embryonic and fetal globins but no beta-globin. We have developed an experimental protocol to produce large numbers of primitive erythroid cells starting from undifferentiated human embryonic stem cells. As the earliest human erythroid cells, the nucleated primitive erythroblasts, are not very well characterized because experimental material at this stage of development is very difficult to obtain, this system should prove useful to answer a number of experimental questions regarding the biology of these cells. In addition, production of mature red blood cells from human embryonic stem cells is of great potential practical importance because it could eventually become an alternate source of cell for transfusion.

  16. Emergence of scaling in human-interest dynamics.

    PubMed

    Zhao, Zhi-Dan; Yang, Zimo; Zhang, Zike; Zhou, Tao; Huang, Zi-Gang; Lai, Ying-Cheng

    2013-12-11

    Human behaviors are often driven by human interests. Despite intense recent efforts in exploring the dynamics of human behaviors, little is known about human-interest dynamics, partly due to the extreme difficulty in accessing the human mind from observations. However, the availability of large-scale data, such as those from e-commerce and smart-phone communications, makes it possible to probe into and quantify the dynamics of human interest. Using three prototypical "Big Data" sets, we investigate the scaling behaviors associated with human-interest dynamics. In particular, from the data sets we uncover fat-tailed (possibly power-law) distributions associated with the three basic quantities: (1) the length of continuous interest, (2) the return time of visiting certain interest, and (3) interest ranking and transition. We argue that there are three basic ingredients underlying human-interest dynamics: preferential return to previously visited interests, inertial effect, and exploration of new interests. We develop a biased random-walk model, incorporating the three ingredients, to account for the observed fat-tailed distributions. Our study represents the first attempt to understand the dynamical processes underlying human interest, which has significant applications in science and engineering, commerce, as well as defense, in terms of specific tasks such as recommendation and human-behavior prediction.

  17. Emergence of scaling in human-interest dynamics

    PubMed Central

    Zhao, Zhi-Dan; Yang, Zimo; Zhang, Zike; Zhou, Tao; Huang, Zi-Gang; Lai, Ying-Cheng

    2013-01-01

    Human behaviors are often driven by human interests. Despite intense recent efforts in exploring the dynamics of human behaviors, little is known about human-interest dynamics, partly due to the extreme difficulty in accessing the human mind from observations. However, the availability of large-scale data, such as those from e-commerce and smart-phone communications, makes it possible to probe into and quantify the dynamics of human interest. Using three prototypical “Big Data” sets, we investigate the scaling behaviors associated with human-interest dynamics. In particular, from the data sets we uncover fat-tailed (possibly power-law) distributions associated with the three basic quantities: (1) the length of continuous interest, (2) the return time of visiting certain interest, and (3) interest ranking and transition. We argue that there are three basic ingredients underlying human-interest dynamics: preferential return to previously visited interests, inertial effect, and exploration of new interests. We develop a biased random-walk model, incorporating the three ingredients, to account for the observed fat-tailed distributions. Our study represents the first attempt to understand the dynamical processes underlying human interest, which has significant applications in science and engineering, commerce, as well as defense, in terms of specific tasks such as recommendation and human-behavior prediction. PMID:24326949

  18. Emergence of scaling in human-interest dynamics

    NASA Astrophysics Data System (ADS)

    Zhao, Zhi-Dan; Yang, Zimo; Zhang, Zike; Zhou, Tao; Huang, Zi-Gang; Lai, Ying-Cheng

    2013-12-01

    Human behaviors are often driven by human interests. Despite intense recent efforts in exploring the dynamics of human behaviors, little is known about human-interest dynamics, partly due to the extreme difficulty in accessing the human mind from observations. However, the availability of large-scale data, such as those from e-commerce and smart-phone communications, makes it possible to probe into and quantify the dynamics of human interest. Using three prototypical ``Big Data'' sets, we investigate the scaling behaviors associated with human-interest dynamics. In particular, from the data sets we uncover fat-tailed (possibly power-law) distributions associated with the three basic quantities: (1) the length of continuous interest, (2) the return time of visiting certain interest, and (3) interest ranking and transition. We argue that there are three basic ingredients underlying human-interest dynamics: preferential return to previously visited interests, inertial effect, and exploration of new interests. We develop a biased random-walk model, incorporating the three ingredients, to account for the observed fat-tailed distributions. Our study represents the first attempt to understand the dynamical processes underlying human interest, which has significant applications in science and engineering, commerce, as well as defense, in terms of specific tasks such as recommendation and human-behavior prediction.

  19. Large-Scale Outflows in Seyfert Galaxies

    NASA Astrophysics Data System (ADS)

    Colbert, E. J. M.; Baum, S. A.

    1995-12-01

    \\catcode`\\@=11 \\ialign{m @th#1hfil ##hfil \\crcr#2\\crcr\\sim\\crcr}}} \\catcode`\\@=12 Highly collimated outflows extend out to Mpc scales in many radio-loud active galaxies. In Seyfert galaxies, which are radio-quiet, the outflows extend out to kpc scales and do not appear to be as highly collimated. In order to study the nature of large-scale (>~1 kpc) outflows in Seyferts, we have conducted optical, radio and X-ray surveys of a distance-limited sample of 22 edge-on Seyfert galaxies. Results of the optical emission-line imaging and spectroscopic survey imply that large-scale outflows are present in >~{{1} /{4}} of all Seyferts. The radio (VLA) and X-ray (ROSAT) surveys show that large-scale radio and X-ray emission is present at about the same frequency. Kinetic luminosities of the outflows in Seyferts are comparable to those in starburst-driven superwinds. Large-scale radio sources in Seyferts appear diffuse, but do not resemble radio halos found in some edge-on starburst galaxies (e.g. M82). We discuss the feasibility of the outflows being powered by the active nucleus (e.g. a jet) or a circumnuclear starburst.

  20. Synchronization of coupled large-scale Boolean networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Fangfei, E-mail: li-fangfei@163.com

    2014-03-15

    This paper investigates the complete synchronization and partial synchronization of two large-scale Boolean networks. First, the aggregation algorithm towards large-scale Boolean network is reviewed. Second, the aggregation algorithm is applied to study the complete synchronization and partial synchronization of large-scale Boolean networks. Finally, an illustrative example is presented to show the efficiency of the proposed results.

  1. Targeted exploration and analysis of large cross-platform human transcriptomic compendia

    PubMed Central

    Zhu, Qian; Wong, Aaron K; Krishnan, Arjun; Aure, Miriam R; Tadych, Alicja; Zhang, Ran; Corney, David C; Greene, Casey S; Bongo, Lars A; Kristensen, Vessela N; Charikar, Moses; Li, Kai; Troyanskaya, Olga G.

    2016-01-01

    We present SEEK (http://seek.princeton.edu), a query-based search engine across very large transcriptomic data collections, including thousands of human data sets from almost 50 microarray and next-generation sequencing platforms. SEEK uses a novel query-level cross-validation-based algorithm to automatically prioritize data sets relevant to the query and a robust search approach to identify query-coregulated genes, pathways, and processes. SEEK provides cross-platform handling, multi-gene query search, iterative metadata-based search refinement, and extensive visualization-based analysis options. PMID:25581801

  2. Reconstruction of genome-scale human metabolic models using omics data.

    PubMed

    Ryu, Jae Yong; Kim, Hyun Uk; Lee, Sang Yup

    2015-08-01

    The impact of genome-scale human metabolic models on human systems biology and medical sciences is becoming greater, thanks to increasing volumes of model building platforms and publicly available omics data. The genome-scale human metabolic models started with Recon 1 in 2007, and have since been used to describe metabolic phenotypes of healthy and diseased human tissues and cells, and to predict therapeutic targets. Here we review recent trends in genome-scale human metabolic modeling, including various generic and tissue/cell type-specific human metabolic models developed to date, and methods, databases and platforms used to construct them. For generic human metabolic models, we pay attention to Recon 2 and HMR 2.0 with emphasis on data sources used to construct them. Draft and high-quality tissue/cell type-specific human metabolic models have been generated using these generic human metabolic models. Integration of tissue/cell type-specific omics data with the generic human metabolic models is the key step, and we discuss omics data and their integration methods to achieve this task. The initial version of the tissue/cell type-specific human metabolic models can further be computationally refined through gap filling, reaction directionality assignment and the subcellular localization of metabolic reactions. We review relevant tools for this model refinement procedure as well. Finally, we suggest the direction of further studies on reconstructing an improved human metabolic model.

  3. Development of large-scale manufacturing of adipose-derived stromal cells for clinical applications using bioreactors and human platelet lysate.

    PubMed

    Haack-Sørensen, Mandana; Juhl, Morten; Follin, Bjarke; Harary Søndergaard, Rebekka; Kirchhoff, Maria; Kastrup, Jens; Ekblond, Annette

    2018-04-17

    In vitro expanded adipose-derived stromal cells (ASCs) are a useful resource for tissue regeneration. Translation of small-scale autologous cell production into a large-scale, allogeneic production process for clinical applications necessitates well-chosen raw materials and cell culture platform. We compare the use of clinical-grade human platelet lysate (hPL) and fetal bovine serum (FBS) as growth supplements for ASC expansion in the automated, closed hollow fibre quantum cell expansion system (bioreactor). Stromal vascular fractions were isolated from human subcutaneous abdominal fat. In average, 95 × 10 6 cells were suspended in 10% FBS or 5% hPL medium, and loaded into a bioreactor coated with cryoprecipitate. ASCs (P0) were harvested, and 30 × 10 6 ASCs were reloaded for continued expansion (P1). Feeding rate and time of harvest was guided by metabolic monitoring. Viability, sterility, purity, differentiation capacity, and genomic stability of ASCs P1 were determined. Cultivation of SVF in hPL medium for in average nine days, yielded 546 × 10 6 ASCs compared to 111 × 10 6 ASCs, after 17 days in FBS medium. ASCs P1 yields were in average 605 × 10 6 ASCs (PD [population doublings]: 4.65) after six days in hPL medium, compared to 119 × 10 6 ASCs (PD: 2.45) in FBS medium, after 21 days. ASCs fulfilled ISCT criteria and demonstrated genomic stability and sterility. The use of hPL as a growth supplement for ASCs expansion in the quantum cell expansion system provides an efficient expansion process compared to the use of FBS, while maintaining cell quality appropriate for clinical use. The described process is an obvious choice for manufacturing of large-scale allogeneic ASC products.

  4. Dissecting the large-scale galactic conformity

    NASA Astrophysics Data System (ADS)

    Seo, Seongu

    2018-01-01

    Galactic conformity is an observed phenomenon that galaxies located in the same region have similar properties such as star formation rate, color, gas fraction, and so on. The conformity was first observed among galaxies within in the same halos (“one-halo conformity”). The one-halo conformity can be readily explained by mutual interactions among galaxies within a halo. Recent observations however further witnessed a puzzling connection among galaxies with no direct interaction. In particular, galaxies located within a sphere of ~5 Mpc radius tend to show similarities, even though the galaxies do not share common halos with each other ("two-halo conformity" or “large-scale conformity”). Using a cosmological hydrodynamic simulation, Illustris, we investigate the physical origin of the two-halo conformity and put forward two scenarios. First, back-splash galaxies are likely responsible for the large-scale conformity. They have evolved into red galaxies due to ram-pressure stripping in a given galaxy cluster and happen to reside now within a ~5 Mpc sphere. Second, galaxies in strong tidal field induced by large-scale structure also seem to give rise to the large-scale conformity. The strong tides suppress star formation in the galaxies. We discuss the importance of the large-scale conformity in the context of galaxy evolution.

  5. Galaxies and large scale structure at high redshifts

    PubMed Central

    Steidel, Charles C.

    1998-01-01

    It is now straightforward to assemble large samples of very high redshift (z ∼ 3) field galaxies selected by their pronounced spectral discontinuity at the rest frame Lyman limit of hydrogen (at 912 Å). This makes possible both statistical analyses of the properties of the galaxies and the first direct glimpse of the progression of the growth of their large-scale distribution at such an early epoch. Here I present a summary of the progress made in these areas to date and some preliminary results of and future plans for a targeted redshift survey at z = 2.7–3.4. Also discussed is how the same discovery method may be used to obtain a “census” of star formation in the high redshift Universe, and the current implications for the history of galaxy formation as a function of cosmic epoch. PMID:9419319

  6. Internationalization Measures in Large Scale Research Projects

    NASA Astrophysics Data System (ADS)

    Soeding, Emanuel; Smith, Nancy

    2017-04-01

    Internationalization measures in Large Scale Research Projects Large scale research projects (LSRP) often serve as flagships used by universities or research institutions to demonstrate their performance and capability to stakeholders and other interested parties. As the global competition among universities for the recruitment of the brightest brains has increased, effective internationalization measures have become hot topics for universities and LSRP alike. Nevertheless, most projects and universities are challenged with little experience on how to conduct these measures and make internationalization an cost efficient and useful activity. Furthermore, those undertakings permanently have to be justified with the Project PIs as important, valuable tools to improve the capacity of the project and the research location. There are a variety of measures, suited to support universities in international recruitment. These include e.g. institutional partnerships, research marketing, a welcome culture, support for science mobility and an effective alumni strategy. These activities, although often conducted by different university entities, are interlocked and can be very powerful measures if interfaced in an effective way. On this poster we display a number of internationalization measures for various target groups, identify interfaces between project management, university administration, researchers and international partners to work together, exchange information and improve processes in order to be able to recruit, support and keep the brightest heads to your project.

  7. XLID-Causing Mutations and Associated Genes Challenged in Light of Data From Large-Scale Human Exome Sequencing

    PubMed Central

    Piton, Amélie; Redin, Claire; Mandel, Jean-Louis

    2013-01-01

    Because of the unbalanced sex ratio (1.3–1.4 to 1) observed in intellectual disability (ID) and the identification of large ID-affected families showing X-linked segregation, much attention has been focused on the genetics of X-linked ID (XLID). Mutations causing monogenic XLID have now been reported in over 100 genes, most of which are commonly included in XLID diagnostic gene panels. Nonetheless, the boundary between true mutations and rare non-disease-causing variants often remains elusive. The sequencing of a large number of control X chromosomes, required for avoiding false-positive results, was not systematically possible in the past. Such information is now available thanks to large-scale sequencing projects such as the National Heart, Lung, and Blood (NHLBI) Exome Sequencing Project, which provides variation information on 10,563 X chromosomes from the general population. We used this NHLBI cohort to systematically reassess the implication of 106 genes proposed to be involved in monogenic forms of XLID. We particularly question the implication in XLID of ten of them (AGTR2, MAGT1, ZNF674, SRPX2, ATP6AP2, ARHGEF6, NXF5, ZCCHC12, ZNF41, and ZNF81), in which truncating variants or previously published mutations are observed at a relatively high frequency within this cohort. We also highlight 15 other genes (CCDC22, CLIC2, CNKSR2, FRMPD4, HCFC1, IGBP1, KIAA2022, KLF8, MAOA, NAA10, NLGN3, RPL10, SHROOM4, ZDHHC15, and ZNF261) for which replication studies are warranted. We propose that similar reassessment of reported mutations (and genes) with the use of data from large-scale human exome sequencing would be relevant for a wide range of other genetic diseases. PMID:23871722

  8. Large-scale model quality assessment for improving protein tertiary structure prediction.

    PubMed

    Cao, Renzhi; Bhattacharya, Debswapna; Adhikari, Badri; Li, Jilong; Cheng, Jianlin

    2015-06-15

    Sampling structural models and ranking them are the two major challenges of protein structure prediction. Traditional protein structure prediction methods generally use one or a few quality assessment (QA) methods to select the best-predicted models, which cannot consistently select relatively better models and rank a large number of models well. Here, we develop a novel large-scale model QA method in conjunction with model clustering to rank and select protein structural models. It unprecedentedly applied 14 model QA methods to generate consensus model rankings, followed by model refinement based on model combination (i.e. averaging). Our experiment demonstrates that the large-scale model QA approach is more consistent and robust in selecting models of better quality than any individual QA method. Our method was blindly tested during the 11th Critical Assessment of Techniques for Protein Structure Prediction (CASP11) as MULTICOM group. It was officially ranked third out of all 143 human and server predictors according to the total scores of the first models predicted for 78 CASP11 protein domains and second according to the total scores of the best of the five models predicted for these domains. MULTICOM's outstanding performance in the extremely competitive 2014 CASP11 experiment proves that our large-scale QA approach together with model clustering is a promising solution to one of the two major problems in protein structure modeling. The web server is available at: http://sysbio.rnet.missouri.edu/multicom_cluster/human/. © The Author 2015. Published by Oxford University Press.

  9. Impact of target mRNA structure on siRNA silencing efficiency: A large-scale study.

    PubMed

    Gredell, Joseph A; Berger, Angela K; Walton, S Patrick

    2008-07-01

    The selection of active siRNAs is generally based on identifying siRNAs with certain sequence and structural properties. However, the efficiency of RNA interference has also been shown to depend on the structure of the target mRNA, primarily through studies using exogenous transcripts with well-defined secondary structures in the vicinity of the target sequence. While these studies provide a means for examining the impact of target sequence and structure independently, the predicted secondary structures for these transcripts are often not reflective of structures that form in full-length, native mRNAs where interactions can occur between relatively remote segments of the mRNAs. Here, using a combination of experimental results and analysis of a large dataset, we demonstrate that the accessibility of certain local target structures on the mRNA is an important determinant in the gene silencing ability of siRNAs. siRNAs targeting the enhanced green fluorescent protein were chosen using a minimal siRNA selection algorithm followed by classification based on the predicted minimum free energy structures of the target transcripts. Transfection into HeLa and HepG2 cells revealed that siRNAs targeting regions of the mRNA predicted to have unpaired 5'- and 3'-ends resulted in greater gene silencing than regions predicted to have other types of secondary structure. These results were confirmed by analysis of gene silencing data from previously published siRNAs, which showed that mRNA target regions unpaired at either the 5'-end or 3'-end were silenced, on average, approximately 10% more strongly than target regions unpaired in the center or primarily paired throughout. We found this effect to be independent of the structure of the siRNA guide strand. Taken together, these results suggest minimal requirements for nucleation of hybridization between the siRNA guide strand and mRNA and that both mRNA and guide strand structure should be considered when choosing candidate si

  10. Impact of target mRNA structure on siRNA silencing efficiency: a large-scale study

    PubMed Central

    Gredell, Joseph A.; Berger, Angela K.; Walton, S. Patrick

    2009-01-01

    The selection of active siRNAs is generally based on identifying siRNAs with certain sequence and structural properties. However, the efficiency of RNA interference has also been shown to depend on the structure of the target mRNA, primarily through studies using exogenous transcripts with well-defined secondary structures in the vicinity of the target sequence. While these studies provide a means for examining the impact of target sequence and structure independently, the predicted secondary structures for these transcripts are often not reflective of structures that form in full-length, native mRNAs where interactions can occur between relatively remote segments of the mRNAs. Here, using a combination of experimental results and analysis of a large dataset, we demonstrate that the accessibility of certain local target structures on the mRNA is an important determinant in the gene silencing ability of siRNAs. siRNAs targeting the enhanced green fluorescent protein were chosen using a minimal siRNA selection algorithm followed by classification based on the predicted minimum free energy structures of the target transcripts. Transfection into HeLa and HepG2 cells revealed that siRNAs targeting regions of the mRNA predicted to have unpaired 5’- and 3’-ends resulted in greater gene silencing than regions predicted to have other types of secondary structure. These results were confirmed by analysis of gene silencing data from previously published siRNAs, which showed that mRNA target regions unpaired at either the 5’-end or 3’-end were silenced, on average, ~10% more strongly than target regions unpaired in the center or primarily paired throughout. We found this effect to be independent of the structure of the siRNA guide strand. Taken together, these results suggest minimal requirements for nucleation of hybridization between the siRNA guide strand and mRNA and that both mRNA and guide strand structure should be considered when choosing candidate siRNAs. PMID

  11. The Large -scale Distribution of Galaxies

    NASA Astrophysics Data System (ADS)

    Flin, Piotr

    A review of the Large-scale structure of the Universe is given. A connection is made with the titanic work by Johannes Kepler in many areas of astronomy and cosmology. A special concern is made to spatial distribution of Galaxies, voids and walls (cellular structure of the Universe). Finaly, the author is concluding that the large scale structure of the Universe can be observed in much greater scale that it was thought twenty years ago.

  12. An evolutionary theory of large-scale human warfare: Group-structured cultural selection.

    PubMed

    Zefferman, Matthew R; Mathew, Sarah

    2015-01-01

    When humans wage war, it is not unusual for battlefields to be strewn with dead warriors. These warriors typically were men in their reproductive prime who, had they not died in battle, might have gone on to father more children. Typically, they are also genetically unrelated to one another. We know of no other animal species in which reproductively capable, genetically unrelated individuals risk their lives in this manner. Because the immense private costs borne by individual warriors create benefits that are shared widely by others in their group, warfare is a stark evolutionary puzzle that is difficult to explain. Although several scholars have posited models of the evolution of human warfare, these models do not adequately explain how humans solve the problem of collective action in warfare at the evolutionarily novel scale of hundreds of genetically unrelated individuals. We propose that group-structured cultural selection explains this phenomenon. © 2015 Wiley Periodicals, Inc.

  13. Atomic scale chemical tomography of human bone

    NASA Astrophysics Data System (ADS)

    Langelier, Brian; Wang, Xiaoyue; Grandfield, Kathryn

    2017-01-01

    Human bone is a complex hierarchical material. Understanding bone structure and its corresponding composition at the nanometer scale is critical for elucidating mechanisms of biomineralization under healthy and pathological states. However, the three-dimensional structure and chemical nature of bone remains largely unexplored at the nanometer scale due to the challenges associated with characterizing both the structural and chemical integrity of bone simultaneously. Here, we use correlative transmission electron microscopy and atom probe tomography for the first time, to our knowledge, to reveal structures in human bone at the atomic level. This approach provides an overlaying chemical map of the organic and inorganic constituents of bone on its structure. This first use of atom probe tomography on human bone reveals local gradients, trace element detection of Mg, and the co-localization of Na with the inorganic-organic interface of bone mineral and collagen fibrils, suggesting the important role of Na-rich organics in the structural connection between mineral and collagen. Our findings provide the first insights into the hierarchical organization and chemical heterogeneity in human bone in three-dimensions at its smallest length scale - the atomic level. We demonstrate that atom probe tomography shows potential for new insights in biomineralization research on bone.

  14. Asynchronous Two-Level Checkpointing Scheme for Large-Scale Adjoints in the Spectral-Element Solver Nek5000

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schanen, Michel; Marin, Oana; Zhang, Hong

    Adjoints are an important computational tool for large-scale sensitivity evaluation, uncertainty quantification, and derivative-based optimization. An essential component of their performance is the storage/recomputation balance in which efficient checkpointing methods play a key role. We introduce a novel asynchronous two-level adjoint checkpointing scheme for multistep numerical time discretizations targeted at large-scale numerical simulations. The checkpointing scheme combines bandwidth-limited disk checkpointing and binomial memory checkpointing. Based on assumptions about the target petascale systems, which we later demonstrate to be realistic on the IBM Blue Gene/Q system Mira, we create a model of the expected performance of our checkpointing approach and validatemore » it using the highly scalable Navier-Stokes spectralelement solver Nek5000 on small to moderate subsystems of the Mira supercomputer. In turn, this allows us to predict optimal algorithmic choices when using all of Mira. We also demonstrate that two-level checkpointing is significantly superior to single-level checkpointing when adjoining a large number of time integration steps. To our knowledge, this is the first time two-level checkpointing had been designed, implemented, tuned, and demonstrated on fluid dynamics codes at large scale of 50k+ cores.« less

  15. Engineering management of large scale systems

    NASA Technical Reports Server (NTRS)

    Sanders, Serita; Gill, Tepper L.; Paul, Arthur S.

    1989-01-01

    The organization of high technology and engineering problem solving, has given rise to an emerging concept. Reasoning principles for integrating traditional engineering problem solving with system theory, management sciences, behavioral decision theory, and planning and design approaches can be incorporated into a methodological approach to solving problems with a long range perspective. Long range planning has a great potential to improve productivity by using a systematic and organized approach. Thus, efficiency and cost effectiveness are the driving forces in promoting the organization of engineering problems. Aspects of systems engineering that provide an understanding of management of large scale systems are broadly covered here. Due to the focus and application of research, other significant factors (e.g., human behavior, decision making, etc.) are not emphasized but are considered.

  16. Defining conservation targets on a landscape-scale

    USGS Publications Warehouse

    Benscoter, A.M.; Romañach, Stephanie; Brandt, Laura A.

    2015-01-01

    Conservation planning, the process of deciding how to protect, conserve, enhance and(or) minimize loss of natural and cultural resources, is a fundamental process to achieve conservation success in a time of rapid environmental change. Conservation targets, the measurable expressions of desired resource conditions, are an important tool in biological planning to achieve effective outcomes. Conservation targets provide a focus for planning, design, conservation action, and collaborative monitoring of environmental trends to guide landscape-scale conservation to improve the quality and quantity of key ecological and cultural resources. It is essential to have an iterative and inclusive method to define conservation targets that is replicable and allows for the evaluation of the effectiveness of conservation targets over time. In this document, we describe a process that can be implemented to achieve landscape-scale conservation, which includes defining conservation targets. We also describe what has been accomplished to date (September 2015) through this process for the Peninsular Florida Landscape Conservation Cooperative (PFLCC).

  17. Large scale dynamic systems

    NASA Technical Reports Server (NTRS)

    Doolin, B. F.

    1975-01-01

    Classes of large scale dynamic systems were discussed in the context of modern control theory. Specific examples discussed were in the technical fields of aeronautics, water resources and electric power.

  18. Large Scale Expansion of Human Umbilical Cord Cells in a Rotating Bed System Bioreactor for Cardiovascular Tissue Engineering Applications

    PubMed Central

    Reichardt, Anne; Polchow, Bianca; Shakibaei, Mehdi; Henrich, Wolfgang; Hetzer, Roland; Lueders, Cora

    2013-01-01

    Widespread use of human umbilical cord cells for cardiovascular tissue engineering requires production of large numbers of well-characterized cells under controlled conditions. In current research projects, the expansion of cells to be used to create a tissue construct is usually performed in static cell culture systems which are, however, often not satisfactory due to limitations in nutrient and oxygen supply. To overcome these limitations dynamic cell expansion in bioreactor systems under controllable conditions could be an important tool providing continuous perfusion for the generation of large numbers of viable pre-conditioned cells in a short time period. For this purpose cells derived from human umbilical cord arteries were expanded in a rotating bed system bioreactor for up to 9 days. For a comparative study, cells were cultivated under static conditions in standard culture devices. Our results demonstrated that the microenvironment in the perfusion bioreactor was more favorable than that of the standard cell culture flasks. Data suggested that cells in the bioreactor expanded 39 fold (38.7 ± 6.1 fold) in comparison to statically cultured cells (31.8 ± 3.0 fold). Large-scale production of cells in the bioreactor resulted in more than 3 x 108 cells from a single umbilical cord fragment within 9 days. Furthermore cell doubling time was lower in the bioreactor system and production of extracellular matrix components was higher. With this study, we present an appropriate method to expand human umbilical cord artery derived cells with high cellular proliferation rates in a well-defined bioreactor system under GMP conditions. PMID:23847691

  19. Large-scale sequence and structural comparisons of human naive and antigen-experienced antibody repertoires.

    PubMed

    DeKosky, Brandon J; Lungu, Oana I; Park, Daechan; Johnson, Erik L; Charab, Wissam; Chrysostomou, Constantine; Kuroda, Daisuke; Ellington, Andrew D; Ippolito, Gregory C; Gray, Jeffrey J; Georgiou, George

    2016-05-10

    Elucidating how antigen exposure and selection shape the human antibody repertoire is fundamental to our understanding of B-cell immunity. We sequenced the paired heavy- and light-chain variable regions (VH and VL, respectively) from large populations of single B cells combined with computational modeling of antibody structures to evaluate sequence and structural features of human antibody repertoires at unprecedented depth. Analysis of a dataset comprising 55,000 antibody clusters from CD19(+)CD20(+)CD27(-) IgM-naive B cells, >120,000 antibody clusters from CD19(+)CD20(+)CD27(+) antigen-experienced B cells, and >2,000 RosettaAntibody-predicted structural models across three healthy donors led to a number of key findings: (i) VH and VL gene sequences pair in a combinatorial fashion without detectable pairing restrictions at the population level; (ii) certain VH:VL gene pairs were significantly enriched or depleted in the antigen-experienced repertoire relative to the naive repertoire; (iii) antigen selection increased antibody paratope net charge and solvent-accessible surface area; and (iv) public heavy-chain third complementarity-determining region (CDR-H3) antibodies in the antigen-experienced repertoire showed signs of convergent paired light-chain genetic signatures, including shared light-chain third complementarity-determining region (CDR-L3) amino acid sequences and/or Vκ,λ-Jκ,λ genes. The data reported here address several longstanding questions regarding antibody repertoire selection and development and provide a benchmark for future repertoire-scale analyses of antibody responses to vaccination and disease.

  20. Coexistence between wildlife and humans at fine spatial scales.

    PubMed

    Carter, Neil H; Shrestha, Binoj K; Karki, Jhamak B; Pradhan, Narendra Man Babu; Liu, Jianguo

    2012-09-18

    Many wildlife species face imminent extinction because of human impacts, and therefore, a prevailing belief is that some wildlife species, particularly large carnivores and ungulates, cannot coexist with people at fine spatial scales (i.e., cannot regularly use the exact same point locations). This belief provides rationale for various conservation programs, such as resettling human communities outside protected areas. However, quantitative information on the capacity and mechanisms for wildlife to coexist with humans at fine spatial scales is scarce. Such information is vital, because the world is becoming increasingly crowded. Here, we provide empirical information about the capacity and mechanisms for tigers (a globally endangered species) to coexist with humans at fine spatial scales inside and outside Nepal's Chitwan National Park, a flagship protected area for imperiled wildlife. Information obtained from field cameras in 2010 and 2011 indicated that human presence (i.e., people on foot and vehicles) was ubiquitous and abundant throughout the study site; however, tiger density was also high. Surprisingly, even at a fine spatial scale (i.e., camera locations), tigers spatially overlapped with people on foot and vehicles in both years. However, in both years, tigers offset their temporal activity patterns to be much less active during the day when human activity peaked. In addition to temporal displacement, tiger-human coexistence was likely enhanced by abundant tiger prey and low levels of tiger poaching. Incorporating fine-scale spatial and temporal activity patterns into conservation plans can help address a major global challenge-meeting human needs while sustaining wildlife.

  1. Coexistence between wildlife and humans at fine spatial scales

    PubMed Central

    Carter, Neil H.; Shrestha, Binoj K.; Karki, Jhamak B.; Pradhan, Narendra Man Babu; Liu, Jianguo

    2012-01-01

    Many wildlife species face imminent extinction because of human impacts, and therefore, a prevailing belief is that some wildlife species, particularly large carnivores and ungulates, cannot coexist with people at fine spatial scales (i.e., cannot regularly use the exact same point locations). This belief provides rationale for various conservation programs, such as resettling human communities outside protected areas. However, quantitative information on the capacity and mechanisms for wildlife to coexist with humans at fine spatial scales is scarce. Such information is vital, because the world is becoming increasingly crowded. Here, we provide empirical information about the capacity and mechanisms for tigers (a globally endangered species) to coexist with humans at fine spatial scales inside and outside Nepal’s Chitwan National Park, a flagship protected area for imperiled wildlife. Information obtained from field cameras in 2010 and 2011 indicated that human presence (i.e., people on foot and vehicles) was ubiquitous and abundant throughout the study site; however, tiger density was also high. Surprisingly, even at a fine spatial scale (i.e., camera locations), tigers spatially overlapped with people on foot and vehicles in both years. However, in both years, tigers offset their temporal activity patterns to be much less active during the day when human activity peaked. In addition to temporal displacement, tiger–human coexistence was likely enhanced by abundant tiger prey and low levels of tiger poaching. Incorporating fine-scale spatial and temporal activity patterns into conservation plans can help address a major global challenge—meeting human needs while sustaining wildlife. PMID:22949642

  2. Automation of large scale transient protein expression in mammalian cells

    PubMed Central

    Zhao, Yuguang; Bishop, Benjamin; Clay, Jordan E.; Lu, Weixian; Jones, Margaret; Daenke, Susan; Siebold, Christian; Stuart, David I.; Yvonne Jones, E.; Radu Aricescu, A.

    2011-01-01

    Traditional mammalian expression systems rely on the time-consuming generation of stable cell lines; this is difficult to accommodate within a modern structural biology pipeline. Transient transfections are a fast, cost-effective solution, but require skilled cell culture scientists, making man-power a limiting factor in a setting where numerous samples are processed in parallel. Here we report a strategy employing a customised CompacT SelecT cell culture robot allowing the large-scale expression of multiple protein constructs in a transient format. Successful protocols have been designed for automated transient transfection of human embryonic kidney (HEK) 293T and 293S GnTI− cells in various flask formats. Protein yields obtained by this method were similar to those produced manually, with the added benefit of reproducibility, regardless of user. Automation of cell maintenance and transient transfection allows the expression of high quality recombinant protein in a completely sterile environment with limited support from a cell culture scientist. The reduction in human input has the added benefit of enabling continuous cell maintenance and protein production, features of particular importance to structural biology laboratories, which typically use large quantities of pure recombinant proteins, and often require rapid characterisation of a series of modified constructs. This automated method for large scale transient transfection is now offered as a Europe-wide service via the P-cube initiative. PMID:21571074

  3. Large-scale network integration in the human brain tracks temporal fluctuations in memory encoding performance.

    PubMed

    Keerativittayayut, Ruedeerat; Aoki, Ryuta; Sarabi, Mitra Taghizadeh; Jimura, Koji; Nakahara, Kiyoshi

    2018-06-18

    Although activation/deactivation of specific brain regions have been shown to be predictive of successful memory encoding, the relationship between time-varying large-scale brain networks and fluctuations of memory encoding performance remains unclear. Here we investigated time-varying functional connectivity patterns across the human brain in periods of 30-40 s, which have recently been implicated in various cognitive functions. During functional magnetic resonance imaging, participants performed a memory encoding task, and their performance was assessed with a subsequent surprise memory test. A graph analysis of functional connectivity patterns revealed that increased integration of the subcortical, default-mode, salience, and visual subnetworks with other subnetworks is a hallmark of successful memory encoding. Moreover, multivariate analysis using the graph metrics of integration reliably classified the brain network states into the period of high (vs. low) memory encoding performance. Our findings suggest that a diverse set of brain systems dynamically interact to support successful memory encoding. © 2018, Keerativittayayut et al.

  4. Large-scale production of megakaryocytes from human pluripotent stem cells by chemically defined forward programming

    PubMed Central

    Moreau, Thomas; Evans, Amanda L.; Vasquez, Louella; Tijssen, Marloes R.; Yan, Ying; Trotter, Matthew W.; Howard, Daniel; Colzani, Maria; Arumugam, Meera; Wu, Wing Han; Dalby, Amanda; Lampela, Riina; Bouet, Guenaelle; Hobbs, Catherine M.; Pask, Dean C.; Payne, Holly; Ponomaryov, Tatyana; Brill, Alexander; Soranzo, Nicole; Ouwehand, Willem H.; Pedersen, Roger A.; Ghevaert, Cedric

    2016-01-01

    The production of megakaryocytes (MKs)—the precursors of blood platelets—from human pluripotent stem cells (hPSCs) offers exciting clinical opportunities for transfusion medicine. Here we describe an original approach for the large-scale generation of MKs in chemically defined conditions using a forward programming strategy relying on the concurrent exogenous expression of three transcription factors: GATA1, FLI1 and TAL1. The forward programmed MKs proliferate and differentiate in culture for several months with MK purity over 90% reaching up to 2 × 105 mature MKs per input hPSC. Functional platelets are generated throughout the culture allowing the prospective collection of several transfusion units from as few as 1 million starting hPSCs. The high cell purity and yield achieved by MK forward programming, combined with efficient cryopreservation and good manufacturing practice (GMP)-compatible culture, make this approach eminently suitable to both in vitro production of platelets for transfusion and basic research in MK and platelet biology. PMID:27052461

  5. Large-scale production of megakaryocytes from human pluripotent stem cells by chemically defined forward programming.

    PubMed

    Moreau, Thomas; Evans, Amanda L; Vasquez, Louella; Tijssen, Marloes R; Yan, Ying; Trotter, Matthew W; Howard, Daniel; Colzani, Maria; Arumugam, Meera; Wu, Wing Han; Dalby, Amanda; Lampela, Riina; Bouet, Guenaelle; Hobbs, Catherine M; Pask, Dean C; Payne, Holly; Ponomaryov, Tatyana; Brill, Alexander; Soranzo, Nicole; Ouwehand, Willem H; Pedersen, Roger A; Ghevaert, Cedric

    2016-04-07

    The production of megakaryocytes (MKs)--the precursors of blood platelets--from human pluripotent stem cells (hPSCs) offers exciting clinical opportunities for transfusion medicine. Here we describe an original approach for the large-scale generation of MKs in chemically defined conditions using a forward programming strategy relying on the concurrent exogenous expression of three transcription factors: GATA1, FLI1 and TAL1. The forward programmed MKs proliferate and differentiate in culture for several months with MK purity over 90% reaching up to 2 × 10(5) mature MKs per input hPSC. Functional platelets are generated throughout the culture allowing the prospective collection of several transfusion units from as few as 1 million starting hPSCs. The high cell purity and yield achieved by MK forward programming, combined with efficient cryopreservation and good manufacturing practice (GMP)-compatible culture, make this approach eminently suitable to both in vitro production of platelets for transfusion and basic research in MK and platelet biology.

  6. Numerical dissipation vs. subgrid-scale modelling for large eddy simulation

    NASA Astrophysics Data System (ADS)

    Dairay, Thibault; Lamballais, Eric; Laizet, Sylvain; Vassilicos, John Christos

    2017-05-01

    This study presents an alternative way to perform large eddy simulation based on a targeted numerical dissipation introduced by the discretization of the viscous term. It is shown that this regularisation technique is equivalent to the use of spectral vanishing viscosity. The flexibility of the method ensures high-order accuracy while controlling the level and spectral features of this purely numerical viscosity. A Pao-like spectral closure based on physical arguments is used to scale this numerical viscosity a priori. It is shown that this way of approaching large eddy simulation is more efficient and accurate than the use of the very popular Smagorinsky model in standard as well as in dynamic version. The main strength of being able to correctly calibrate numerical dissipation is the possibility to regularise the solution at the mesh scale. Thanks to this property, it is shown that the solution can be seen as numerically converged. Conversely, the two versions of the Smagorinsky model are found unable to ensure regularisation while showing a strong sensitivity to numerical errors. The originality of the present approach is that it can be viewed as implicit large eddy simulation, in the sense that the numerical error is the source of artificial dissipation, but also as explicit subgrid-scale modelling, because of the equivalence with spectral viscosity prescribed on a physical basis.

  7. Comparison of human observer and algorithmic target detection in nonurban forward-looking infrared imagery

    NASA Astrophysics Data System (ADS)

    Weber, Bruce A.

    2005-07-01

    We have performed an experiment that compares the performance of human observers with that of a robust algorithm for the detection of targets in difficult, nonurban forward-looking infrared imagery. Our purpose was to benchmark the comparison and document performance differences for future algorithm improvement. The scale-insensitive detection algorithm, used as a benchmark by the Night Vision Electronic Sensors Directorate for algorithm evaluation, employed a combination of contrastlike features to locate targets. Detection receiver operating characteristic curves and observer-confidence analyses were used to compare human and algorithmic responses and to gain insight into differences. The test database contained ground targets, in natural clutter, whose detectability, as judged by human observers, ranged from easy to very difficult. In general, as compared with human observers, the algorithm detected most of the same targets, but correlated confidence with correct detections poorly and produced many more false alarms at any useful level of performance. Though characterizing human performance was not the intent of this study, results suggest that previous observational experience was not a strong predictor of human performance, and that combining individual human observations by majority vote significantly reduced false-alarm rates.

  8. Exploring the cellular basis of human disease through a large-scale mapping of deleterious genes to cell types.

    PubMed

    Cornish, Alex J; Filippis, Ioannis; David, Alessia; Sternberg, Michael J E

    2015-09-01

    Each cell type found within the human body performs a diverse and unique set of functions, the disruption of which can lead to disease. However, there currently exists no systematic mapping between cell types and the diseases they can cause. In this study, we integrate protein-protein interaction data with high-quality cell-type-specific gene expression data from the FANTOM5 project to build the largest collection of cell-type-specific interactomes created to date. We develop a novel method, called gene set compactness (GSC), that contrasts the relative positions of disease-associated genes across 73 cell-type-specific interactomes to map genes associated with 196 diseases to the cell types they affect. We conduct text-mining of the PubMed database to produce an independent resource of disease-associated cell types, which we use to validate our method. The GSC method successfully identifies known disease-cell-type associations, as well as highlighting associations that warrant further study. This includes mast cells and multiple sclerosis, a cell population currently being targeted in a multiple sclerosis phase 2 clinical trial. Furthermore, we build a cell-type-based diseasome using the cell types identified as manifesting each disease, offering insight into diseases linked through etiology. The data set produced in this study represents the first large-scale mapping of diseases to the cell types in which they are manifested and will therefore be useful in the study of disease systems. Overall, we demonstrate that our approach links disease-associated genes to the phenotypes they produce, a key goal within systems medicine.

  9. Transition from large-scale to small-scale dynamo.

    PubMed

    Ponty, Y; Plunian, F

    2011-04-15

    The dynamo equations are solved numerically with a helical forcing corresponding to the Roberts flow. In the fully turbulent regime the flow behaves as a Roberts flow on long time scales, plus turbulent fluctuations at short time scales. The dynamo onset is controlled by the long time scales of the flow, in agreement with the former Karlsruhe experimental results. The dynamo mechanism is governed by a generalized α effect, which includes both the usual α effect and turbulent diffusion, plus all higher order effects. Beyond the onset we find that this generalized α effect scales as O(Rm(-1)), suggesting the takeover of small-scale dynamo action. This is confirmed by simulations in which dynamo occurs even if the large-scale field is artificially suppressed.

  10. Fast large-scale object retrieval with binary quantization

    NASA Astrophysics Data System (ADS)

    Zhou, Shifu; Zeng, Dan; Shen, Wei; Zhang, Zhijiang; Tian, Qi

    2015-11-01

    The objective of large-scale object retrieval systems is to search for images that contain the target object in an image database. Where state-of-the-art approaches rely on global image representations to conduct searches, we consider many boxes per image as candidates to search locally in a picture. In this paper, a feature quantization algorithm called binary quantization is proposed. In binary quantization, a scale-invariant feature transform (SIFT) feature is quantized into a descriptive and discriminative bit-vector, which allows itself to adapt to the classic inverted file structure for box indexing. The inverted file, which stores the bit-vector and box ID where the SIFT feature is located inside, is compact and can be loaded into the main memory for efficient box indexing. We evaluate our approach on available object retrieval datasets. Experimental results demonstrate that the proposed approach is fast and achieves excellent search quality. Therefore, the proposed approach is an improvement over state-of-the-art approaches for object retrieval.

  11. Unravelling connections between river flow and large-scale climate: experiences from Europe

    NASA Astrophysics Data System (ADS)

    Hannah, D. M.; Kingston, D. G.; Lavers, D.; Stagge, J. H.; Tallaksen, L. M.

    2016-12-01

    The United Nations has identified better knowledge of large-scale water cycle processes as essential for socio-economic development and global water-food-energy security. In this context, and given the ever-growing concerns about climate change/ variability and human impacts on hydrology, there is an urgent research need: (a) to quantify space-time variability in regional river flow, and (b) to improve hydroclimatological understanding of climate-flow connections as a basis for identifying current and future water-related issues. In this paper, we draw together studies undertaken at the pan-European scale: (1) to evaluate current methods for assessing space-time dynamics for different streamflow metrics (annual regimes, low flows and high flows) and for linking flow variability to atmospheric drivers (circulation indices, air-masses, gridded climate fields and vapour flux); and (2) to propose a plan for future research connecting streamflow and the atmospheric conditions in Europe and elsewhere. We believe this research makes a useful, unique contribution to the literature through a systematic inter-comparison of different streamflow metrics and atmospheric descriptors. In our findings, we highlight the need to consider appropriate atmospheric descriptors (dependent on the target flow metric and region of interest) and to develop analytical techniques that best characterise connections in the ocean-atmosphere-land surface process chain. We call for the need to consider not only atmospheric interactions, but also the role of the river basin-scale terrestrial hydrological processes in modifying the climate signal response of river flows.

  12. Large-Scale Hybrid Motor Testing. Chapter 10

    NASA Technical Reports Server (NTRS)

    Story, George

    2006-01-01

    Hybrid rocket motors can be successfully demonstrated at a small scale virtually anywhere. There have been many suitcase sized portable test stands assembled for demonstration of hybrids. They show the safety of hybrid rockets to the audiences. These small show motors and small laboratory scale motors can give comparative burn rate data for development of different fuel/oxidizer combinations, however questions that are always asked when hybrids are mentioned for large scale applications are - how do they scale and has it been shown in a large motor? To answer those questions, large scale motor testing is required to verify the hybrid motor at its true size. The necessity to conduct large-scale hybrid rocket motor tests to validate the burn rate from the small motors to application size has been documented in several place^'^^.^. Comparison of small scale hybrid data to that of larger scale data indicates that the fuel burn rate goes down with increasing port size, even with the same oxidizer flux. This trend holds for conventional hybrid motors with forward oxidizer injection and HTPB based fuels. While the reason this is occurring would make a great paper or study or thesis, it is not thoroughly understood at this time. Potential causes include the fact that since hybrid combustion is boundary layer driven, the larger port sizes reduce the interaction (radiation, mixing and heat transfer) from the core region of the port. This chapter focuses on some of the large, prototype sized testing of hybrid motors. The largest motors tested have been AMROC s 250K-lbf thrust motor at Edwards Air Force Base and the Hybrid Propulsion Demonstration Program s 250K-lbf thrust motor at Stennis Space Center. Numerous smaller tests were performed to support the burn rate, stability and scaling concepts that went into the development of those large motors.

  13. Why small-scale cannabis growers stay small: five mechanisms that prevent small-scale growers from going large scale.

    PubMed

    Hammersvik, Eirik; Sandberg, Sveinung; Pedersen, Willy

    2012-11-01

    Over the past 15-20 years, domestic cultivation of cannabis has been established in a number of European countries. New techniques have made such cultivation easier; however, the bulk of growers remain small-scale. In this study, we explore the factors that prevent small-scale growers from increasing their production. The study is based on 1 year of ethnographic fieldwork and qualitative interviews conducted with 45 Norwegian cannabis growers, 10 of whom were growing on a large-scale and 35 on a small-scale. The study identifies five mechanisms that prevent small-scale indoor growers from going large-scale. First, large-scale operations involve a number of people, large sums of money, a high work-load and a high risk of detection, and thus demand a higher level of organizational skills than for small growing operations. Second, financial assets are needed to start a large 'grow-site'. Housing rent, electricity, equipment and nutrients are expensive. Third, to be able to sell large quantities of cannabis, growers need access to an illegal distribution network and knowledge of how to act according to black market norms and structures. Fourth, large-scale operations require advanced horticultural skills to maximize yield and quality, which demands greater skills and knowledge than does small-scale cultivation. Fifth, small-scale growers are often embedded in the 'cannabis culture', which emphasizes anti-commercialism, anti-violence and ecological and community values. Hence, starting up large-scale production will imply having to renegotiate or abandon these values. Going from small- to large-scale cannabis production is a demanding task-ideologically, technically, economically and personally. The many obstacles that small-scale growers face and the lack of interest and motivation for going large-scale suggest that the risk of a 'slippery slope' from small-scale to large-scale growing is limited. Possible political implications of the findings are discussed. Copyright

  14. Quantifying the Impacts of Large Scale Integration of Renewables in Indian Power Sector

    NASA Astrophysics Data System (ADS)

    Kumar, P.; Mishra, T.; Banerjee, R.

    2017-12-01

    India's power sector is responsible for nearly 37 percent of India's greenhouse gas emissions. For a fast emerging economy like India whose population and energy consumption are poised to rise rapidly in the coming decades, renewable energy can play a vital role in decarbonizing power sector. In this context, India has targeted 33-35 percent emission intensity reduction (with respect to 2005 levels) along with large scale renewable energy targets (100GW solar, 60GW wind, and 10GW biomass energy by 2022) in INDCs submitted at Paris agreement. But large scale integration of renewable energy is a complex process which faces a number of problems like capital intensiveness, matching intermittent loads with least storage capacity and reliability. In this context, this study attempts to assess the technical feasibility of integrating renewables into Indian electricity mix by 2022 and analyze its implications on power sector operations. This study uses TIMES, a bottom up energy optimization model with unit commitment and dispatch features. We model coal and gas fired units discretely with region-wise representation of wind and solar resources. The dispatch features are used for operational analysis of power plant units under ramp rate and minimum generation constraints. The study analyzes India's electricity sector transition for the year 2022 with three scenarios. The base case scenario (no RE addition) along with INDC scenario (with 100GW solar, 60GW wind, 10GW biomass) and low RE scenario (50GW solar, 30GW wind) have been created to analyze the implications of large scale integration of variable renewable energy. The results provide us insights on trade-offs involved in achieving mitigation targets and investment decisions involved. The study also examines operational reliability and flexibility requirements of the system for integrating renewables.

  15. Water limited agriculture in Africa: Climate change sensitivity of large scale land investments

    NASA Astrophysics Data System (ADS)

    Rulli, M. C.; D'Odorico, P.; Chiarelli, D. D.; Davis, K. F.

    2015-12-01

    The past few decades have seen unprecedented changes in the global agricultural system with a dramatic increase in the rates of food production fueled by an escalating demand for food calories, as a result of demographic growth, dietary changes, and - more recently - new bioenergy policies. Food prices have become consistently higher and increasingly volatile with dramatic spikes in 2007-08 and 2010-11. The confluence of these factors has heightened demand for land and brought a wave of land investment to the developing world: some of the more affluent countries are trying to secure land rights in areas suitable for agriculture. According to some estimates, to date, roughly 38 million hectares have been acquired worldwide by large scale investors, 16 million of which in Africa. More than 85% of large scale land acquisitions in Africa are by foreign investors. Many land deals are motivated not only by the need for fertile land but for the water resources required for crop production. Despite some recent assessments of the water appropriation associated with large scale land investments, their impact on the water resources of the target countries under present conditions and climate change scenarios remains poorly understood. Here we investigate irrigation water requirements by various crops planted in the acquired land as an indicator of the pressure likely placed by land investors on ("blue") water resources of target regions in Africa and evaluate the sensitivity to climate changes scenarios.

  16. Generation of Large-Scale Magnetic Fields by Small-Scale Dynamo in Shear Flows.

    PubMed

    Squire, J; Bhattacharjee, A

    2015-10-23

    We propose a new mechanism for a turbulent mean-field dynamo in which the magnetic fluctuations resulting from a small-scale dynamo drive the generation of large-scale magnetic fields. This is in stark contrast to the common idea that small-scale magnetic fields should be harmful to large-scale dynamo action. These dynamos occur in the presence of a large-scale velocity shear and do not require net helicity, resulting from off-diagonal components of the turbulent resistivity tensor as the magnetic analogue of the "shear-current" effect. Given the inevitable existence of nonhelical small-scale magnetic fields in turbulent plasmas, as well as the generic nature of velocity shear, the suggested mechanism may help explain the generation of large-scale magnetic fields across a wide range of astrophysical objects.

  17. Large-scale functional models of visual cortex for remote sensing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brumby, Steven P; Kenyon, Garrett; Rasmussen, Craig E

    Neuroscience has revealed many properties of neurons and of the functional organization of visual cortex that are believed to be essential to human vision, but are missing in standard artificial neural networks. Equally important may be the sheer scale of visual cortex requiring {approx}1 petaflop of computation. In a year, the retina delivers {approx}1 petapixel to the brain, leading to massively large opportunities for learning at many levels of the cortical system. We describe work at Los Alamos National Laboratory (LANL) to develop large-scale functional models of visual cortex on LANL's Roadrunner petaflop supercomputer. An initial run of a simplemore » region VI code achieved 1.144 petaflops during trials at the IBM facility in Poughkeepsie, NY (June 2008). Here, we present criteria for assessing when a set of learned local representations is 'complete' along with general criteria for assessing computer vision models based on their projected scaling behavior. Finally, we extend one class of biologically-inspired learning models to problems of remote sensing imagery.« less

  18. Cooperation, collective action, and the archeology of large-scale societies.

    PubMed

    Carballo, David M; Feinman, Gary M

    2016-11-01

    Archeologists investigating the emergence of large-scale societies in the past have renewed interest in examining the dynamics of cooperation as a means of understanding societal change and organizational variability within human groups over time. Unlike earlier approaches to these issues, which used models designated voluntaristic or managerial, contemporary research articulates more explicitly with frameworks for cooperation and collective action used in other fields, thereby facilitating empirical testing through better definition of the costs, benefits, and social mechanisms associated with success or failure in coordinated group action. Current scholarship is nevertheless bifurcated along lines of epistemology and scale, which is understandable but problematic for forging a broader, more transdisciplinary field of cooperation studies. Here, we point to some areas of potential overlap by reviewing archeological research that places the dynamics of social cooperation and competition in the foreground of the emergence of large-scale societies, which we define as those having larger populations, greater concentrations of political power, and higher degrees of social inequality. We focus on key issues involving the communal-resource management of subsistence and other economic goods, as well as the revenue flows that undergird political institutions. Drawing on archeological cases from across the globe, with greater detail from our area of expertise in Mesoamerica, we offer suggestions for strengthening analytical methods and generating more transdisciplinary research programs that address human societies across scalar and temporal spectra. © 2016 Wiley Periodicals, Inc.

  19. Large Scale Traffic Simulations

    DOT National Transportation Integrated Search

    1997-01-01

    Large scale microscopic (i.e. vehicle-based) traffic simulations pose high demands on computation speed in at least two application areas: (i) real-time traffic forecasting, and (ii) long-term planning applications (where repeated "looping" between t...

  20. Leveraging human oversight and intervention in large-scale parallel processing of open-source data

    NASA Astrophysics Data System (ADS)

    Casini, Enrico; Suri, Niranjan; Bradshaw, Jeffrey M.

    2015-05-01

    The popularity of cloud computing along with the increased availability of cheap storage have led to the necessity of elaboration and transformation of large volumes of open-source data, all in parallel. One way to handle such extensive volumes of information properly is to take advantage of distributed computing frameworks like Map-Reduce. Unfortunately, an entirely automated approach that excludes human intervention is often unpredictable and error prone. Highly accurate data processing and decision-making can be achieved by supporting an automatic process through human collaboration, in a variety of environments such as warfare, cyber security and threat monitoring. Although this mutual participation seems easily exploitable, human-machine collaboration in the field of data analysis presents several challenges. First, due to the asynchronous nature of human intervention, it is necessary to verify that once a correction is made, all the necessary reprocessing is done in chain. Second, it is often needed to minimize the amount of reprocessing in order to optimize the usage of resources due to limited availability. In order to improve on these strict requirements, this paper introduces improvements to an innovative approach for human-machine collaboration in the processing of large amounts of open-source data in parallel.

  1. Modular cell-internalizing aptamer nanostructure enables targeted delivery of large functional RNAs in cancer cell lines.

    PubMed

    Porciani, David; Cardwell, Leah N; Tawiah, Kwaku D; Alam, Khalid K; Lange, Margaret J; Daniels, Mark A; Burke, Donald H

    2018-06-11

    Large RNAs and ribonucleoprotein complexes have powerful therapeutic potential, but effective cell-targeted delivery tools are limited. Aptamers that internalize into target cells can deliver siRNAs (<15 kDa, 19-21 nt/strand). We demonstrate a modular nanostructure for cellular delivery of large, functional RNA payloads (50-80 kDa, 175-250 nt) by aptamers that recognize multiple human B cell cancer lines and transferrin receptor-expressing cells. Fluorogenic RNA reporter payloads enable accelerated testing of platform designs and rapid evaluation of assembly and internalization. Modularity is demonstrated by swapping in different targeting and payload aptamers. Both modules internalize into leukemic B cell lines and remained colocalized within endosomes. Fluorescence from internalized RNA persists for ≥2 h, suggesting a sizable window for aptamer payloads to exert influence upon targeted cells. This demonstration of aptamer-mediated, cell-internalizing delivery of large RNAs with retention of functional structure raises the possibility of manipulating endosomes and cells by delivering large aptamers and regulatory RNAs.

  2. Generation of large-scale magnetic fields by small-scale dynamo in shear flows

    DOE PAGES

    Squire, J.; Bhattacharjee, A.

    2015-10-20

    We propose a new mechanism for a turbulent mean-field dynamo in which the magnetic fluctuations resulting from a small-scale dynamo drive the generation of large-scale magnetic fields. This is in stark contrast to the common idea that small-scale magnetic fields should be harmful to large-scale dynamo action. These dynamos occur in the presence of a large-scale velocity shear and do not require net helicity, resulting from off-diagonal components of the turbulent resistivity tensor as the magnetic analogue of the "shear-current" effect. Furthermore, given the inevitable existence of nonhelical small-scale magnetic fields in turbulent plasmas, as well as the generic naturemore » of velocity shear, the suggested mechanism may help explain the generation of large-scale magnetic fields across a wide range of astrophysical objects.« less

  3. Target Earth: evidence for large-scale impact events.

    PubMed

    Grieve, R A

    1997-05-30

    Unlike the Moon, the Earth has retained only a small sample of its population of impact structures. Currently, over 150 impact structures are known and there are 15 instances of impact known from the stratigraphic record, some of which have been correlated with known impact structures. The terrestrial record is biased toward younger and larger structures on the stable cratonic areas of the crust, because of the effects of constant surface renewal on the Earth. The high level of endogenic geologic activity also affects the morphology and morphometry of terrestrial impact structures; although, the same general morphologic forms that occur on the other terrestrial planets can be observed. A terrestrial cratering rate of 5.6 +/- 2.8 x 10(-15) km-1 a-1 for structures > or = 20 km in diameter can be derived, which is equivalent to that estimated from astronomical observations. Although there are claims to the contrary, the overall uncertainties in the ages of structures in the impact record preclude the determination of any periodicity in the record. Small terrestrial impact structures are the result of the impact of iron or stony iron bodies, with weaker stony and icy bodies being crushed on atmospheric passage. At larger structures (>1 km), trace element geochemistry suggests that approximately 50% of the impact flux is from chondritic bodies, but this may be a function of the signal:noise ratio of the meteoritic tracer elements. Evidence for impact in the stratigraphic record is both chemical and physical. Although currently small in number, there are indications that more evidence will be forthcoming with time. Such searches for evidence of impact have been stimulated by the chemical and physical evidence of the involvement of impact at the K/T boundary. There will, however, be problems in differentiating geochemically the signal of even relatively large impact events from the background cosmic flux of every day meteoritic debris. Even with these biases and

  4. Large-scale machine learning and evaluation platform for real-time traffic surveillance

    NASA Astrophysics Data System (ADS)

    Eichel, Justin A.; Mishra, Akshaya; Miller, Nicholas; Jankovic, Nicholas; Thomas, Mohan A.; Abbott, Tyler; Swanson, Douglas; Keller, Joel

    2016-09-01

    In traffic engineering, vehicle detectors are trained on limited datasets, resulting in poor accuracy when deployed in real-world surveillance applications. Annotating large-scale high-quality datasets is challenging. Typically, these datasets have limited diversity; they do not reflect the real-world operating environment. There is a need for a large-scale, cloud-based positive and negative mining process and a large-scale learning and evaluation system for the application of automatic traffic measurements and classification. The proposed positive and negative mining process addresses the quality of crowd sourced ground truth data through machine learning review and human feedback mechanisms. The proposed learning and evaluation system uses a distributed cloud computing framework to handle data-scaling issues associated with large numbers of samples and a high-dimensional feature space. The system is trained using AdaBoost on 1,000,000 Haar-like features extracted from 70,000 annotated video frames. The trained real-time vehicle detector achieves an accuracy of at least 95% for 1/2 and about 78% for 19/20 of the time when tested on ˜7,500,000 video frames. At the end of 2016, the dataset is expected to have over 1 billion annotated video frames.

  5. FDTD method for laser absorption in metals for large scale problems.

    PubMed

    Deng, Chun; Ki, Hyungson

    2013-10-21

    The FDTD method has been successfully used for many electromagnetic problems, but its application to laser material processing has been limited because even a several-millimeter domain requires a prohibitively large number of grids. In this article, we present a novel FDTD method for simulating large-scale laser beam absorption problems, especially for metals, by enlarging laser wavelength while maintaining the material's reflection characteristics. For validation purposes, the proposed method has been tested with in-house FDTD codes to simulate p-, s-, and circularly polarized 1.06 μm irradiation on Fe and Sn targets, and the simulation results are in good agreement with theoretical predictions.

  6. Automatic three-dimensional measurement of large-scale structure based on vision metrology.

    PubMed

    Zhu, Zhaokun; Guan, Banglei; Zhang, Xiaohu; Li, Daokui; Yu, Qifeng

    2014-01-01

    All relevant key techniques involved in photogrammetric vision metrology for fully automatic 3D measurement of large-scale structure are studied. A new kind of coded target consisting of circular retroreflective discs is designed, and corresponding detection and recognition algorithms based on blob detection and clustering are presented. Then a three-stage strategy starting with view clustering is proposed to achieve automatic network orientation. As for matching of noncoded targets, the concept of matching path is proposed, and matches for each noncoded target are found by determination of the optimal matching path, based on a novel voting strategy, among all possible ones. Experiments on a fixed keel of airship have been conducted to verify the effectiveness and measuring accuracy of the proposed methods.

  7. Toward Increasing Fairness in Score Scale Calibrations Employed in International Large-Scale Assessments

    ERIC Educational Resources Information Center

    Oliveri, Maria Elena; von Davier, Matthias

    2014-01-01

    In this article, we investigate the creation of comparable score scales across countries in international assessments. We examine potential improvements to current score scale calibration procedures used in international large-scale assessments. Our approach seeks to improve fairness in scoring international large-scale assessments, which often…

  8. Large-scale sequence and structural comparisons of human naive and antigen-experienced antibody repertoires

    PubMed Central

    DeKosky, Brandon J.; Lungu, Oana I.; Park, Daechan; Johnson, Erik L.; Charab, Wissam; Chrysostomou, Constantine; Kuroda, Daisuke; Ellington, Andrew D.; Ippolito, Gregory C.; Gray, Jeffrey J.; Georgiou, George

    2016-01-01

    Elucidating how antigen exposure and selection shape the human antibody repertoire is fundamental to our understanding of B-cell immunity. We sequenced the paired heavy- and light-chain variable regions (VH and VL, respectively) from large populations of single B cells combined with computational modeling of antibody structures to evaluate sequence and structural features of human antibody repertoires at unprecedented depth. Analysis of a dataset comprising 55,000 antibody clusters from CD19+CD20+CD27− IgM-naive B cells, >120,000 antibody clusters from CD19+CD20+CD27+ antigen–experienced B cells, and >2,000 RosettaAntibody-predicted structural models across three healthy donors led to a number of key findings: (i) VH and VL gene sequences pair in a combinatorial fashion without detectable pairing restrictions at the population level; (ii) certain VH:VL gene pairs were significantly enriched or depleted in the antigen-experienced repertoire relative to the naive repertoire; (iii) antigen selection increased antibody paratope net charge and solvent-accessible surface area; and (iv) public heavy-chain third complementarity-determining region (CDR-H3) antibodies in the antigen-experienced repertoire showed signs of convergent paired light-chain genetic signatures, including shared light-chain third complementarity-determining region (CDR-L3) amino acid sequences and/or Vκ,λ–Jκ,λ genes. The data reported here address several longstanding questions regarding antibody repertoire selection and development and provide a benchmark for future repertoire-scale analyses of antibody responses to vaccination and disease. PMID:27114511

  9. Very Large Scale Integration (VLSI).

    ERIC Educational Resources Information Center

    Yeaman, Andrew R. J.

    Very Large Scale Integration (VLSI), the state-of-the-art production techniques for computer chips, promises such powerful, inexpensive computing that, in the future, people will be able to communicate with computer devices in natural language or even speech. However, before full-scale VLSI implementation can occur, certain salient factors must be…

  10. Screening of missing proteins in the human liver proteome by improved MRM-approach-based targeted proteomics.

    PubMed

    Chen, Chen; Liu, Xiaohui; Zheng, Weimin; Zhang, Lei; Yao, Jun; Yang, Pengyuan

    2014-04-04

    To completely annotate the human genome, the task of identifying and characterizing proteins that currently lack mass spectrometry (MS) evidence is inevitable and urgent. In this study, as the first effort to screen missing proteins in large scale, we developed an approach based on SDS-PAGE followed by liquid chromatography-multiple reaction monitoring (LC-MRM), for screening of those missing proteins with only a single peptide hit in the previous liver proteome data set. Proteins extracted from normal human liver were separated in SDS-PAGE and digested in split gel slice, and the resulting digests were then subjected to LC-schedule MRM analysis. The MRM assays were developed through synthesized crude peptides for target peptides. In total, the expressions of 57 target proteins were confirmed from 185 MRM assays in normal human liver tissues. Among the proved 57 one-hit wonders, 50 proteins are of the minimally redundant set in the PeptideAtlas database, 7 proteins even have none MS-based information previously in various biological processes. We conclude that our SDS-PAGE-MRM workflow can be a powerful approach to screen missing or poorly characterized proteins in different samples and to provide their quantity if detected. The MRM raw data have been uploaded to ISB/SRM Atlas/PASSEL (PXD000648).

  11. Survey on large scale system control methods

    NASA Technical Reports Server (NTRS)

    Mercadal, Mathieu

    1987-01-01

    The problem inherent to large scale systems such as power network, communication network and economic or ecological systems were studied. The increase in size and flexibility of future spacecraft has put those dynamical systems into the category of large scale systems, and tools specific to the class of large systems are being sought to design control systems that can guarantee more stability and better performance. Among several survey papers, reference was found to a thorough investigation on decentralized control methods. Especially helpful was the classification made of the different existing approaches to deal with large scale systems. A very similar classification is used, even though the papers surveyed are somehow different from the ones reviewed in other papers. Special attention is brought to the applicability of the existing methods to controlling large mechanical systems like large space structures. Some recent developments are added to this survey.

  12. Large-Scale 3D Printing: The Way Forward

    NASA Astrophysics Data System (ADS)

    Jassmi, Hamad Al; Najjar, Fady Al; Ismail Mourad, Abdel-Hamid

    2018-03-01

    Research on small-scale 3D printing has rapidly evolved, where numerous industrial products have been tested and successfully applied. Nonetheless, research on large-scale 3D printing, directed to large-scale applications such as construction and automotive manufacturing, yet demands a great a great deal of efforts. Large-scale 3D printing is considered an interdisciplinary topic and requires establishing a blended knowledge base from numerous research fields including structural engineering, materials science, mechatronics, software engineering, artificial intelligence and architectural engineering. This review article summarizes key topics of relevance to new research trends on large-scale 3D printing, particularly pertaining (1) technological solutions of additive construction (i.e. the 3D printers themselves), (2) materials science challenges, and (3) new design opportunities.

  13. Novel method to construct large-scale design space in lubrication process utilizing Bayesian estimation based on a small-scale design-of-experiment and small sets of large-scale manufacturing data.

    PubMed

    Maeda, Jin; Suzuki, Tatsuya; Takayama, Kozo

    2012-12-01

    A large-scale design space was constructed using a Bayesian estimation method with a small-scale design of experiments (DoE) and small sets of large-scale manufacturing data without enforcing a large-scale DoE. The small-scale DoE was conducted using various Froude numbers (X(1)) and blending times (X(2)) in the lubricant blending process for theophylline tablets. The response surfaces, design space, and their reliability of the compression rate of the powder mixture (Y(1)), tablet hardness (Y(2)), and dissolution rate (Y(3)) on a small scale were calculated using multivariate spline interpolation, a bootstrap resampling technique, and self-organizing map clustering. The constant Froude number was applied as a scale-up rule. Three experiments under an optimal condition and two experiments under other conditions were performed on a large scale. The response surfaces on the small scale were corrected to those on a large scale by Bayesian estimation using the large-scale results. Large-scale experiments under three additional sets of conditions showed that the corrected design space was more reliable than that on the small scale, even if there was some discrepancy in the pharmaceutical quality between the manufacturing scales. This approach is useful for setting up a design space in pharmaceutical development when a DoE cannot be performed at a commercial large manufacturing scale.

  14. Large-scale production of lipoplexes with long shelf-life.

    PubMed

    Clement, Jule; Kiefer, Karin; Kimpfler, Andrea; Garidel, Patrick; Peschka-Süss, Regine

    2005-01-01

    The instability of lipoplex formulations is a major obstacle to overcome before their commercial application in gene therapy. In this study, a continuous mixing technique for the large-scale preparation of lipoplexes followed by lyophilisation for increased stability and shelf-life has been developed. Lipoplexes were analysed for transfection efficiency and cytotoxicity in human aorta smooth muscle cells (HASMC) and a rat smooth muscle cell line (A-10 SMC). Homogeneity of lipid/DNA-products was investigated by photon correlation spectroscopy (PCS) and cryotransmission electron microscopy (cryo-TEM). Studies have been undertaken with DAC-30, a composition of 3beta-[N-(N,N'-dimethylaminoethane)-carbamoyl]-cholesterol (DAC-Chol) and dioleylphosphatidylethanolamine (DOPE) and a green fluorescent protein (GFP) expressing marker plasmid. A continuous mixing technique was compared to the small-scale preparation of lipoplexes by pipetting. Individual steps of the continuous mixing process were evaluated in order to optimise the manufacturing technique: lipid/plasmid ratio, composition of transfection medium, pre-treatment of the lipid, size of the mixing device, mixing procedure and the influence of the lyophilisation process. It could be shown that the method developed for production of lipoplexes on a large scale under sterile conditions led to lipoplexes with good transfection efficiencies combined with low cytotoxicity, improved characteristics and long shelf-life.

  15. Large-scale virtual screening on public cloud resources with Apache Spark.

    PubMed

    Capuccini, Marco; Ahmed, Laeeq; Schaal, Wesley; Laure, Erwin; Spjuth, Ola

    2017-01-01

    Structure-based virtual screening is an in-silico method to screen a target receptor against a virtual molecular library. Applying docking-based screening to large molecular libraries can be computationally expensive, however it constitutes a trivially parallelizable task. Most of the available parallel implementations are based on message passing interface, relying on low failure rate hardware and fast network connection. Google's MapReduce revolutionized large-scale analysis, enabling the processing of massive datasets on commodity hardware and cloud resources, providing transparent scalability and fault tolerance at the software level. Open source implementations of MapReduce include Apache Hadoop and the more recent Apache Spark. We developed a method to run existing docking-based screening software on distributed cloud resources, utilizing the MapReduce approach. We benchmarked our method, which is implemented in Apache Spark, docking a publicly available target receptor against [Formula: see text]2.2 M compounds. The performance experiments show a good parallel efficiency (87%) when running in a public cloud environment. Our method enables parallel Structure-based virtual screening on public cloud resources or commodity computer clusters. The degree of scalability that we achieve allows for trying out our method on relatively small libraries first and then to scale to larger libraries. Our implementation is named Spark-VS and it is freely available as open source from GitHub (https://github.com/mcapuccini/spark-vs).Graphical abstract.

  16. Large-scale Meteorological Patterns Associated with Extreme Precipitation Events over Portland, OR

    NASA Astrophysics Data System (ADS)

    Aragon, C.; Loikith, P. C.; Lintner, B. R.; Pike, M.

    2017-12-01

    Extreme precipitation events can have profound impacts on human life and infrastructure, with broad implications across a range of stakeholders. Changes to extreme precipitation events are a projected outcome of climate change that warrants further study, especially at regional- to local-scales. While global climate models are generally capable of simulating mean climate at global-to-regional scales with reasonable skill, resiliency and adaptation decisions are made at local-scales where most state-of-the-art climate models are limited by coarse resolution. Characterization of large-scale meteorological patterns associated with extreme precipitation events at local-scales can provide climatic information without this scale limitation, thus facilitating stakeholder decision-making. This research will use synoptic climatology as a tool by which to characterize the key large-scale meteorological patterns associated with extreme precipitation events in the Portland, Oregon metro region. Composite analysis of meteorological patterns associated with extreme precipitation days, and associated watershed-specific flooding, is employed to enhance understanding of the climatic drivers behind such events. The self-organizing maps approach is then used to characterize the within-composite variability of the large-scale meteorological patterns associated with extreme precipitation events, allowing us to better understand the different types of meteorological conditions that lead to high-impact precipitation events and associated hydrologic impacts. A more comprehensive understanding of the meteorological drivers of extremes will aid in evaluation of the ability of climate models to capture key patterns associated with extreme precipitation over Portland and to better interpret projections of future climate at impact-relevant scales.

  17. Data management strategies for multinational large-scale systems biology projects.

    PubMed

    Wruck, Wasco; Peuker, Martin; Regenbrecht, Christian R A

    2014-01-01

    Good accessibility of publicly funded research data is essential to secure an open scientific system and eventually becomes mandatory [Wellcome Trust will Penalise Scientists Who Don't Embrace Open Access. The Guardian 2012]. By the use of high-throughput methods in many research areas from physics to systems biology, large data collections are increasingly important as raw material for research. Here, we present strategies worked out by international and national institutions targeting open access to publicly funded research data via incentives or obligations to share data. Funding organizations such as the British Wellcome Trust therefore have developed data sharing policies and request commitment to data management and sharing in grant applications. Increased citation rates are a profound argument for sharing publication data. Pre-publication sharing might be rewarded by a data citation credit system via digital object identifiers (DOIs) which have initially been in use for data objects. Besides policies and incentives, good practice in data management is indispensable. However, appropriate systems for data management of large-scale projects for example in systems biology are hard to find. Here, we give an overview of a selection of open-source data management systems proved to be employed successfully in large-scale projects.

  18. Data management strategies for multinational large-scale systems biology projects

    PubMed Central

    Peuker, Martin; Regenbrecht, Christian R.A.

    2014-01-01

    Good accessibility of publicly funded research data is essential to secure an open scientific system and eventually becomes mandatory [Wellcome Trust will Penalise Scientists Who Don’t Embrace Open Access. The Guardian 2012]. By the use of high-throughput methods in many research areas from physics to systems biology, large data collections are increasingly important as raw material for research. Here, we present strategies worked out by international and national institutions targeting open access to publicly funded research data via incentives or obligations to share data. Funding organizations such as the British Wellcome Trust therefore have developed data sharing policies and request commitment to data management and sharing in grant applications. Increased citation rates are a profound argument for sharing publication data. Pre-publication sharing might be rewarded by a data citation credit system via digital object identifiers (DOIs) which have initially been in use for data objects. Besides policies and incentives, good practice in data management is indispensable. However, appropriate systems for data management of large-scale projects for example in systems biology are hard to find. Here, we give an overview of a selection of open-source data management systems proved to be employed successfully in large-scale projects. PMID:23047157

  19. A relativistic signature in large-scale structure

    NASA Astrophysics Data System (ADS)

    Bartolo, Nicola; Bertacca, Daniele; Bruni, Marco; Koyama, Kazuya; Maartens, Roy; Matarrese, Sabino; Sasaki, Misao; Verde, Licia; Wands, David

    2016-09-01

    In General Relativity, the constraint equation relating metric and density perturbations is inherently nonlinear, leading to an effective non-Gaussianity in the dark matter density field on large scales-even if the primordial metric perturbation is Gaussian. Intrinsic non-Gaussianity in the large-scale dark matter overdensity in GR is real and physical. However, the variance smoothed on a local physical scale is not correlated with the large-scale curvature perturbation, so that there is no relativistic signature in the galaxy bias when using the simplest model of bias. It is an open question whether the observable mass proxies such as luminosity or weak lensing correspond directly to the physical mass in the simple halo bias model. If not, there may be observables that encode this relativistic signature.

  20. Human seizures couple across spatial scales through travelling wave dynamics

    NASA Astrophysics Data System (ADS)

    Martinet, L.-E.; Fiddyment, G.; Madsen, J. R.; Eskandar, E. N.; Truccolo, W.; Eden, U. T.; Cash, S. S.; Kramer, M. A.

    2017-04-01

    Epilepsy--the propensity toward recurrent, unprovoked seizures--is a devastating disease affecting 65 million people worldwide. Understanding and treating this disease remains a challenge, as seizures manifest through mechanisms and features that span spatial and temporal scales. Here we address this challenge through the analysis and modelling of human brain voltage activity recorded simultaneously across microscopic and macroscopic spatial scales. We show that during seizure large-scale neural populations spanning centimetres of cortex coordinate with small neural groups spanning cortical columns, and provide evidence that rapidly propagating waves of activity underlie this increased inter-scale coupling. We develop a corresponding computational model to propose specific mechanisms--namely, the effects of an increased extracellular potassium concentration diffusing in space--that support the observed spatiotemporal dynamics. Understanding the multi-scale, spatiotemporal dynamics of human seizures--and connecting these dynamics to specific biological mechanisms--promises new insights to treat this devastating disease.

  1. Ectopically tethered CP190 induces large-scale chromatin decondensation

    NASA Astrophysics Data System (ADS)

    Ahanger, Sajad H.; Günther, Katharina; Weth, Oliver; Bartkuhn, Marek; Bhonde, Ramesh R.; Shouche, Yogesh S.; Renkawitz, Rainer

    2014-01-01

    Insulator mediated alteration in higher-order chromatin and/or nucleosome organization is an important aspect of epigenetic gene regulation. Recent studies have suggested a key role for CP190 in such processes. In this study, we analysed the effects of ectopically tethered insulator factors on chromatin structure and found that CP190 induces large-scale decondensation when targeted to a condensed lacO array in mammalian and Drosophila cells. In contrast, dCTCF alone, is unable to cause such a decondensation, however, when CP190 is present, dCTCF recruits it to the lacO array and mediates chromatin unfolding. The CP190 induced opening of chromatin may not be correlated with transcriptional activation, as binding of CP190 does not enhance luciferase activity in reporter assays. We propose that CP190 may mediate histone modification and chromatin remodelling activity to induce an open chromatin state by its direct recruitment or targeting by a DNA binding factor such as dCTCF.

  2. Detection and identification of human targets in radar data

    NASA Astrophysics Data System (ADS)

    Gürbüz, Sevgi Z.; Melvin, William L.; Williams, Douglas B.

    2007-04-01

    Radar offers unique advantages over other sensors, such as visual or seismic sensors, for human target detection. Many situations, especially military applications, prevent the placement of video cameras or implantment seismic sensors in the area being observed, because of security or other threats. However, radar can operate far away from potential targets, and functions during daytime as well as nighttime, in virtually all weather conditions. In this paper, we examine the problem of human target detection and identification using single-channel, airborne, synthetic aperture radar (SAR). Human targets are differentiated from other detected slow-moving targets by analyzing the spectrogram of each potential target. Human spectrograms are unique, and can be used not just to identify targets as human, but also to determine features about the human target being observed, such as size, gender, action, and speed. A 12-point human model, together with kinematic equations of motion for each body part, is used to calculate the expected target return and spectrogram. A MATLAB simulation environment is developed including ground clutter, human and non-human targets for the testing of spectrogram-based detection and identification algorithms. Simulations show that spectrograms have some ability to detect and identify human targets in low noise. An example gender discrimination system correctly detected 83.97% of males and 91.11% of females. The problems and limitations of spectrogram-based methods in high clutter environments are discussed. The SNR loss inherent to spectrogram-based methods is quantified. An alternate detection and identification method that will be used as a basis for future work is proposed.

  3. Stochastic characterization of small-scale algorithms for human sensory processing

    NASA Astrophysics Data System (ADS)

    Neri, Peter

    2010-12-01

    Human sensory processing can be viewed as a functional H mapping a stimulus vector s into a decisional variable r. We currently have no direct access to r; rather, the human makes a decision based on r in order to drive subsequent behavior. It is this (typically binary) decision that we can measure. For example, there may be two external stimuli s[0] and s[1], mapped onto r[0] and r[1] by the sensory apparatus H; the human chooses the stimulus associated with largest r. This kind of decisional transduction poses a major challenge for an accurate characterization of H. In this article, we explore a specific approach based on a behavioral variant of reverse correlation techniques, where the input s contains a target signal corrupted by a controlled noisy perturbation. The presence of the target signal poses an additional challenge because it distorts the otherwise unbiased nature of the noise source. We consider issues arising from both the decisional transducer and the target signal, their impact on system identification, and ways to handle them effectively for system characterizations that extend to second-order functional approximations with associated small-scale cascade models.

  4. Integration and segregation of large-scale brain networks during short-term task automatization

    PubMed Central

    Mohr, Holger; Wolfensteller, Uta; Betzel, Richard F.; Mišić, Bratislav; Sporns, Olaf; Richiardi, Jonas; Ruge, Hannes

    2016-01-01

    The human brain is organized into large-scale functional networks that can flexibly reconfigure their connectivity patterns, supporting both rapid adaptive control and long-term learning processes. However, it has remained unclear how short-term network dynamics support the rapid transformation of instructions into fluent behaviour. Comparing fMRI data of a learning sample (N=70) with a control sample (N=67), we find that increasingly efficient task processing during short-term practice is associated with a reorganization of large-scale network interactions. Practice-related efficiency gains are facilitated by enhanced coupling between the cingulo-opercular network and the dorsal attention network. Simultaneously, short-term task automatization is accompanied by decreasing activation of the fronto-parietal network, indicating a release of high-level cognitive control, and a segregation of the default mode network from task-related networks. These findings suggest that short-term task automatization is enabled by the brain's ability to rapidly reconfigure its large-scale network organization involving complementary integration and segregation processes. PMID:27808095

  5. Large-scale production of human pluripotent stem cell derived cardiomyocytes.

    PubMed

    Kempf, Henning; Andree, Birgit; Zweigerdt, Robert

    2016-01-15

    Regenerative medicine, including preclinical studies in large animal models and tissue engineering approaches as well as innovative assays for drug discovery, will require the constant supply of hPSC-derived cardiomyocytes and other functional progenies. Respective cell production processes must be robust, economically viable and ultimately GMP-compliant. Recent research has enabled transition of lab scale protocols for hPSC expansion and cardiomyogenic differentiation towards more controlled processing in industry-compatible culture platforms. Here, advanced strategies for the cultivation and differentiation of hPSCs will be reviewed by focusing on stirred bioreactor-based techniques for process upscaling. We will discuss how cardiomyocyte mass production might benefit from recent findings such as cell expansion at the cardiovascular progenitor state. Finally, remaining challenges will be highlighted, specifically regarding three dimensional (3D) hPSC suspension culture and critical safety issues ahead of clinical translation. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Sound production due to large-scale coherent structures

    NASA Technical Reports Server (NTRS)

    Gatski, T. B.

    1979-01-01

    The acoustic pressure fluctuations due to large-scale finite amplitude disturbances in a free turbulent shear flow are calculated. The flow is decomposed into three component scales; the mean motion, the large-scale wave-like disturbance, and the small-scale random turbulence. The effect of the large-scale structure on the flow is isolated by applying both a spatial and phase average on the governing differential equations and by initially taking the small-scale turbulence to be in energetic equilibrium with the mean flow. The subsequent temporal evolution of the flow is computed from global energetic rate equations for the different component scales. Lighthill's theory is then applied to the region with the flowfield as the source and an observer located outside the flowfield in a region of uniform velocity. Since the time history of all flow variables is known, a minimum of simplifying assumptions for the Lighthill stress tensor is required, including no far-field approximations. A phase average is used to isolate the pressure fluctuations due to the large-scale structure, and also to isolate the dynamic process responsible. Variation of mean square pressure with distance from the source is computed to determine the acoustic far-field location and decay rate, and, in addition, spectra at various acoustic field locations are computed and analyzed. Also included are the effects of varying the growth and decay of the large-scale disturbance on the sound produced.

  7. Hydrometeorological variability on a large french catchment and its relation to large-scale circulation across temporal scales

    NASA Astrophysics Data System (ADS)

    Massei, Nicolas; Dieppois, Bastien; Fritier, Nicolas; Laignel, Benoit; Debret, Maxime; Lavers, David; Hannah, David

    2015-04-01

    In the present context of global changes, considerable efforts have been deployed by the hydrological scientific community to improve our understanding of the impacts of climate fluctuations on water resources. Both observational and modeling studies have been extensively employed to characterize hydrological changes and trends, assess the impact of climate variability or provide future scenarios of water resources. In the aim of a better understanding of hydrological changes, it is of crucial importance to determine how and to what extent trends and long-term oscillations detectable in hydrological variables are linked to global climate oscillations. In this work, we develop an approach associating large-scale/local-scale correlation, enmpirical statistical downscaling and wavelet multiresolution decomposition of monthly precipitation and streamflow over the Seine river watershed, and the North Atlantic sea level pressure (SLP) in order to gain additional insights on the atmospheric patterns associated with the regional hydrology. We hypothesized that: i) atmospheric patterns may change according to the different temporal wavelengths defining the variability of the signals; and ii) definition of those hydrological/circulation relationships for each temporal wavelength may improve the determination of large-scale predictors of local variations. The results showed that the large-scale/local-scale links were not necessarily constant according to time-scale (i.e. for the different frequencies characterizing the signals), resulting in changing spatial patterns across scales. This was then taken into account by developing an empirical statistical downscaling (ESD) modeling approach which integrated discrete wavelet multiresolution analysis for reconstructing local hydrometeorological processes (predictand : precipitation and streamflow on the Seine river catchment) based on a large-scale predictor (SLP over the Euro-Atlantic sector) on a monthly time-step. This approach

  8. The Challenge of Large-Scale Literacy Improvement

    ERIC Educational Resources Information Center

    Levin, Ben

    2010-01-01

    This paper discusses the challenge of making large-scale improvements in literacy in schools across an entire education system. Despite growing interest and rhetoric, there are very few examples of sustained, large-scale change efforts around school-age literacy. The paper reviews 2 instances of such efforts, in England and Ontario. After…

  9. Cryogenic hydrogen fuel for controlled inertial confinement fusion (formation of reactor-scale cryogenic targets)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aleksandrova, I. V.; Koresheva, E. R., E-mail: elena.koresheva@gmail.com; Krokhin, O. N.

    2016-12-15

    In inertial fusion energy research, considerable attention has recently been focused on low-cost fabrication of a large number of targets by developing a specialized layering module of repeatable operation. The targets must be free-standing, or unmounted. Therefore, the development of a target factory for inertial confinement fusion (ICF) is based on methods that can ensure a cost-effective target production with high repeatability. Minimization of the amount of tritium (i.e., minimization of time and space at all production stages) is a necessary condition as well. Additionally, the cryogenic hydrogen fuel inside the targets must have a structure (ultrafine layers—the grain sizemore » should be scaled back to the nanometer range) that supports the fuel layer survivability under target injection and transport through the reactor chamber. To meet the above requirements, significant progress has been made at the Lebedev Physical Institute (LPI) in the technology developed on the basis of rapid fuel layering inside moving free-standing targets (FST), also referred to as the FST layering method. Owing to the research carried out at LPI, unique experience has been gained in the development of the FST-layering module for target fabrication with an ultrafine fuel layer, including a reactor- scale target design. This experience can be used for the development of the next-generation FST-layering module for construction of a prototype of a target factory for power laser facilities and inertial fusion power plants.« less

  10. Large-scale influences in near-wall turbulence.

    PubMed

    Hutchins, Nicholas; Marusic, Ivan

    2007-03-15

    Hot-wire data acquired in a high Reynolds number facility are used to illustrate the need for adequate scale separation when considering the coherent structure in wall-bounded turbulence. It is found that a large-scale motion in the log region becomes increasingly comparable in energy to the near-wall cycle as the Reynolds number increases. Through decomposition of fluctuating velocity signals, it is shown that this large-scale motion has a distinct modulating influence on the small-scale energy (akin to amplitude modulation). Reassessment of DNS data, in light of these results, shows similar trends, with the rate and intensity of production due to the near-wall cycle subject to a modulating influence from the largest-scale motions.

  11. Large-scale coupling dynamics of instructed reversal learning.

    PubMed

    Mohr, Holger; Wolfensteller, Uta; Ruge, Hannes

    2018-02-15

    The ability to rapidly learn from others by instruction is an important characteristic of human cognition. A recent study found that the rapid transfer from initial instructions to fluid behavior is supported by changes of functional connectivity between and within several large-scale brain networks, and particularly by the coupling of the dorsal attention network (DAN) with the cingulo-opercular network (CON). In the present study, we extended this approach to investigate how these brain networks interact when stimulus-response mappings are altered by novel instructions. We hypothesized that residual stimulus-response associations from initial practice might negatively impact the ability to implement novel instructions. Using functional imaging and large-scale connectivity analysis, we found that functional coupling between the CON and DAN was generally at a higher level during initial than reversal learning. Examining the learning-related connectivity dynamics between the CON and DAN in more detail by means of multivariate patterns analyses, we identified a specific subset of connections which showed a particularly high increase in connectivity during initial learning compared to reversal learning. This finding suggests that the CON-DAN connections can be separated into two functionally dissociable yet spatially intertwined subsystems supporting different aspects of short-term task automatization. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Generation of large-scale magnetic fields by small-scale dynamo in shear flows

    NASA Astrophysics Data System (ADS)

    Squire, Jonathan; Bhattacharjee, Amitava

    2015-11-01

    A new mechanism for turbulent mean-field dynamo is proposed, in which the magnetic fluctuations resulting from a small-scale dynamo drive the generation of large-scale magnetic fields. This is in stark contrast to the common idea that small-scale magnetic fields should be harmful to large-scale dynamo action. These dynamos occur in the presence of large-scale velocity shear and do not require net helicity, resulting from off-diagonal components of the turbulent resistivity tensor as the magnetic analogue of the ``shear-current'' effect. The dynamo is studied using a variety of computational and analytic techniques, both when the magnetic fluctuations arise self-consistently through the small-scale dynamo and in lower Reynolds number regimes. Given the inevitable existence of non-helical small-scale magnetic fields in turbulent plasmas, as well as the generic nature of velocity shear, the suggested mechanism may help to explain generation of large-scale magnetic fields across a wide range of astrophysical objects. This work was supported by a Procter Fellowship at Princeton University, and the US Department of Energy Grant DE-AC02-09-CH11466.

  13. Log-polar mapping-based scale space tracking with adaptive target response

    NASA Astrophysics Data System (ADS)

    Li, Dongdong; Wen, Gongjian; Kuai, Yangliu; Zhang, Ximing

    2017-05-01

    Correlation filter-based tracking has exhibited impressive robustness and accuracy in recent years. Standard correlation filter-based trackers are restricted to translation estimation and equipped with fixed target response. These trackers produce an inferior performance when encountered with a significant scale variation or appearance change. We propose a log-polar mapping-based scale space tracker with an adaptive target response. This tracker transforms the scale variation of the target in the Cartesian space into a shift along the logarithmic axis in the log-polar space. A one-dimensional scale correlation filter is learned online to estimate the shift along the logarithmic axis. With the log-polar representation, scale estimation is achieved accurately without a multiresolution pyramid. To achieve an adaptive target response, a variance of the Gaussian function is computed from the response map and updated online with a learning rate parameter. Our log-polar mapping-based scale correlation filter and adaptive target response can be combined with any correlation filter-based trackers. In addition, the scale correlation filter can be extended to a two-dimensional correlation filter to achieve joint estimation of the scale variation and in-plane rotation. Experiments performed on an OTB50 benchmark demonstrate that our tracker achieves superior performance against state-of-the-art trackers.

  14. International Halley Watch: Discipline specialists for large scale phenomena

    NASA Technical Reports Server (NTRS)

    Brandt, J. C.; Niedner, M. B., Jr.

    1986-01-01

    The largest scale structures of comets, their tails, are extremely interesting from a physical point of view, and some of their properties are among the most spectacular displayed by comets. Because the tail(s) is an important component part of a comet, the Large-Scale Phenomena (L-SP) Discipline was created as one of eight different observational methods in which Halley data would be encouraged and collected from all around the world under the aspices of the International Halley Watch (IHW). The L-SP Discipline Specialist (DS) Team resides at NASA/Goddard Space Flight Center under the leadership of John C. Brandt, Malcolm B. Niedner, and their team of image-processing and computer specialists; Jurgan Rahe at NASA Headquarters completes the formal DS science staff. The team has adopted the study of disconnection events (DE) as its principal science target, and it is because of the rapid changes which occur in connection with DE's that such extensive global coverage was deemed necessary to assemble a complete record.

  15. Single-trabecula building block for large-scale finite element models of cancellous bone.

    PubMed

    Dagan, D; Be'ery, M; Gefen, A

    2004-07-01

    Recent development of high-resolution imaging of cancellous bone allows finite element (FE) analysis of bone tissue stresses and strains in individual trabeculae. However, specimen-specific stress/strain analyses can include effects of anatomical variations and local damage that can bias the interpretation of the results from individual specimens with respect to large populations. This study developed a standard (generic) 'building-block' of a trabecula for large-scale FE models. Being parametric and based on statistics of dimensions of ovine trabeculae, this building block can be scaled for trabecular thickness and length and be used in commercial or custom-made FE codes to construct generic, large-scale FE models of bone, using less computer power than that currently required to reproduce the accurate micro-architecture of trabecular bone. Orthogonal lattices constructed with this building block, after it was scaled to trabeculae of the human proximal femur, provided apparent elastic moduli of approximately 150 MPa, in good agreement with experimental data for the stiffness of cancellous bone from this site. Likewise, lattices with thinner, osteoporotic-like trabeculae could predict a reduction of approximately 30% in the apparent elastic modulus, as reported in experimental studies of osteoporotic femora. Based on these comparisons, it is concluded that the single-trabecula element developed in the present study is well-suited for representing cancellous bone in large-scale generic FE simulations.

  16. Culture rather than genes provides greater scope for the evolution of large-scale human prosociality

    PubMed Central

    Bell, Adrian V.; Richerson, Peter J.; McElreath, Richard

    2009-01-01

    Whether competition among large groups played an important role in human social evolution is dependent on how variation, whether cultural or genetic, is maintained between groups. Comparisons between genetic and cultural differentiation between neighboring groups show how natural selection on large groups is more plausible on cultural rather than genetic variation. PMID:19822753

  17. Fabrication of the HIAD Large-Scale Demonstration Assembly and Upcoming Mission Applications

    NASA Technical Reports Server (NTRS)

    Swanson, G. T.; Johnson, R. K.; Hughes, S. J.; Dinonno, J. M.; Cheatwood, F M.

    2017-01-01

    Over a decade of work has been conducted in the development of NASAs Hypersonic Inflatable Aerodynamic Decelerator (HIAD) technology. This effort has included multiple ground test campaigns and flight tests culminating in the HIAD projects second generation (Gen-2) deployable aeroshell system and associated analytical tools. NASAs HIAD project team has developed, fabricated, and tested inflatable structures (IS) integrated with flexible thermal protection system (F-TPS), ranging in diameters from 3-6m, with cone angles of 60 and 70 deg.In 2015, United Launch Alliance (ULA) announced that they will use a HIAD (10-12m) as part of their Sensible, Modular, Autonomous Return Technology (SMART) for their upcoming Vulcan rocket. ULA expects SMART reusability, coupled with other advancements for Vulcan, will substantially reduce the cost of access to space. The first booster engine recovery via HIAD is scheduled for 2024. To meet this near-term need, as well as future NASA applications, the HIAD team is investigating taking the technology to the 10-15m diameter scale.In the last year, many significant development and fabrication efforts have been accomplished, culminating in the construction of a large-scale inflatable structure demonstration assembly. This assembly incorporated the first three tori for a 12m Mars Human-Scale Pathfinder HIAD conceptual design that was constructed with the current state of the art material set. Numerous design trades and torus fabrication demonstrations preceded this effort. In 2016, three large-scale tori (0.61m cross-section) and six subscale tori (0.25m cross-section) were manufactured to demonstrate fabrication techniques using the newest candidate material sets. These tori were tested to evaluate durability and load capacity. This work led to the selection of the inflatable structures third generation (Gen-3) structural liner. In late 2016, the three tori required for the large-scale demonstration assembly were fabricated, and then

  18. PKI security in large-scale healthcare networks.

    PubMed

    Mantas, Georgios; Lymberopoulos, Dimitrios; Komninos, Nikos

    2012-06-01

    During the past few years a lot of PKI (Public Key Infrastructures) infrastructures have been proposed for healthcare networks in order to ensure secure communication services and exchange of data among healthcare professionals. However, there is a plethora of challenges in these healthcare PKI infrastructures. Especially, there are a lot of challenges for PKI infrastructures deployed over large-scale healthcare networks. In this paper, we propose a PKI infrastructure to ensure security in a large-scale Internet-based healthcare network connecting a wide spectrum of healthcare units geographically distributed within a wide region. Furthermore, the proposed PKI infrastructure facilitates the trust issues that arise in a large-scale healthcare network including multi-domain PKI infrastructures.

  19. Application of Large-Scale Aptamer-Based Proteomic Profiling to Planned Myocardial Infarctions.

    PubMed

    Jacob, Jaison; Ngo, Debby; Finkel, Nancy; Pitts, Rebecca; Gleim, Scott; Benson, Mark D; Keyes, Michelle J; Farrell, Laurie A; Morgan, Thomas; Jennings, Lori L; Gerszten, Robert E

    2018-03-20

    Emerging proteomic technologies using novel affinity-based reagents allow for efficient multiplexing with high-sample throughput. To identify early biomarkers of myocardial injury, we recently applied an aptamer-based proteomic profiling platform that measures 1129 proteins to samples from patients undergoing septal alcohol ablation for hypertrophic cardiomyopathy, a human model of planned myocardial injury. Here, we examined the scalability of this approach using a markedly expanded platform to study a far broader range of human proteins in the context of myocardial injury. We applied a highly multiplexed, expanded proteomic technique that uses single-stranded DNA aptamers to assay 4783 human proteins (4137 distinct human gene targets) to derivation and validation cohorts of planned myocardial injury, individuals with spontaneous myocardial infarction, and at-risk controls. We found 376 target proteins that significantly changed in the blood after planned myocardial injury in a derivation cohort (n=20; P <1.05E-05, 1-way repeated measures analysis of variance, Bonferroni threshold). Two hundred forty-seven of these proteins were validated in an independent planned myocardial injury cohort (n=15; P <1.33E-04, 1-way repeated measures analysis of variance); >90% were directionally consistent and reached nominal significance in the validation cohort. Among the validated proteins that were increased within 1 hour after planned myocardial injury, 29 were also elevated in patients with spontaneous myocardial infarction (n=63; P <6.17E-04). Many of the novel markers identified in our study are intracellular proteins not previously identified in the peripheral circulation or have functional roles relevant to myocardial injury. For example, the cardiac LIM protein, cysteine- and glycine-rich protein 3, is thought to mediate cardiac mechanotransduction and stress responses, whereas the mitochondrial ATP synthase F 0 subunit component is a vasoactive peptide on its release

  20. Analysis of calibration accuracy of cameras with different target sizes for large field of view

    NASA Astrophysics Data System (ADS)

    Zhang, Jin; Chai, Zhiwen; Long, Changyu; Deng, Huaxia; Ma, Mengchao; Zhong, Xiang; Yu, Huan

    2018-03-01

    Visual measurement plays an increasingly important role in the field o f aerospace, ship and machinery manufacturing. Camera calibration of large field-of-view is a critical part of visual measurement . For the issue a large scale target is difficult to be produced, and the precision can not to be guaranteed. While a small target has the advantage of produced of high precision, but only local optimal solutions can be obtained . Therefore, studying the most suitable ratio of the target size to the camera field of view to ensure the calibration precision requirement of the wide field-of-view is required. In this paper, the cameras are calibrated by a series of different dimensions of checkerboard calibration target s and round calibration targets, respectively. The ratios of the target size to the camera field-of-view are 9%, 18%, 27%, 36%, 45%, 54%, 63%, 72%, 81% and 90%. The target is placed in different positions in the camera field to obtain the camera parameters of different positions . Then, the distribution curves of the reprojection mean error of the feature points' restructure in different ratios are analyzed. The experimental data demonstrate that with the ratio of the target size to the camera field-of-view increas ing, the precision of calibration is accordingly improved, and the reprojection mean error changes slightly when the ratio is above 45%.

  1. Distributed multimodal data fusion for large scale wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Ertin, Emre

    2006-05-01

    Sensor network technology has enabled new surveillance systems where sensor nodes equipped with processing and communication capabilities can collaboratively detect, classify and track targets of interest over a large surveillance area. In this paper we study distributed fusion of multimodal sensor data for extracting target information from a large scale sensor network. Optimal tracking, classification, and reporting of threat events require joint consideration of multiple sensor modalities. Multiple sensor modalities improve tracking by reducing the uncertainty in the track estimates as well as resolving track-sensor data association problems. Our approach to solving the fusion problem with large number of multimodal sensors is construction of likelihood maps. The likelihood maps provide a summary data for the solution of the detection, tracking and classification problem. The likelihood map presents the sensory information in an easy format for the decision makers to interpret and is suitable with fusion of spatial prior information such as maps, imaging data from stand-off imaging sensors. We follow a statistical approach to combine sensor data at different levels of uncertainty and resolution. The likelihood map transforms each sensor data stream to a spatio-temporal likelihood map ideally suitable for fusion with imaging sensor outputs and prior geographic information about the scene. We also discuss distributed computation of the likelihood map using a gossip based algorithm and present simulation results.

  2. Large-scale velocities and primordial non-Gaussianity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schmidt, Fabian

    2010-09-15

    We study the peculiar velocities of density peaks in the presence of primordial non-Gaussianity. Rare, high-density peaks in the initial density field can be identified with tracers such as galaxies and clusters in the evolved matter distribution. The distribution of relative velocities of peaks is derived in the large-scale limit using two different approaches based on a local biasing scheme. Both approaches agree, and show that halos still stream with the dark matter locally as well as statistically, i.e. they do not acquire a velocity bias. Nonetheless, even a moderate degree of (not necessarily local) non-Gaussianity induces a significant skewnessmore » ({approx}0.1-0.2) in the relative velocity distribution, making it a potentially interesting probe of non-Gaussianity on intermediate to large scales. We also study two-point correlations in redshift space. The well-known Kaiser formula is still a good approximation on large scales, if the Gaussian halo bias is replaced with its (scale-dependent) non-Gaussian generalization. However, there are additional terms not encompassed by this simple formula which become relevant on smaller scales (k > or approx. 0.01h/Mpc). Depending on the allowed level of non-Gaussianity, these could be of relevance for future large spectroscopic surveys.« less

  3. Real-time evolution of a large-scale relativistic jet

    NASA Astrophysics Data System (ADS)

    Martí, Josep; Luque-Escamilla, Pedro L.; Romero, Gustavo E.; Sánchez-Sutil, Juan R.; Muñoz-Arjonilla, Álvaro J.

    2015-06-01

    Context. Astrophysical jets are ubiquitous in the Universe on all scales, but their large-scale dynamics and evolution in time are hard to observe since they usually develop at a very slow pace. Aims: We aim to obtain the first observational proof of the expected large-scale evolution and interaction with the environment in an astrophysical jet. Only jets from microquasars offer a chance to witness the real-time, full-jet evolution within a human lifetime, since they combine a "short", few parsec length with relativistic velocities. Methods: The methodology of this work is based on a systematic recalibraton of interferometric radio observations of microquasars available in public archives. In particular, radio observations of the microquasar GRS 1758-258 over less than two decades have provided the most striking results. Results: Significant morphological variations in the extended jet structure of GRS 1758-258 are reported here that were previously missed. Its northern radio lobe underwent a major morphological variation that rendered the hotspot undetectable in 2001 and reappeared again in the following years. The reported changes confirm the Galactic nature of the source. We tentatively interpret them in terms of the growth of instabilities in the jet flow. There is also evidence of surrounding cocoon. These results can provide a testbed for models accounting for the evolution of jets and their interaction with the environment.

  4. Ship detection using STFT sea background statistical modeling for large-scale oceansat remote sensing image

    NASA Astrophysics Data System (ADS)

    Wang, Lixia; Pei, Jihong; Xie, Weixin; Liu, Jinyuan

    2018-03-01

    Large-scale oceansat remote sensing images cover a big area sea surface, which fluctuation can be considered as a non-stationary process. Short-Time Fourier Transform (STFT) is a suitable analysis tool for the time varying nonstationary signal. In this paper, a novel ship detection method using 2-D STFT sea background statistical modeling for large-scale oceansat remote sensing images is proposed. First, the paper divides the large-scale oceansat remote sensing image into small sub-blocks, and 2-D STFT is applied to each sub-block individually. Second, the 2-D STFT spectrum of sub-blocks is studied and the obvious different characteristic between sea background and non-sea background is found. Finally, the statistical model for all valid frequency points in the STFT spectrum of sea background is given, and the ship detection method based on the 2-D STFT spectrum modeling is proposed. The experimental result shows that the proposed algorithm can detect ship targets with high recall rate and low missing rate.

  5. Large-Scale Culture and Genetic Modification of Human Natural Killer Cells for Cellular Therapy.

    PubMed

    Lapteva, Natalia; Parihar, Robin; Rollins, Lisa A; Gee, Adrian P; Rooney, Cliona M

    2016-01-01

    Recent advances in methods for the ex vivo expansion of human natural killer (NK) cells have facilitated the use of these powerful immune cells in clinical protocols. Further, the ability to genetically modify primary human NK cells following rapid expansion allows targeting and enhancement of their immune function. We have successfully adapted an expansion method for primary NK cells from peripheral blood mononuclear cells or from apheresis products in gas permeable rapid expansion devices (G-Rexes). Here, we describe an optimized protocol for rapid and robust NK cell expansion as well as a method for highly efficient retroviral transduction of these ex vivo expanded cells. These methodologies are good manufacturing practice (GMP) compliant and could be used for clinical-grade product manufacturing.

  6. Investigating large-scale brain dynamics using field potential recordings: analysis and interpretation.

    PubMed

    Pesaran, Bijan; Vinck, Martin; Einevoll, Gaute T; Sirota, Anton; Fries, Pascal; Siegel, Markus; Truccolo, Wilson; Schroeder, Charles E; Srinivasan, Ramesh

    2018-06-25

    New technologies to record electrical activity from the brain on a massive scale offer tremendous opportunities for discovery. Electrical measurements of large-scale brain dynamics, termed field potentials, are especially important to understanding and treating the human brain. Here, our goal is to provide best practices on how field potential recordings (electroencephalograms, magnetoencephalograms, electrocorticograms and local field potentials) can be analyzed to identify large-scale brain dynamics, and to highlight critical issues and limitations of interpretation in current work. We focus our discussion of analyses around the broad themes of activation, correlation, communication and coding. We provide recommendations for interpreting the data using forward and inverse models. The forward model describes how field potentials are generated by the activity of populations of neurons. The inverse model describes how to infer the activity of populations of neurons from field potential recordings. A recurring theme is the challenge of understanding how field potentials reflect neuronal population activity given the complexity of the underlying brain systems.

  7. Large-scale regions of antimatter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grobov, A. V., E-mail: alexey.grobov@gmail.com; Rubin, S. G., E-mail: sgrubin@mephi.ru

    2015-07-15

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era.

  8. Risk of large-scale fires in boreal forests of Finland under changing climate

    NASA Astrophysics Data System (ADS)

    Lehtonen, I.; Venäläinen, A.; Kämäräinen, M.; Peltola, H.; Gregow, H.

    2016-01-01

    The target of this work was to assess the impact of projected climate change on forest-fire activity in Finland with special emphasis on large-scale fires. In addition, we were particularly interested to examine the inter-model variability of the projected change of fire danger. For this purpose, we utilized fire statistics covering the period 1996-2014 and consisting of almost 20 000 forest fires, as well as daily meteorological data from five global climate models under representative concentration pathway RCP4.5 and RCP8.5 scenarios. The model data were statistically downscaled onto a high-resolution grid using the quantile-mapping method before performing the analysis. In examining the relationship between weather and fire danger, we applied the Canadian fire weather index (FWI) system. Our results suggest that the number of large forest fires may double or even triple during the present century. This would increase the risk that some of the fires could develop into real conflagrations which have become almost extinct in Finland due to active and efficient fire suppression. However, the results reveal substantial inter-model variability in the rate of the projected increase of forest-fire danger, emphasizing the large uncertainty related to the climate change signal in fire activity. We moreover showed that the majority of large fires in Finland occur within a relatively short period in May and June due to human activities and that FWI correlates poorer with the fire activity during this time of year than later in summer when lightning is a more important cause of fires.

  9. Material Targets for Scaling All-Spin Logic

    NASA Astrophysics Data System (ADS)

    Manipatruni, Sasikanth; Nikonov, Dmitri E.; Young, Ian A.

    2016-01-01

    All-spin-logic devices are promising candidates to augment and complement beyond-CMOS integrated circuit computing due to nonvolatility, ultralow operating voltages, higher logical efficiency, and high density integration. However, the path to reach lower energy-delay product performance compared to CMOS transistors currently is not clear. We show that scaling and engineering the nanoscale magnetic materials and interfaces is the key to realizing spin-logic devices that can surpass the energy-delay performance of CMOS transistors. With validated stochastic nanomagnetic and vector spin-transport numerical models, we derive the target material and interface properties for the nanomagnets and channels. We identify promising directions for material engineering and discovery focusing on the systematic scaling of magnetic anisotropy (Hk ) and saturation magnetization (Ms ), the use of perpendicular magnetic anisotropy, and the interface spin-mixing conductance of the ferromagnet-spin-channel interface (Gmix ). We provide systematic targets for scaling a spin-logic energy-delay product toward 2 aJ ns, comprehending the stochastic noise for nanomagnets.

  10. The Expanded Large Scale Gap Test

    DTIC Science & Technology

    1987-03-01

    NSWC TR 86-32 DTIC THE EXPANDED LARGE SCALE GAP TEST BY T. P. LIDDIARD D. PRICE RESEARCH AND TECHNOLOGY DEPARTMENT ’ ~MARCH 1987 Ap~proved for public...arises, to reduce the spread in the LSGT 50% gap value.) The worst charges, such as those with the highest or lowest densities, the largest re-pressed...Arlington, VA 22217 PE 62314N INS3A 1 RJ14E31 7R4TBK 11 TITLE (Include Security CIlmsilficatiorn The Expanded Large Scale Gap Test . 12. PEIRSONAL AUTHOR() T

  11. An Objective Evaluation of Mass Scaling Techniques Utilizing Computational Human Body Finite Element Models.

    PubMed

    Davis, Matthew L; Scott Gayzik, F

    2016-10-01

    Biofidelity response corridors developed from post-mortem human subjects are commonly used in the design and validation of anthropomorphic test devices and computational human body models (HBMs). Typically, corridors are derived from a diverse pool of biomechanical data and later normalized to a target body habitus. The objective of this study was to use morphed computational HBMs to compare the ability of various scaling techniques to scale response data from a reference to a target anthropometry. HBMs are ideally suited for this type of study since they uphold the assumptions of equal density and modulus that are implicit in scaling method development. In total, six scaling procedures were evaluated, four from the literature (equal-stress equal-velocity, ESEV, and three variations of impulse momentum) and two which are introduced in the paper (ESEV using a ratio of effective masses, ESEV-EffMass, and a kinetic energy approach). In total, 24 simulations were performed, representing both pendulum and full body impacts for three representative HBMs. These simulations were quantitatively compared using the International Organization for Standardization (ISO) ISO-TS18571 standard. Based on these results, ESEV-EffMass achieved the highest overall similarity score (indicating that it is most proficient at scaling a reference response to a target). Additionally, ESEV was found to perform poorly for two degree-of-freedom (DOF) systems. However, the results also indicated that no single technique was clearly the most appropriate for all scenarios.

  12. Gorilla and Orangutan Brains Conform to the Primate Cellular Scaling Rules: Implications for Human Evolution

    PubMed Central

    Herculano-Houzel, Suzana; Kaas, Jon H.

    2011-01-01

    Gorillas and orangutans are primates at least as large as humans, but their brains amount to about one third of the size of the human brain. This discrepancy has been used as evidence that the human brain is about 3 times larger than it should be for a primate species of its body size. In contrast to the view that the human brain is special in its size, we have suggested that it is the great apes that might have evolved bodies that are unusually large, on the basis of our recent finding that the cellular composition of the human brain matches that expected for a primate brain of its size, making the human brain a linearly scaled-up primate brain in its number of cells. To investigate whether the brain of great apes also conforms to the primate cellular scaling rules identified previously, we determine the numbers of neuronal and other cells that compose the orangutan and gorilla cerebella, use these numbers to calculate the size of the brain and of the cerebral cortex expected for these species, and show that these match the sizes described in the literature. Our results suggest that the brains of great apes also scale linearly in their numbers of neurons like other primate brains, including humans. The conformity of great apes and humans to the linear cellular scaling rules that apply to other primates that diverged earlier in primate evolution indicates that prehistoric Homo species as well as other hominins must have had brains that conformed to the same scaling rules, irrespective of their body size. We then used those scaling rules and published estimated brain volumes for various hominin species to predict the numbers of neurons that composed their brains. We predict that Homo heidelbergensis and Homo neanderthalensis had brains with approximately 80 billion neurons, within the range of variation found in modern Homo sapiens. We propose that while the cellular scaling rules that apply to the primate brain have remained stable in hominin evolution (since they

  13. Gorilla and orangutan brains conform to the primate cellular scaling rules: implications for human evolution.

    PubMed

    Herculano-Houzel, Suzana; Kaas, Jon H

    2011-01-01

    Gorillas and orangutans are primates at least as large as humans, but their brains amount to about one third of the size of the human brain. This discrepancy has been used as evidence that the human brain is about 3 times larger than it should be for a primate species of its body size. In contrast to the view that the human brain is special in its size, we have suggested that it is the great apes that might have evolved bodies that are unusually large, on the basis of our recent finding that the cellular composition of the human brain matches that expected for a primate brain of its size, making the human brain a linearly scaled-up primate brain in its number of cells. To investigate whether the brain of great apes also conforms to the primate cellular scaling rules identified previously, we determine the numbers of neuronal and other cells that compose the orangutan and gorilla cerebella, use these numbers to calculate the size of the brain and of the cerebral cortex expected for these species, and show that these match the sizes described in the literature. Our results suggest that the brains of great apes also scale linearly in their numbers of neurons like other primate brains, including humans. The conformity of great apes and humans to the linear cellular scaling rules that apply to other primates that diverged earlier in primate evolution indicates that prehistoric Homo species as well as other hominins must have had brains that conformed to the same scaling rules, irrespective of their body size. We then used those scaling rules and published estimated brain volumes for various hominin species to predict the numbers of neurons that composed their brains. We predict that Homo heidelbergensis and Homo neanderthalensis had brains with approximately 80 billion neurons, within the range of variation found in modern Homo sapiens. We propose that while the cellular scaling rules that apply to the primate brain have remained stable in hominin evolution (since they

  14. PPI4DOCK: large scale assessment of the use of homology models in free docking over more than 1000 realistic targets.

    PubMed

    Yu, Jinchao; Guerois, Raphaël

    2016-12-15

    Protein-protein docking methods are of great importance for understanding interactomes at the structural level. It has become increasingly appealing to use not only experimental structures but also homology models of unbound subunits as input for docking simulations. So far we are missing a large scale assessment of the success of rigid-body free docking methods on homology models. We explored how we could benefit from comparative modelling of unbound subunits to expand docking benchmark datasets. Starting from a collection of 3157 non-redundant, high X-ray resolution heterodimers, we developed the PPI4DOCK benchmark containing 1417 docking targets based on unbound homology models. Rigid-body docking by Zdock showed that for 1208 cases (85.2%), at least one correct decoy was generated, emphasizing the efficiency of rigid-body docking in generating correct assemblies. Overall, the PPI4DOCK benchmark contains a large set of realistic cases and provides new ground for assessing docking and scoring methodologies. Benchmark sets can be downloaded from http://biodev.cea.fr/interevol/ppi4dock/ CONTACT: guerois@cea.frSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  15. An informal paper on large-scale dynamic systems

    NASA Technical Reports Server (NTRS)

    Ho, Y. C.

    1975-01-01

    Large scale systems are defined as systems requiring more than one decision maker to control the system. Decentralized control and decomposition are discussed for large scale dynamic systems. Information and many-person decision problems are analyzed.

  16. Genome-scale CRISPR-Cas9 knockout screening in human cells.

    PubMed

    Shalem, Ophir; Sanjana, Neville E; Hartenian, Ella; Shi, Xi; Scott, David A; Mikkelson, Tarjei; Heckl, Dirk; Ebert, Benjamin L; Root, David E; Doench, John G; Zhang, Feng

    2014-01-03

    The simplicity of programming the CRISPR (clustered regularly interspaced short palindromic repeats)-associated nuclease Cas9 to modify specific genomic loci suggests a new way to interrogate gene function on a genome-wide scale. We show that lentiviral delivery of a genome-scale CRISPR-Cas9 knockout (GeCKO) library targeting 18,080 genes with 64,751 unique guide sequences enables both negative and positive selection screening in human cells. First, we used the GeCKO library to identify genes essential for cell viability in cancer and pluripotent stem cells. Next, in a melanoma model, we screened for genes whose loss is involved in resistance to vemurafenib, a therapeutic RAF inhibitor. Our highest-ranking candidates include previously validated genes NF1 and MED12, as well as novel hits NF2, CUL3, TADA2B, and TADA1. We observe a high level of consistency between independent guide RNAs targeting the same gene and a high rate of hit confirmation, demonstrating the promise of genome-scale screening with Cas9.

  17. Large-scale conservation planning in a multinational marine environment: cost matters.

    PubMed

    Mazor, Tessa; Giakoumi, Sylvaine; Kark, Salit; Possingham, Hugh P

    2014-07-01

    Explicitly including cost in marine conservation planning is essential for achieving feasible and efficient conservation outcomes. Yet, spatial priorities for marine conservation are still often based solely on biodiversity hotspots, species richness, and/or cumulative threat maps. This study aims to provide an approach for including cost when planning large-scale Marine Protected Area (MPA) networks that span multiple countries. Here, we explore the incorporation of cost in the complex setting of the Mediterranean Sea. In order to include cost in conservation prioritization, we developed surrogates that account for revenue from multiple marine sectors: commercial fishing, noncommercial fishing, and aquaculture. Such revenue can translate into an opportunity cost for the implementation of an MPA network. Using the software Marxan, we set conservation targets to protect 10% of the distribution of 77 threatened marine species in the Mediterranean Sea. We compared nine scenarios of opportunity cost by calculating the area and cost required to meet our targets. We further compared our spatial priorities with those that are considered consensus areas by several proposed prioritization schemes in the Mediterranean Sea, none of which explicitly considers cost. We found that for less than 10% of the Sea's area, our conservation targets can be achieved while incurring opportunity costs of less than 1%. In marine systems, we reveal that area is a poor cost surrogate and that the most effective surrogates are those that account for multiple sectors or stakeholders. Furthermore, our results indicate that including cost can greatly influence the selection of spatial priorities for marine conservation of threatened species. Although there are known limitations in multinational large-scale planning, attempting to devise more systematic and rigorous planning methods is especially critical given that collaborative conservation action is on the rise and global financial crisis

  18. On large-scale dynamo action at high magnetic Reynolds number

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cattaneo, F.; Tobias, S. M., E-mail: smt@maths.leeds.ac.uk

    2014-07-01

    We consider the generation of magnetic activity—dynamo waves—in the astrophysical limit of very large magnetic Reynolds number. We consider kinematic dynamo action for a system consisting of helical flow and large-scale shear. We demonstrate that large-scale dynamo waves persist at high Rm if the helical flow is characterized by a narrow band of spatial scales and the shear is large enough. However, for a wide band of scales the dynamo becomes small scale with a further increase of Rm, with dynamo waves re-emerging only if the shear is then increased. We show that at high Rm, the key effect ofmore » the shear is to suppress small-scale dynamo action, allowing large-scale dynamo action to be observed. We conjecture that this supports a general 'suppression principle'—large-scale dynamo action can only be observed if there is a mechanism that suppresses the small-scale fluctuations.« less

  19. Large-scale dynamos in rapidly rotating plane layer convection

    NASA Astrophysics Data System (ADS)

    Bushby, P. J.; Käpylä, P. J.; Masada, Y.; Brandenburg, A.; Favier, B.; Guervilly, C.; Käpylä, M. J.

    2018-05-01

    Context. Convectively driven flows play a crucial role in the dynamo processes that are responsible for producing magnetic activity in stars and planets. It is still not fully understood why many astrophysical magnetic fields have a significant large-scale component. Aims: Our aim is to investigate the dynamo properties of compressible convection in a rapidly rotating Cartesian domain, focusing upon a parameter regime in which the underlying hydrodynamic flow is known to be unstable to a large-scale vortex instability. Methods: The governing equations of three-dimensional non-linear magnetohydrodynamics (MHD) are solved numerically. Different numerical schemes are compared and we propose a possible benchmark case for other similar codes. Results: In keeping with previous related studies, we find that convection in this parameter regime can drive a large-scale dynamo. The components of the mean horizontal magnetic field oscillate, leading to a continuous overall rotation of the mean field. Whilst the large-scale vortex instability dominates the early evolution of the system, the large-scale vortex is suppressed by the magnetic field and makes a negligible contribution to the mean electromotive force that is responsible for driving the large-scale dynamo. The cycle period of the dynamo is comparable to the ohmic decay time, with longer cycles for dynamos in convective systems that are closer to onset. In these particular simulations, large-scale dynamo action is found only when vertical magnetic field boundary conditions are adopted at the upper and lower boundaries. Strongly modulated large-scale dynamos are found at higher Rayleigh numbers, with periods of reduced activity (grand minima-like events) occurring during transient phases in which the large-scale vortex temporarily re-establishes itself, before being suppressed again by the magnetic field.

  20. Large-scale anisotropy of the cosmic microwave background radiation

    NASA Technical Reports Server (NTRS)

    Silk, J.; Wilson, M. L.

    1981-01-01

    Inhomogeneities in the large-scale distribution of matter inevitably lead to the generation of large-scale anisotropy in the cosmic background radiation. The dipole, quadrupole, and higher order fluctuations expected in an Einstein-de Sitter cosmological model have been computed. The dipole and quadrupole anisotropies are comparable to the measured values, and impose important constraints on the allowable spectrum of large-scale matter density fluctuations. A significant dipole anisotropy is generated by the matter distribution on scales greater than approximately 100 Mpc. The large-scale anisotropy is insensitive to the ionization history of the universe since decoupling, and cannot easily be reconciled with a galaxy formation theory that is based on primordial adiabatic density fluctuations.

  1. Diversity and Relationships of Cocirculating Modern Human Rotaviruses Revealed Using Large-Scale Comparative Genomics

    PubMed Central

    McKell, Allison O.; Rippinger, Christine M.; McAllen, John K.; Akopov, Asmik; Kirkness, Ewen F.; Payne, Daniel C.; Edwards, Kathryn M.; Chappell, James D.; Patton, John T.

    2012-01-01

    Group A rotaviruses (RVs) are 11-segmented, double-stranded RNA viruses and are primary causes of gastroenteritis in young children. Despite their medical relevance, the genetic diversity of modern human RVs is poorly understood, and the impact of vaccine use on circulating strains remains unknown. In this study, we report the complete genome sequence analysis of 58 RVs isolated from children with severe diarrhea and/or vomiting at Vanderbilt University Medical Center (VUMC) in Nashville, TN, during the years spanning community vaccine implementation (2005 to 2009). The RVs analyzed include 36 G1P[8], 18 G3P[8], and 4 G12P[8] Wa-like genogroup 1 strains with VP6-VP1-VP2-VP3-NSP1-NSP2-NSP3-NSP4-NSP5/6 genotype constellations of I1-R1-C1-M1-A1-N1-T1-E1-H1. By constructing phylogenetic trees, we identified 2 to 5 subgenotype alleles for each gene. The results show evidence of intragenogroup gene reassortment among the cocirculating strains. However, several isolates from different seasons maintained identical allele constellations, consistent with the notion that certain RV clades persisted in the community. By comparing the genes of VUMC RVs to those of other archival and contemporary RV strains for which sequences are available, we defined phylogenetic lineages and verified that the diversity of the strains analyzed in this study reflects that seen in other regions of the world. Importantly, the VP4 and VP7 proteins encoded by VUMC RVs and other contemporary strains show amino acid changes in or near neutralization domains, which might reflect antigenic drift of the virus. Thus, this large-scale, comparative genomic study of modern human RVs provides significant insight into how this pathogen evolves during its spread in the community. PMID:22696651

  2. Large-scale human skin lipidomics by quantitative, high-throughput shotgun mass spectrometry.

    PubMed

    Sadowski, Tomasz; Klose, Christian; Gerl, Mathias J; Wójcik-Maciejewicz, Anna; Herzog, Ronny; Simons, Kai; Reich, Adam; Surma, Michal A

    2017-03-07

    The lipid composition of human skin is essential for its function; however the simultaneous quantification of a wide range of stratum corneum (SC) and sebaceous lipids is not trivial. We developed and validated a quantitative high-throughput shotgun mass spectrometry-based platform for lipid analysis of tape-stripped SC skin samples. It features coverage of 16 lipid classes; total quantification to the level of individual lipid molecules; high reproducibility and high-throughput capabilities. With this method we conducted a large lipidomic survey of 268 human SC samples, where we investigated the relationship between sampling depth and lipid composition, lipidome variability in samples from 14 different sampling sites on the human body and finally, we assessed the impact of age and sex on lipidome variability in 104 healthy subjects. We found sebaceous lipids to constitute an abundant component of the SC lipidome as they diffuse into the topmost SC layers forming a gradient. Lipidomic variability with respect to sampling depth, site and subject is considerable, and mainly accredited to sebaceous lipids, while stratum corneum lipids vary less. This stresses the importance of sampling design and the role of sebaceous lipids in skin studies.

  3. Large-scale manufacturing of GMP-compliant anti-EGFR targeted nanocarriers: production of doxorubicin-loaded anti-EGFR-immunoliposomes for a first-in-man clinical trial.

    PubMed

    Wicki, Andreas; Ritschard, Reto; Loesch, Uli; Deuster, Stefanie; Rochlitz, Christoph; Mamot, Christoph

    2015-04-30

    We describe the large-scale, GMP-compliant production process of doxorubicin-loaded and anti-EGFR-coated immunoliposomes (anti-EGFR-ILs-dox) used in a first-in-man, dose escalation clinical trial. 10 batches of this nanoparticle have been produced in clean room facilities. Stability data from the pre-GMP and the GMP batch indicate that the anti-EGFR-ILs-dox nanoparticle was stable for at least 18 months after release. Release criteria included visual inspection, sterility testing, as well as measurements of pH (pH 5.0-7.0), doxorubicin HCl concentration (0.45-0.55 mg/ml), endotoxin concentration (<1.21 IU/ml), leakage (<10%), particle size (Z-average of Caelyx ± 20 nm), and particle uptake (uptake absolute: >0.50 ng doxorubicin/μg protein; uptake relatively to PLD: >5 fold). All batches fulfilled the defined release criteria, indicating a high reproducibility as well as batch-to-batch uniformity of the main physico-chemical features of the nanoparticles in the setting of the large-scale GMP process. In the clinical trial, 29 patients were treated with this nanoparticle between 2007 and 2010. Pharmacokinetic data of anti-EGFR-ILs-dox collected during the clinical study revealed stability of the nanocarrier in vivo. Thus, reliable and GMP-compliant production of anti-EGFR-targeted nanoparticles for clinical application is feasible. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Using stroboscopic flow imaging to validate large-scale computational fluid dynamics simulations

    NASA Astrophysics Data System (ADS)

    Laurence, Ted A.; Ly, Sonny; Fong, Erika; Shusteff, Maxim; Randles, Amanda; Gounley, John; Draeger, Erik

    2017-02-01

    The utility and accuracy of computational modeling often requires direct validation against experimental measurements. The work presented here is motivated by taking a combined experimental and computational approach to determine the ability of large-scale computational fluid dynamics (CFD) simulations to understand and predict the dynamics of circulating tumor cells in clinically relevant environments. We use stroboscopic light sheet fluorescence imaging to track the paths and measure the velocities of fluorescent microspheres throughout a human aorta model. Performed over complex physiologicallyrealistic 3D geometries, large data sets are acquired with microscopic resolution over macroscopic distances.

  5. Economically viable large-scale hydrogen liquefaction

    NASA Astrophysics Data System (ADS)

    Cardella, U.; Decker, L.; Klein, H.

    2017-02-01

    The liquid hydrogen demand, particularly driven by clean energy applications, will rise in the near future. As industrial large scale liquefiers will play a major role within the hydrogen supply chain, production capacity will have to increase by a multiple of today’s typical sizes. The main goal is to reduce the total cost of ownership for these plants by increasing energy efficiency with innovative and simple process designs, optimized in capital expenditure. New concepts must ensure a manageable plant complexity and flexible operability. In the phase of process development and selection, a dimensioning of key equipment for large scale liquefiers, such as turbines and compressors as well as heat exchangers, must be performed iteratively to ensure technological feasibility and maturity. Further critical aspects related to hydrogen liquefaction, e.g. fluid properties, ortho-para hydrogen conversion, and coldbox configuration, must be analysed in detail. This paper provides an overview on the approach, challenges and preliminary results in the development of efficient as well as economically viable concepts for large-scale hydrogen liquefaction.

  6. Large-Scale Coronal Heating from the Solar Magnetic Network

    NASA Technical Reports Server (NTRS)

    Falconer, David A.; Moore, Ronald L.; Porter, Jason G.; Hathaway, David H.

    1999-01-01

    In Fe 12 images from SOHO/EIT, the quiet solar corona shows structure on scales ranging from sub-supergranular (i.e., bright points and coronal network) to multi- supergranular. In Falconer et al 1998 (Ap.J., 501, 386) we suppressed the large-scale background and found that the network-scale features are predominantly rooted in the magnetic network lanes at the boundaries of the supergranules. The emission of the coronal network and bright points contribute only about 5% of the entire quiet solar coronal Fe MI emission. Here we investigate the large-scale corona, the supergranular and larger-scale structure that we had previously treated as a background, and that emits 95% of the total Fe XII emission. We compare the dim and bright halves of the large- scale corona and find that the bright half is 1.5 times brighter than the dim half, has an order of magnitude greater area of bright point coverage, has three times brighter coronal network, and has about 1.5 times more magnetic flux than the dim half These results suggest that the brightness of the large-scale corona is more closely related to the large- scale total magnetic flux than to bright point activity. We conclude that in the quiet sun: (1) Magnetic flux is modulated (concentrated/diluted) on size scales larger than supergranules. (2) The large-scale enhanced magnetic flux gives an enhanced, more active, magnetic network and an increased incidence of network bright point formation. (3) The heating of the large-scale corona is dominated by more widespread, but weaker, network activity than that which heats the bright points. This work was funded by the Solar Physics Branch of NASA's office of Space Science through the SR&T Program and the SEC Guest Investigator Program.

  7. Large- and Very-Large-Scale Motions in Katabatic Flows Over Steep Slopes

    NASA Astrophysics Data System (ADS)

    Giometto, M. G.; Fang, J.; Salesky, S.; Parlange, M. B.

    2016-12-01

    Evidence of large- and very-large-scale motions populating the boundary layer in katabatic flows over steep slopes is presented via direct numerical simulations (DNSs). DNSs are performed at a modified Reynolds number (Rem = 967), considering four sloping angles (α = 60°, 70°, 80° and 90°). Large coherent structures prove to be strongly dependent on the inclination of the underlying surface. Spectra and co-spectra consistently show signatures of large-scale motions (LSMs), with streamwise extension on the order of the boundary layer thickness. A second low-wavenumber mode characterizes pre-multiplied spectra and co-spectra when the slope angle is below 70°, indicative of very-large-scale motions (VLSMs). In addition, conditional sampling and averaging shows how LSMs and VLSMs are induced by counter-rotating roll modes, in agreement with findings from canonical wall-bounded flows. VLSMs contribute to the stream-wise velocity variance and shear stress in the above-jet regions up to 30% and 45% respectively, whereas both LSMs and VLSMs are inactive in the near-wall regions.

  8. Large Scale Processes and Extreme Floods in Brazil

    NASA Astrophysics Data System (ADS)

    Ribeiro Lima, C. H.; AghaKouchak, A.; Lall, U.

    2016-12-01

    Persistent large scale anomalies in the atmospheric circulation and ocean state have been associated with heavy rainfall and extreme floods in water basins of different sizes across the world. Such studies have emerged in the last years as a new tool to improve the traditional, stationary based approach in flood frequency analysis and flood prediction. Here we seek to advance previous studies by evaluating the dominance of large scale processes (e.g. atmospheric rivers/moisture transport) over local processes (e.g. local convection) in producing floods. We consider flood-prone regions in Brazil as case studies and the role of large scale climate processes in generating extreme floods in such regions is explored by means of observed streamflow, reanalysis data and machine learning methods. The dynamics of the large scale atmospheric circulation in the days prior to the flood events are evaluated based on the vertically integrated moisture flux and its divergence field, which are interpreted in a low-dimensional space as obtained by machine learning techniques, particularly supervised kernel principal component analysis. In such reduced dimensional space, clusters are obtained in order to better understand the role of regional moisture recycling or teleconnected moisture in producing floods of a given magnitude. The convective available potential energy (CAPE) is also used as a measure of local convection activities. We investigate for individual sites the exceedance probability in which large scale atmospheric fluxes dominate the flood process. Finally, we analyze regional patterns of floods and how the scaling law of floods with drainage area responds to changes in the climate forcing mechanisms (e.g. local vs large scale).

  9. Generation of large-scale density fluctuations by buoyancy

    NASA Technical Reports Server (NTRS)

    Chasnov, J. R.; Rogallo, R. S.

    1990-01-01

    The generation of fluid motion from a state of rest by buoyancy forces acting on a homogeneous isotropic small-scale density field is considered. Nonlinear interactions between the generated fluid motion and the initial isotropic small-scale density field are found to create an anisotropic large-scale density field with spectrum proportional to kappa(exp 4). This large-scale density field is observed to result in an increasing Reynolds number of the fluid turbulence in its final period of decay.

  10. Large-Scale Aerosol Modeling and Analysis

    DTIC Science & Technology

    2009-09-30

    Modeling of Burning Emissions ( FLAMBE ) project, and other related parameters. Our plans to embed NAAPS inside NOGAPS may need to be put on hold...AOD, FLAMBE and FAROP at FNMOC are supported by 6.4 funding from PMW-120 for “Large-scale Atmospheric Models”, “Small-scale Atmospheric Models

  11. Genome-scale modeling of human metabolism - a systems biology approach.

    PubMed

    Mardinoglu, Adil; Gatto, Francesco; Nielsen, Jens

    2013-09-01

    Altered metabolism is linked to the appearance of various human diseases and a better understanding of disease-associated metabolic changes may lead to the identification of novel prognostic biomarkers and the development of new therapies. Genome-scale metabolic models (GEMs) have been employed for studying human metabolism in a systematic manner, as well as for understanding complex human diseases. In the past decade, such metabolic models - one of the fundamental aspects of systems biology - have started contributing to the understanding of the mechanistic relationship between genotype and phenotype. In this review, we focus on the construction of the Human Metabolic Reaction database, the generation of healthy cell type- and cancer-specific GEMs using different procedures, and the potential applications of these developments in the study of human metabolism and in the identification of metabolic changes associated with various disorders. We further examine how in silico genome-scale reconstructions can be employed to simulate metabolic flux distributions and how high-throughput omics data can be analyzed in a context-dependent fashion. Insights yielded from this mechanistic modeling approach can be used for identifying new therapeutic agents and drug targets as well as for the discovery of novel biomarkers. Finally, recent advancements in genome-scale modeling and the future challenge of developing a model of whole-body metabolism are presented. The emergent contribution of GEMs to personalized and translational medicine is also discussed. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Information Tailoring Enhancements for Large-Scale Social Data

    DTIC Science & Technology

    2016-06-15

    Intelligent Automation Incorporated Information Tailoring Enhancements for Large-Scale... Automation Incorporated Progress Report No. 3 Information Tailoring Enhancements for Large-Scale Social Data Submitted in accordance with...1 Work Performed within This Reporting Period .................................................... 2 1.1 Enhanced Named Entity Recognition (NER

  13. Automatic Selection of Order Parameters in the Analysis of Large Scale Molecular Dynamics Simulations.

    PubMed

    Sultan, Mohammad M; Kiss, Gert; Shukla, Diwakar; Pande, Vijay S

    2014-12-09

    Given the large number of crystal structures and NMR ensembles that have been solved to date, classical molecular dynamics (MD) simulations have become powerful tools in the atomistic study of the kinetics and thermodynamics of biomolecular systems on ever increasing time scales. By virtue of the high-dimensional conformational state space that is explored, the interpretation of large-scale simulations faces difficulties not unlike those in the big data community. We address this challenge by introducing a method called clustering based feature selection (CB-FS) that employs a posterior analysis approach. It combines supervised machine learning (SML) and feature selection with Markov state models to automatically identify the relevant degrees of freedom that separate conformational states. We highlight the utility of the method in the evaluation of large-scale simulations and show that it can be used for the rapid and automated identification of relevant order parameters involved in the functional transitions of two exemplary cell-signaling proteins central to human disease states.

  14. A large-scale peer teaching programme - acceptance and benefit.

    PubMed

    Schuetz, Elisabeth; Obirei, Barbara; Salat, Daniela; Scholz, Julia; Hann, Dagmar; Dethleffsen, Kathrin

    2017-08-01

    The involvement of students in the embodiment of university teaching through peer-assisted learning formats is commonly applied. Publications on this topic exclusively focus on strictly defined situations within the curriculum and selected target groups. This study, in contrast, presents and evaluates a large-scale structured and quality-assured peer teaching programme, which offers diverse and targeted courses throughout the preclinical part of the medical curriculum. The large-scale peer teaching programme consists of subject specific and interdisciplinary tutorials that address all scientific, physiological and anatomic subjects of the preclinical curriculum as well as tutorials with contents exceeding the formal curriculum. In the study year 2013/14 a total of 1,420 lessons were offered as part of the programme. Paper-based evaluations were conducted over the full range of courses. Acceptance and benefit of this peer teaching programme were evaluated in a retrospective study covering the period 2012 to 2014. Usage of tutorials by students who commenced their studies in 2012/13 (n=959) was analysed from 2012 till 2014. Based on the results of 13 first assessments in the preclinical subjects anatomy, biochemistry and physiology, the students were assigned to one of five groups. These groups were compared according to participation in the tutorials. To investigate the benefit of tutorials of the peer teaching programme, the results of biochemistry re-assessments of participants and non-participants of tutorials in the years 2012 till 2014 (n=188, 172 and 204, respectively) were compared using Kolmogorov-Smirnov- and Chi-square tests as well as the effect size Cohen's d. Almost 70 % of the students attended the voluntary additional programme during their preclinical studies. The students participating in the tutorials had achieved different levels of proficiency in first assessments. The acceptance of different kinds of tutorials appears to correlate with their

  15. A bibliographical surveys of large-scale systems

    NASA Technical Reports Server (NTRS)

    Corliss, W. R.

    1970-01-01

    A limited, partly annotated bibliography was prepared on the subject of large-scale system control. Approximately 400 references are divided into thirteen application areas, such as large societal systems and large communication systems. A first-author index is provided.

  16. Spatial fingerprints of community structure in human interaction network for an extensive set of large-scale regions.

    PubMed

    Kallus, Zsófia; Barankai, Norbert; Szüle, János; Vattay, Gábor

    2015-01-01

    Human interaction networks inferred from country-wide telephone activity recordings were recently used to redraw political maps by projecting their topological partitions into geographical space. The results showed remarkable spatial cohesiveness of the network communities and a significant overlap between the redrawn and the administrative borders. Here we present a similar analysis based on one of the most popular online social networks represented by the ties between more than 5.8 million of its geo-located users. The worldwide coverage of their measured activity allowed us to analyze the large-scale regional subgraphs of entire continents and an extensive set of examples for single countries. We present results for North and South America, Europe and Asia. In our analysis we used the well-established method of modularity clustering after an aggregation of the individual links into a weighted graph connecting equal-area geographical pixels. Our results show fingerprints of both of the opposing forces of dividing local conflicts and of uniting cross-cultural trends of globalization.

  17. Analyzing large scale genomic data on the cloud with Sparkhit

    PubMed Central

    Huang, Liren; Krüger, Jan

    2018-01-01

    Abstract Motivation The increasing amount of next-generation sequencing data poses a fundamental challenge on large scale genomic analytics. Existing tools use different distributed computational platforms to scale-out bioinformatics workloads. However, the scalability of these tools is not efficient. Moreover, they have heavy run time overheads when pre-processing large amounts of data. To address these limitations, we have developed Sparkhit: a distributed bioinformatics framework built on top of the Apache Spark platform. Results Sparkhit integrates a variety of analytical methods. It is implemented in the Spark extended MapReduce model. It runs 92–157 times faster than MetaSpark on metagenomic fragment recruitment and 18–32 times faster than Crossbow on data pre-processing. We analyzed 100 terabytes of data across four genomic projects in the cloud in 21 h, which includes the run times of cluster deployment and data downloading. Furthermore, our application on the entire Human Microbiome Project shotgun sequencing data was completed in 2 h, presenting an approach to easily associate large amounts of public datasets with reference data. Availability and implementation Sparkhit is freely available at: https://rhinempi.github.io/sparkhit/. Contact asczyrba@cebitec.uni-bielefeld.de Supplementary information Supplementary data are available at Bioinformatics online. PMID:29253074

  18. Large-scale topology and the default mode network in the mouse connectome

    PubMed Central

    Stafford, James M.; Jarrett, Benjamin R.; Miranda-Dominguez, Oscar; Mills, Brian D.; Cain, Nicholas; Mihalas, Stefan; Lahvis, Garet P.; Lattal, K. Matthew; Mitchell, Suzanne H.; David, Stephen V.; Fryer, John D.; Nigg, Joel T.; Fair, Damien A.

    2014-01-01

    Noninvasive functional imaging holds great promise for serving as a translational bridge between human and animal models of various neurological and psychiatric disorders. However, despite a depth of knowledge of the cellular and molecular underpinnings of atypical processes in mouse models, little is known about the large-scale functional architecture measured by functional brain imaging, limiting translation to human conditions. Here, we provide a robust processing pipeline to generate high-resolution, whole-brain resting-state functional connectivity MRI (rs-fcMRI) images in the mouse. Using a mesoscale structural connectome (i.e., an anterograde tracer mapping of axonal projections across the mouse CNS), we show that rs-fcMRI in the mouse has strong structural underpinnings, validating our procedures. We next directly show that large-scale network properties previously identified in primates are present in rodents, although they differ in several ways. Last, we examine the existence of the so-called default mode network (DMN)—a distributed functional brain system identified in primates as being highly important for social cognition and overall brain function and atypically functionally connected across a multitude of disorders. We show the presence of a potential DMN in the mouse brain both structurally and functionally. Together, these studies confirm the presence of basic network properties and functional networks of high translational importance in structural and functional systems in the mouse brain. This work clears the way for an important bridge measurement between human and rodent models, enabling us to make stronger conclusions about how regionally specific cellular and molecular manipulations in mice relate back to humans. PMID:25512496

  19. Moon-based Earth Observation for Large Scale Geoscience Phenomena

    NASA Astrophysics Data System (ADS)

    Guo, Huadong; Liu, Guang; Ding, Yixing

    2016-07-01

    The capability of Earth observation for large-global-scale natural phenomena needs to be improved and new observing platform are expected. We have studied the concept of Moon as an Earth observation in these years. Comparing with manmade satellite platform, Moon-based Earth observation can obtain multi-spherical, full-band, active and passive information,which is of following advantages: large observation range, variable view angle, long-term continuous observation, extra-long life cycle, with the characteristics of longevity ,consistency, integrity, stability and uniqueness. Moon-based Earth observation is suitable for monitoring the large scale geoscience phenomena including large scale atmosphere change, large scale ocean change,large scale land surface dynamic change,solid earth dynamic change,etc. For the purpose of establishing a Moon-based Earth observation platform, we already have a plan to study the five aspects as follows: mechanism and models of moon-based observing earth sciences macroscopic phenomena; sensors' parameters optimization and methods of moon-based Earth observation; site selection and environment of moon-based Earth observation; Moon-based Earth observation platform; and Moon-based Earth observation fundamental scientific framework.

  20. Testing Inflation with Large Scale Structure: Connecting Hopes with Reality

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alvarez, Marcello; Baldauf, T.; Bond, J. Richard

    2014-12-15

    The statistics of primordial curvature fluctuations are our window into the period of inflation, where these fluctuations were generated. To date, the cosmic microwave background has been the dominant source of information about these perturbations. Large-scale structure is, however, from where drastic improvements should originate. In this paper, we explain the theoretical motivations for pursuing such measurements and the challenges that lie ahead. In particular, we discuss and identify theoretical targets regarding the measurement of primordial non-Gaussianity. We argue that when quantified in terms of the local (equilateral) template amplitude fmore » $$loc\\atop{NL}$$ (f$$eq\\atop{NL}$$), natural target levels of sensitivity are Δf$$loc, eq\\atop{NL}$$ ≃ 1. We highlight that such levels are within reach of future surveys by measuring 2-, 3- and 4-point statistics of the galaxy spatial distribution. This paper summarizes a workshop held at CITA (University of Toronto) on October 23-24, 2014.« less

  1. Large-Scale Structure and Hyperuniformity of Amorphous Ices

    NASA Astrophysics Data System (ADS)

    Martelli, Fausto; Torquato, Salvatore; Giovambattista, Nicolas; Car, Roberto

    2017-09-01

    We investigate the large-scale structure of amorphous ices and transitions between their different forms by quantifying their large-scale density fluctuations. Specifically, we simulate the isothermal compression of low-density amorphous ice (LDA) and hexagonal ice to produce high-density amorphous ice (HDA). Both HDA and LDA are nearly hyperuniform; i.e., they are characterized by an anomalous suppression of large-scale density fluctuations. By contrast, in correspondence with the nonequilibrium phase transitions to HDA, the presence of structural heterogeneities strongly suppresses the hyperuniformity and the system becomes hyposurficial (devoid of "surface-area fluctuations"). Our investigation challenges the largely accepted "frozen-liquid" picture, which views glasses as structurally arrested liquids. Beyond implications for water, our findings enrich our understanding of pressure-induced structural transformations in glasses.

  2. Large-scale interaction profiling of PDZ domains through proteomic peptide-phage display using human and viral phage peptidomes.

    PubMed

    Ivarsson, Ylva; Arnold, Roland; McLaughlin, Megan; Nim, Satra; Joshi, Rakesh; Ray, Debashish; Liu, Bernard; Teyra, Joan; Pawson, Tony; Moffat, Jason; Li, Shawn Shun-Cheng; Sidhu, Sachdev S; Kim, Philip M

    2014-02-18

    The human proteome contains a plethora of short linear motifs (SLiMs) that serve as binding interfaces for modular protein domains. Such interactions are crucial for signaling and other cellular processes, but are difficult to detect because of their low to moderate affinities. Here we developed a dedicated approach, proteomic peptide-phage display (ProP-PD), to identify domain-SLiM interactions. Specifically, we generated phage libraries containing all human and viral C-terminal peptides using custom oligonucleotide microarrays. With these libraries we screened the nine PSD-95/Dlg/ZO-1 (PDZ) domains of human Densin-180, Erbin, Scribble, and Disks large homolog 1 for peptide ligands. We identified several known and putative interactions potentially relevant to cellular signaling pathways and confirmed interactions between full-length Scribble and the target proteins β-PIX, plakophilin-4, and guanylate cyclase soluble subunit α-2 using colocalization and coimmunoprecipitation experiments. The affinities of recombinant Scribble PDZ domains and the synthetic peptides representing the C termini of these proteins were in the 1- to 40-μM range. Furthermore, we identified several well-established host-virus protein-protein interactions, and confirmed that PDZ domains of Scribble interact with the C terminus of Tax-1 of human T-cell leukemia virus with micromolar affinity. Previously unknown putative viral protein ligands for the PDZ domains of Scribble and Erbin were also identified. Thus, we demonstrate that our ProP-PD libraries are useful tools for probing PDZ domain interactions. The method can be extended to interrogate all potential eukaryotic, bacterial, and viral SLiMs and we suggest it will be a highly valuable approach for studying cellular and pathogen-host protein-protein interactions.

  3. Analysis of a large-scale weighted network of one-to-one human communication

    NASA Astrophysics Data System (ADS)

    Onnela, Jukka-Pekka; Saramäki, Jari; Hyvönen, Jörkki; Szabó, Gábor; Argollo de Menezes, M.; Kaski, Kimmo; Barabási, Albert-László; Kertész, János

    2007-06-01

    We construct a connected network of 3.9 million nodes from mobile phone call records, which can be regarded as a proxy for the underlying human communication network at the societal level. We assign two weights on each edge to reflect the strength of social interaction, which are the aggregate call duration and the cumulative number of calls placed between the individuals over a period of 18 weeks. We present a detailed analysis of this weighted network by examining its degree, strength, and weight distributions, as well as its topological assortativity and weighted assortativity, clustering and weighted clustering, together with correlations between these quantities. We give an account of motif intensity and coherence distributions and compare them to a randomized reference system. We also use the concept of link overlap to measure the number of common neighbours any two adjacent nodes have, which serves as a useful local measure for identifying the interconnectedness of communities. We report a positive correlation between the overlap and weight of a link, thus providing strong quantitative evidence for the weak ties hypothesis, a central concept in social network analysis. The percolation properties of the network are found to depend on the type and order of removed links, and they can help understand how the local structure of the network manifests itself at the global level. We hope that our results will contribute to modelling weighted large-scale social networks, and believe that the systematic approach followed here can be adopted to study other weighted networks.

  4. Large Scale Cross Drive Correlation Of Digital Media

    DTIC Science & Technology

    2016-03-01

    NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS LARGE SCALE CROSS-DRIVE CORRELATION OF DIGITAL MEDIA by Joseph Van Bruaene March 2016 Thesis Co...CROSS-DRIVE CORRELATION OF DIGITAL MEDIA 5. FUNDING NUMBERS 6. AUTHOR(S) Joseph Van Bruaene 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Naval...the ability to make large scale cross-drive correlations among a large corpus of digital media becomes increasingly important. We propose a

  5. Accommodations for English Language Learners Taking Large-Scale Assessments: A Meta-Analysis on Effectiveness and Validity

    ERIC Educational Resources Information Center

    Kieffer, Michael J.; Lesaux, Nonie K.; Rivera, Mabel; Francis, David J.

    2009-01-01

    Including English language learners (ELLs) in large-scale assessments raises questions about the validity of inferences based on their scores. Test accommodations for ELLs are intended to reduce the impact of limited English proficiency on the assessment of the target construct, most often mathematic or science proficiency. This meta-analysis…

  6. The large-scale distribution of galaxies

    NASA Technical Reports Server (NTRS)

    Geller, Margaret J.

    1989-01-01

    The spatial distribution of galaxies in the universe is characterized on the basis of the six completed strips of the Harvard-Smithsonian Center for Astrophysics redshift-survey extension. The design of the survey is briefly reviewed, and the results are presented graphically. Vast low-density voids similar to the void in Bootes are found, almost completely surrounded by thin sheets of galaxies. Also discussed are the implications of the results for the survey sampling problem, the two-point correlation function of the galaxy distribution, the possibility of detecting large-scale coherent flows, theoretical models of large-scale structure, and the identification of groups and clusters of galaxies.

  7. Methods, caveats and the future of large-scale microelectrode recordings in the non-human primate

    PubMed Central

    Dotson, Nicholas M.; Goodell, Baldwin; Salazar, Rodrigo F.; Hoffman, Steven J.; Gray, Charles M.

    2015-01-01

    Cognitive processes play out on massive brain-wide networks, which produce widely distributed patterns of activity. Capturing these activity patterns requires tools that are able to simultaneously measure activity from many distributed sites with high spatiotemporal resolution. Unfortunately, current techniques with adequate coverage do not provide the requisite spatiotemporal resolution. Large-scale microelectrode recording devices, with dozens to hundreds of microelectrodes capable of simultaneously recording from nearly as many cortical and subcortical areas, provide a potential way to minimize these tradeoffs. However, placing hundreds of microelectrodes into a behaving animal is a highly risky and technically challenging endeavor that has only been pursued by a few groups. Recording activity from multiple electrodes simultaneously also introduces several statistical and conceptual dilemmas, such as the multiple comparisons problem and the uncontrolled stimulus response problem. In this perspective article, we discuss some of the techniques that we, and others, have developed for collecting and analyzing large-scale data sets, and address the future of this emerging field. PMID:26578906

  8. Sediment transport dynamics in response to large-scale human intervention

    NASA Astrophysics Data System (ADS)

    Eelkema, Menno; Wang, Zheng Bing

    2010-05-01

    SEDIMENT TRANSPORT DYNAMICS IN RESPONSE TO LARGE-SCALE HUMAN INTERVENTION M. Eelkema and Z.B. Wang The Eastern Scheldt basin in the southwestern part of the Netherlands is an elongated tidal basin of approximately 50 km in length with an average tidal range of roughly 3 meters at the inlet. Before 1969 A.D., this basin was also connected to two more tidal basins to the north through several narrow, yet deep channels. These connections were closed off with dams in the nineteen sixties in response to the catastrophic flooding in 1953. In the inlet of the Eastern Scheldt a storm-surge barrier was built in order to safeguard against flooding during storms while retaining a part of the tidal influence inside the basin during normal conditions. This barrier was finalized in 1986. The construction of the back-barrier dams in 1965 and 1969 had a significant impact on the tidal hydrodynamics and sediment transport (Van den Berg, 1986). The effects of these interventions were still ongoing when the hydrodynamic regime was altered again by the construction of the storm-surge barrier between 1983 and 1986. This research aims to describe the hydrodynamic and morphodynamic evolution of the Eastern Scheldt between 1953 and 1983, before construction of the storm-surge barrier had started. An analysis is made of the manner in which the back-barrier dams changed the tidal flow through the basin, and how these altered hydrodynamics influenced the sediment transport and morphology. This analysis consists first of all of a description of the observed hydrodynamical and bathymetrical changes. Second, these observations are used as input for a process-based hydrodynamic model (Delft3D), which is applied in order to gain more insight into the changes in sediment transport patterns. The model is used to simulate the situations before and after the closures of the connections between the Eastern Scheldt and the basins north of it In the decades before 1965, the Eastern Scheldt exported

  9. Large-scale environments of narrow-line Seyfert 1 galaxies

    NASA Astrophysics Data System (ADS)

    Järvelä, E.; Lähteenmäki, A.; Lietzen, H.; Poudel, A.; Heinämäki, P.; Einasto, M.

    2017-09-01

    Studying large-scale environments of narrow-line Seyfert 1 (NLS1) galaxies gives a new perspective on their properties, particularly their radio loudness. The large-scale environment is believed to have an impact on the evolution and intrinsic properties of galaxies, however, NLS1 sources have not been studied in this context before. We have a large and diverse sample of 1341 NLS1 galaxies and three separate environment data sets constructed using Sloan Digital Sky Survey. We use various statistical methods to investigate how the properties of NLS1 galaxies are connected to the large-scale environment, and compare the large-scale environments of NLS1 galaxies with other active galactic nuclei (AGN) classes, for example, other jetted AGN and broad-line Seyfert 1 (BLS1) galaxies, to study how they are related. NLS1 galaxies reside in less dense environments than any of the comparison samples, thus confirming their young age. The average large-scale environment density and environmental distribution of NLS1 sources is clearly different compared to BLS1 galaxies, thus it is improbable that they could be the parent population of NLS1 galaxies and unified by orientation. Within the NLS1 class there is a trend of increasing radio loudness with increasing large-scale environment density, indicating that the large-scale environment affects their intrinsic properties. Our results suggest that the NLS1 class of sources is not homogeneous, and furthermore, that a considerable fraction of them are misclassified. We further support a published proposal to replace the traditional classification to radio-loud, and radio-quiet or radio-silent sources with a division into jetted and non-jetted sources.

  10. Large-scale production of diesel-like biofuels - process design as an inherent part of microorganism development.

    PubMed

    Cuellar, Maria C; Heijnen, Joseph J; van der Wielen, Luuk A M

    2013-06-01

    Industrial biotechnology is playing an important role in the transition to a bio-based economy. Currently, however, industrial implementation is still modest, despite the advances made in microorganism development. Given that the fuels and commodity chemicals sectors are characterized by tight economic margins, we propose to address overall process design and efficiency at the start of bioprocess development. While current microorganism development is targeted at product formation and product yield, addressing process design at the start of bioprocess development means that microorganism selection can also be extended to other critical targets for process technology and process scale implementation, such as enhancing cell separation or increasing cell robustness at operating conditions that favor the overall process. In this paper we follow this approach for the microbial production of diesel-like biofuels. We review current microbial routes with both oleaginous and engineered microorganisms. For the routes leading to extracellular production, we identify the process conditions for large scale operation. The process conditions identified are finally translated to microorganism development targets. We show that microorganism development should be directed at anaerobic production, increasing robustness at extreme process conditions and tailoring cell surface properties. All the same time, novel process configurations integrating fermentation and product recovery, cell reuse and low-cost technologies for product separation are mandatory. This review provides a state-of-the-art summary of the latest challenges in large-scale production of diesel-like biofuels. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Double inflation - A possible resolution of the large-scale structure problem

    NASA Technical Reports Server (NTRS)

    Turner, Michael S.; Villumsen, Jens V.; Vittorio, Nicola; Silk, Joseph; Juszkiewicz, Roman

    1987-01-01

    A model is presented for the large-scale structure of the universe in which two successive inflationary phases resulted in large small-scale and small large-scale density fluctuations. This bimodal density fluctuation spectrum in an Omega = 1 universe dominated by hot dark matter leads to large-scale structure of the galaxy distribution that is consistent with recent observational results. In particular, large, nearly empty voids and significant large-scale peculiar velocity fields are produced over scales of about 100 Mpc, while the small-scale structure over less than about 10 Mpc resembles that in a low-density universe, as observed. Detailed analytical calculations and numerical simulations are given of the spatial and velocity correlations.

  12. A robust close-range photogrammetric target extraction algorithm for size and type variant targets

    NASA Astrophysics Data System (ADS)

    Nyarko, Kofi; Thomas, Clayton; Torres, Gilbert

    2016-05-01

    The Photo-G program conducted by Naval Air Systems Command at the Atlantic Test Range in Patuxent River, Maryland, uses photogrammetric analysis of large amounts of real-world imagery to characterize the motion of objects in a 3-D scene. Current approaches involve several independent processes including target acquisition, target identification, 2-D tracking of image features, and 3-D kinematic state estimation. Each process has its own inherent complications and corresponding degrees of both human intervention and computational complexity. One approach being explored for automated target acquisition relies on exploiting the pixel intensity distributions of photogrammetric targets, which tend to be patterns with bimodal intensity distributions. The bimodal distribution partitioning algorithm utilizes this distribution to automatically deconstruct a video frame into regions of interest (ROI) that are merged and expanded to target boundaries, from which ROI centroids are extracted to mark target acquisition points. This process has proved to be scale, position and orientation invariant, as well as fairly insensitive to global uniform intensity disparities.

  13. Measuring the Large-scale Solar Magnetic Field

    NASA Astrophysics Data System (ADS)

    Hoeksema, J. T.; Scherrer, P. H.; Peterson, E.; Svalgaard, L.

    2017-12-01

    The Sun's large-scale magnetic field is important for determining global structure of the corona and for quantifying the evolution of the polar field, which is sometimes used for predicting the strength of the next solar cycle. Having confidence in the determination of the large-scale magnetic field of the Sun is difficult because the field is often near the detection limit, various observing methods all measure something a little different, and various systematic effects can be very important. We compare resolved and unresolved observations of the large-scale magnetic field from the Wilcox Solar Observatory, Heliseismic and Magnetic Imager (HMI), Michelson Doppler Imager (MDI), and Solis. Cross comparison does not enable us to establish an absolute calibration, but it does allow us to discover and compensate for instrument problems, such as the sensitivity decrease seen in the WSO measurements in late 2016 and early 2017.

  14. Allometric scaling for predicting human clearance of bisphenol A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collet, Séverine H., E-mail: s.collet@envt.fr; Picard-Hagen, Nicole, E-mail: n.hagen-picard@envt.fr; Lacroix, Marlène Z., E-mail: m.lacroix@envt.fr

    The investigation of interspecies differences in bisphenol A (BPA) pharmacokinetics (PK) may be useful for translating findings from animal studies to humans, identifying major processes involved in BPA clearance mechanisms, and predicting BPA PK parameters in man. For the first time, a large range of species in terms of body weight, from 0.02 kg (mice) to 495 kg (horses) was used to predict BPA clearance in man by an allometric approach. BPA PK was evaluated after intravenous administration of BPA in horses, sheep, pigs, dogs, rats and mice. A non-compartmental analysis was used to estimate plasma clearance and steady statemore » volume of distribution and predict BPA PK parameters in humans from allometric scaling. In all the species investigated, BPA plasma clearance was high and of the same order of magnitude as their respective hepatic blood flow. By an allometric scaling, the human clearance was estimated to be 1.79 L/min (equivalent to 25.6 mL/kg.min) with a 95% prediction interval of 0.36 to 8.83 L/min. Our results support the hypothesis that there are highly efficient and hepatic mechanisms of BPA clearance in man. - Highlights: • Allometric scaling was used to predict BPA pharmacokinetic parameters in humans. • In all species, BPA plasma clearance approached hepatic blood flow. • Human BPA clearance was estimated to be 1.79 L/min.« less

  15. Gas Generators and Their Potential to Support Human-Scale HIADS (Hypersonic Inflatable Aerodynamic Decelerators)

    NASA Technical Reports Server (NTRS)

    Bodkin, Richard J.; Cheatwood, F. M.; Dillman, Robert A; Dinonno, John M.; Hughes, Stephen J.; Lucy, Melvin H.

    2016-01-01

    As HIAD technology progresses from 3-m diameter experimental scale to as large as 20-m diameter for human Mars entry, the mass penalties of carrying compressed gas has led the HIAD team to research current state-of-the-art gas generator approaches. Summarized below are several technologies identified in this survey, along with some of the pros and cons with respect to supporting large-scale HIAD applications.

  16. Spectral fingerprints of large-scale neuronal interactions.

    PubMed

    Siegel, Markus; Donner, Tobias H; Engel, Andreas K

    2012-01-11

    Cognition results from interactions among functionally specialized but widely distributed brain regions; however, neuroscience has so far largely focused on characterizing the function of individual brain regions and neurons therein. Here we discuss recent studies that have instead investigated the interactions between brain regions during cognitive processes by assessing correlations between neuronal oscillations in different regions of the primate cerebral cortex. These studies have opened a new window onto the large-scale circuit mechanisms underlying sensorimotor decision-making and top-down attention. We propose that frequency-specific neuronal correlations in large-scale cortical networks may be 'fingerprints' of canonical neuronal computations underlying cognitive processes.

  17. A unified large/small-scale dynamo in helical turbulence

    NASA Astrophysics Data System (ADS)

    Bhat, Pallavi; Subramanian, Kandaswamy; Brandenburg, Axel

    2016-09-01

    We use high resolution direct numerical simulations (DNS) to show that helical turbulence can generate significant large-scale fields even in the presence of strong small-scale dynamo action. During the kinematic stage, the unified large/small-scale dynamo grows fields with a shape-invariant eigenfunction, with most power peaked at small scales or large k, as in Subramanian & Brandenburg. Nevertheless, the large-scale field can be clearly detected as an excess power at small k in the negatively polarized component of the energy spectrum for a forcing with positively polarized waves. Its strength overline{B}, relative to the total rms field Brms, decreases with increasing magnetic Reynolds number, ReM. However, as the Lorentz force becomes important, the field generated by the unified dynamo orders itself by saturating on successively larger scales. The magnetic integral scale for the positively polarized waves, characterizing the small-scale field, increases significantly from the kinematic stage to saturation. This implies that the small-scale field becomes as coherent as possible for a given forcing scale, which averts the ReM-dependent quenching of overline{B}/B_rms. These results are obtained for 10243 DNS with magnetic Prandtl numbers of PrM = 0.1 and 10. For PrM = 0.1, overline{B}/B_rms grows from about 0.04 to about 0.4 at saturation, aided in the final stages by helicity dissipation. For PrM = 10, overline{B}/B_rms grows from much less than 0.01 to values of the order the 0.2. Our results confirm that there is a unified large/small-scale dynamo in helical turbulence.

  18. A Magnetic Bead-Integrated Chip for the Large Scale Manufacture of Normalized esiRNAs

    PubMed Central

    Wang, Zhao; Huang, Huang; Zhang, Hanshuo; Sun, Changhong; Hao, Yang; Yang, Junyu; Fan, Yu; Xi, Jianzhong Jeff

    2012-01-01

    The chemically-synthesized siRNA duplex has become a powerful and widely used tool for RNAi loss-of-function studies, but suffers from a high off-target effect problem. Recently, endoribonulease-prepared siRNA (esiRNA) has been shown to be an attractive alternative due to its lower off-target effect and cost effectiveness. However, the current manufacturing method for esiRNA is complicated, mainly in regards to purification and normalization on a large-scale level. In this study, we present a magnetic bead-integrated chip that can immobilize amplification or transcription products on beads and accomplish transcription, digestion, normalization and purification in a robust and convenient manner. This chip is equipped to manufacture ready-to-use esiRNAs on a large-scale level. Silencing specificity and efficiency of these esiRNAs were validated at the transcriptional, translational and functional levels. Manufacture of several normalized esiRNAs in a single well, including those silencing PARP1 and BRCA1, was successfully achieved, and the esiRNAs were subsequently utilized to effectively investigate their synergistic effect on cell viability. A small esiRNA library targeting 68 tyrosine kinase genes was constructed for a loss-of-function study, and four genes were identified in regulating the migration capability of Hela cells. We believe that this approach provides a more robust and cost-effective choice for manufacturing esiRNAs than current approaches, and therefore these heterogeneous RNA strands may have utility in most intensive and extensive applications. PMID:22761791

  19. Modulation of Small-scale Turbulence Structure by Large-scale Motions in the Absence of Direct Energy Transfer.

    NASA Astrophysics Data System (ADS)

    Brasseur, James G.; Juneja, Anurag

    1996-11-01

    Previous DNS studies indicate that small-scale structure can be directly altered through ``distant'' dynamical interactions by energetic forcing of the large scales. To remove the possibility of stimulating energy transfer between the large- and small-scale motions in these long-range interactions, we here perturb the large scale structure without altering its energy content by suddenly altering only the phases of large-scale Fourier modes. Scale-dependent changes in turbulence structure appear as a non zero difference field between two simulations from identical initial conditions of isotropic decaying turbulence, one perturbed and one unperturbed. We find that the large-scale phase perturbations leave the evolution of the energy spectrum virtually unchanged relative to the unperturbed turbulence. The difference field, on the other hand, is strongly affected by the perturbation. Most importantly, the time scale τ characterizing the change in in turbulence structure at spatial scale r shortly after initiating a change in large-scale structure decreases with decreasing turbulence scale r. Thus, structural information is transferred directly from the large- to the smallest-scale motions in the absence of direct energy transfer---a long-range effect which cannot be explained by a linear mechanism such as rapid distortion theory. * Supported by ARO grant DAAL03-92-G-0117

  20. A Functional Model for Management of Large Scale Assessments.

    ERIC Educational Resources Information Center

    Banta, Trudy W.; And Others

    This functional model for managing large-scale program evaluations was developed and validated in connection with the assessment of Tennessee's Nutrition Education and Training Program. Management of such a large-scale assessment requires the development of a structure for the organization; distribution and recovery of large quantities of…

  1. Current Scientific Issues in Large Scale Atmospheric Dynamics

    NASA Technical Reports Server (NTRS)

    Miller, T. L. (Compiler)

    1986-01-01

    Topics in large scale atmospheric dynamics are discussed. Aspects of atmospheric blocking, the influence of transient baroclinic eddies on planetary-scale waves, cyclogenesis, the effects of orography on planetary scale flow, small scale frontal structure, and simulations of gravity waves in frontal zones are discussed.

  2. A fast boosting-based screening method for large-scale association study in complex traits with genetic heterogeneity.

    PubMed

    Wang, Lu-Yong; Fasulo, D

    2006-01-01

    Genome-wide association study for complex diseases will generate massive amount of single nucleotide polymorphisms (SNPs) data. Univariate statistical test (i.e. Fisher exact test) was used to single out non-associated SNPs. However, the disease-susceptible SNPs may have little marginal effects in population and are unlikely to retain after the univariate tests. Also, model-based methods are impractical for large-scale dataset. Moreover, genetic heterogeneity makes the traditional methods harder to identify the genetic causes of diseases. A more recent random forest method provides a more robust method for screening the SNPs in thousands scale. However, for more large-scale data, i.e., Affymetrix Human Mapping 100K GeneChip data, a faster screening method is required to screening SNPs in whole-genome large scale association analysis with genetic heterogeneity. We propose a boosting-based method for rapid screening in large-scale analysis of complex traits in the presence of genetic heterogeneity. It provides a relatively fast and fairly good tool for screening and limiting the candidate SNPs for further more complex computational modeling task.

  3. The ecological future of the North American bison: Conceiving long-term, large-scale conservation of a species

    USGS Publications Warehouse

    Sanderson, E.W.; Redford, Kent; Weber, Bill; Aune, K.; Baldes, Dick; Berger, J.; Carter, Dave; Curtin, C.; Derr, James N.; Dobrott, S.J.; Fearn, Eva; Fleener, Craig; Forrest, Steven C.; Gerlach, Craig; Gates, C. Cormack; Gross, J.E.; Gogan, P.; Grassel, Shaun M.; Hilty, Jodi A.; Jensen, Marv; Kunkel, Kyran; Lammers, Duane; List, R.; Minkowski, Karen; Olson, Tom; Pague, Chris; Robertson, Paul B.; Stephenson, Bob

    2008-01-01

    Many wide-ranging mammal species have experienced significant declines over the last 200 years; restoring these species will require long-term, large-scale recovery efforts. We highlight 5 attributes of a recent range-wide vision-setting exercise for ecological recovery of the North American bison (Bison bison) that are broadly applicable to other species and restoration targets. The result of the exercise, the “Vermejo Statement” on bison restoration, is explicitly (1) large scale, (2) long term, (3) inclusive, (4) fulfilling of different values, and (5) ambitious. It reads, in part, “Over the next century, the ecological recovery of the North American bison will occur when multiple large herds move freely across extensive landscapes within all major habitats of their historic range, interacting in ecologically significant ways with the fullest possible set of other native species, and inspiring, sustaining and connecting human cultures.” We refined the vision into a scorecard that illustrates how individual bison herds can contribute to the vision. We also developed a set of maps and analyzed the current and potential future distributions of bison on the basis of expert assessment. Although more than 500,000 bison exist in North America today, we estimated they occupy <1% of their historical range and in no place express the full range of ecological and social values of previous times. By formulating an inclusive, affirmative, and specific vision through consultation with a wide range of stakeholders, we hope to provide a foundation for conservation of bison, and other wide-ranging species, over the next 100 years.

  4. Condition Monitoring of Large-Scale Facilities

    NASA Technical Reports Server (NTRS)

    Hall, David L.

    1999-01-01

    This document provides a summary of the research conducted for the NASA Ames Research Center under grant NAG2-1182 (Condition-Based Monitoring of Large-Scale Facilities). The information includes copies of view graphs presented at NASA Ames in the final Workshop (held during December of 1998), as well as a copy of a technical report provided to the COTR (Dr. Anne Patterson-Hine) subsequent to the workshop. The material describes the experimental design, collection of data, and analysis results associated with monitoring the health of large-scale facilities. In addition to this material, a copy of the Pennsylvania State University Applied Research Laboratory data fusion visual programming tool kit was also provided to NASA Ames researchers.

  5. The Human Kinome Targeted by FDA Approved Multi-Target Drugs and Combination Products: A Comparative Study from the Drug-Target Interaction Network Perspective.

    PubMed

    Li, Ying Hong; Wang, Pan Pan; Li, Xiao Xu; Yu, Chun Yan; Yang, Hong; Zhou, Jin; Xue, Wei Wei; Tan, Jun; Zhu, Feng

    2016-01-01

    The human kinome is one of the most productive classes of drug target, and there is emerging necessity for treating complex diseases by means of polypharmacology (multi-target drugs and combination products). However, the advantages of the multi-target drugs and the combination products are still under debate. A comparative analysis between FDA approved multi-target drugs and combination products, targeting the human kinome, was conducted by mapping targets onto the phylogenetic tree of the human kinome. The approach of network medicine illustrating the drug-target interactions was applied to identify popular targets of multi-target drugs and combination products. As identified, the multi-target drugs tended to inhibit target pairs in the human kinome, especially the receptor tyrosine kinase family, while the combination products were able to against targets of distant homology relationship. This finding asked for choosing the combination products as a better solution for designing drugs aiming at targets of distant homology relationship. Moreover, sub-networks of drug-target interactions in specific disease were generated, and mechanisms shared by multi-target drugs and combination products were identified. In conclusion, this study performed an analysis between approved multi-target drugs and combination products against the human kinome, which could assist the discovery of next generation polypharmacology.

  6. Seismic safety in conducting large-scale blasts

    NASA Astrophysics Data System (ADS)

    Mashukov, I. V.; Chaplygin, V. V.; Domanov, V. P.; Semin, A. A.; Klimkin, M. A.

    2017-09-01

    In mining enterprises to prepare hard rocks for excavation a drilling and blasting method is used. With the approach of mining operations to settlements the negative effect of large-scale blasts increases. To assess the level of seismic impact of large-scale blasts the scientific staff of Siberian State Industrial University carried out expertise for coal mines and iron ore enterprises. Determination of the magnitude of surface seismic vibrations caused by mass explosions was performed using seismic receivers, an analog-digital converter with recording on a laptop. The registration results of surface seismic vibrations during production of more than 280 large-scale blasts at 17 mining enterprises in 22 settlements are presented. The maximum velocity values of the Earth’s surface vibrations are determined. The safety evaluation of seismic effect was carried out according to the permissible value of vibration velocity. For cases with exceedance of permissible values recommendations were developed to reduce the level of seismic impact.

  7. Large scale RNAi screen in Tribolium reveals novel target genes for pest control and the proteasome as prime target.

    PubMed

    Ulrich, Julia; Dao, Van Anh; Majumdar, Upalparna; Schmitt-Engel, Christian; Schwirz, Jonas; Schultheis, Dorothea; Ströhlein, Nadi; Troelenberg, Nicole; Grossmann, Daniela; Richter, Tobias; Dönitz, Jürgen; Gerischer, Lizzy; Leboulle, Gérard; Vilcinskas, Andreas; Stanke, Mario; Bucher, Gregor

    2015-09-03

    Insect pest control is challenged by insecticide resistance and negative impact on ecology and health. One promising pest specific alternative is the generation of transgenic plants, which express double stranded RNAs targeting essential genes of a pest species. Upon feeding, the dsRNA induces gene silencing in the pest resulting in its death. However, the identification of efficient RNAi target genes remains a major challenge as genomic tools and breeding capacity is limited in most pest insects impeding whole-animal-high-throughput-screening. We use the red flour beetle Tribolium castaneum as a screening platform in order to identify the most efficient RNAi target genes. From about 5,000 randomly screened genes of the iBeetle RNAi screen we identify 11 novel and highly efficient RNAi targets. Our data allowed us to determine GO term combinations that are predictive for efficient RNAi target genes with proteasomal genes being most predictive. Finally, we show that RNAi target genes do not appear to act synergistically and that protein sequence conservation does not correlate with the number of potential off target sites. Our results will aid the identification of RNAi target genes in many pest species by providing a manageable number of excellent candidate genes to be tested and the proteasome as prime target. Further, the identified GO term combinations will help to identify efficient target genes from organ specific transcriptomes. Our off target analysis is relevant for the sequence selection used in transgenic plants.

  8. Potential for geophysical experiments in large scale tests.

    USGS Publications Warehouse

    Dieterich, J.H.

    1981-01-01

    Potential research applications for large-specimen geophysical experiments include measurements of scale dependence of physical parameters and examination of interactions with heterogeneities, especially flaws such as cracks. In addition, increased specimen size provides opportunities for improved recording resolution and greater control of experimental variables. Large-scale experiments using a special purpose low stress (100MPa).-Author

  9. Application of industrial scale genomics to discovery of therapeutic targets in heart failure.

    PubMed

    Mehraban, F; Tomlinson, J E

    2001-12-01

    In recent years intense activity in both academic and industrial sectors has provided a wealth of information on the human genome with an associated impressive increase in the number of novel gene sequences deposited in sequence data repositories and patent applications. This genomic industrial revolution has transformed the way in which drug target discovery is now approached. In this article we discuss how various differential gene expression (DGE) technologies are being utilized for cardiovascular disease (CVD) drug target discovery. Other approaches such as sequencing cDNA from cardiovascular derived tissues and cells coupled with bioinformatic sequence analysis are used with the aim of identifying novel gene sequences that may be exploited towards target discovery. Additional leverage from gene sequence information is obtained through identification of polymorphisms that may confer disease susceptibility and/or affect drug responsiveness. Pharmacogenomic studies are described wherein gene expression-based techniques are used to evaluate drug response and/or efficacy. Industrial-scale genomics supports and addresses not only novel target gene discovery but also the burgeoning issues in pharmaceutical and clinical cardiovascular medicine relative to polymorphic gene responses.

  10. Homogenization of Large-Scale Movement Models in Ecology

    USGS Publications Warehouse

    Garlick, M.J.; Powell, J.A.; Hooten, M.B.; McFarlane, L.R.

    2011-01-01

    A difficulty in using diffusion models to predict large scale animal population dispersal is that individuals move differently based on local information (as opposed to gradients) in differing habitat types. This can be accommodated by using ecological diffusion. However, real environments are often spatially complex, limiting application of a direct approach. Homogenization for partial differential equations has long been applied to Fickian diffusion (in which average individual movement is organized along gradients of habitat and population density). We derive a homogenization procedure for ecological diffusion and apply it to a simple model for chronic wasting disease in mule deer. Homogenization allows us to determine the impact of small scale (10-100 m) habitat variability on large scale (10-100 km) movement. The procedure generates asymptotic equations for solutions on the large scale with parameters defined by small-scale variation. The simplicity of this homogenization procedure is striking when compared to the multi-dimensional homogenization procedure for Fickian diffusion,and the method will be equally straightforward for more complex models. ?? 2010 Society for Mathematical Biology.

  11. A large scale membrane-binding protein conformational change that initiates at small length scales

    NASA Astrophysics Data System (ADS)

    Grandpre, Trevor; Andorf, Matthew; Chakravarthy, Srinivas; Lamb, Robert; Poor, Taylor; Landahl, Eric

    2013-03-01

    The fusion (F) protein of parainfluenza virus 5 (PIV5) is a membrane-bound, homotrimeric glycoprotein located on the surface of PIV5 viral envelopes. Upon being triggered by the receptor-binding protein (HN), F undergoes a greater than 100Å ATP-independent refolding event. This refolding event results in the insertion of a hydrophobic fusion peptide into the membrane of the target cell, followed by the desolvation and subsequent fusion event as the two membranes are brought together. Isothermal calorimetry and hydrophobic dye incorporation experiments indicate that the soluble construct of the F protein undergoes a conformational rearrangement event at around 55 deg C. We present the results of an initial Time-Resolved Small-Angle X-Ray Scattering (TR-SAXS) study of this large scale, entropically driven conformational change using a temperature jump. Although we the measured radius of gyration of this protein changes on a 110 second timescale, we find that the x-ray scattering intensity at higher angles (corresponding to smaller length scales in the protein) changes nearly an order of magnitude faster. We believe this may be a signature of entropically-driven conformational change. To whom correspondence should be addressed

  12. Epidemiology of forest malaria in central Vietnam: a large scale cross-sectional survey.

    PubMed

    Erhart, Annette; Ngo, Duc Thang; Phan, Van Ky; Ta, Thi Tinh; Van Overmeir, Chantal; Speybroeck, Niko; Obsomer, Valerie; Le, Xuan Hung; Le, Khanh Thuan; Coosemans, Marc; D'alessandro, Umberto

    2005-12-08

    In Vietnam, a large proportion of all malaria cases and deaths occurs in the central mountainous and forested part of the country. Indeed, forest malaria, despite intensive control activities, is still a major problem which raises several questions about its dynamics.A large-scale malaria morbidity survey to measure malaria endemicity and identify important risk factors was carried out in 43 villages situated in a forested area of Ninh Thuan province, south central Vietnam. Four thousand three hundred and six randomly selected individuals, aged 10-60 years, participated in the survey. Rag Lays (86%), traditionally living in the forest and practising "slash and burn" cultivation represented the most common ethnic group. The overall parasite rate was 13.3% (range [0-42.3] while Plasmodium falciparum seroprevalence was 25.5% (range [2.1-75.6]). Mapping of these two variables showed a patchy distribution, suggesting that risk factors other than remoteness and forest proximity modulated the human-vector interactions. This was confirmed by the results of the multivariate-adjusted analysis, showing that forest work was a significant risk factor for malaria infection, further increased by staying in the forest overnight (OR= 2.86; 95%CI [1.62; 5.07]). Rag Lays had a higher risk of malaria infection, which inversely related to education level and socio-economic status. Women were less at risk than men (OR = 0.71; 95%CI [0.59; 0.86]), a possible consequence of different behaviour. This study confirms that malaria endemicity is still relatively high in this area and that the dynamics of transmission is constantly modulated by the behaviour of both humans and vectors. A well-targeted intervention reducing the "vector/forest worker" interaction, based on long-lasting insecticidal material, could be appropriate in this environment.

  13. Epidemiology of forest malaria in central Vietnam: a large scale cross-sectional survey

    PubMed Central

    Erhart, Annette; Thang, Ngo Duc; Van Ky, Phan; Tinh, Ta Thi; Van Overmeir, Chantal; Speybroeck, Niko; Obsomer, Valerie; Hung, Le Xuan; Thuan, Le Khanh; Coosemans, Marc; D'alessandro, Umberto

    2005-01-01

    In Vietnam, a large proportion of all malaria cases and deaths occurs in the central mountainous and forested part of the country. Indeed, forest malaria, despite intensive control activities, is still a major problem which raises several questions about its dynamics. A large-scale malaria morbidity survey to measure malaria endemicity and identify important risk factors was carried out in 43 villages situated in a forested area of Ninh Thuan province, south central Vietnam. Four thousand three hundred and six randomly selected individuals, aged 10–60 years, participated in the survey. Rag Lays (86%), traditionally living in the forest and practising "slash and burn" cultivation represented the most common ethnic group. The overall parasite rate was 13.3% (range [0–42.3] while Plasmodium falciparum seroprevalence was 25.5% (range [2.1–75.6]). Mapping of these two variables showed a patchy distribution, suggesting that risk factors other than remoteness and forest proximity modulated the human-vector interactions. This was confirmed by the results of the multivariate-adjusted analysis, showing that forest work was a significant risk factor for malaria infection, further increased by staying in the forest overnight (OR= 2.86; 95%CI [1.62; 5.07]). Rag Lays had a higher risk of malaria infection, which inversely related to education level and socio-economic status. Women were less at risk than men (OR = 0.71; 95%CI [0.59; 0.86]), a possible consequence of different behaviour. This study confirms that malaria endemicity is still relatively high in this area and that the dynamics of transmission is constantly modulated by the behaviour of both humans and vectors. A well-targeted intervention reducing the "vector/forest worker" interaction, based on long-lasting insecticidal material, could be appropriate in this environment. PMID:16336671

  14. The large-scale organization of metabolic networks

    NASA Astrophysics Data System (ADS)

    Jeong, H.; Tombor, B.; Albert, R.; Oltvai, Z. N.; Barabási, A.-L.

    2000-10-01

    In a cell or microorganism, the processes that generate mass, energy, information transfer and cell-fate specification are seamlessly integrated through a complex network of cellular constituents and reactions. However, despite the key role of these networks in sustaining cellular functions, their large-scale structure is essentially unknown. Here we present a systematic comparative mathematical analysis of the metabolic networks of 43 organisms representing all three domains of life. We show that, despite significant variation in their individual constituents and pathways, these metabolic networks have the same topological scaling properties and show striking similarities to the inherent organization of complex non-biological systems. This may indicate that metabolic organization is not only identical for all living organisms, but also complies with the design principles of robust and error-tolerant scale-free networks, and may represent a common blueprint for the large-scale organization of interactions among all cellular constituents.

  15. A Scaled Framework for CRISPR Editing of Human Pluripotent Stem Cells to Study Psychiatric Disease.

    PubMed

    Hazelbaker, Dane Z; Beccard, Amanda; Bara, Anne M; Dabkowski, Nicole; Messana, Angelica; Mazzucato, Patrizia; Lam, Daisy; Manning, Danielle; Eggan, Kevin; Barrett, Lindy E

    2017-10-10

    Scaling of CRISPR-Cas9 technology in human pluripotent stem cells (hPSCs) represents an important step for modeling complex disease and developing drug screens in human cells. However, variables affecting the scaling efficiency of gene editing in hPSCs remain poorly understood. Here, we report a standardized CRISPR-Cas9 approach, with robust benchmarking at each step, to successfully target and genotype a set of psychiatric disease-implicated genes in hPSCs and provide a resource of edited hPSC lines for six of these genes. We found that transcriptional state and nucleosome positioning around targeted loci was not correlated with editing efficiency. However, editing frequencies varied between different hPSC lines and correlated with genomic stability, underscoring the need for careful cell line selection and unbiased assessments of genomic integrity. Together, our step-by-step quantification and in-depth analyses provide an experimental roadmap for scaling Cas9-mediated editing in hPSCs to study psychiatric disease, with broader applicability for other polygenic diseases. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.

  16. Large-scale structure of randomly jammed spheres

    NASA Astrophysics Data System (ADS)

    Ikeda, Atsushi; Berthier, Ludovic; Parisi, Giorgio

    2017-05-01

    We numerically analyze the density field of three-dimensional randomly jammed packings of monodisperse soft frictionless spherical particles, paying special attention to fluctuations occurring at large length scales. We study in detail the two-point static structure factor at low wave vectors in Fourier space. We also analyze the nature of the density field in real space by studying the large-distance behavior of the two-point pair correlation function, of density fluctuations in subsystems of increasing sizes, and of the direct correlation function. We show that such real space analysis can be greatly improved by introducing a coarse-grained density field to disentangle genuine large-scale correlations from purely local effects. Our results confirm that both Fourier and real space signatures of vanishing density fluctuations at large scale are absent, indicating that randomly jammed packings are not hyperuniform. In addition, we establish that the pair correlation function displays a surprisingly complex structure at large distances, which is however not compatible with the long-range negative correlation of hyperuniform systems but fully compatible with an analytic form for the structure factor. This implies that the direct correlation function is short ranged, as we also demonstrate directly. Our results reveal that density fluctuations in jammed packings do not follow the behavior expected for random hyperuniform materials, but display instead a more complex behavior.

  17. A CRITICAL ASSESSMENT OF BIODOSIMETRY METHODS FOR LARGE-SCALE INCIDENTS

    PubMed Central

    Swartz, Harold M.; Flood, Ann Barry; Gougelet, Robert M.; Rea, Michael E.; Nicolalde, Roberto J.; Williams, Benjamin B.

    2014-01-01

    Recognition is growing regarding the possibility that terrorism or large-scale accidents could result in potential radiation exposure of hundreds of thousands of people and that the present guidelines for evaluation after such an event are seriously deficient. Therefore, there is a great and urgent need for after-the-fact biodosimetric methods to estimate radiation dose. To accomplish this goal, the dose estimates must be at the individual level, timely, accurate, and plausibly obtained in large-scale disasters. This paper evaluates current biodosimetry methods, focusing on their strengths and weaknesses in estimating human radiation exposure in large-scale disasters at three stages. First, the authors evaluate biodosimetry’s ability to determine which individuals did not receive a significant exposure so they can be removed from the acute response system. Second, biodosimetry’s capacity to classify those initially assessed as needing further evaluation into treatment-level categories is assessed. Third, we review biodosimetry’s ability to guide treatment, both short- and long-term, is reviewed. The authors compare biodosimetric methods that are based on physical vs. biological parameters and evaluate the features of current dosimeters (capacity, speed and ease of getting information, and accuracy) to determine which are most useful in meeting patients’ needs at each of the different stages. Results indicate that the biodosimetry methods differ in their applicability to the three different stages, and that combining physical and biological techniques may sometimes be most effective. In conclusion, biodosimetry techniques have different properties, and knowledge of their properties for meeting the different needs for different stages will result in their most effective use in a nuclear disaster mass-casualty event. PMID:20065671

  18. Spatial Fingerprints of Community Structure in Human Interaction Network for an Extensive Set of Large-Scale Regions

    PubMed Central

    Kallus, Zsófia; Barankai, Norbert; Szüle, János; Vattay, Gábor

    2015-01-01

    Human interaction networks inferred from country-wide telephone activity recordings were recently used to redraw political maps by projecting their topological partitions into geographical space. The results showed remarkable spatial cohesiveness of the network communities and a significant overlap between the redrawn and the administrative borders. Here we present a similar analysis based on one of the most popular online social networks represented by the ties between more than 5.8 million of its geo-located users. The worldwide coverage of their measured activity allowed us to analyze the large-scale regional subgraphs of entire continents and an extensive set of examples for single countries. We present results for North and South America, Europe and Asia. In our analysis we used the well-established method of modularity clustering after an aggregation of the individual links into a weighted graph connecting equal-area geographical pixels. Our results show fingerprints of both of the opposing forces of dividing local conflicts and of uniting cross-cultural trends of globalization. PMID:25993329

  19. Sign: large-scale gene network estimation environment for high performance computing.

    PubMed

    Tamada, Yoshinori; Shimamura, Teppei; Yamaguchi, Rui; Imoto, Seiya; Nagasaki, Masao; Miyano, Satoru

    2011-01-01

    Our research group is currently developing software for estimating large-scale gene networks from gene expression data. The software, called SiGN, is specifically designed for the Japanese flagship supercomputer "K computer" which is planned to achieve 10 petaflops in 2012, and other high performance computing environments including Human Genome Center (HGC) supercomputer system. SiGN is a collection of gene network estimation software with three different sub-programs: SiGN-BN, SiGN-SSM and SiGN-L1. In these three programs, five different models are available: static and dynamic nonparametric Bayesian networks, state space models, graphical Gaussian models, and vector autoregressive models. All these models require a huge amount of computational resources for estimating large-scale gene networks and therefore are designed to be able to exploit the speed of 10 petaflops. The software will be available freely for "K computer" and HGC supercomputer system users. The estimated networks can be viewed and analyzed by Cell Illustrator Online and SBiP (Systems Biology integrative Pipeline). The software project web site is available at http://sign.hgc.jp/ .

  20. An Novel Architecture of Large-scale Communication in IOT

    NASA Astrophysics Data System (ADS)

    Ma, Wubin; Deng, Su; Huang, Hongbin

    2018-03-01

    In recent years, many scholars have done a great deal of research on the development of Internet of Things and networked physical systems. However, few people have made the detailed visualization of the large-scale communications architecture in the IOT. In fact, the non-uniform technology between IPv6 and access points has led to a lack of broad principles of large-scale communications architectures. Therefore, this paper presents the Uni-IPv6 Access and Information Exchange Method (UAIEM), a new architecture and algorithm that addresses large-scale communications in the IOT.

  1. Gravitational lenses and large scale structure

    NASA Technical Reports Server (NTRS)

    Turner, Edwin L.

    1987-01-01

    Four possible statistical tests of the large scale distribution of cosmic material are described. Each is based on gravitational lensing effects. The current observational status of these tests is also summarized.

  2. Large-Scale 1:1 Computing Initiatives: An Open Access Database

    ERIC Educational Resources Information Center

    Richardson, Jayson W.; McLeod, Scott; Flora, Kevin; Sauers, Nick J.; Kannan, Sathiamoorthy; Sincar, Mehmet

    2013-01-01

    This article details the spread and scope of large-scale 1:1 computing initiatives around the world. What follows is a review of the existing literature around 1:1 programs followed by a description of the large-scale 1:1 database. Main findings include: 1) the XO and the Classmate PC dominate large-scale 1:1 initiatives; 2) if professional…

  3. A new way to protect privacy in large-scale genome-wide association studies.

    PubMed

    Kamm, Liina; Bogdanov, Dan; Laur, Sven; Vilo, Jaak

    2013-04-01

    Increased availability of various genotyping techniques has initiated a race for finding genetic markers that can be used in diagnostics and personalized medicine. Although many genetic risk factors are known, key causes of common diseases with complex heritage patterns are still unknown. Identification of such complex traits requires a targeted study over a large collection of data. Ideally, such studies bring together data from many biobanks. However, data aggregation on such a large scale raises many privacy issues. We show how to conduct such studies without violating privacy of individual donors and without leaking the data to third parties. The presented solution has provable security guarantees. Supplementary data are available at Bioinformatics online.

  4. Biogrid--a microfluidic device for large-scale enzyme-free dissociation of stem cell aggregates.

    PubMed

    Wallman, Lars; Åkesson, Elisabet; Ceric, Dario; Andersson, Per Henrik; Day, Kelly; Hovatta, Outi; Falci, Scott; Laurell, Thomas; Sundström, Erik

    2011-10-07

    Culturing stem cells as free-floating aggregates in suspension facilitates large-scale production of cells in closed systems, for clinical use. To comply with GMP standards, the use of substances such as proteolytic enzymes should be avoided. Instead of enzymatic dissociation, the growing cell aggregates may be mechanically cut at passage, but available methods are not compatible with large-scale cell production and hence translation into the clinic becomes a severe bottle-neck. We have developed the Biogrid device, which consists of an array of micrometerscale knife edges, micro-fabricated in silicon, and a manifold in which the microgrid is placed across the central fluid channel. By connecting one side of the Biogrid to a syringe or a pump and the other side to the cell culture, the culture medium with suspended cell aggregates can be aspirated, forcing the aggregates through the microgrid, and ejected back to the cell culture container. Large aggregates are thereby dissociated into smaller fragments while small aggregates pass through the microgrid unaffected. As proof-of-concept, we demonstrate that the Biogrid device can be successfully used for repeated passage of human neural stem/progenitor cells cultured as so-called neurospheres, as well as for passage of suspension cultures of human embryonic stem cells. We also show that human neural stem/progenitor cells tolerate transient pressure changes far exceeding those that will occur in a fluidic system incorporating the Biogrid microgrids. Thus, by using the Biogrid device it is possible to mechanically passage large quantities of cells in suspension cultures in closed fluidic systems, without the use of proteolytic enzymes.

  5. A review of sensing technologies for small and large-scale touch panels

    NASA Astrophysics Data System (ADS)

    Akhtar, Humza; Kemao, Qian; Kakarala, Ramakrishna

    2017-06-01

    A touch panel is an input device for human computer interaction. It consists of a network of sensors, a sampling circuit and a micro controller for detecting and locating a touch input. Touch input can come from either finger or stylus depending upon the type of touch technology. These touch panels provide an intuitive and collaborative workspace so that people can perform various tasks with the use of their fingers instead of traditional input devices like keyboard and mouse. Touch sensing technology is not new. At the time of this writing, various technologies are available in the market and this paper reviews the most common ones. We review traditional designs and sensing algorithms for touch technology. We also observe that due to its various strengths, capacitive touch will dominate the large-scale touch panel industry in years to come. In the end, we discuss the motivation for doing academic research on large-scale panels.

  6. A Matter of Time: Faster Percolator Analysis via Efficient SVM Learning for Large-Scale Proteomics.

    PubMed

    Halloran, John T; Rocke, David M

    2018-05-04

    Percolator is an important tool for greatly improving the results of a database search and subsequent downstream analysis. Using support vector machines (SVMs), Percolator recalibrates peptide-spectrum matches based on the learned decision boundary between targets and decoys. To improve analysis time for large-scale data sets, we update Percolator's SVM learning engine through software and algorithmic optimizations rather than heuristic approaches that necessitate the careful study of their impact on learned parameters across different search settings and data sets. We show that by optimizing Percolator's original learning algorithm, l 2 -SVM-MFN, large-scale SVM learning requires nearly only a third of the original runtime. Furthermore, we show that by employing the widely used Trust Region Newton (TRON) algorithm instead of l 2 -SVM-MFN, large-scale Percolator SVM learning is reduced to nearly only a fifth of the original runtime. Importantly, these speedups only affect the speed at which Percolator converges to a global solution and do not alter recalibration performance. The upgraded versions of both l 2 -SVM-MFN and TRON are optimized within the Percolator codebase for multithreaded and single-thread use and are available under Apache license at bitbucket.org/jthalloran/percolator_upgrade .

  7. Adverse drug reaction prediction using scores produced by large-scale drug-protein target docking on high-performance computing machines.

    PubMed

    LaBute, Montiago X; Zhang, Xiaohua; Lenderman, Jason; Bennion, Brian J; Wong, Sergio E; Lightstone, Felice C

    2014-01-01

    Late-stage or post-market identification of adverse drug reactions (ADRs) is a significant public health issue and a source of major economic liability for drug development. Thus, reliable in silico screening of drug candidates for possible ADRs would be advantageous. In this work, we introduce a computational approach that predicts ADRs by combining the results of molecular docking and leverages known ADR information from DrugBank and SIDER. We employed a recently parallelized version of AutoDock Vina (VinaLC) to dock 906 small molecule drugs to a virtual panel of 409 DrugBank protein targets. L1-regularized logistic regression models were trained on the resulting docking scores of a 560 compound subset from the initial 906 compounds to predict 85 side effects, grouped into 10 ADR phenotype groups. Only 21% (87 out of 409) of the drug-protein binding features involve known targets of the drug subset, providing a significant probe of off-target effects. As a control, associations of this drug subset with the 555 annotated targets of these compounds, as reported in DrugBank, were used as features to train a separate group of models. The Vina off-target models and the DrugBank on-target models yielded comparable median area-under-the-receiver-operating-characteristic-curves (AUCs) during 10-fold cross-validation (0.60-0.69 and 0.61-0.74, respectively). Evidence was found in the PubMed literature to support several putative ADR-protein associations identified by our analysis. Among them, several associations between neoplasm-related ADRs and known tumor suppressor and tumor invasiveness marker proteins were found. A dual role for interstitial collagenase in both neoplasms and aneurysm formation was also identified. These associations all involve off-target proteins and could not have been found using available drug/on-target interaction data. This study illustrates a path forward to comprehensive ADR virtual screening that can potentially scale with increasing number

  8. Wedge measures parallax separations...on large-scale 70-mm

    Treesearch

    Steven L. Wert; Richard J. Myhre

    1967-01-01

    A new parallax wedge (range: 1.5 to 2 inches) has been designed for use with large-scaled 70-mm. aerial photographs. The narrow separation of the wedge allows the user to measure small parallax separations that are characteristic of large-scale photographs.

  9. MiR-21 is enriched in the RNA-induced silencing complex and targets COL4A1 in human granulosa cell lines.

    PubMed

    Mase, Yuri; Ishibashi, Osamu; Ishikawa, Tomoko; Takizawa, Takami; Kiguchi, Kazushige; Ohba, Takashi; Katabuchi, Hidetaka; Takeshita, Toshiyuki; Takizawa, Toshihiro

    2012-10-01

    MicroRNAs (miRNAs) are noncoding small RNAs that play important roles in a variety of physiological and pathological events. In this study, we performed large-scale profiling of EIF2C2-bound miRNAs in 3 human granulosa-derived cell lines (ie, KGN, HSOGT, and GC1a) by high-throughput sequencing and found that miR-21 accounted for more than 80% of EIF2C2-bound miRNAs, suggesting that it was enriched in the RNA-induced silencing complex (RISC) and played a functional role in human granulosa cell (GC) lines. We also found high expression levels of miR-21 in primary human GCs. Assuming that miR-21 target mRNAs are enriched in RISC, we performed cDNA cloning of EIF2C2-bound mRNAs in KGN cells. We identified COL4A1 mRNA as a miR-21 target in the GC lines. These data suggest that miR-21 is involved in the regulation of the synthesis of COL4A1, a component of the basement membrane surrounding the GC layer and granulosa-embedded extracellular structure.

  10. Techniques for Large-Scale Bacterial Genome Manipulation and Characterization of the Mutants with Respect to In Silico Metabolic Reconstructions.

    PubMed

    diCenzo, George C; Finan, Turlough M

    2018-01-01

    The rate at which all genes within a bacterial genome can be identified far exceeds the ability to characterize these genes. To assist in associating genes with cellular functions, a large-scale bacterial genome deletion approach can be employed to rapidly screen tens to thousands of genes for desired phenotypes. Here, we provide a detailed protocol for the generation of deletions of large segments of bacterial genomes that relies on the activity of a site-specific recombinase. In this procedure, two recombinase recognition target sequences are introduced into known positions of a bacterial genome through single cross-over plasmid integration. Subsequent expression of the site-specific recombinase mediates recombination between the two target sequences, resulting in the excision of the intervening region and its loss from the genome. We further illustrate how this deletion system can be readily adapted to function as a large-scale in vivo cloning procedure, in which the region excised from the genome is captured as a replicative plasmid. We next provide a procedure for the metabolic analysis of bacterial large-scale genome deletion mutants using the Biolog Phenotype MicroArray™ system. Finally, a pipeline is described, and a sample Matlab script is provided, for the integration of the obtained data with a draft metabolic reconstruction for the refinement of the reactions and gene-protein-reaction relationships in a metabolic reconstruction.

  11. Targeted carbon conservation at national scales with high-resolution monitoring

    PubMed Central

    Asner, Gregory P.; Knapp, David E.; Martin, Roberta E.; Tupayachi, Raul; Anderson, Christopher B.; Mascaro, Joseph; Sinca, Felipe; Chadwick, K. Dana; Higgins, Mark; Farfan, William; Llactayo, William; Silman, Miles R.

    2014-01-01

    Terrestrial carbon conservation can provide critical environmental, social, and climate benefits. Yet, the geographically complex mosaic of threats to, and opportunities for, conserving carbon in landscapes remain largely unresolved at national scales. Using a new high-resolution carbon mapping approach applied to Perú, a megadiverse country undergoing rapid land use change, we found that at least 0.8 Pg of aboveground carbon stocks are at imminent risk of emission from land use activities. Map-based information on the natural controls over carbon density, as well as current ecosystem threats and protections, revealed three biogeographically explicit strategies that fully offset forthcoming land-use emissions. High-resolution carbon mapping affords targeted interventions to reduce greenhouse gas emissions in rapidly developing tropical nations. PMID:25385593

  12. Targeted carbon conservation at national scales with high-resolution monitoring.

    PubMed

    Asner, Gregory P; Knapp, David E; Martin, Roberta E; Tupayachi, Raul; Anderson, Christopher B; Mascaro, Joseph; Sinca, Felipe; Chadwick, K Dana; Higgins, Mark; Farfan, William; Llactayo, William; Silman, Miles R

    2014-11-25

    Terrestrial carbon conservation can provide critical environmental, social, and climate benefits. Yet, the geographically complex mosaic of threats to, and opportunities for, conserving carbon in landscapes remain largely unresolved at national scales. Using a new high-resolution carbon mapping approach applied to Perú, a megadiverse country undergoing rapid land use change, we found that at least 0.8 Pg of aboveground carbon stocks are at imminent risk of emission from land use activities. Map-based information on the natural controls over carbon density, as well as current ecosystem threats and protections, revealed three biogeographically explicit strategies that fully offset forthcoming land-use emissions. High-resolution carbon mapping affords targeted interventions to reduce greenhouse gas emissions in rapidly developing tropical nations.

  13. Quasi-Experimental Evaluation of the Effectiveness of a Large-Scale Readmission Reduction Program.

    PubMed

    Jenq, Grace Y; Doyle, Margaret M; Belton, Beverly M; Herrin, Jeph; Horwitz, Leora I

    2016-05-01

    Feasibility, effectiveness, and sustainability of large-scale readmission reduction efforts are uncertain. The Greater New Haven Coalition for Safe Transitions and Readmission Reductions was funded by the Center for Medicare & Medicaid Services (CMS) to reduce readmissions among all discharged Medicare fee-for-service (FFS) patients. To evaluate whether overall Medicare FFS readmissions were reduced through an intervention applied to high-risk discharge patients. This quasi-experimental evaluation took place at an urban academic medical center. Target discharge patients were older than 64 years with Medicare FFS insurance, residing in nearby zip codes, and discharged alive to home or facility and not against medical advice or to hospice; control discharge patients were older than 54 years with the same zip codes and discharge disposition but without Medicare FFS insurance if older than 64 years. High-risk target discharge patients were selectively enrolled in the program. Personalized transitional care, including education, medication reconciliation, follow-up telephone calls, and linkage to community resources. We measured the 30-day unplanned same-hospital readmission rates in the baseline period (May 1, 2011, through April 30, 2012) and intervention period (October 1, 2012, through May 31, 2014). We enrolled 10 621 (58.3%) of 18 223 target discharge patients (73.9% of discharge patients screened as high risk) and included all target discharge patients in the analysis. The mean (SD) age of the target discharge patients was 79.7 (8.8) years. The adjusted readmission rate decreased from 21.5% to 19.5% in the target population and from 21.1% to 21.0% in the control population, a relative reduction of 9.3%. The number needed to treat to avoid 1 readmission was 50. In a difference-in-differences analysis using a logistic regression model, the odds of readmission in the target population decreased significantly more than that of the control population in the

  14. Large-scale evidence of dependency length minimization in 37 languages

    PubMed Central

    Futrell, Richard; Mahowald, Kyle; Gibson, Edward

    2015-01-01

    Explaining the variation between human languages and the constraints on that variation is a core goal of linguistics. In the last 20 y, it has been claimed that many striking universals of cross-linguistic variation follow from a hypothetical principle that dependency length—the distance between syntactically related words in a sentence—is minimized. Various models of human sentence production and comprehension predict that long dependencies are difficult or inefficient to process; minimizing dependency length thus enables effective communication without incurring processing difficulty. However, despite widespread application of this idea in theoretical, empirical, and practical work, there is not yet large-scale evidence that dependency length is actually minimized in real utterances across many languages; previous work has focused either on a small number of languages or on limited kinds of data about each language. Here, using parsed corpora of 37 diverse languages, we show that overall dependency lengths for all languages are shorter than conservative random baselines. The results strongly suggest that dependency length minimization is a universal quantitative property of human languages and support explanations of linguistic variation in terms of general properties of human information processing. PMID:26240370

  15. Monitoring Great Ape and Elephant Abundance at Large Spatial Scales: Measuring Effectiveness of a Conservation Landscape

    PubMed Central

    Stokes, Emma J.; Strindberg, Samantha; Bakabana, Parfait C.; Elkan, Paul W.; Iyenguet, Fortuné C.; Madzoké, Bola; Malanda, Guy Aimé F.; Mowawa, Brice S.; Moukoumbou, Calixte; Ouakabadio, Franck K.; Rainey, Hugo J.

    2010-01-01

    Protected areas are fundamental to biodiversity conservation, but there is growing recognition of the need to extend beyond protected areas to meet the ecological requirements of species at larger scales. Landscape-scale conservation requires an evaluation of management impact on biodiversity under different land-use strategies; this is challenging and there exist few empirical studies. In a conservation landscape in northern Republic of Congo we demonstrate the application of a large-scale monitoring program designed to evaluate the impact of conservation interventions on three globally threatened species: western gorillas, chimpanzees and forest elephants, under three land-use types: integral protection, commercial logging, and community-based natural resource management. We applied distance-sampling methods to examine species abundance across different land-use types under varying degrees of management and human disturbance. We found no clear trends in abundance between land-use types. However, units with interventions designed to reduce poaching and protect habitats - irrespective of land-use type - harboured all three species at consistently higher abundance than a neighbouring logging concession undergoing no wildlife management. We applied Generalized-Additive Models to evaluate a priori predictions of species response to different landscape processes. Our results indicate that, given adequate protection from poaching, elephants and gorillas can profit from herbaceous vegetation in recently logged forests and maintain access to ecologically important resources located outside of protected areas. However, proximity to the single integrally protected area in the landscape maintained an overriding positive influence on elephant abundance, and logging roads – even subject to anti-poaching controls - were exploited by elephant poachers and had a major negative influence on elephant distribution. Chimpanzees show a clear preference for unlogged or more mature

  16. Monitoring great ape and elephant abundance at large spatial scales: measuring effectiveness of a conservation landscape.

    PubMed

    Stokes, Emma J; Strindberg, Samantha; Bakabana, Parfait C; Elkan, Paul W; Iyenguet, Fortuné C; Madzoké, Bola; Malanda, Guy Aimé F; Mowawa, Brice S; Moukoumbou, Calixte; Ouakabadio, Franck K; Rainey, Hugo J

    2010-04-23

    Protected areas are fundamental to biodiversity conservation, but there is growing recognition of the need to extend beyond protected areas to meet the ecological requirements of species at larger scales. Landscape-scale conservation requires an evaluation of management impact on biodiversity under different land-use strategies; this is challenging and there exist few empirical studies. In a conservation landscape in northern Republic of Congo we demonstrate the application of a large-scale monitoring program designed to evaluate the impact of conservation interventions on three globally threatened species: western gorillas, chimpanzees and forest elephants, under three land-use types: integral protection, commercial logging, and community-based natural resource management. We applied distance-sampling methods to examine species abundance across different land-use types under varying degrees of management and human disturbance. We found no clear trends in abundance between land-use types. However, units with interventions designed to reduce poaching and protect habitats--irrespective of land-use type--harboured all three species at consistently higher abundance than a neighbouring logging concession undergoing no wildlife management. We applied Generalized-Additive Models to evaluate a priori predictions of species response to different landscape processes. Our results indicate that, given adequate protection from poaching, elephants and gorillas can profit from herbaceous vegetation in recently logged forests and maintain access to ecologically important resources located outside of protected areas. However, proximity to the single integrally protected area in the landscape maintained an overriding positive influence on elephant abundance, and logging roads--even subject to anti-poaching controls--were exploited by elephant poachers and had a major negative influence on elephant distribution. Chimpanzees show a clear preference for unlogged or more mature forests

  17. Viral Organization of Human Proteins

    PubMed Central

    Wuchty, Stefan; Siwo, Geoffrey; Ferdig, Michael T.

    2010-01-01

    Although maps of intracellular interactions are increasingly well characterized, little is known about large-scale maps of host-pathogen protein interactions. The investigation of host-pathogen interactions can reveal features of pathogenesis and provide a foundation for the development of drugs and disease prevention strategies. A compilation of experimentally verified interactions between HIV-1 and human proteins and a set of HIV-dependency factors (HDF) allowed insights into the topology and intricate interplay between viral and host proteins on a large scale. We found that targeted and HDF proteins appear predominantly in rich-clubs, groups of human proteins that are strongly intertwined among each other. These assemblies of proteins may serve as an infection gateway, allowing the virus to take control of the human host by reaching protein pathways and diversified cellular functions in a pronounced and focused way. Particular transcription factors and protein kinases facilitate indirect interactions between HDFs and viral proteins. Discerning the entanglement of directly targeted and indirectly interacting proteins may uncover molecular and functional sites that can provide novel perspectives on the progression of HIV infection and highlight new avenues to fight this virus. PMID:20827298

  18. The Mechanism of Gene Targeting in Human Somatic Cells

    PubMed Central

    Kan, Yinan; Ruis, Brian; Lin, Sherry; Hendrickson, Eric A.

    2014-01-01

    Gene targeting in human somatic cells is of importance because it can be used to either delineate the loss-of-function phenotype of a gene or correct a mutated gene back to wild-type. Both of these outcomes require a form of DNA double-strand break (DSB) repair known as homologous recombination (HR). The mechanism of HR leading to gene targeting, however, is not well understood in human cells. Here, we demonstrate that a two-end, ends-out HR intermediate is valid for human gene targeting. Furthermore, the resolution step of this intermediate occurs via the classic DSB repair model of HR while synthesis-dependent strand annealing and Holliday Junction dissolution are, at best, minor pathways. Moreover, and in contrast to other systems, the positions of Holliday Junction resolution are evenly distributed along the homology arms of the targeting vector. Most unexpectedly, we demonstrate that when a meganuclease is used to introduce a chromosomal DSB to augment gene targeting, the mechanism of gene targeting is inverted to an ends-in process. Finally, we demonstrate that the anti-recombination activity of mismatch repair is a significant impediment to gene targeting. These observations significantly advance our understanding of HR and gene targeting in human cells. PMID:24699519

  19. The micro-environmental impact of volatile organic compound emissions from large-scale assemblies of people in a confined space

    USDA-ARS?s Scientific Manuscript database

    Large-scale assemblies of people in a con'ned space can exert signi'cant impacts on the local air chemistry due to human emissions of volatile organics. Variations of air-quality in such small scale can be studied by quantifying 'ngerprint volatile organic compounds (VOCs) such as acetone, toluene, ...

  20. Management of large-scale technology

    NASA Technical Reports Server (NTRS)

    Levine, A.

    1985-01-01

    Two major themes are addressed in this assessment of the management of large-scale NASA programs: (1) how a high technology agency was a decade marked by a rapid expansion of funds and manpower in the first half and almost as rapid contraction in the second; and (2) how NASA combined central planning and control with decentralized project execution.

  1. Flexible sampling large-scale social networks by self-adjustable random walk

    NASA Astrophysics Data System (ADS)

    Xu, Xiao-Ke; Zhu, Jonathan J. H.

    2016-12-01

    Online social networks (OSNs) have become an increasingly attractive gold mine for academic and commercial researchers. However, research on OSNs faces a number of difficult challenges. One bottleneck lies in the massive quantity and often unavailability of OSN population data. Sampling perhaps becomes the only feasible solution to the problems. How to draw samples that can represent the underlying OSNs has remained a formidable task because of a number of conceptual and methodological reasons. Especially, most of the empirically-driven studies on network sampling are confined to simulated data or sub-graph data, which are fundamentally different from real and complete-graph OSNs. In the current study, we propose a flexible sampling method, called Self-Adjustable Random Walk (SARW), and test it against with the population data of a real large-scale OSN. We evaluate the strengths of the sampling method in comparison with four prevailing methods, including uniform, breadth-first search (BFS), random walk (RW), and revised RW (i.e., MHRW) sampling. We try to mix both induced-edge and external-edge information of sampled nodes together in the same sampling process. Our results show that the SARW sampling method has been able to generate unbiased samples of OSNs with maximal precision and minimal cost. The study is helpful for the practice of OSN research by providing a highly needed sampling tools, for the methodological development of large-scale network sampling by comparative evaluations of existing sampling methods, and for the theoretical understanding of human networks by highlighting discrepancies and contradictions between existing knowledge/assumptions of large-scale real OSN data.

  2. Space-Time Controls on Carbon Sequestration Over Large-Scale Amazon Basin

    NASA Technical Reports Server (NTRS)

    Smith, Eric A.; Cooper, Harry J.; Gu, Jiujing; Grose, Andrew; Norman, John; daRocha, Humberto R.; Starr, David O. (Technical Monitor)

    2002-01-01

    A major research focus of the LBA Ecology Program is an assessment of the carbon budget and the carbon sequestering capacity of the large scale forest-pasture system that dominates the Amazonia landscape, and its time-space heterogeneity manifest in carbon fluxes across the large scale Amazon basin ecosystem. Quantification of these processes requires a combination of in situ measurements, remotely sensed measurements from space, and a realistically forced hydrometeorological model coupled to a carbon assimilation model, capable of simulating details within the surface energy and water budgets along with the principle modes of photosynthesis and respiration. Here we describe the results of an investigation concerning the space-time controls of carbon sources and sinks distributed over the large scale Amazon basin. The results are derived from a carbon-water-energy budget retrieval system for the large scale Amazon basin, which uses a coupled carbon assimilation-hydrometeorological model as an integrating system, forced by both in situ meteorological measurements and remotely sensed radiation fluxes and precipitation retrieval retrieved from a combination of GOES, SSM/I, TOMS, and TRMM satellite measurements. Brief discussion concerning validation of (a) retrieved surface radiation fluxes and precipitation based on 30-min averaged surface measurements taken at Ji-Parana in Rondonia and Manaus in Amazonas, and (b) modeled carbon fluxes based on tower CO2 flux measurements taken at Reserva Jaru, Manaus and Fazenda Nossa Senhora. The space-time controls on carbon sequestration are partitioned into sets of factors classified by: (1) above canopy meteorology, (2) incoming surface radiation, (3) precipitation interception, and (4) indigenous stomatal processes varied over the different land covers of pristine rainforest, partially, and fully logged rainforests, and pasture lands. These are the principle meteorological, thermodynamical, hydrological, and biophysical

  3. US National Large-scale City Orthoimage Standard Initiative

    USGS Publications Warehouse

    Zhou, G.; Song, C.; Benjamin, S.; Schickler, W.

    2003-01-01

    The early procedures and algorithms for National digital orthophoto generation in National Digital Orthophoto Program (NDOP) were based on earlier USGS mapping operations, such as field control, aerotriangulation (derived in the early 1920's), the quarter-quadrangle-centered (3.75 minutes of longitude and latitude in geographic extent), 1:40,000 aerial photographs, and 2.5 D digital elevation models. However, large-scale city orthophotos using early procedures have disclosed many shortcomings, e.g., ghost image, occlusion, shadow. Thus, to provide the technical base (algorithms, procedure) and experience needed for city large-scale digital orthophoto creation is essential for the near future national large-scale digital orthophoto deployment and the revision of the Standards for National Large-scale City Digital Orthophoto in National Digital Orthophoto Program (NDOP). This paper will report our initial research results as follows: (1) High-precision 3D city DSM generation through LIDAR data processing, (2) Spatial objects/features extraction through surface material information and high-accuracy 3D DSM data, (3) 3D city model development, (4) Algorithm development for generation of DTM-based orthophoto, and DBM-based orthophoto, (5) True orthophoto generation by merging DBM-based orthophoto and DTM-based orthophoto, and (6) Automatic mosaic by optimizing and combining imagery from many perspectives.

  4. Long-Gradient Separations Coupled with Selected Reaction Monitoring for Highly Sensitive, Large Scale Targeted Protein Quantification in a Single Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shi, Tujin; Fillmore, Thomas L.; Gao, Yuqian

    2013-10-01

    Long-gradient separations coupled to tandem MS were recently demonstrated to provide a deep proteome coverage for global proteomics; however, such long-gradient separations have not been explored for targeted proteomics. Herein, we investigate the potential performance of the long-gradient separations coupled with selected reaction monitoring (LG-SRM) for targeted protein quantification. Direct comparison of LG-SRM (5 h gradient) and conventional LC-SRM (45 min gradient) showed that the long-gradient separations significantly reduced background interference levels and provided an 8- to 100-fold improvement in LOQ for target proteins in human female serum. Based on at least one surrogate peptide per protein, an LOQ ofmore » 10 ng/mL was achieved for the two spiked proteins in non-depleted human serum. The LG-SRM detection of seven out of eight endogenous plasma proteins expressed at ng/mL or sub-ng/mL levels in clinical patient sera was also demonstrated. A correlation coefficient of >0.99 was observed for the results of LG-SRM and ELISA measurements for prostate-specific antigen (PSA) in selected patient sera. Further enhancement of LG-SRM sensitivity was achieved by applying front-end IgY14 immunoaffinity depletion. Besides improved sensitivity, LG-SRM offers at least 3 times higher multiplexing capacity than conventional LC-SRM due to ~3-fold increase in average peak widths for a 300-min gradient compared to a 45-min gradient. Therefore, LG-SRM holds great potential for bridging the gap between global and targeted proteomics due to its advantages in both sensitivity and multiplexing capacity.« less

  5. Large-scale structure in superfluid Chaplygin gas cosmology

    NASA Astrophysics Data System (ADS)

    Yang, Rongjia

    2014-03-01

    We investigate the growth of the large-scale structure in the superfluid Chaplygin gas (SCG) model. Both linear and nonlinear growth, such as σ8 and the skewness S3, are discussed. We find the growth factor of SCG reduces to the Einstein-de Sitter case at early times while it differs from the cosmological constant model (ΛCDM) case in the large a limit. We also find there will be more stricture growth on large scales in the SCG scenario than in ΛCDM and the variations of σ8 and S3 between SCG and ΛCDM cannot be discriminated.

  6. Food appropriation through large scale land acquisitions

    NASA Astrophysics Data System (ADS)

    Rulli, Maria Cristina; D'Odorico, Paolo

    2014-05-01

    The increasing demand for agricultural products and the uncertainty of international food markets has recently drawn the attention of governments and agribusiness firms toward investments in productive agricultural land, mostly in the developing world. The targeted countries are typically located in regions that have remained only marginally utilized because of lack of modern technology. It is expected that in the long run large scale land acquisitions (LSLAs) for commercial farming will bring the technology required to close the existing crops yield gaps. While the extent of the acquired land and the associated appropriation of freshwater resources have been investigated in detail, the amount of food this land can produce and the number of people it could feed still need to be quantified. Here we use a unique dataset of land deals to provide a global quantitative assessment of the rates of crop and food appropriation potentially associated with LSLAs. We show how up to 300-550 million people could be fed by crops grown in the acquired land, should these investments in agriculture improve crop production and close the yield gap. In contrast, about 190-370 million people could be supported by this land without closing of the yield gap. These numbers raise some concern because the food produced in the acquired land is typically exported to other regions, while the target countries exhibit high levels of malnourishment. Conversely, if used for domestic consumption, the crops harvested in the acquired land could ensure food security to the local populations.

  7. Large Scale Anthropogenic Reduction of Forest Cover in Last Glacial Maximum Europe

    PubMed Central

    Pfeiffer, Mirjam; Kolen, Jan C. A.; Davis, Basil A. S.

    2016-01-01

    Reconstructions of the vegetation of Europe during the Last Glacial Maximum (LGM) are an enigma. Pollen-based analyses have suggested that Europe was largely covered by steppe and tundra, and forests persisted only in small refugia. Climate-vegetation model simulations on the other hand have consistently suggested that broad areas of Europe would have been suitable for forest, even in the depths of the last glaciation. Here we reconcile models with data by demonstrating that the highly mobile groups of hunter-gatherers that inhabited Europe at the LGM could have substantially reduced forest cover through the ignition of wildfires. Similar to hunter-gatherers of the more recent past, Upper Paleolithic humans were masters of the use of fire, and preferred inhabiting semi-open landscapes to facilitate foraging, hunting and travel. Incorporating human agency into a dynamic vegetation-fire model and simulating forest cover shows that even small increases in wildfire frequency over natural background levels resulted in large changes in the forested area of Europe, in part because trees were already stressed by low atmospheric CO2 concentrations and the cold, dry, and highly variable climate. Our results suggest that the impact of humans on the glacial landscape of Europe may be one of the earliest large-scale anthropogenic modifications of the earth system. PMID:27902716

  8. Geospatial Optimization of Siting Large-Scale Solar Projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Macknick, Jordan; Quinby, Ted; Caulfield, Emmet

    2014-03-01

    Recent policy and economic conditions have encouraged a renewed interest in developing large-scale solar projects in the U.S. Southwest. However, siting large-scale solar projects is complex. In addition to the quality of the solar resource, solar developers must take into consideration many environmental, social, and economic factors when evaluating a potential site. This report describes a proof-of-concept, Web-based Geographical Information Systems (GIS) tool that evaluates multiple user-defined criteria in an optimization algorithm to inform discussions and decisions regarding the locations of utility-scale solar projects. Existing siting recommendations for large-scale solar projects from governmental and non-governmental organizations are not consistent withmore » each other, are often not transparent in methods, and do not take into consideration the differing priorities of stakeholders. The siting assistance GIS tool we have developed improves upon the existing siting guidelines by being user-driven, transparent, interactive, capable of incorporating multiple criteria, and flexible. This work provides the foundation for a dynamic siting assistance tool that can greatly facilitate siting decisions among multiple stakeholders.« less

  9. Critical Issues in Large-Scale Assessment: A Resource Guide.

    ERIC Educational Resources Information Center

    Redfield, Doris

    The purpose of this document is to provide practical guidance and support for the design, development, and implementation of large-scale assessment systems that are grounded in research and best practice. Information is included about existing large-scale testing efforts, including national testing programs, state testing programs, and…

  10. RAAS inhibitors and cardiovascular protection in large scale trials.

    PubMed

    von Lueder, Thomas G; Krum, Henry

    2013-04-01

    Hypertension, coronary artery disease and heart failure affect over half of the adult population in most Western societies, and are prime causes of CV morbidity and mortality. With the ever-increasing worldwide prevalence of CV disease due to ageing and the "diabetes" pandemic, guideline groups have recognized the importance of achieving cardioprotection in affected individuals as well as in those at risk for future CV events. The renin-angiotensin-aldosterone system (RAAS) is the most important system controlling blood pressure (BP), cardiovascular and renal function in man. As our understanding of the crucial role of RAAS in the pathogenesis of most, if not all, CV disease has expanded over the past decades, so has the development of drugs targeting its individual components. Angiotensin-converting enzyme inhibitors (ACEi), Ang-II receptor blockers (ARB), and mineralcorticoid receptor antagonists (MRA) have been evaluated in large clinical trials for their potential to mediate cardioprotection, singly or in combination. Direct renin inhibitors are currently under scrutiny, as well as novel dual-acting RAAS-blocking agents. Herein, we review the evidence generated from large-scale clinical trials of cardioprotection achieved through RAAS-blockade.

  11. State of the Art in Large-Scale Soil Moisture Monitoring

    NASA Technical Reports Server (NTRS)

    Ochsner, Tyson E.; Cosh, Michael Harold; Cuenca, Richard H.; Dorigo, Wouter; Draper, Clara S.; Hagimoto, Yutaka; Kerr, Yan H.; Larson, Kristine M.; Njoku, Eni Gerald; Small, Eric E.; hide

    2013-01-01

    Soil moisture is an essential climate variable influencing land atmosphere interactions, an essential hydrologic variable impacting rainfall runoff processes, an essential ecological variable regulating net ecosystem exchange, and an essential agricultural variable constraining food security. Large-scale soil moisture monitoring has advanced in recent years creating opportunities to transform scientific understanding of soil moisture and related processes. These advances are being driven by researchers from a broad range of disciplines, but this complicates collaboration and communication. For some applications, the science required to utilize large-scale soil moisture data is poorly developed. In this review, we describe the state of the art in large-scale soil moisture monitoring and identify some critical needs for research to optimize the use of increasingly available soil moisture data. We review representative examples of 1) emerging in situ and proximal sensing techniques, 2) dedicated soil moisture remote sensing missions, 3) soil moisture monitoring networks, and 4) applications of large-scale soil moisture measurements. Significant near-term progress seems possible in the use of large-scale soil moisture data for drought monitoring. Assimilation of soil moisture data for meteorological or hydrologic forecasting also shows promise, but significant challenges related to model structures and model errors remain. Little progress has been made yet in the use of large-scale soil moisture observations within the context of ecological or agricultural modeling. Opportunities abound to advance the science and practice of large-scale soil moisture monitoring for the sake of improved Earth system monitoring, modeling, and forecasting.

  12. Simulation research on the process of large scale ship plane segmentation intelligent workshop

    NASA Astrophysics Data System (ADS)

    Xu, Peng; Liao, Liangchuang; Zhou, Chao; Xue, Rui; Fu, Wei

    2017-04-01

    Large scale ship plane segmentation intelligent workshop is a new thing, and there is no research work in related fields at home and abroad. The mode of production should be transformed by the existing industry 2.0 or part of industry 3.0, also transformed from "human brain analysis and judgment + machine manufacturing" to "machine analysis and judgment + machine manufacturing". In this transforming process, there are a great deal of tasks need to be determined on the aspects of management and technology, such as workshop structure evolution, development of intelligent equipment and changes in business model. Along with them is the reformation of the whole workshop. Process simulation in this project would verify general layout and process flow of large scale ship plane section intelligent workshop, also would analyze intelligent workshop working efficiency, which is significant to the next step of the transformation of plane segmentation intelligent workshop.

  13. Nonlinear Generation of shear flows and large scale magnetic fields by small scale

    NASA Astrophysics Data System (ADS)

    Aburjania, G.

    2009-04-01

    EGU2009-233 Nonlinear Generation of shear flows and large scale magnetic fields by small scale turbulence in the ionosphere by G. Aburjania Contact: George Aburjania, g.aburjania@gmail.com,aburj@mymail.ge

  14. A simulation study demonstrating the importance of large-scale trailing vortices in wake steering

    DOE PAGES

    Fleming, Paul; Annoni, Jennifer; Churchfield, Matthew; ...

    2018-05-14

    In this article, we investigate the role of flow structures generated in wind farm control through yaw misalignment. A pair of counter-rotating vortices are shown to be important in deforming the shape of the wake and in explaining the asymmetry of wake steering in oppositely signed yaw angles. We motivate the development of new physics for control-oriented engineering models of wind farm control, which include the effects of these large-scale flow structures. Such a new model would improve the predictability of control-oriented models. Results presented in this paper indicate that wind farm control strategies, based on new control-oriented models withmore » new physics, that target total flow control over wake redirection may be different, and perhaps more effective, than current approaches. We propose that wind farm control and wake steering should be thought of as the generation of large-scale flow structures, which will aid in the improved performance of wind farms.« less

  15. A simulation study demonstrating the importance of large-scale trailing vortices in wake steering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fleming, Paul; Annoni, Jennifer; Churchfield, Matthew

    In this article, we investigate the role of flow structures generated in wind farm control through yaw misalignment. A pair of counter-rotating vortices are shown to be important in deforming the shape of the wake and in explaining the asymmetry of wake steering in oppositely signed yaw angles. We motivate the development of new physics for control-oriented engineering models of wind farm control, which include the effects of these large-scale flow structures. Such a new model would improve the predictability of control-oriented models. Results presented in this paper indicate that wind farm control strategies, based on new control-oriented models withmore » new physics, that target total flow control over wake redirection may be different, and perhaps more effective, than current approaches. We propose that wind farm control and wake steering should be thought of as the generation of large-scale flow structures, which will aid in the improved performance of wind farms.« less

  16. Can global hydrological models reproduce large scale river flood regimes?

    NASA Astrophysics Data System (ADS)

    Eisner, Stephanie; Flörke, Martina

    2013-04-01

    River flooding remains one of the most severe natural hazards. On the one hand, major flood events pose a serious threat to human well-being, causing deaths and considerable economic damage. On the other hand, the periodic occurrence of flood pulses is crucial to maintain the functioning of riverine floodplains and wetlands, and to preserve the ecosystem services the latter provide. In many regions, river floods reveal a distinct seasonality, i.e. they occur at a particular time during the year. This seasonality is related to regionally dominant flood generating processes which can be expressed in river flood types. While in data-rich regions (esp. Europe and North America) the analysis of flood regimes can be based on observed river discharge time series, this data is sparse or lacking in many other regions of the world. This gap of knowledge can be filled by global modeling approaches. However, to date most global modeling studies have focused on mean annual or monthly water availability and their change over time while simulating discharge extremes, both floods and droughts, still remains a challenge for large scale hydrological models. This study will explore the ability of the global hydrological model WaterGAP3 to simulate the large scale patterns of river flood regimes, represented by seasonal pattern and the dominant flood type. WaterGAP3 simulates the global terrestrial water balance on a 5 arc minute spatial grid (excluding Greenland and Antarctica) at a daily time step. The model accounts for human interference on river flow, i.e. water abstraction for various purposes, e.g. irrigation, and flow regulation by large dams and reservoirs. Our analysis will provide insight in the general ability of global hydrological models to reproduce river flood regimes and thus will promote the creation of a global map of river flood regimes to provide a spatially inclusive and comprehensive picture. Understanding present-day flood regimes can support both flood risk

  17. Large-scale Labeled Datasets to Fuel Earth Science Deep Learning Applications

    NASA Astrophysics Data System (ADS)

    Maskey, M.; Ramachandran, R.; Miller, J.

    2017-12-01

    Deep learning has revolutionized computer vision and natural language processing with various algorithms scaled using high-performance computing. However, generic large-scale labeled datasets such as the ImageNet are the fuel that drives the impressive accuracy of deep learning results. Large-scale labeled datasets already exist in domains such as medical science, but creating them in the Earth science domain is a challenge. While there are ways to apply deep learning using limited labeled datasets, there is a need in the Earth sciences for creating large-scale labeled datasets for benchmarking and scaling deep learning applications. At the NASA Marshall Space Flight Center, we are using deep learning for a variety of Earth science applications where we have encountered the need for large-scale labeled datasets. We will discuss our approaches for creating such datasets and why these datasets are just as valuable as deep learning algorithms. We will also describe successful usage of these large-scale labeled datasets with our deep learning based applications.

  18. Large-Scale Coherent Vortex Formation in Two-Dimensional Turbulence

    NASA Astrophysics Data System (ADS)

    Orlov, A. V.; Brazhnikov, M. Yu.; Levchenko, A. A.

    2018-04-01

    The evolution of a vortex flow excited by an electromagnetic technique in a thin layer of a conducting liquid was studied experimentally. Small-scale vortices, excited at the pumping scale, merge with time due to the nonlinear interaction and produce large-scale structures—the inverse energy cascade is formed. The dependence of the energy spectrum in the developed inverse cascade is well described by the Kraichnan law k -5/3. At large scales, the inverse cascade is limited by cell sizes, and a large-scale coherent vortex flow is formed, which occupies almost the entire area of the experimental cell. The radial profile of the azimuthal velocity of the coherent vortex immediately after the pumping was switched off has been established for the first time. Inside the vortex core, the azimuthal velocity grows linearly along a radius and reaches a constant value outside the core, which agrees well with the theoretical prediction.

  19. Robust regression for large-scale neuroimaging studies.

    PubMed

    Fritsch, Virgile; Da Mota, Benoit; Loth, Eva; Varoquaux, Gaël; Banaschewski, Tobias; Barker, Gareth J; Bokde, Arun L W; Brühl, Rüdiger; Butzek, Brigitte; Conrod, Patricia; Flor, Herta; Garavan, Hugh; Lemaitre, Hervé; Mann, Karl; Nees, Frauke; Paus, Tomas; Schad, Daniel J; Schümann, Gunter; Frouin, Vincent; Poline, Jean-Baptiste; Thirion, Bertrand

    2015-05-01

    Multi-subject datasets used in neuroimaging group studies have a complex structure, as they exhibit non-stationary statistical properties across regions and display various artifacts. While studies with small sample sizes can rarely be shown to deviate from standard hypotheses (such as the normality of the residuals) due to the poor sensitivity of normality tests with low degrees of freedom, large-scale studies (e.g. >100 subjects) exhibit more obvious deviations from these hypotheses and call for more refined models for statistical inference. Here, we demonstrate the benefits of robust regression as a tool for analyzing large neuroimaging cohorts. First, we use an analytic test based on robust parameter estimates; based on simulations, this procedure is shown to provide an accurate statistical control without resorting to permutations. Second, we show that robust regression yields more detections than standard algorithms using as an example an imaging genetics study with 392 subjects. Third, we show that robust regression can avoid false positives in a large-scale analysis of brain-behavior relationships with over 1500 subjects. Finally we embed robust regression in the Randomized Parcellation Based Inference (RPBI) method and demonstrate that this combination further improves the sensitivity of tests carried out across the whole brain. Altogether, our results show that robust procedures provide important advantages in large-scale neuroimaging group studies. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Intensive agriculture erodes β-diversity at large scales.

    PubMed

    Karp, Daniel S; Rominger, Andrew J; Zook, Jim; Ranganathan, Jai; Ehrlich, Paul R; Daily, Gretchen C

    2012-09-01

    Biodiversity is declining from unprecedented land conversions that replace diverse, low-intensity agriculture with vast expanses under homogeneous, intensive production. Despite documented losses of species richness, consequences for β-diversity, changes in community composition between sites, are largely unknown, especially in the tropics. Using a 10-year data set on Costa Rican birds, we find that low-intensity agriculture sustained β-diversity across large scales on a par with forest. In high-intensity agriculture, low local (α) diversity inflated β-diversity as a statistical artefact. Therefore, at small spatial scales, intensive agriculture appeared to retain β-diversity. Unlike in forest or low-intensity systems, however, high-intensity agriculture also homogenised vegetation structure over large distances, thereby decoupling the fundamental ecological pattern of bird communities changing with geographical distance. This ~40% decline in species turnover indicates a significant decline in β-diversity at large spatial scales. These findings point the way towards multi-functional agricultural systems that maintain agricultural productivity while simultaneously conserving biodiversity. © 2012 Blackwell Publishing Ltd/CNRS.

  1. Tools for Large-Scale Mobile Malware Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bierma, Michael

    Analyzing mobile applications for malicious behavior is an important area of re- search, and is made di cult, in part, by the increasingly large number of appli- cations available for the major operating systems. There are currently over 1.2 million apps available in both the Google Play and Apple App stores (the respec- tive o cial marketplaces for the Android and iOS operating systems)[1, 2]. Our research provides two large-scale analysis tools to aid in the detection and analysis of mobile malware. The rst tool we present, Andlantis, is a scalable dynamic analysis system capa- ble of processing over 3000more » Android applications per hour. Traditionally, Android dynamic analysis techniques have been relatively limited in scale due to the compu- tational resources required to emulate the full Android system to achieve accurate execution. Andlantis is the most scalable Android dynamic analysis framework to date, and is able to collect valuable forensic data, which helps reverse-engineers and malware researchers identify and understand anomalous application behavior. We discuss the results of running 1261 malware samples through the system, and provide examples of malware analysis performed with the resulting data. While techniques exist to perform static analysis on a large number of appli- cations, large-scale analysis of iOS applications has been relatively small scale due to the closed nature of the iOS ecosystem, and the di culty of acquiring appli- cations for analysis. The second tool we present, iClone, addresses the challenges associated with iOS research in order to detect application clones within a dataset of over 20,000 iOS applications.« less

  2. Advanced Algorithms and High-Performance Testbed for Large-Scale Site Characterization and Subsurface Target Detecting Using Airborne Ground Penetrating SAR

    NASA Technical Reports Server (NTRS)

    Fijany, Amir; Collier, James B.; Citak, Ari

    1997-01-01

    A team of US Army Corps of Engineers, Omaha District and Engineering and Support Center, Huntsville, let Propulsion Laboratory (JPL), Stanford Research Institute (SRI), and Montgomery Watson is currently in the process of planning and conducting the largest ever survey at the Former Buckley Field (60,000 acres), in Colorado, by using SRI airborne, ground penetrating, Synthetic Aperture Radar (SAR). The purpose of this survey is the detection of surface and subsurface Unexploded Ordnance (UXO) and in a broader sense the site characterization for identification of contaminated as well as clear areas. In preparation for such a large-scale survey, JPL has been developing advanced algorithms and a high-performance restbed for processing of massive amount of expected SAR data from this site. Two key requirements of this project are the accuracy (in terms of UXO detection) and speed of SAR data processing. The first key feature of this testbed is a large degree of automation and a minimum degree of the need for human perception in the processing to achieve an acceptable processing rate of several hundred acres per day. For accurate UXO detection, novel algorithms have been developed and implemented. These algorithms analyze dual polarized (HH and VV) SAR data. They are based on the correlation of HH and VV SAR data and involve a rather large set of parameters for accurate detection of UXO. For each specific site, this set of parameters can be optimized by using ground truth data (i.e., known surface and subsurface UXOs). In this paper, we discuss these algorithms and their successful application for detection of surface and subsurface anti-tank mines by using a data set from Yuma proving Ground, A7, acquired by SRI SAR.

  3. Stability of large-scale systems.

    NASA Technical Reports Server (NTRS)

    Siljak, D. D.

    1972-01-01

    The purpose of this paper is to present the results obtained in stability study of large-scale systems based upon the comparison principle and vector Liapunov functions. The exposition is essentially self-contained, with emphasis on recent innovations which utilize explicit information about the system structure. This provides a natural foundation for the stability theory of dynamic systems under structural perturbations.

  4. Real-time simulation of large-scale floods

    NASA Astrophysics Data System (ADS)

    Liu, Q.; Qin, Y.; Li, G. D.; Liu, Z.; Cheng, D. J.; Zhao, Y. H.

    2016-08-01

    According to the complex real-time water situation, the real-time simulation of large-scale floods is very important for flood prevention practice. Model robustness and running efficiency are two critical factors in successful real-time flood simulation. This paper proposed a robust, two-dimensional, shallow water model based on the unstructured Godunov- type finite volume method. A robust wet/dry front method is used to enhance the numerical stability. An adaptive method is proposed to improve the running efficiency. The proposed model is used for large-scale flood simulation on real topography. Results compared to those of MIKE21 show the strong performance of the proposed model.

  5. Bioinformatics by Example: From Sequence to Target

    NASA Astrophysics Data System (ADS)

    Kossida, Sophia; Tahri, Nadia; Daizadeh, Iraj

    2002-12-01

    With the completion of the human genome, and the imminent completion of other large-scale sequencing and structure-determination projects, computer-assisted bioscience is aimed to become the new paradigm for conducting basic and applied research. The presence of these additional bioinformatics tools stirs great anxiety for experimental researchers (as well as for pedagogues), since they are now faced with a wider and deeper knowledge of differing disciplines (biology, chemistry, physics, mathematics, and computer science). This review targets those individuals who are interested in using computational methods in their teaching or research. By analyzing a real-life, pharmaceutical, multicomponent, target-based example the reader will experience this fascinating new discipline.

  6. Human immune cell targeting of protein nanoparticles - caveospheres

    NASA Astrophysics Data System (ADS)

    Glass, Joshua J.; Yuen, Daniel; Rae, James; Johnston, Angus P. R.; Parton, Robert G.; Kent, Stephen J.; de Rose, Robert

    2016-04-01

    Nanotechnology has the power to transform vaccine and drug delivery through protection of payloads from both metabolism and off-target effects, while facilitating specific delivery of cargo to immune cells. However, evaluation of immune cell nanoparticle targeting is conventionally restricted to monocultured cell line models. We generated human caveolin-1 nanoparticles, termed caveospheres, which were efficiently functionalized with monoclonal antibodies. Using this platform, we investigated CD4+ T cell and CD20+ B cell targeting within physiological mixtures of primary human blood immune cells using flow cytometry, imaging flow cytometry and confocal microscopy. Antibody-functionalization enhanced caveosphere binding to targeted immune cells (6.6 to 43.9-fold) within mixed populations and in the presence of protein-containing fluids. Moreover, targeting caveospheres to CCR5 enabled caveosphere internalization by non-phagocytic CD4+ T cells--an important therapeutic target for HIV treatment. This efficient and flexible system of immune cell-targeted caveosphere nanoparticles holds promise for the development of advanced immunotherapeutics and vaccines.

  7. Large scale anomalies in the microwave background: causation and correlation.

    PubMed

    Aslanyan, Grigor; Easther, Richard

    2013-12-27

    Most treatments of large scale anomalies in the microwave sky are a posteriori, with unquantified look-elsewhere effects. We contrast these with physical models of specific inhomogeneities in the early Universe which can generate these apparent anomalies. Physical models predict correlations between candidate anomalies and the corresponding signals in polarization and large scale structure, reducing the impact of cosmic variance. We compute the apparent spatial curvature associated with large-scale inhomogeneities and show that it is typically small, allowing for a self-consistent analysis. As an illustrative example we show that a single large plane wave inhomogeneity can contribute to low-l mode alignment and odd-even asymmetry in the power spectra and the best-fit model accounts for a significant part of the claimed odd-even asymmetry. We argue that this approach can be generalized to provide a more quantitative assessment of potential large scale anomalies in the Universe.

  8. Humans and seasonal climate variability threaten large-bodied coral reef fish with small ranges.

    PubMed

    Mellin, C; Mouillot, D; Kulbicki, M; McClanahan, T R; Vigliola, L; Bradshaw, C J A; Brainard, R E; Chabanet, P; Edgar, G J; Fordham, D A; Friedlander, A M; Parravicini, V; Sequeira, A M M; Stuart-Smith, R D; Wantiez, L; Caley, M J

    2016-02-03

    Coral reefs are among the most species-rich and threatened ecosystems on Earth, yet the extent to which human stressors determine species occurrences, compared with biogeography or environmental conditions, remains largely unknown. With ever-increasing human-mediated disturbances on these ecosystems, an important question is not only how many species can inhabit local communities, but also which biological traits determine species that can persist (or not) above particular disturbance thresholds. Here we show that human pressure and seasonal climate variability are disproportionately and negatively associated with the occurrence of large-bodied and geographically small-ranging fishes within local coral reef communities. These species are 67% less likely to occur where human impact and temperature seasonality exceed critical thresholds, such as in the marine biodiversity hotspot: the Coral Triangle. Our results identify the most sensitive species and critical thresholds of human and climatic stressors, providing opportunity for targeted conservation intervention to prevent local extinctions.

  9. Humans and seasonal climate variability threaten large-bodied coral reef fish with small ranges

    PubMed Central

    Mellin, C.; Mouillot, D.; Kulbicki, M.; McClanahan, T. R.; Vigliola, L.; Bradshaw, C. J. A.; Brainard, R. E.; Chabanet, P.; Edgar, G. J.; Fordham, D. A.; Friedlander, A. M.; Parravicini, V.; Sequeira, A. M. M.; Stuart-Smith, R. D.; Wantiez, L.; Caley, M. J.

    2016-01-01

    Coral reefs are among the most species-rich and threatened ecosystems on Earth, yet the extent to which human stressors determine species occurrences, compared with biogeography or environmental conditions, remains largely unknown. With ever-increasing human-mediated disturbances on these ecosystems, an important question is not only how many species can inhabit local communities, but also which biological traits determine species that can persist (or not) above particular disturbance thresholds. Here we show that human pressure and seasonal climate variability are disproportionately and negatively associated with the occurrence of large-bodied and geographically small-ranging fishes within local coral reef communities. These species are 67% less likely to occur where human impact and temperature seasonality exceed critical thresholds, such as in the marine biodiversity hotspot: the Coral Triangle. Our results identify the most sensitive species and critical thresholds of human and climatic stressors, providing opportunity for targeted conservation intervention to prevent local extinctions. PMID:26839155

  10. Effects of biasing on the galaxy power spectrum at large scales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beltran Jimenez, Jose; Departamento de Fisica Teorica, Universidad Complutense de Madrid, 28040, Madrid; Durrer, Ruth

    2011-05-15

    In this paper we study the effect of biasing on the power spectrum at large scales. We show that even though nonlinear biasing does introduce a white noise contribution on large scales, the P(k){proportional_to}k{sup n} behavior of the matter power spectrum on large scales may still be visible and above the white noise for about one decade. We show, that the Kaiser biasing scheme which leads to linear bias of the correlation function on large scales, also generates a linear bias of the power spectrum on rather small scales. This is a consequence of the divergence on small scales ofmore » the pure Harrison-Zeldovich spectrum. However, biasing becomes k dependent if we damp the underlying power spectrum on small scales. We also discuss the effect of biasing on the baryon acoustic oscillations.« less

  11. Proteome-scale human interactomics

    PubMed Central

    Luck, Katja; Sheynkman, Gloria M.; Zhang, Ivy; Vidal, Marc

    2017-01-01

    Cellular functions are mediated by complex interactome networks of physical, biochemical, and functional interactions between DNA sequences, RNA molecules, proteins, lipids, and small metabolites. A thorough understanding of cellular organization requires accurate and relatively complete models of interactome networks at proteome-scale. The recent publication of four human protein-protein interaction (PPI) maps represents a technological breakthrough and an unprecedented resource for the scientific community, heralding a new era of proteome-scale human interactomics. Our knowledge gained from these and complementary studies provides fresh insights into the opportunities and challenges when analyzing systematically generated interactome data, defines a clear roadmap towards the generation of a first reference interactome, and reveals new perspectives on the organization of cellular life. PMID:28284537

  12. Proteome-Scale Human Interactomics.

    PubMed

    Luck, Katja; Sheynkman, Gloria M; Zhang, Ivy; Vidal, Marc

    2017-05-01

    Cellular functions are mediated by complex interactome networks of physical, biochemical, and functional interactions between DNA sequences, RNA molecules, proteins, lipids, and small metabolites. A thorough understanding of cellular organization requires accurate and relatively complete models of interactome networks at proteome scale. The recent publication of four human protein-protein interaction (PPI) maps represents a technological breakthrough and an unprecedented resource for the scientific community, heralding a new era of proteome-scale human interactomics. Our knowledge gained from these and complementary studies provides fresh insights into the opportunities and challenges when analyzing systematically generated interactome data, defines a clear roadmap towards the generation of a first reference interactome, and reveals new perspectives on the organization of cellular life. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Very large scale monoclonal antibody purification: the case for conventional unit operations.

    PubMed

    Kelley, Brian

    2007-01-01

    Technology development initiatives targeted for monoclonal antibody purification may be motivated by manufacturing limitations and are often aimed at solving current and future process bottlenecks. A subject under debate in many biotechnology companies is whether conventional unit operations such as chromatography will eventually become limiting for the production of recombinant protein therapeutics. An evaluation of the potential limitations of process chromatography and filtration using today's commercially available resins and membranes was conducted for a conceptual process scaled to produce 10 tons of monoclonal antibody per year from a single manufacturing plant, a scale representing one of the world's largest single-plant capacities for cGMP protein production. The process employs a simple, efficient purification train using only two chromatographic and two ultrafiltration steps, modeled after a platform antibody purification train that has generated 10 kg batches in clinical production. Based on analyses of cost of goods and the production capacity of this very large scale purification process, it is unlikely that non-conventional downstream unit operations would be needed to replace conventional chromatographic and filtration separation steps, at least for recombinant antibodies.

  14. Rare variation facilitates inferences of fine-scale population structure in humans.

    PubMed

    O'Connor, Timothy D; Fu, Wenqing; Mychaleckyj, Josyf C; Logsdon, Benjamin; Auer, Paul; Carlson, Christopher S; Leal, Suzanne M; Smith, Joshua D; Rieder, Mark J; Bamshad, Michael J; Nickerson, Deborah A; Akey, Joshua M

    2015-03-01

    Understanding the genetic structure of human populations has important implications for the design and interpretation of disease mapping studies and reconstructing human evolutionary history. To date, inferences of human population structure have primarily been made with common variants. However, recent large-scale resequencing studies have shown an abundance of rare variation in humans, which may be particularly useful for making inferences of fine-scale population structure. To this end, we used an information theory framework and extensive coalescent simulations to rigorously quantify the informativeness of rare and common variation to detect signatures of fine-scale population structure. We show that rare variation affords unique insights into patterns of recent population structure. Furthermore, to empirically assess our theoretical findings, we analyzed high-coverage exome sequences in 6,515 European and African American individuals. As predicted, rare variants are more informative than common polymorphisms in revealing a distinct cluster of European-American individuals, and subsequent analyses demonstrate that these individuals are likely of Ashkenazi Jewish ancestry. Our results provide new insights into the population structure using rare variation, which will be an important factor to account for in rare variant association studies. © The Author 2014. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  15. The role of large—scale BECCS in the pursuit of the 1.5°C target: an Earth system model perspective

    NASA Astrophysics Data System (ADS)

    Muri, Helene

    2018-04-01

    The increasing awareness of the many damaging aspects of climate change has prompted research into ways of reducing and reversing the anthropogenic increase in carbon concentrations in the atmosphere. Most emission scenarios stabilizing climate at low levels, such as the 1.5 °C target as outlined by the Paris Agreement, require large-scale deployment of Bio-Energy with Carbon Capture and Storage (BECCS). Here, the potential of large-scale BECCS deployment in contributing towards the 1.5 °C global warming target is evaluated using an Earth system model, as well as associated climate responses and carbon cycle feedbacks. The geographical location of the bioenergy feedstock is shown to be key to the success of such measures in the context of temperature targets. Although net negative emissions were reached sooner, by ∼6 years, and scaled up, land use change emissions and reductions in forest carbon sinks outweigh these effects in one scenario. Re-cultivating mid-latitudes was found to be beneficial, on the other hand, contributing in the right direction towards the 1.5 °C target, only by ‑0.1 °C and ‑54 Gt C in avoided emissions, however. Obstacles remain related to competition for land from nature preservation and food security, as well as the technological availability of CCS.

  16. Engineered human skin substitutes undergo large-scale genomic reprogramming and normal skin-like maturation after transplantation to athymic mice.

    PubMed

    Klingenberg, Jennifer M; McFarland, Kevin L; Friedman, Aaron J; Boyce, Steven T; Aronow, Bruce J; Supp, Dorothy M

    2010-02-01

    Bioengineered skin substitutes can facilitate wound closure in severely burned patients, but deficiencies limit their outcomes compared with native skin autografts. To identify gene programs associated with their in vivo capabilities and limitations, we extended previous gene expression profile analyses to now compare engineered skin after in vivo grafting with both in vitro maturation and normal human skin. Cultured skin substitutes were grafted on full-thickness wounds in athymic mice, and biopsy samples for microarray analyses were collected at multiple in vitro and in vivo time points. Over 10,000 transcripts exhibited large-scale expression pattern differences during in vitro and in vivo maturation. Using hierarchical clustering, 11 different expression profile clusters were partitioned on the basis of differential sample type and temporal stage-specific activation or repression. Analyses show that the wound environment exerts a massive influence on gene expression in skin substitutes. For example, in vivo-healed skin substitutes gained the expression of many native skin-expressed genes, including those associated with epidermal barrier and multiple categories of cell-cell and cell-basement membrane adhesion. In contrast, immunological, trichogenic, and endothelial gene programs were largely lacking. These analyses suggest important areas for guiding further improvement of engineered skin for both increased homology with native skin and enhanced wound healing.

  17. Targeted and genome-scale methylomics reveals gene body signatures in human cell lines

    PubMed Central

    Ball, Madeleine Price; Li, Jin Billy; Gao, Yuan; Lee, Je-Hyuk; LeProust, Emily; Park, In-Hyun; Xie, Bin; Daley, George Q.; Church, George M.

    2012-01-01

    Cytosine methylation, an epigenetic modification of DNA, is a target of growing interest for developing high throughput profiling technologies. Here we introduce two new, complementary techniques for cytosine methylation profiling utilizing next generation sequencing technology: bisulfite padlock probes (BSPPs) and methyl sensitive cut counting (MSCC). In the first method, we designed a set of ~10,000 BSPPs distributed over the ENCODE pilot project regions to take advantage of existing expression and chromatin immunoprecipitation data. We observed a pattern of low promoter methylation coupled with high gene body methylation in highly expressed genes. Using the second method, MSCC, we gathered genome-scale data for 1.4 million HpaII sites and confirmed that gene body methylation in highly expressed genes is a consistent phenomenon over the entire genome. Our observations highlight the usefulness of techniques which are not inherently or intentionally biased in favor of only profiling particular subsets like CpG islands or promoter regions. PMID:19329998

  18. Alpha-particle radiotherapy: For large solid tumors diffusion trumps targeting.

    PubMed

    Zhu, Charles; Sempkowski, Michelle; Holleran, Timothy; Linz, Thomas; Bertalan, Thomas; Josefsson, Anders; Bruchertseifer, Frank; Morgenstern, Alfred; Sofou, Stavroula

    2017-06-01

    Diffusion limitations on the penetration of nanocarriers in solid tumors hamper their therapeutic use when labeled with α-particle emitters. This is mostly due to the α-particles' relatively short range (≤100 μm) resulting in partial tumor irradiation and limited killing. To utilize the high therapeutic potential of α-particles against solid tumors, we designed non-targeted, non-internalizing nanometer-sized tunable carriers (pH-tunable liposomes) that are triggered to release, within the slightly acidic tumor interstitium, highly-diffusive forms of the encapsulated α-particle generator Actinium-225 ( 225 Ac) resulting in more homogeneous distributions of the α-particle emitters, improving uniformity in tumor irradiation and increasing killing efficacies. On large multicellular spheroids (400 μm-in-diameter), used as surrogates of the avascular areas of solid tumors, interstitially-releasing liposomes resulted in best growth control independent of HER2 expression followed in performance by (a) the HER2-targeting radiolabeled antibody or (b) the non-responsive liposomes. In an orthotopic human HER2-negative mouse model, interstitially-releasing 225 Ac-loaded liposomes resulted in the longest overall and median survival. This study demonstrates the therapeutic potential of a general strategy to bypass the diffusion-limited transport of radionuclide carriers in solid tumors enabling interstitial release from non-internalizing nanocarriers of highly-diffusing and deeper tumor-penetrating molecular forms of α-particle emitters, independent of cell-targeting. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Development of large-scale functional brain networks in children.

    PubMed

    Supekar, Kaustubh; Musen, Mark; Menon, Vinod

    2009-07-01

    The ontogeny of large-scale functional organization of the human brain is not well understood. Here we use network analysis of intrinsic functional connectivity to characterize the organization of brain networks in 23 children (ages 7-9 y) and 22 young-adults (ages 19-22 y). Comparison of network properties, including path-length, clustering-coefficient, hierarchy, and regional connectivity, revealed that although children and young-adults' brains have similar "small-world" organization at the global level, they differ significantly in hierarchical organization and interregional connectivity. We found that subcortical areas were more strongly connected with primary sensory, association, and paralimbic areas in children, whereas young-adults showed stronger cortico-cortical connectivity between paralimbic, limbic, and association areas. Further, combined analysis of functional connectivity with wiring distance measures derived from white-matter fiber tracking revealed that the development of large-scale brain networks is characterized by weakening of short-range functional connectivity and strengthening of long-range functional connectivity. Importantly, our findings show that the dynamic process of over-connectivity followed by pruning, which rewires connectivity at the neuronal level, also operates at the systems level, helping to reconfigure and rebalance subcortical and paralimbic connectivity in the developing brain. Our study demonstrates the usefulness of network analysis of brain connectivity to elucidate key principles underlying functional brain maturation, paving the way for novel studies of disrupted brain connectivity in neurodevelopmental disorders such as autism.

  20. Development of Large-Scale Functional Brain Networks in Children

    PubMed Central

    Supekar, Kaustubh; Musen, Mark; Menon, Vinod

    2009-01-01

    The ontogeny of large-scale functional organization of the human brain is not well understood. Here we use network analysis of intrinsic functional connectivity to characterize the organization of brain networks in 23 children (ages 7–9 y) and 22 young-adults (ages 19–22 y). Comparison of network properties, including path-length, clustering-coefficient, hierarchy, and regional connectivity, revealed that although children and young-adults' brains have similar “small-world” organization at the global level, they differ significantly in hierarchical organization and interregional connectivity. We found that subcortical areas were more strongly connected with primary sensory, association, and paralimbic areas in children, whereas young-adults showed stronger cortico-cortical connectivity between paralimbic, limbic, and association areas. Further, combined analysis of functional connectivity with wiring distance measures derived from white-matter fiber tracking revealed that the development of large-scale brain networks is characterized by weakening of short-range functional connectivity and strengthening of long-range functional connectivity. Importantly, our findings show that the dynamic process of over-connectivity followed by pruning, which rewires connectivity at the neuronal level, also operates at the systems level, helping to reconfigure and rebalance subcortical and paralimbic connectivity in the developing brain. Our study demonstrates the usefulness of network analysis of brain connectivity to elucidate key principles underlying functional brain maturation, paving the way for novel studies of disrupted brain connectivity in neurodevelopmental disorders such as autism. PMID:19621066

  1. Limited accessibility to designs and results of Japanese large-scale clinical trials for cardiovascular diseases.

    PubMed

    Sawata, Hiroshi; Ueshima, Kenji; Tsutani, Kiichiro

    2011-04-14

    Clinical evidence is important for improving the treatment of patients by health care providers. In the study of cardiovascular diseases, large-scale clinical trials involving thousands of participants are required to evaluate the risks of cardiac events and/or death. The problems encountered in conducting the Japanese Acute Myocardial Infarction Prospective (JAMP) study highlighted the difficulties involved in obtaining the financial and infrastructural resources necessary for conducting large-scale clinical trials. The objectives of the current study were: 1) to clarify the current funding and infrastructural environment surrounding large-scale clinical trials in cardiovascular and metabolic diseases in Japan, and 2) to find ways to improve the environment surrounding clinical trials in Japan more generally. We examined clinical trials examining cardiovascular diseases that evaluated true endpoints and involved 300 or more participants using Pub-Med, Ichushi (by the Japan Medical Abstracts Society, a non-profit organization), websites of related medical societies, the University Hospital Medical Information Network (UMIN) Clinical Trials Registry, and clinicaltrials.gov at three points in time: 30 November, 2004, 25 February, 2007 and 25 July, 2009. We found a total of 152 trials that met our criteria for 'large-scale clinical trials' examining cardiovascular diseases in Japan. Of these, 72.4% were randomized controlled trials (RCTs). Of 152 trials, 9.2% of the trials examined more than 10,000 participants, and 42.8% examined between 1,000 and 10,000 participants. The number of large-scale clinical trials markedly increased from 2001 to 2004, but suddenly decreased in 2007, then began to increase again. Ischemic heart disease (39.5%) was the most common target disease. Most of the larger-scale trials were funded by private organizations such as pharmaceutical companies. The designs and results of 13 trials were not disclosed. To improve the quality of clinical

  2. Scale and the representation of human agency in the modeling of agroecosystems

    DOE PAGES

    Preston, Benjamin L.; King, Anthony W.; Ernst, Kathleen M.; ...

    2015-07-17

    Human agency is an essential determinant of the dynamics of agroecosystems. However, the manner in which agency is represented within different approaches to agroecosystem modeling is largely contingent on the scales of analysis and the conceptualization of the system of interest. While appropriate at times, narrow conceptualizations of agroecosystems can preclude consideration for how agency manifests at different scales, thereby marginalizing processes, feedbacks, and constraints that would otherwise affect model results. Modifications to the existing modeling toolkit may therefore enable more holistic representations of human agency. Model integration can assist with the development of multi-scale agroecosystem modeling frameworks that capturemore » different aspects of agency. In addition, expanding the use of socioeconomic scenarios and stakeholder participation can assist in explicitly defining context-dependent elements of scale and agency. Finally, such approaches, however, should be accompanied by greater recognition of the meta agency of model users and the need for more critical evaluation of model selection and application.« less

  3. Statistical Measures of Large-Scale Structure

    NASA Astrophysics Data System (ADS)

    Vogeley, Michael; Geller, Margaret; Huchra, John; Park, Changbom; Gott, J. Richard

    1993-12-01

    \\inv Mpc} To quantify clustering in the large-scale distribution of galaxies and to test theories for the formation of structure in the universe, we apply statistical measures to the CfA Redshift Survey. This survey is complete to m_{B(0)}=15.5 over two contiguous regions which cover one-quarter of the sky and include ~ 11,000 galaxies. The salient features of these data are voids with diameter 30-50\\hmpc and coherent dense structures with a scale ~ 100\\hmpc. Comparison with N-body simulations rules out the ``standard" CDM model (Omega =1, b=1.5, sigma_8 =1) at the 99% confidence level because this model has insufficient power on scales lambda >30\\hmpc. An unbiased open universe CDM model (Omega h =0.2) and a biased CDM model with non-zero cosmological constant (Omega h =0.24, lambda_0 =0.6) match the observed power spectrum. The amplitude of the power spectrum depends on the luminosity of galaxies in the sample; bright (L>L(*) ) galaxies are more strongly clustered than faint galaxies. The paucity of bright galaxies in low-density regions may explain this dependence. To measure the topology of large-scale structure, we compute the genus of isodensity surfaces of the smoothed density field. On scales in the ``non-linear" regime, <= 10\\hmpc, the high- and low-density regions are multiply-connected over a broad range of density threshold, as in a filamentary net. On smoothing scales >10\\hmpc, the topology is consistent with statistics of a Gaussian random field. Simulations of CDM models fail to produce the observed coherence of structure on non-linear scales (>95% confidence level). The underdensity probability (the frequency of regions with density contrast delta rho //lineρ=-0.8) depends strongly on the luminosity of galaxies; underdense regions are significantly more common (>2sigma ) in bright (L>L(*) ) galaxy samples than in samples which include fainter galaxies.

  4. The human footprint in the west: a large-scale analysis of human impacts

    USGS Publications Warehouse

    Leu, Matthias

    2003-01-01

    Background Humans have dramatically altered wildlands in the western United States over the past 100 years by using these lands and the resources they provide. Anthropogenic changes to the landscape, such as urban expansion and development of rural areas, influence the number and kinds of plants and wildlife that remain. In addition, western ecosystems are also affected by roads, powerlines, and other networks and land uses necessary to maintain human populations. The cumulative impacts of human presence and actions on a landscape are called the "human footprint." These impacts may affect plants and wildlife by increasing the number of synanthropic (species that benefit from human activities) bird and mammal predators and facilitating their movements through the landscape or by creating unsuitable habitats. These actions can impact plants and wildlife to such an extent that the persistence of populations or entire species is questionable. For example, greater sage-grouse (Centrocercus urophasianus) once were widespread throughout the Great Basin, but now are a focus of conservation concern because populations have declined for the past three decades across most of their range. At the USGS Forest and Rangeland Ecosystem Science Center, we are developing spatial models to better understand potential influences of the human footprint on shrubland ecosystems and associated wildlife in the western United States.

  5. Cosmic strings and the large-scale structure

    NASA Technical Reports Server (NTRS)

    Stebbins, Albert

    1988-01-01

    A possible problem for cosmic string models of galaxy formation is presented. If very large voids are common and if loop fragmentation is not much more efficient than presently believed, then it may be impossible for string scenarios to produce the observed large-scale structure with Omega sub 0 = 1 and without strong environmental biasing.

  6. Coronal hole evolution by sudden large scale changes

    NASA Technical Reports Server (NTRS)

    Nolte, J. T.; Gerassimenko, M.; Krieger, A. S.; Solodyna, C. V.

    1978-01-01

    Sudden shifts in coronal-hole boundaries observed by the S-054 X-ray telescope on Skylab between May and November, 1973, within 1 day of CMP of the holes, at latitudes not exceeding 40 deg, are compared with the long-term evolution of coronal-hole area. It is found that large-scale shifts in boundary locations can account for most if not all of the evolution of coronal holes. The temporal and spatial scales of these large-scale changes imply that they are the results of a physical process occurring in the corona. It is concluded that coronal holes evolve by magnetic-field lines' opening when the holes are growing, and by fields' closing as the holes shrink.

  7. New probes of Cosmic Microwave Background large-scale anomalies

    NASA Astrophysics Data System (ADS)

    Aiola, Simone

    Fifty years of Cosmic Microwave Background (CMB) data played a crucial role in constraining the parameters of the LambdaCDM model, where Dark Energy, Dark Matter, and Inflation are the three most important pillars not yet understood. Inflation prescribes an isotropic universe on large scales, and it generates spatially-correlated density fluctuations over the whole Hubble volume. CMB temperature fluctuations on scales bigger than a degree in the sky, affected by modes on super-horizon scale at the time of recombination, are a clean snapshot of the universe after inflation. In addition, the accelerated expansion of the universe, driven by Dark Energy, leaves a hardly detectable imprint in the large-scale temperature sky at late times. Such fundamental predictions have been tested with current CMB data and found to be in tension with what we expect from our simple LambdaCDM model. Is this tension just a random fluke or a fundamental issue with the present model? In this thesis, we present a new framework to probe the lack of large-scale correlations in the temperature sky using CMB polarization data. Our analysis shows that if a suppression in the CMB polarization correlations is detected, it will provide compelling evidence for new physics on super-horizon scale. To further analyze the statistical properties of the CMB temperature sky, we constrain the degree of statistical anisotropy of the CMB in the context of the observed large-scale dipole power asymmetry. We find evidence for a scale-dependent dipolar modulation at 2.5sigma. To isolate late-time signals from the primordial ones, we test the anomalously high Integrated Sachs-Wolfe effect signal generated by superstructures in the universe. We find that the detected signal is in tension with the expectations from LambdaCDM at the 2.5sigma level, which is somewhat smaller than what has been previously argued. To conclude, we describe the current status of CMB observations on small scales, highlighting the

  8. Exploring the large-scale structure of Taylor–Couette turbulence through Large-Eddy Simulations

    NASA Astrophysics Data System (ADS)

    Ostilla-Mónico, Rodolfo; Zhu, Xiaojue; Verzicco, Roberto

    2018-04-01

    Large eddy simulations (LES) of Taylor-Couette (TC) flow, the flow between two co-axial and independently rotating cylinders are performed in an attempt to explore the large-scale axially-pinned structures seen in experiments and simulations. Both static and dynamic LES models are used. The Reynolds number is kept fixed at Re = 3.4 · 104, and the radius ratio η = ri /ro is set to η = 0.909, limiting the effects of curvature and resulting in frictional Reynolds numbers of around Re τ ≈ 500. Four rotation ratios from Rot = ‑0.0909 to Rot = 0.3 are simulated. First, the LES of TC is benchmarked for different rotation ratios. Both the Smagorinsky model with a constant of cs = 0.1 and the dynamic model are found to produce reasonable results for no mean rotation and cyclonic rotation, but deviations increase for increasing rotation. This is attributed to the increasing anisotropic character of the fluctuations. Second, “over-damped” LES, i.e. LES with a large Smagorinsky constant is performed and is shown to reproduce some features of the large-scale structures, even when the near-wall region is not adequately modeled. This shows the potential for using over-damped LES for fast explorations of the parameter space where large-scale structures are found.

  9. Large-scale microwave anisotropy from gravitating seeds

    NASA Technical Reports Server (NTRS)

    Veeraraghavan, Shoba; Stebbins, Albert

    1992-01-01

    Topological defects could have seeded primordial inhomogeneities in cosmological matter. We examine the horizon-scale matter and geometry perturbations generated by such seeds in an expanding homogeneous and isotropic universe. Evolving particle horizons generally lead to perturbations around motionless seeds, even when there are compensating initial underdensities in the matter. We describe the pattern of the resulting large angular scale microwave anisotropy.

  10. Preventing Large-Scale Controlled Substance Diversion From Within the Pharmacy

    PubMed Central

    Martin, Emory S.; Dzierba, Steven H.; Jones, David M.

    2013-01-01

    Large-scale diversion of controlled substances (CS) from within a hospital or heath system pharmacy is a rare but growing problem. It is the responsibility of pharmacy leadership to scrutinize control processes to expose weaknesses. This article reviews examples of large-scale diversion incidents and diversion techniques and provides practical strategies to stimulate enhanced CS security within the pharmacy staff. Large-scale diversion from within a pharmacy department can be averted by a pharmacist-in-charge who is informed and proactive in taking effective countermeasures. PMID:24421497

  11. Large-scale flow experiments for managing river systems

    USGS Publications Warehouse

    Konrad, Christopher P.; Olden, Julian D.; Lytle, David A.; Melis, Theodore S.; Schmidt, John C.; Bray, Erin N.; Freeman, Mary C.; Gido, Keith B.; Hemphill, Nina P.; Kennard, Mark J.; McMullen, Laura E.; Mims, Meryl C.; Pyron, Mark; Robinson, Christopher T.; Williams, John G.

    2011-01-01

    Experimental manipulations of streamflow have been used globally in recent decades to mitigate the impacts of dam operations on river systems. Rivers are challenging subjects for experimentation, because they are open systems that cannot be isolated from their social context. We identify principles to address the challenges of conducting effective large-scale flow experiments. Flow experiments have both scientific and social value when they help to resolve specific questions about the ecological action of flow with a clear nexus to water policies and decisions. Water managers must integrate new information into operating policies for large-scale experiments to be effective. Modeling and monitoring can be integrated with experiments to analyze long-term ecological responses. Experimental design should include spatially extensive observations and well-defined, repeated treatments. Large-scale flow manipulations are only a part of dam operations that affect river systems. Scientists can ensure that experimental manipulations continue to be a valuable approach for the scientifically based management of river systems.

  12. Large-scale multiplex absolute protein quantification of drug-metabolizing enzymes and transporters in human intestine, liver, and kidney microsomes by SWATH-MS: Comparison with MRM/SRM and HR-MRM/PRM.

    PubMed

    Nakamura, Kenji; Hirayama-Kurogi, Mio; Ito, Shingo; Kuno, Takuya; Yoneyama, Toshihiro; Obuchi, Wataru; Terasaki, Tetsuya; Ohtsuki, Sumio

    2016-08-01

    The purpose of the present study was to examine simultaneously the absolute protein amounts of 152 membrane and membrane-associated proteins, including 30 metabolizing enzymes and 107 transporters, in pooled microsomal fractions of human liver, kidney, and intestine by means of SWATH-MS with stable isotope-labeled internal standard peptides, and to compare the results with those obtained by MRM/SRM and high resolution (HR)-MRM/PRM. The protein expression levels of 27 metabolizing enzymes, 54 transporters, and six other membrane proteins were quantitated by SWATH-MS; other targets were below the lower limits of quantitation. Most of the values determined by SWATH-MS differed by less than 50% from those obtained by MRM/SRM or HR-MRM/PRM. Various metabolizing enzymes were expressed in liver microsomes more abundantly than in other microsomes. Ten, 13, and eight transporters listed as important for drugs by International Transporter Consortium were quantified in liver, kidney, and intestinal microsomes, respectively. Our results indicate that SWATH-MS enables large-scale multiplex absolute protein quantification while retaining similar quantitative capability to MRM/SRM or HR-MRM/PRM. SWATH-MS is expected to be useful methodology in the context of drug development for elucidating the molecular mechanisms of drug absorption, metabolism, and excretion in the human body based on protein profile information. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Common fragile sites (CFS) and extremely large CFS genes are targets for human papillomavirus integrations and chromosome rearrangements in oropharyngeal squamous cell carcinoma.

    PubMed

    Gao, Ge; Johnson, Sarah H; Vasmatzis, George; Pauley, Christina E; Tombers, Nicole M; Kasperbauer, Jan L; Smith, David I

    2017-01-01

    Common fragile sites (CFS) are chromosome regions that are prone to form gaps or breaks in response to DNA replication stress. They are often found as hotspots for sister chromatid exchanges, deletions, and amplifications in different cancers. Many of the CFS regions are found to span genes whose genomic sequence is greater than 1 Mb, some of which have been demonstrated to function as important tumor suppressors. CFS regions are also hotspots for human papillomavirus (HPV) integrations in cervical cancer. We used mate-pair sequencing to examine HPV integration events and chromosomal structural variations in 34 oropharyngeal squamous cell carcinoma (OPSCC). We used endpoint PCR and Sanger sequencing to validate each HPV integration event and found HPV integrations preferentially occurred within CFS regions similar to what is observed in cervical cancer. We also found that many of the chromosomal alterations detected also occurred at or near the cytogenetic location of CFSs. Several large genes were also found to be recurrent targets of rearrangements, independent of HPV integrations, including CSMD1 (2.1Mb), LRP1B (1.9Mb), and LARGE1 (0.7Mb). Sanger sequencing revealed that the nucleotide sequences near to identified junction sites contained repetitive and AT-rich sequences that were shown to have the potential to form stem-loop DNA secondary structures that might stall DNA replication fork progression during replication stress. This could then cause increased instability in these regions which could lead to cancer development in human cells. Our findings suggest that CFSs and some specific large genes appear to play important roles in OPSCC. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  14. Large Scale Underground Detectors in Europe

    NASA Astrophysics Data System (ADS)

    Katsanevas, S. K.

    2006-07-01

    The physics potential and the complementarity of the large scale underground European detectors: Water Cherenkov (MEMPHYS), Liquid Argon TPC (GLACIER) and Liquid Scintillator (LENA) is presented with emphasis on the major physics opportunities, namely proton decay, supernova detection and neutrino parameter determination using accelerator beams.

  15. Large-scale retrieval for medical image analytics: A comprehensive review.

    PubMed

    Li, Zhongyu; Zhang, Xiaofan; Müller, Henning; Zhang, Shaoting

    2018-01-01

    Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Large-scale magnetic fields at high Reynolds numbers in magnetohydrodynamic simulations.

    PubMed

    Hotta, H; Rempel, M; Yokoyama, T

    2016-03-25

    The 11-year solar magnetic cycle shows a high degree of coherence in spite of the turbulent nature of the solar convection zone. It has been found in recent high-resolution magnetohydrodynamics simulations that the maintenance of a large-scale coherent magnetic field is difficult with small viscosity and magnetic diffusivity (≲10 (12) square centimenters per second). We reproduced previous findings that indicate a reduction of the energy in the large-scale magnetic field for lower diffusivities and demonstrate the recovery of the global-scale magnetic field using unprecedentedly high resolution. We found an efficient small-scale dynamo that suppresses small-scale flows, which mimics the properties of large diffusivity. As a result, the global-scale magnetic field is maintained even in the regime of small diffusivities-that is, large Reynolds numbers. Copyright © 2016, American Association for the Advancement of Science.

  17. Prototype Vector Machine for Large Scale Semi-Supervised Learning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Kai; Kwok, James T.; Parvin, Bahram

    2009-04-29

    Practicaldataminingrarelyfalls exactlyinto the supervisedlearning scenario. Rather, the growing amount of unlabeled data poses a big challenge to large-scale semi-supervised learning (SSL). We note that the computationalintensivenessofgraph-based SSLarises largely from the manifold or graph regularization, which in turn lead to large models that are dificult to handle. To alleviate this, we proposed the prototype vector machine (PVM), a highlyscalable,graph-based algorithm for large-scale SSL. Our key innovation is the use of"prototypes vectors" for effcient approximation on both the graph-based regularizer and model representation. The choice of prototypes are grounded upon two important criteria: they not only perform effective low-rank approximation of themore » kernel matrix, but also span a model suffering the minimum information loss compared with the complete model. We demonstrate encouraging performance and appealing scaling properties of the PVM on a number of machine learning benchmark data sets.« less

  18. Large- to small-scale dynamo in domains of large aspect ratio: kinematic regime

    NASA Astrophysics Data System (ADS)

    Shumaylova, Valeria; Teed, Robert J.; Proctor, Michael R. E.

    2017-04-01

    The Sun's magnetic field exhibits coherence in space and time on much larger scales than the turbulent convection that ultimately powers the dynamo. In this work, we look for numerical evidence of a large-scale magnetic field as the magnetic Reynolds number, Rm, is increased. The investigation is based on the simulations of the induction equation in elongated periodic boxes. The imposed flows considered are the standard ABC flow (named after Arnold, Beltrami & Childress) with wavenumber ku = 1 (small-scale) and a modulated ABC flow with wavenumbers ku = m, 1, 1 ± m, where m is the wavenumber corresponding to the long-wavelength perturbation on the scale of the box. The critical magnetic Reynolds number R_m^{crit} decreases as the permitted scale separation in the system increases, such that R_m^{crit} ∝ [L_x/L_z]^{-1/2}. The results show that the α-effect derived from the mean-field theory ansatz is valid for a small range of Rm after which small scale dynamo instability occurs and the mean-field approximation is no longer valid. The transition from large- to small-scale dynamo is smooth and takes place in two stages: a fast transition into a predominantly small-scale magnetic energy state and a slower transition into even smaller scales. In the range of Rm considered, the most energetic Fourier component corresponding to the structure in the long x-direction has twice the length-scale of the forcing scale. The long-wavelength perturbation imposed on the ABC flow in the modulated case is not preserved in the eigenmodes of the magnetic field.

  19. The micro-environmental impact of volatile organic compound emissions from large-scale assemblies of people in a confined space

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dutta, Tanushree

    Large-scale assemblies of people in a confined space can exert significant impacts on the local air chemistry due to human emissions of volatile organics. Variations of air-quality in such small scale can be studied by quantifying fingerprint volatile organic compounds (VOCs) such as acetone, toluene, and isoprene produced during concerts, movie screenings, and sport events (like the Olympics and the World Cup). This review summarizes the extent of VOC accumulation resulting from a large population in a confined area or in a small open area during sporting and other recreational activities. Apart from VOCs emitted directly from human bodies (e.g.,more » perspiration and exhaled breath), those released indirectly from other related sources (e.g., smoking, waste disposal, discharge of food-waste, and use of personal-care products) are also discussed. Although direct and indirect emissions of VOCs from human may constitute <1% of the global atmospheric VOCs budget, unique spatiotemporal variations in VOCs species within a confined space can have unforeseen impacts on the local atmosphere to lead to acute human exposure to harmful pollutants.« less

  20. Measuring large-scale vertical motion in the atmosphere with dropsondes

    NASA Astrophysics Data System (ADS)

    Bony, Sandrine; Stevens, Bjorn

    2017-04-01

    Large-scale vertical velocity modulates important processes in the atmosphere, including the formation of clouds, and constitutes a key component of the large-scale forcing of Single-Column Model simulations and Large-Eddy Simulations. Its measurement has also been a long-standing challenge for observationalists. We will show that it is possible to measure the vertical profile of large-scale wind divergence and vertical velocity from aircraft by using dropsondes. This methodology was tested in August 2016 during the NARVAL2 campaign in the lower Atlantic trades. Results will be shown for several research flights, the robustness and the uncertainty of measurements will be assessed, ands observational estimates will be compared with data from high-resolution numerical forecasts.

  1. Molecular basis for the action of a dietary flavonoid revealed by the comprehensive identification of apigenin human targets

    PubMed Central

    Arango, Daniel; Morohashi, Kengo; Yilmaz, Alper; Kuramochi, Kouji; Parihar, Arti; Brahimaj, Bledi; Grotewold, Erich; Doseff, Andrea I.

    2013-01-01

    Flavonoids constitute the largest class of dietary phytochemicals, adding essential health value to our diet, and are emerging as key nutraceuticals. Cellular targets for dietary phytochemicals remain largely unknown, posing significant challenges for the regulation of dietary supplements and the understanding of how nutraceuticals provide health value. Here, we describe the identification of human cellular targets of apigenin, a flavonoid abundantly present in fruits and vegetables, using an innovative high-throughput approach that combines phage display with second generation sequencing. The 160 identified high-confidence candidate apigenin targets are significantly enriched in three main functional categories: GTPase activation, membrane transport, and mRNA metabolism/alternative splicing. This last category includes the heterogeneous nuclear ribonucleoprotein A2 (hnRNPA2), a factor involved in splicing regulation, mRNA stability, and mRNA transport. Apigenin binds to the C-terminal glycine-rich domain of hnRNPA2, preventing hnRNPA2 from forming homodimers, and therefore, it perturbs the alternative splicing of several human hnRNPA2 targets. Our results provide a framework to understand how dietary phytochemicals exert their actions by binding to many functionally diverse cellular targets. In turn, some of them may modulate the activity of a large number of downstream genes, which is exemplified here by the effects of apigenin on the alternative splicing activity of hnRNPA2. Hence, in contrast to small-molecule pharmaceuticals designed for defined target specificity, dietary phytochemicals affect a large number of cellular targets with varied affinities that, combined, result in their recognized health benefits. PMID:23697369

  2. Targeting xenobiotic receptors PXR and CAR in human diseases

    PubMed Central

    Banerjee, Monimoy; Robbins, Delira; Chen, Taosheng

    2014-01-01

    Nuclear receptors such as the pregnane X receptor (PXR) and constitutive androstane receptor (CAR) are xenobiotic receptors regulating not only drug metabolism and disposition but also various human diseases such as cancer, diabetes, inflammatory disease, metabolic disease and liver diseases, suggesting that PXR and CAR are promising targets for drug discovery. Consequently, there is an urgent need to discover and develop small molecules that target these PXR- and/or CAR-mediated human-disease-related pathways for relevant therapeutic applications. This review proposes approaches to target PXR and CAR, either individually or simultaneously, in the context of various human diseases, taking into consideration the structural differences between PXR and CAR. PMID:25463033

  3. Bayesian hierarchical model for large-scale covariance matrix estimation.

    PubMed

    Zhu, Dongxiao; Hero, Alfred O

    2007-12-01

    Many bioinformatics problems implicitly depend on estimating large-scale covariance matrix. The traditional approaches tend to give rise to high variance and low accuracy due to "overfitting." We cast the large-scale covariance matrix estimation problem into the Bayesian hierarchical model framework, and introduce dependency between covariance parameters. We demonstrate the advantages of our approaches over the traditional approaches using simulations and OMICS data analysis.

  4. Hum-mPLoc: an ensemble classifier for large-scale human protein subcellular location prediction by incorporating samples with multiple sites.

    PubMed

    Shen, Hong-Bin; Chou, Kuo-Chen

    2007-04-20

    Proteins may simultaneously exist at, or move between, two or more different subcellular locations. Proteins with multiple locations or dynamic feature of this kind are particularly interesting because they may have some very special biological functions intriguing to investigators in both basic research and drug discovery. For instance, among the 6408 human protein entries that have experimentally observed subcellular location annotations in the Swiss-Prot database (version 50.7, released 19-Sept-2006), 973 ( approximately 15%) have multiple location sites. The number of total human protein entries (except those annotated with "fragment" or those with less than 50 amino acids) in the same database is 14,370, meaning a gap of (14,370-6408)=7962 entries for which no knowledge is available about their subcellular locations. Although one can use the computational approach to predict the desired information for the gap, so far all the existing methods for predicting human protein subcellular localization are limited in the case of single location site only. To overcome such a barrier, a new ensemble classifier, named Hum-mPLoc, was developed that can be used to deal with the case of multiple location sites as well. Hum-mPLoc is freely accessible to the public as a web server at http://202.120.37.186/bioinf/hum-multi. Meanwhile, for the convenience of people working in the relevant areas, Hum-mPLoc has been used to identify all human protein entries in the Swiss-Prot database that do not have subcellular location annotations or are annotated as being uncertain. The large-scale results thus obtained have been deposited in a downloadable file prepared with Microsoft Excel and named "Tab_Hum-mPLoc.xls". This file is available at the same website and will be updated twice a year to include new entries of human proteins and reflect the continuous development of Hum-mPLoc.

  5. Large Scale Deformation of the Western US Cordillera

    NASA Technical Reports Server (NTRS)

    Bennett, Richard A.

    2001-01-01

    Destructive earthquakes occur throughout the western US Cordillera (WUSC), not just within the San Andreas fault zone. But because we do not understand the present-day large-scale deformations of the crust throughout the WUSC, our ability to assess the potential for seismic hazards in this region remains severely limited. To address this problem, we are using a large collection of Global Positioning System (GPS) networks which spans the WUSC to precisely quantify present-day large-scale crustal deformations in a single uniform reference frame. Our work can roughly be divided into an analysis of the GPS observations to infer the deformation field across and within the entire plate boundary zone and an investigation of the implications of this deformation field regarding plate boundary dynamics.

  6. Large Scale Survey Data in Career Development Research

    ERIC Educational Resources Information Center

    Diemer, Matthew A.

    2008-01-01

    Large scale survey datasets have been underutilized but offer numerous advantages for career development scholars, as they contain numerous career development constructs with large and diverse samples that are followed longitudinally. Constructs such as work salience, vocational expectations, educational expectations, work satisfaction, and…

  7. Large-Scale Weather Disturbances in Mars’ Southern Extratropics

    NASA Astrophysics Data System (ADS)

    Hollingsworth, Jeffery L.; Kahre, Melinda A.

    2015-11-01

    Between late autumn and early spring, Mars’ middle and high latitudes within its atmosphere support strong mean thermal gradients between the tropics and poles. Observations from both the Mars Global Surveyor (MGS) and Mars Reconnaissance Orbiter (MRO) indicate that this strong baroclinicity supports intense, large-scale eastward traveling weather systems (i.e., transient synoptic-period waves). These extratropical weather disturbances are key components of the global circulation. Such wave-like disturbances act as agents in the transport of heat and momentum, and generalized scalar/tracer quantities (e.g., atmospheric dust, water-vapor and ice clouds). The character of large-scale, traveling extratropical synoptic-period disturbances in Mars' southern hemisphere during late winter through early spring is investigated using a moderately high-resolution Mars global climate model (Mars GCM). This Mars GCM imposes interactively lifted and radiatively active dust based on a threshold value of the surface stress. The model exhibits a reasonable "dust cycle" (i.e., globally averaged, a dustier atmosphere during southern spring and summer occurs). Compared to their northern-hemisphere counterparts, southern synoptic-period weather disturbances and accompanying frontal waves have smaller meridional and zonal scales, and are far less intense. Influences of the zonally asymmetric (i.e., east-west varying) topography on southern large-scale weather are examined. Simulations that adapt Mars’ full topography compared to simulations that utilize synthetic topographies emulating key large-scale features of the southern middle latitudes indicate that Mars’ transient barotropic/baroclinic eddies are highly influenced by the great impact basins of this hemisphere (e.g., Argyre and Hellas). The occurrence of a southern storm zone in late winter and early spring appears to be anchored to the western hemisphere via orographic influences from the Tharsis highlands, and the Argyre

  8. Systematic Phenotyping of a Large-Scale Candida glabrata Deletion Collection Reveals Novel Antifungal Tolerance Genes

    PubMed Central

    Hiller, Ekkehard; Istel, Fabian; Tscherner, Michael; Brunke, Sascha; Ames, Lauren; Firon, Arnaud; Green, Brian; Cabral, Vitor; Marcet-Houben, Marina; Jacobsen, Ilse D.; Quintin, Jessica; Seider, Katja; Frohner, Ingrid; Glaser, Walter; Jungwirth, Helmut; Bachellier-Bassi, Sophie; Chauvel, Murielle; Zeidler, Ute; Ferrandon, Dominique; Gabaldón, Toni; Hube, Bernhard; d'Enfert, Christophe; Rupp, Steffen; Cormack, Brendan; Haynes, Ken; Kuchler, Karl

    2014-01-01

    The opportunistic fungal pathogen Candida glabrata is a frequent cause of candidiasis, causing infections ranging from superficial to life-threatening disseminated disease. The inherent tolerance of C. glabrata to azole drugs makes this pathogen a serious clinical threat. To identify novel genes implicated in antifungal drug tolerance, we have constructed a large-scale C. glabrata deletion library consisting of 619 unique, individually bar-coded mutant strains, each lacking one specific gene, all together representing almost 12% of the genome. Functional analysis of this library in a series of phenotypic and fitness assays identified numerous genes required for growth of C. glabrata under normal or specific stress conditions, as well as a number of novel genes involved in tolerance to clinically important antifungal drugs such as azoles and echinocandins. We identified 38 deletion strains displaying strongly increased susceptibility to caspofungin, 28 of which encoding proteins that have not previously been linked to echinocandin tolerance. Our results demonstrate the potential of the C. glabrata mutant collection as a valuable resource in functional genomics studies of this important fungal pathogen of humans, and to facilitate the identification of putative novel antifungal drug target and virulence genes. PMID:24945925

  9. How Large Scales Flows May Influence Solar Activity

    NASA Technical Reports Server (NTRS)

    Hathaway, D. H.

    2004-01-01

    Large scale flows within the solar convection zone are the primary drivers of the Sun's magnetic activity cycle and play important roles in shaping the Sun's magnetic field. Differential rotation amplifies the magnetic field through its shearing action and converts poloidal field into toroidal field. Poleward meridional flow near the surface carries magnetic flux that reverses the magnetic poles at about the time of solar maximum. The deeper, equatorward meridional flow can carry magnetic flux back toward the lower latitudes where it erupts through the surface to form tilted active regions that convert toroidal fields into oppositely directed poloidal fields. These axisymmetric flows are themselves driven by large scale convective motions. The effects of the Sun's rotation on convection produce velocity correlations that can maintain both the differential rotation and the meridional circulation. These convective motions can also influence solar activity directly by shaping the magnetic field pattern. While considerable theoretical advances have been made toward understanding these large scale flows, outstanding problems in matching theory to observations still remain.

  10. Genome-scale approaches to the epigenetics of common human disease

    PubMed Central

    2011-01-01

    Traditionally, the pathology of human disease has been focused on microscopic examination of affected tissues, chemical and biochemical analysis of biopsy samples, other available samples of convenience, such as blood, and noninvasive or invasive imaging of varying complexity, in order to classify disease and illuminate its mechanistic basis. The molecular age has complemented this armamentarium with gene expression arrays and selective analysis of individual genes. However, we are entering a new era of epigenomic profiling, i.e., genome-scale analysis of cell-heritable nonsequence genetic change, such as DNA methylation. The epigenome offers access to stable measurements of cellular state and to biobanked material for large-scale epidemiological studies. Some of these genome-scale technologies are beginning to be applied to create the new field of epigenetic epidemiology. PMID:19844740

  11. The role of Natural Flood Management in managing floods in large scale basins during extreme events

    NASA Astrophysics Data System (ADS)

    Quinn, Paul; Owen, Gareth; ODonnell, Greg; Nicholson, Alex; Hetherington, David

    2016-04-01

    There is a strong evidence database showing the negative impacts of land use intensification and soil degradation in NW European river basins on hydrological response and to flood impact downstream. However, the ability to target zones of high runoff production and the extent to which we can manage flood risk using nature-based flood management solution are less known. A move to planting more trees and having less intense farmed landscapes is part of natural flood management (NFM) solutions and these methods suggest that flood risk can be managed in alternative and more holistic ways. So what local NFM management methods should be used, where in large scale basin should they be deployed and how does flow is propagate to any point downstream? Generally, how much intervention is needed and will it compromise food production systems? If we are observing record levels of rainfall and flow, for example during Storm Desmond in Dec 2015 in the North West of England, what other flood management options are really needed to complement our traditional defences in large basins for the future? In this paper we will show examples of NFM interventions in the UK that have impacted at local scale sites. We will demonstrate the impact of interventions at local, sub-catchment (meso-scale) and finally at the large scale. These tools include observations, process based models and more generalised Flood Impact Models. Issues of synchronisation and the design level of protection will be debated. By reworking observed rainfall and discharge (runoff) for observed extreme events in the River Eden and River Tyne, during Storm Desmond, we will show how much flood protection is needed in large scale basins. The research will thus pose a number of key questions as to how floods may have to be managed in large scale basins in the future. We will seek to support a method of catchment systems engineering that holds water back across the whole landscape as a major opportunity to management water

  12. Optical interconnect for large-scale systems

    NASA Astrophysics Data System (ADS)

    Dress, William

    2013-02-01

    This paper presents a switchless, optical interconnect module that serves as a node in a network of identical distribution modules for large-scale systems. Thousands to millions of hosts or endpoints may be interconnected by a network of such modules, avoiding the need for multi-level switches. Several common network topologies are reviewed and their scaling properties assessed. The concept of message-flow routing is discussed in conjunction with the unique properties enabled by the optical distribution module where it is shown how top-down software control (global routing tables, spanning-tree algorithms) may be avoided.

  13. Contribution of peculiar shear motions to large-scale structure

    NASA Technical Reports Server (NTRS)

    Mueler, Hans-Reinhard; Treumann, Rudolf A.

    1994-01-01

    Self-gravitating shear flow instability simulations in a cold dark matter-dominated expanding Einstein-de Sitter universe have been performed. When the shear flow speed exceeds a certain threshold, self-gravitating Kelvin-Helmoholtz instability occurs, forming density voids and excesses along the shear flow layer which serve as seeds for large-scale structure formation. A possible mechanism for generating shear peculiar motions are velocity fluctuations induced by the density perturbations of the postinflation era. In this scenario, short scales grow earlier than large scales. A model of this kind may contribute to the cellular structure of the luminous mass distribution in the universe.

  14. Measuring Large-Scale Social Networks with High Resolution

    PubMed Central

    Stopczynski, Arkadiusz; Sekara, Vedran; Sapiezynski, Piotr; Cuttone, Andrea; Madsen, Mette My; Larsen, Jakob Eg; Lehmann, Sune

    2014-01-01

    This paper describes the deployment of a large-scale study designed to measure human interactions across a variety of communication channels, with high temporal resolution and spanning multiple years—the Copenhagen Networks Study. Specifically, we collect data on face-to-face interactions, telecommunication, social networks, location, and background information (personality, demographics, health, politics) for a densely connected population of 1 000 individuals, using state-of-the-art smartphones as social sensors. Here we provide an overview of the related work and describe the motivation and research agenda driving the study. Additionally, the paper details the data-types measured, and the technical infrastructure in terms of both backend and phone software, as well as an outline of the deployment procedures. We document the participant privacy procedures and their underlying principles. The paper is concluded with early results from data analysis, illustrating the importance of multi-channel high-resolution approach to data collection. PMID:24770359

  15. Rainbow: a tool for large-scale whole-genome sequencing data analysis using cloud computing.

    PubMed

    Zhao, Shanrong; Prenger, Kurt; Smith, Lance; Messina, Thomas; Fan, Hongtao; Jaeger, Edward; Stephens, Susan

    2013-06-27

    Technical improvements have decreased sequencing costs and, as a result, the size and number of genomic datasets have increased rapidly. Because of the lower cost, large amounts of sequence data are now being produced by small to midsize research groups. Crossbow is a software tool that can detect single nucleotide polymorphisms (SNPs) in whole-genome sequencing (WGS) data from a single subject; however, Crossbow has a number of limitations when applied to multiple subjects from large-scale WGS projects. The data storage and CPU resources that are required for large-scale whole genome sequencing data analyses are too large for many core facilities and individual laboratories to provide. To help meet these challenges, we have developed Rainbow, a cloud-based software package that can assist in the automation of large-scale WGS data analyses. Here, we evaluated the performance of Rainbow by analyzing 44 different whole-genome-sequenced subjects. Rainbow has the capacity to process genomic data from more than 500 subjects in two weeks using cloud computing provided by the Amazon Web Service. The time includes the import and export of the data using Amazon Import/Export service. The average cost of processing a single sample in the cloud was less than 120 US dollars. Compared with Crossbow, the main improvements incorporated into Rainbow include the ability: (1) to handle BAM as well as FASTQ input files; (2) to split large sequence files for better load balance downstream; (3) to log the running metrics in data processing and monitoring multiple Amazon Elastic Compute Cloud (EC2) instances; and (4) to merge SOAPsnp outputs for multiple individuals into a single file to facilitate downstream genome-wide association studies. Rainbow is a scalable, cost-effective, and open-source tool for large-scale WGS data analysis. For human WGS data sequenced by either the Illumina HiSeq 2000 or HiSeq 2500 platforms, Rainbow can be used straight out of the box. Rainbow is available

  16. A first large-scale flood inundation forecasting model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schumann, Guy J-P; Neal, Jeffrey C.; Voisin, Nathalie

    2013-11-04

    At present continental to global scale flood forecasting focusses on predicting at a point discharge, with little attention to the detail and accuracy of local scale inundation predictions. Yet, inundation is actually the variable of interest and all flood impacts are inherently local in nature. This paper proposes a first large scale flood inundation ensemble forecasting model that uses best available data and modeling approaches in data scarce areas and at continental scales. The model was built for the Lower Zambezi River in southeast Africa to demonstrate current flood inundation forecasting capabilities in large data-scarce regions. The inundation model domainmore » has a surface area of approximately 170k km2. ECMWF meteorological data were used to force the VIC (Variable Infiltration Capacity) macro-scale hydrological model which simulated and routed daily flows to the input boundary locations of the 2-D hydrodynamic model. Efficient hydrodynamic modeling over large areas still requires model grid resolutions that are typically larger than the width of many river channels that play a key a role in flood wave propagation. We therefore employed a novel sub-grid channel scheme to describe the river network in detail whilst at the same time representing the floodplain at an appropriate and efficient scale. The modeling system was first calibrated using water levels on the main channel from the ICESat (Ice, Cloud, and land Elevation Satellite) laser altimeter and then applied to predict the February 2007 Mozambique floods. Model evaluation showed that simulated flood edge cells were within a distance of about 1 km (one model resolution) compared to an observed flood edge of the event. Our study highlights that physically plausible parameter values and satisfactory performance can be achieved at spatial scales ranging from tens to several hundreds of thousands of km2 and at model grid resolutions up to several km2. However, initial model test runs in forecast

  17. Reconstructing Information in Large-Scale Structure via Logarithmic Mapping

    NASA Astrophysics Data System (ADS)

    Szapudi, Istvan

    We propose to develop a new method to extract information from large-scale structure data combining two-point statistics and non-linear transformations; before, this information was available only with substantially more complex higher-order statistical methods. Initially, most of the cosmological information in large-scale structure lies in two-point statistics. With non- linear evolution, some of that useful information leaks into higher-order statistics. The PI and group has shown in a series of theoretical investigations how that leakage occurs, and explained the Fisher information plateau at smaller scales. This plateau means that even as more modes are added to the measurement of the power spectrum, the total cumulative information (loosely speaking the inverse errorbar) is not increasing. Recently we have shown in Neyrinck et al. (2009, 2010) that a logarithmic (and a related Gaussianization or Box-Cox) transformation on the non-linear Dark Matter or galaxy field reconstructs a surprisingly large fraction of this missing Fisher information of the initial conditions. This was predicted by the earlier wave mechanical formulation of gravitational dynamics by Szapudi & Kaiser (2003). The present proposal is focused on working out the theoretical underpinning of the method to a point that it can be used in practice to analyze data. In particular, one needs to deal with the usual real-life issues of galaxy surveys, such as complex geometry, discrete sam- pling (Poisson or sub-Poisson noise), bias (linear, or non-linear, deterministic, or stochastic), redshift distortions, pro jection effects for 2D samples, and the effects of photometric redshift errors. We will develop methods for weak lensing and Sunyaev-Zeldovich power spectra as well, the latter specifically targetting Planck. In addition, we plan to investigate the question of residual higher- order information after the non-linear mapping, and possible applications for cosmology. Our aim will be to work out

  18. [A large-scale accident in Alpine terrain].

    PubMed

    Wildner, M; Paal, P

    2015-02-01

    Due to the geographical conditions, large-scale accidents amounting to mass casualty incidents (MCI) in Alpine terrain regularly present rescue teams with huge challenges. Using an example incident, specific conditions and typical problems associated with such a situation are presented. The first rescue team members to arrive have the elementary tasks of qualified triage and communication to the control room, which is required to dispatch the necessary additional support. Only with a clear "concept", to which all have to adhere, can the subsequent chaos phase be limited. In this respect, a time factor confounded by adverse weather conditions or darkness represents enormous pressure. Additional hazards are frostbite and hypothermia. If priorities can be established in terms of urgency, then treatment and procedure algorithms have proven successful. For evacuation of causalities, a helicopter should be strived for. Due to the low density of hospitals in Alpine regions, it is often necessary to distribute the patients over a wide area. Rescue operations in Alpine terrain have to be performed according to the particular conditions and require rescue teams to have specific knowledge and expertise. The possibility of a large-scale accident should be considered when planning events. With respect to optimization of rescue measures, regular training and exercises are rational, as is the analysis of previous large-scale Alpine accidents.

  19. Collaborative mining and interpretation of large-scale data for biomedical research insights.

    PubMed

    Tsiliki, Georgia; Karacapilidis, Nikos; Christodoulou, Spyros; Tzagarakis, Manolis

    2014-01-01

    Biomedical research becomes increasingly interdisciplinary and collaborative in nature. Researchers need to efficiently and effectively collaborate and make decisions by meaningfully assembling, mining and analyzing available large-scale volumes of complex multi-faceted data residing in different sources. In line with related research directives revealing that, in spite of the recent advances in data mining and computational analysis, humans can easily detect patterns which computer algorithms may have difficulty in finding, this paper reports on the practical use of an innovative web-based collaboration support platform in a biomedical research context. Arguing that dealing with data-intensive and cognitively complex settings is not a technical problem alone, the proposed platform adopts a hybrid approach that builds on the synergy between machine and human intelligence to facilitate the underlying sense-making and decision making processes. User experience shows that the platform enables more informed and quicker decisions, by displaying the aggregated information according to their needs, while also exploiting the associated human intelligence.

  20. Collaborative Mining and Interpretation of Large-Scale Data for Biomedical Research Insights

    PubMed Central

    Tsiliki, Georgia; Karacapilidis, Nikos; Christodoulou, Spyros; Tzagarakis, Manolis

    2014-01-01

    Biomedical research becomes increasingly interdisciplinary and collaborative in nature. Researchers need to efficiently and effectively collaborate and make decisions by meaningfully assembling, mining and analyzing available large-scale volumes of complex multi-faceted data residing in different sources. In line with related research directives revealing that, in spite of the recent advances in data mining and computational analysis, humans can easily detect patterns which computer algorithms may have difficulty in finding, this paper reports on the practical use of an innovative web-based collaboration support platform in a biomedical research context. Arguing that dealing with data-intensive and cognitively complex settings is not a technical problem alone, the proposed platform adopts a hybrid approach that builds on the synergy between machine and human intelligence to facilitate the underlying sense-making and decision making processes. User experience shows that the platform enables more informed and quicker decisions, by displaying the aggregated information according to their needs, while also exploiting the associated human intelligence. PMID:25268270

  1. Large-scale anisotropy in stably stratified rotating flows

    DOE PAGES

    Marino, R.; Mininni, P. D.; Rosenberg, D. L.; ...

    2014-08-28

    We present results from direct numerical simulations of the Boussinesq equations in the presence of rotation and/or stratification, both in the vertical direction. The runs are forced isotropically and randomly at small scales and have spatial resolutions of up tomore » $1024^3$ grid points and Reynolds numbers of $$\\approx 1000$$. We first show that solutions with negative energy flux and inverse cascades develop in rotating turbulence, whether or not stratification is present. However, the purely stratified case is characterized instead by an early-time, highly anisotropic transfer to large scales with almost zero net isotropic energy flux. This is consistent with previous studies that observed the development of vertically sheared horizontal winds, although only at substantially later times. However, and unlike previous works, when sufficient scale separation is allowed between the forcing scale and the domain size, the total energy displays a perpendicular (horizontal) spectrum with power law behavior compatible with $$\\sim k_\\perp^{-5/3}$$, including in the absence of rotation. In this latter purely stratified case, such a spectrum is the result of a direct cascade of the energy contained in the large-scale horizontal wind, as is evidenced by a strong positive flux of energy in the parallel direction at all scales including the largest resolved scales.« less

  2. Neutrino footprint in large scale structure

    NASA Astrophysics Data System (ADS)

    Garay, Carlos Peña; Verde, Licia; Jimenez, Raul

    2017-03-01

    Recent constrains on the sum of neutrino masses inferred by analyzing cosmological data, show that detecting a non-zero neutrino mass is within reach of forthcoming cosmological surveys. Such a measurement will imply a direct determination of the absolute neutrino mass scale. Physically, the measurement relies on constraining the shape of the matter power spectrum below the neutrino free streaming scale: massive neutrinos erase power at these scales. However, detection of a lack of small-scale power from cosmological data could also be due to a host of other effects. It is therefore of paramount importance to validate neutrinos as the source of power suppression at small scales. We show that, independent on hierarchy, neutrinos always show a footprint on large, linear scales; the exact location and properties are fully specified by the measured power suppression (an astrophysical measurement) and atmospheric neutrinos mass splitting (a neutrino oscillation experiment measurement). This feature cannot be easily mimicked by systematic uncertainties in the cosmological data analysis or modifications in the cosmological model. Therefore the measurement of such a feature, up to 1% relative change in the power spectrum for extreme differences in the mass eigenstates mass ratios, is a smoking gun for confirming the determination of the absolute neutrino mass scale from cosmological observations. It also demonstrates the synergy between astrophysics and particle physics experiments.

  3. Sensitivity analysis for large-scale problems

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Whitworth, Sandra L.

    1987-01-01

    The development of efficient techniques for calculating sensitivity derivatives is studied. The objective is to present a computational procedure for calculating sensitivity derivatives as part of performing structural reanalysis for large-scale problems. The scope is limited to framed type structures. Both linear static analysis and free-vibration eigenvalue problems are considered.

  4. Corridors Increase Plant Species Richness at Large Scales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Damschen, Ellen I.; Haddad, Nick M.; Orrock,John L.

    2006-09-01

    Habitat fragmentation is one of the largest threats to biodiversity. Landscape corridors, which are hypothesized to reduce the negative consequences of fragmentation, have become common features of ecological management plans worldwide. Despite their popularity, there is little evidence documenting the effectiveness of corridors in preserving biodiversity at large scales. Using a large-scale replicated experiment, we showed that habitat patches connected by corridors retain more native plant species than do isolated patches, that this difference increases over time, and that corridors do not promote invasion by exotic species. Our results support the use of corridors in biodiversity conservation.

  5. Corridors increase plant species richness at large scales.

    PubMed

    Damschen, Ellen I; Haddad, Nick M; Orrock, John L; Tewksbury, Joshua J; Levey, Douglas J

    2006-09-01

    Habitat fragmentation is one of the largest threats to biodiversity. Landscape corridors, which are hypothesized to reduce the negative consequences of fragmentation, have become common features of ecological management plans worldwide. Despite their popularity, there is little evidence documenting the effectiveness of corridors in preserving biodiversity at large scales. Using a large-scale replicated experiment, we showed that habitat patches connected by corridors retain more native plant species than do isolated patches, that this difference increases over time, and that corridors do not promote invasion by exotic species. Our results support the use of corridors in biodiversity conservation.

  6. The Cosmology Large Angular Scale Surveyor

    NASA Technical Reports Server (NTRS)

    Harrington, Kathleen; Marriage, Tobias; Ali, Aamir; Appel, John; Bennett, Charles; Boone, Fletcher; Brewer, Michael; Chan, Manwei; Chuss, David T.; Colazo, Felipe; hide

    2016-01-01

    The Cosmology Large Angular Scale Surveyor (CLASS) is a four telescope array designed to characterize relic primordial gravitational waves from inflation and the optical depth to reionization through a measurement of the polarized cosmic microwave background (CMB) on the largest angular scales. The frequencies of the four CLASS telescopes, one at 38 GHz, two at 93 GHz, and one dichroic system at 145217 GHz, are chosen to avoid spectral regions of high atmospheric emission and span the minimum of the polarized Galactic foregrounds: synchrotron emission at lower frequencies and dust emission at higher frequencies. Low-noise transition edge sensor detectors and a rapid front-end polarization modulator provide a unique combination of high sensitivity, stability, and control of systematics. The CLASS site, at 5200 m in the Chilean Atacama desert, allows for daily mapping of up to 70% of the sky and enables the characterization of CMB polarization at the largest angular scales. Using this combination of a broad frequency range, large sky coverage, control over systematics, and high sensitivity, CLASS will observe the reionization and recombination peaks of the CMB E- and B-mode power spectra. CLASS will make a cosmic variance limited measurement of the optical depth to reionization and will measure or place upper limits on the tensor-to-scalar ratio, r, down to a level of 0.01 (95% C.L.).

  7. The Cosmology Large Angular Scale Surveyor

    NASA Astrophysics Data System (ADS)

    Harrington, Kathleen; Marriage, Tobias; Ali, Aamir; Appel, John W.; Bennett, Charles L.; Boone, Fletcher; Brewer, Michael; Chan, Manwei; Chuss, David T.; Colazo, Felipe; Dahal, Sumit; Denis, Kevin; Dünner, Rolando; Eimer, Joseph; Essinger-Hileman, Thomas; Fluxa, Pedro; Halpern, Mark; Hilton, Gene; Hinshaw, Gary F.; Hubmayr, Johannes; Iuliano, Jeffrey; Karakla, John; McMahon, Jeff; Miller, Nathan T.; Moseley, Samuel H.; Palma, Gonzalo; Parker, Lucas; Petroff, Matthew; Pradenas, Bastián.; Rostem, Karwan; Sagliocca, Marco; Valle, Deniz; Watts, Duncan; Wollack, Edward; Xu, Zhilei; Zeng, Lingzhen

    2016-07-01

    The Cosmology Large Angular Scale Surveyor (CLASS) is a four telescope array designed to characterize relic primordial gravitational waves from in ation and the optical depth to reionization through a measurement of the polarized cosmic microwave background (CMB) on the largest angular scales. The frequencies of the four CLASS telescopes, one at 38 GHz, two at 93 GHz, and one dichroic system at 145/217 GHz, are chosen to avoid spectral regions of high atmospheric emission and span the minimum of the polarized Galactic foregrounds: synchrotron emission at lower frequencies and dust emission at higher frequencies. Low-noise transition edge sensor detectors and a rapid front-end polarization modulator provide a unique combination of high sensitivity, stability, and control of systematics. The CLASS site, at 5200 m in the Chilean Atacama desert, allows for daily mapping of up to 70% of the sky and enables the characterization of CMB polarization at the largest angular scales. Using this combination of a broad frequency range, large sky coverage, control over systematics, and high sensitivity, CLASS will observe the reionization and recombination peaks of the CMB E- and B-mode power spectra. CLASS will make a cosmic variance limited measurement of the optical depth to reionization and will measure or place upper limits on the tensor-to-scalar ratio, r, down to a level of 0.01 (95% C.L.).

  8. Prehospital Acute Stroke Severity Scale to Predict Large Artery Occlusion: Design and Comparison With Other Scales.

    PubMed

    Hastrup, Sidsel; Damgaard, Dorte; Johnsen, Søren Paaske; Andersen, Grethe

    2016-07-01

    We designed and validated a simple prehospital stroke scale to identify emergent large vessel occlusion (ELVO) in patients with acute ischemic stroke and compared the scale to other published scales for prediction of ELVO. A national historical test cohort of 3127 patients with information on intracranial vessel status (angiography) before reperfusion therapy was identified. National Institutes of Health Stroke Scale (NIHSS) items with the highest predictive value of occlusion of a large intracranial artery were identified, and the most optimal combination meeting predefined criteria to ensure usefulness in the prehospital phase was determined. The predictive performance of Prehospital Acute Stroke Severity (PASS) scale was compared with other published scales for ELVO. The PASS scale was composed of 3 NIHSS scores: level of consciousness (month/age), gaze palsy/deviation, and arm weakness. In derivation of PASS 2/3 of the test cohort was used and showed accuracy (area under the curve) of 0.76 for detecting large arterial occlusion. Optimal cut point ≥2 abnormal scores showed: sensitivity=0.66 (95% CI, 0.62-0.69), specificity=0.83 (0.81-0.85), and area under the curve=0.74 (0.72-0.76). Validation on 1/3 of the test cohort showed similar performance. Patients with a large artery occlusion on angiography with PASS ≥2 had a median NIHSS score of 17 (interquartile range=6) as opposed to PASS <2 with a median NIHSS score of 6 (interquartile range=5). The PASS scale showed equal performance although more simple when compared with other scales predicting ELVO. The PASS scale is simple and has promising accuracy for prediction of ELVO in the field. © 2016 American Heart Association, Inc.

  9. Facilitating large-scale clinical trials: in Asia.

    PubMed

    Choi, Han Yong; Ko, Jae-Wook

    2010-01-01

    The number of clinical trials conducted in Asian countries has started to increase as a result of expansion of the pharmaceutical market in this area. There is a growing opportunity for large-scale clinical trials because of the large number of patients, significant market potential, good quality of data, and the cost effective and qualified medical infrastructure. However, for carrying out large-scale clinical trials in Asia, there are several major challenges, including the quality control of data, budget control, laboratory validation, monitoring capacity, authorship, staff training, and nonstandard treatment that need to be considered. There are also several difficulties in collaborating on international trials in Asia because Asia is an extremely diverse continent. The major challenges are language differences, diversity of patterns of disease, and current treatments, a large gap in the experience with performing multinational trials, and regulatory differences among the Asian countries. In addition, there are also differences in the understanding of global clinical trials, medical facilities, indemnity assurance, and culture, including food and religion. To make regional and local data provide evidence for efficacy through the standardization of these differences, unlimited effort is required. At this time, there are no large clinical trials led by urologists in Asia, but it is anticipated that the role of urologists in clinical trials will continue to increase. Copyright © 2010 Elsevier Inc. All rights reserved.

  10. First Large-Scale Proteogenomic Study of Breast Cancer Provides Insight into Potential Therapeutic Targets | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    News Release: May 25, 2016 — Building on data from The Cancer Genome Atlas (TCGA) project, a multi-institutional team of scientists has completed the first large-scale “proteogenomic” study of breast cancer, linking DNA mutations to protein signaling and helping pinpoint the genes that drive cancer.

  11. Multi-dimension and Comprehensive Assessment on the Utilizing and Sharing of Regional Large-Scale Scientific Equipment

    PubMed Central

    Li, Chen; Yongbo, Lv; Chi, Chen

    2015-01-01

    Based on the data from 30 provincial regions in China, an assessment and empirical analysis was carried out on the utilizing and sharing of the large-scale scientific equipment with a comprehensive assessment model established on the three dimensions, namely, equipment, utilization and sharing. The assessment results were interpreted in light of relevant policies. The results showed that on the whole, the overall development level in the provincial regions in eastern and central China is higher than that in western China. This is mostly because of the large gap among the different provincial regions with respect to the equipped level. But in terms of utilizing and sharing, some of the Western provincial regions, such as Ningxia, perform well, which is worthy of our attention. Policy adjustment targeting at the differentiation, elevation of the capacity of the equipment management personnel, perfection of the sharing and cooperation platform, and the promotion of the establishment of open sharing funds, are all important measures to promote the utilization and sharing of the large-scale scientific equipment and to narrow the gap among different regions. PMID:25937850

  12. Mems: Platform for Large-Scale Integrated Vacuum Electronic Circuits

    DTIC Science & Technology

    2017-03-20

    SECURITY CLASSIFICATION OF: The objective of the LIVEC advanced study project was to develop a platform for large-scale integrated vacuum electronic ...Distribution Unlimited UU UU UU UU 20-03-2017 1-Jul-2014 30-Jun-2015 Final Report: MEMS Platform for Large-Scale Integrated Vacuum Electronic ... Electronic Circuits (LIVEC) Contract No: W911NF-14-C-0093 COR Dr. James Harvey U.S. ARO RTP, NC 27709-2211 Phone: 702-696-2533 e-mail

  13. Latest COBE results, large-scale data, and predictions of inflation

    NASA Technical Reports Server (NTRS)

    Kashlinsky, A.

    1992-01-01

    One of the predictions of the inflationary scenario of cosmology is that the initial spectrum of primordial density fluctuations (PDFs) must have the Harrison-Zeldovich (HZ) form. Here, in order to test the inflationary scenario, predictions of the microwave background radiation (MBR) anisotropies measured by COBE are computed based on large-scale data for the universe and assuming Omega-1 and the HZ spectrum on large scales. It is found that the minimal scale where the spectrum can first enter the HZ regime is found, constraining the power spectrum of the mass distribution to within the bias factor b. This factor is determined and used to predict parameters of the MBR anisotropy field. For the spectrum of PDFs that reaches the HZ regime immediately after the scale accessible to the APM catalog, the numbers on MBR anisotropies are consistent with the COBE detections and thus the standard inflation can indeed be considered a viable theory for the origin of the large-scale structure in the universe.

  14. Managing Risk and Uncertainty in Large-Scale University Research Projects

    ERIC Educational Resources Information Center

    Moore, Sharlissa; Shangraw, R. F., Jr.

    2011-01-01

    Both publicly and privately funded research projects managed by universities are growing in size and scope. Complex, large-scale projects (over $50 million) pose new management challenges and risks for universities. This paper explores the relationship between project success and a variety of factors in large-scale university projects. First, we…

  15. Parallel Clustering Algorithm for Large-Scale Biological Data Sets

    PubMed Central

    Wang, Minchao; Zhang, Wu; Ding, Wang; Dai, Dongbo; Zhang, Huiran; Xie, Hao; Chen, Luonan; Guo, Yike; Xie, Jiang

    2014-01-01

    Backgrounds Recent explosion of biological data brings a great challenge for the traditional clustering algorithms. With increasing scale of data sets, much larger memory and longer runtime are required for the cluster identification problems. The affinity propagation algorithm outperforms many other classical clustering algorithms and is widely applied into the biological researches. However, the time and space complexity become a great bottleneck when handling the large-scale data sets. Moreover, the similarity matrix, whose constructing procedure takes long runtime, is required before running the affinity propagation algorithm, since the algorithm clusters data sets based on the similarities between data pairs. Methods Two types of parallel architectures are proposed in this paper to accelerate the similarity matrix constructing procedure and the affinity propagation algorithm. The memory-shared architecture is used to construct the similarity matrix, and the distributed system is taken for the affinity propagation algorithm, because of its large memory size and great computing capacity. An appropriate way of data partition and reduction is designed in our method, in order to minimize the global communication cost among processes. Result A speedup of 100 is gained with 128 cores. The runtime is reduced from serval hours to a few seconds, which indicates that parallel algorithm is capable of handling large-scale data sets effectively. The parallel affinity propagation also achieves a good performance when clustering large-scale gene data (microarray) and detecting families in large protein superfamilies. PMID:24705246

  16. Efficient On-Demand Operations in Large-Scale Infrastructures

    ERIC Educational Resources Information Center

    Ko, Steven Y.

    2009-01-01

    In large-scale distributed infrastructures such as clouds, Grids, peer-to-peer systems, and wide-area testbeds, users and administrators typically desire to perform "on-demand operations" that deal with the most up-to-date state of the infrastructure. However, the scale and dynamism present in the operating environment make it challenging to…

  17. Above scaling short-pulse ion acceleration from flat foil and ``Pizza-top Cone'' targets at the Trident laser facility

    NASA Astrophysics Data System (ADS)

    Flippo, Kirk; Hegelich, B. Manuel; Cort Gautier, D.; Johnson, J. Randy; Kline, John L.; Shimada, Tsutomu; Fernández, Juan C.; Gaillard, Sandrine; Rassuchine, Jennifer; Le Galloudec, Nathalie; Cowan, Thomas E.; Malekos, Steve; Korgan, Grant

    2006-10-01

    Ion-driven Fast Ignition (IFI) has certain advantages over electron-driven FI due to a possible large reduction in the amount of energy required. Recent experiments at the Los Alamos National Laboratory's Trident facility have yielded ion energies and efficiencies many times in excess of recent published scaling laws, leading to even more potential advantages of IFI. Proton energies in excess of 35 MeV have been observed from targets produced by the University of Nevada, Reno - dubbed ``Pizza-top Cone'' targets - at intensities of only 1x10^19 W/cm^2 with 20 joules in 600 fs. Energies in excess of 24 MeV were observed from simple flat foil targets as well. The observed energies, above any published scaling laws, are attributed to target production, preparation, and shot to shot monitoring of many laser parameters, especially the laser ASE prepulse level and laser pulse duration. The laser parameters are monitored in real-time to keep the laser in optimal condition throughout the run providing high quality, reproducible shots.

  18. Large-scale protein-protein interactions detection by integrating big biosensing data with computational model.

    PubMed

    You, Zhu-Hong; Li, Shuai; Gao, Xin; Luo, Xin; Ji, Zhen

    2014-01-01

    Protein-protein interactions are the basis of biological functions, and studying these interactions on a molecular level is of crucial importance for understanding the functionality of a living cell. During the past decade, biosensors have emerged as an important tool for the high-throughput identification of proteins and their interactions. However, the high-throughput experimental methods for identifying PPIs are both time-consuming and expensive. On the other hand, high-throughput PPI data are often associated with high false-positive and high false-negative rates. Targeting at these problems, we propose a method for PPI detection by integrating biosensor-based PPI data with a novel computational model. This method was developed based on the algorithm of extreme learning machine combined with a novel representation of protein sequence descriptor. When performed on the large-scale human protein interaction dataset, the proposed method achieved 84.8% prediction accuracy with 84.08% sensitivity at the specificity of 85.53%. We conducted more extensive experiments to compare the proposed method with the state-of-the-art techniques, support vector machine. The achieved results demonstrate that our approach is very promising for detecting new PPIs, and it can be a helpful supplement for biosensor-based PPI data detection.

  19. Scaling relations for large Martian valleys

    NASA Astrophysics Data System (ADS)

    Som, Sanjoy M.; Montgomery, David R.; Greenberg, Harvey M.

    2009-02-01

    The dendritic morphology of Martian valley networks, particularly in the Noachian highlands, has long been argued to imply a warmer, wetter early Martian climate, but the character and extent of this period remains controversial. We analyzed scaling relations for the 10 large valley systems incised in terrain of various ages, resolvable using the Mars Orbiter Laser Altimeter (MOLA) and the Thermal Emission Imaging System (THEMIS). Four of the valleys originate in point sources with negligible contributions from tributaries, three are very poorly dissected with a few large tributaries separated by long uninterrupted trunks, and three exhibit the dendritic, branching morphology typical of terrestrial channel networks. We generated width-area and slope-area relationships for each because these relations are identified as either theoretically predicted or robust terrestrial empiricisms for graded precipitation-fed, perennial channels. We also generated distance-area relationships (Hack's law) because they similarly represent robust characteristics of terrestrial channels (whether perennial or ephemeral). We find that the studied Martian valleys, even the dendritic ones, do not satisfy those empiricisms. On Mars, the width-area scaling exponent b of -0.7-4.7 contrasts with values of 0.3-0.6 typical of terrestrial channels; the slope-area scaling exponent $\\theta$ ranges from -25.6-5.5, whereas values of 0.3-0.5 are typical on Earth; the length-area, or Hack's exponent n ranges from 0.47 to 19.2, while values of 0.5-0.6 are found on Earth. None of the valleys analyzed satisfy all three relations typical of terrestrial perennial channels. As such, our analysis supports the hypotheses that ephemeral and/or immature channel morphologies provide the closest terrestrial analogs to the dendritic networks on Mars, and point source discharges provide terrestrial analogs best suited to describe the other large Martian valleys.

  20. Large-scale particle acceleration by magnetic reconnection during solar flares

    NASA Astrophysics Data System (ADS)

    Li, X.; Guo, F.; Li, H.; Li, G.; Li, S.

    2017-12-01

    Magnetic reconnection that triggers explosive magnetic energy release has been widely invoked to explain the large-scale particle acceleration during solar flares. While great efforts have been spent in studying the acceleration mechanism in small-scale kinetic simulations, there have been rare studies that make predictions to acceleration in the large scale comparable to the flare reconnection region. Here we present a new arrangement to study this problem. We solve the large-scale energetic-particle transport equation in the fluid velocity and magnetic fields from high-Lundquist-number MHD simulations of reconnection layers. This approach is based on examining the dominant acceleration mechanism and pitch-angle scattering in kinetic simulations. Due to the fluid compression in reconnection outflows and merging magnetic islands, particles are accelerated to high energies and develop power-law energy distributions. We find that the acceleration efficiency and power-law index depend critically on upstream plasma beta and the magnitude of guide field (the magnetic field component perpendicular to the reconnecting component) as they influence the compressibility of the reconnection layer. We also find that the accelerated high-energy particles are mostly concentrated in large magnetic islands, making the islands a source of energetic particles and high-energy emissions. These findings may provide explanations for acceleration process in large-scale magnetic reconnection during solar flares and the temporal and spatial emission properties observed in different flare events.

  1. Large-Scale Advanced Prop-Fan (LAP)

    NASA Technical Reports Server (NTRS)

    Degeorge, C. L.

    1988-01-01

    In recent years, considerable attention has been directed toward improving aircraft fuel efficiency. Analytical studies and research with wind tunnel models have demonstrated that the high inherent efficiency of low speed turboprop propulsion systems may now be extended to the Mach .8 flight regime of today's commercial airliners. This can be accomplished with a propeller, employing a large number of thin highly swept blades. The term Prop-Fan has been coined to describe such a propulsion system. In 1983 the NASA-Lewis Research Center contracted with Hamilton Standard to design, build and test a near full scale Prop-Fan, designated the Large Scale Advanced Prop-Fan (LAP). This report provides a detailed description of the LAP program. The assumptions and analytical procedures used in the design of Prop-Fan system components are discussed in detail. The manufacturing techniques used in the fabrication of the Prop-Fan are presented. Each of the tests run during the course of the program are also discussed and the major conclusions derived from them stated.

  2. Exact-Differential Large-Scale Traffic Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanai, Masatoshi; Suzumura, Toyotaro; Theodoropoulos, Georgios

    2015-01-01

    Analyzing large-scale traffics by simulation needs repeating execution many times with various patterns of scenarios or parameters. Such repeating execution brings about big redundancy because the change from a prior scenario to a later scenario is very minor in most cases, for example, blocking only one of roads or changing the speed limit of several roads. In this paper, we propose a new redundancy reduction technique, called exact-differential simulation, which enables to simulate only changing scenarios in later execution while keeping exactly same results as in the case of whole simulation. The paper consists of two main efforts: (i) amore » key idea and algorithm of the exact-differential simulation, (ii) a method to build large-scale traffic simulation on the top of the exact-differential simulation. In experiments of Tokyo traffic simulation, the exact-differential simulation shows 7.26 times as much elapsed time improvement in average and 2.26 times improvement even in the worst case as the whole simulation.« less

  3. Local gravity and large-scale structure

    NASA Technical Reports Server (NTRS)

    Juszkiewicz, Roman; Vittorio, Nicola; Wyse, Rosemary F. G.

    1990-01-01

    The magnitude and direction of the observed dipole anisotropy of the galaxy distribution can in principle constrain the amount of large-scale power present in the spectrum of primordial density fluctuations. This paper confronts the data, provided by a recent redshift survey of galaxies detected by the IRAS satellite, with the predictions of two cosmological models with very different levels of large-scale power: the biased Cold Dark Matter dominated model (CDM) and a baryon-dominated model (BDM) with isocurvature initial conditions. Model predictions are investigated for the Local Group peculiar velocity, v(R), induced by mass inhomogeneities distributed out to a given radius, R, for R less than about 10,000 km/s. Several convergence measures for v(R) are developed, which can become powerful cosmological tests when deep enough samples become available. For the present data sets, the CDM and BDM predictions are indistinguishable at the 2 sigma level and both are consistent with observations. A promising discriminant between cosmological models is the misalignment angle between v(R) and the apex of the dipole anisotropy of the microwave background.

  4. Comprehensive School Teachers' Professional Agency in Large-Scale Educational Change

    ERIC Educational Resources Information Center

    Pyhältö, Kirsi; Pietarinen, Janne; Soini, Tiina

    2014-01-01

    This article explores how comprehensive school teachers' sense of professional agency changes in the context of large-scale national educational change in Finland. We analysed the premises on which teachers (n = 100) view themselves and their work in terms of developing their own school, catalysed by the large-scale national change. The study…

  5. Intact mass detection, interpretation, and visualization to automate Top-Down proteomics on a large scale

    PubMed Central

    Durbin, Kenneth R.; Tran, John C.; Zamdborg, Leonid; Sweet, Steve M. M.; Catherman, Adam D.; Lee, Ji Eun; Li, Mingxi; Kellie, John F.; Kelleher, Neil L.

    2011-01-01

    Applying high-throughput Top-Down MS to an entire proteome requires a yet-to-be-established model for data processing. Since Top-Down is becoming possible on a large scale, we report our latest software pipeline dedicated to capturing the full value of intact protein data in automated fashion. For intact mass detection, we combine algorithms for processing MS1 data from both isotopically resolved (FT) and charge-state resolved (ion trap) LC-MS data, which are then linked to their fragment ions for database searching using ProSight. Automated determination of human keratin and tubulin isoforms is one result. Optimized for the intricacies of whole proteins, new software modules visualize proteome-scale data based on the LC retention time and intensity of intact masses and enable selective detection of PTMs to automatically screen for acetylation, phosphorylation, and methylation. Software functionality was demonstrated using comparative LC-MS data from yeast strains in addition to human cells undergoing chemical stress. We further these advances as a key aspect of realizing Top-Down MS on a proteomic scale. PMID:20848673

  6. Large-Scale Brain Network Coupling Predicts Total Sleep Deprivation Effects on Cognitive Capacity

    PubMed Central

    Wang, Lubin; Zhai, Tianye; Zou, Feng; Ye, Enmao; Jin, Xiao; Li, Wuju; Qi, Jianlin; Yang, Zheng

    2015-01-01

    Interactions between large-scale brain networks have received most attention in the study of cognitive dysfunction of human brain. In this paper, we aimed to test the hypothesis that the coupling strength of large-scale brain networks will reflect the pressure for sleep and will predict cognitive performance, referred to as sleep pressure index (SPI). Fourteen healthy subjects underwent this within-subject functional magnetic resonance imaging (fMRI) study during rested wakefulness (RW) and after 36 h of total sleep deprivation (TSD). Self-reported scores of sleepiness were higher for TSD than for RW. A subsequent working memory (WM) task showed that WM performance was lower after 36 h of TSD. Moreover, SPI was developed based on the coupling strength of salience network (SN) and default mode network (DMN). Significant increase of SPI was observed after 36 h of TSD, suggesting stronger pressure for sleep. In addition, SPI was significantly correlated with both the visual analogue scale score of sleepiness and the WM performance. These results showed that alterations in SN-DMN coupling might be critical in cognitive alterations that underlie the lapse after TSD. Further studies may validate the SPI as a potential clinical biomarker to assess the impact of sleep deprivation. PMID:26218521

  7. A small-scale, rolled-membrane microfluidic artificial lung designed towards future large area manufacturing.

    PubMed

    Thompson, A J; Marks, L H; Goudie, M J; Rojas-Pena, A; Handa, H; Potkay, J A

    2017-03-01

    Artificial lungs have been used in the clinic for multiple decades to supplement patient pulmonary function. Recently, small-scale microfluidic artificial lungs (μAL) have been demonstrated with large surface area to blood volume ratios, biomimetic blood flow paths, and pressure drops compatible with pumpless operation. Initial small-scale microfluidic devices with blood flow rates in the μ l/min to ml/min range have exhibited excellent gas transfer efficiencies; however, current manufacturing techniques may not be suitable for scaling up to human applications. Here, we present a new manufacturing technology for a microfluidic artificial lung in which the structure is assembled via a continuous "rolling" and bonding procedure from a single, patterned layer of polydimethyl siloxane (PDMS). This method is demonstrated in a small-scale four-layer device, but is expected to easily scale to larger area devices. The presented devices have a biomimetic branching blood flow network, 10  μ m tall artificial capillaries, and a 66  μ m thick gas transfer membrane. Gas transfer efficiency in blood was evaluated over a range of blood flow rates (0.1-1.25 ml/min) for two different sweep gases (pure O 2 , atmospheric air). The achieved gas transfer data closely follow predicted theoretical values for oxygenation and CO 2 removal, while pressure drop is marginally higher than predicted. This work is the first step in developing a scalable method for creating large area microfluidic artificial lungs. Although designed for microfluidic artificial lungs, the presented technique is expected to result in the first manufacturing method capable of simply and easily creating large area microfluidic devices from PDMS.

  8. The influence of large-scale wind power on global climate.

    PubMed

    Keith, David W; Decarolis, Joseph F; Denkenberger, David C; Lenschow, Donald H; Malyshev, Sergey L; Pacala, Stephen; Rasch, Philip J

    2004-11-16

    Large-scale use of wind power can alter local and global climate by extracting kinetic energy and altering turbulent transport in the atmospheric boundary layer. We report climate-model simulations that address the possible climatic impacts of wind power at regional to global scales by using two general circulation models and several parameterizations of the interaction of wind turbines with the boundary layer. We find that very large amounts of wind power can produce nonnegligible climatic change at continental scales. Although large-scale effects are observed, wind power has a negligible effect on global-mean surface temperature, and it would deliver enormous global benefits by reducing emissions of CO(2) and air pollutants. Our results may enable a comparison between the climate impacts due to wind power and the reduction in climatic impacts achieved by the substitution of wind for fossil fuels.

  9. Large-Angular-Scale Clustering as a Clue to the Source of UHECRs

    NASA Astrophysics Data System (ADS)

    Berlind, Andreas A.; Farrar, Glennys R.

    We explore what can be learned about the sources of UHECRs from their large-angular-scale clustering (referred to as their "bias" by the cosmology community). Exploiting the clustering on large scales has the advantage over small-scale correlations of being insensitive to uncertainties in source direction from magnetic smearing or measurement error. In a Cold Dark Matter cosmology, the amplitude of large-scale clustering depends on the mass of the system, with more massive systems such as galaxy clusters clustering more strongly than less massive systems such as ordinary galaxies or AGN. Therefore, studying the large-scale clustering of UHECRs can help determine a mass scale for their sources, given the assumption that their redshift depth is as expected from the GZK cutoff. We investigate the constraining power of a given UHECR sample as a function of its cutoff energy and number of events. We show that current and future samples should be able to distinguish between the cases of their sources being galaxy clusters, ordinary galaxies, or sources that are uncorrelated with the large-scale structure of the universe.

  10. Integrating an agent-based model into a large-scale hydrological model for evaluating drought management in California

    NASA Astrophysics Data System (ADS)

    Sheffield, J.; He, X.; Wada, Y.; Burek, P.; Kahil, M.; Wood, E. F.; Oppenheimer, M.

    2017-12-01

    California has endured record-breaking drought since winter 2011 and will likely experience more severe and persistent drought in the coming decades under changing climate. At the same time, human water management practices can also affect drought frequency and intensity, which underscores the importance of human behaviour in effective drought adaptation and mitigation. Currently, although a few large-scale hydrological and water resources models (e.g., PCR-GLOBWB) consider human water use and management practices (e.g., irrigation, reservoir operation, groundwater pumping), none of them includes the dynamic feedback between local human behaviors/decisions and the natural hydrological system. It is, therefore, vital to integrate social and behavioral dimensions into current hydrological modeling frameworks. This study applies the agent-based modeling (ABM) approach and couples it with a large-scale hydrological model (i.e., Community Water Model, CWatM) in order to have a balanced representation of social, environmental and economic factors and a more realistic representation of the bi-directional interactions and feedbacks in coupled human and natural systems. In this study, we focus on drought management in California and considers two types of agents, which are (groups of) farmers and state management authorities, and assumed that their corresponding objectives are to maximize the net crop profit and to maintain sufficient water supply, respectively. Farmers' behaviors are linked with local agricultural practices such as cropping patterns and deficit irrigation. More precisely, farmers' decisions are incorporated into CWatM across different time scales in terms of daily irrigation amount, seasonal/annual decisions on crop types and irrigated area as well as the long-term investment of irrigation infrastructure. This simulation-based optimization framework is further applied by performing different sets of scenarios to investigate and evaluate the effectiveness

  11. Affective video retrieval: violence detection in Hollywood movies by large-scale segmental feature extraction.

    PubMed

    Eyben, Florian; Weninger, Felix; Lehment, Nicolas; Schuller, Björn; Rigoll, Gerhard

    2013-01-01

    Without doubt general video and sound, as found in large multimedia archives, carry emotional information. Thus, audio and video retrieval by certain emotional categories or dimensions could play a central role for tomorrow's intelligent systems, enabling search for movies with a particular mood, computer aided scene and sound design in order to elicit certain emotions in the audience, etc. Yet, the lion's share of research in affective computing is exclusively focusing on signals conveyed by humans, such as affective speech. Uniting the fields of multimedia retrieval and affective computing is believed to lend to a multiplicity of interesting retrieval applications, and at the same time to benefit affective computing research, by moving its methodology "out of the lab" to real-world, diverse data. In this contribution, we address the problem of finding "disturbing" scenes in movies, a scenario that is highly relevant for computer-aided parental guidance. We apply large-scale segmental feature extraction combined with audio-visual classification to the particular task of detecting violence. Our system performs fully data-driven analysis including automatic segmentation. We evaluate the system in terms of mean average precision (MAP) on the official data set of the MediaEval 2012 evaluation campaign's Affect Task, which consists of 18 original Hollywood movies, achieving up to .398 MAP on unseen test data in full realism. An in-depth analysis of the worth of individual features with respect to the target class and the system errors is carried out and reveals the importance of peak-related audio feature extraction and low-level histogram-based video analysis.

  12. Dispersal Mutualism Incorporated into Large-Scale, Infrequent Disturbances

    PubMed Central

    Parker, V. Thomas

    2015-01-01

    Because of their influence on succession and other community interactions, large-scale, infrequent natural disturbances also should play a major role in mutualistic interactions. Using field data and experiments, I test whether mutualisms have been incorporated into large-scale wildfire by whether the outcomes of a mutualism depend on disturbance. In this study a seed dispersal mutualism is shown to depend on infrequent, large-scale disturbances. A dominant shrubland plant (Arctostaphylos species) produces seeds that make up a persistent soil seed bank and requires fire to germinate. In post-fire stands, I show that seedlings emerging from rodent caches dominate sites experiencing higher fire intensity. Field experiments show that rodents (Perimyscus californicus, P. boylii) do cache Arctostaphylos fruit and bury most seed caches to a sufficient depth to survive a killing heat pulse that a fire might drive into the soil. While the rodent dispersal and caching behavior itself has not changed compared to other habitats, the environmental transformation caused by wildfire converts the caching burial of seed from a dispersal process to a plant fire adaptive trait, and provides the context for stimulating subsequent life history evolution in the plant host. PMID:26151560

  13. Out of the net: An agent-based model to study human movements influence on local-scale malaria transmission.

    PubMed

    Pizzitutti, Francesco; Pan, William; Feingold, Beth; Zaitchik, Ben; Álvarez, Carlos A; Mena, Carlos F

    2018-01-01

    Though malaria control initiatives have markedly reduced malaria prevalence in recent decades, global eradication is far from actuality. Recent studies show that environmental and social heterogeneities in low-transmission settings have an increased weight in shaping malaria micro-epidemiology. New integrated and more localized control strategies should be developed and tested. Here we present a set of agent-based models designed to study the influence of local scale human movements on local scale malaria transmission in a typical Amazon environment, where malaria is transmission is low and strongly connected with seasonal riverine flooding. The agent-based simulations show that the overall malaria incidence is essentially not influenced by local scale human movements. In contrast, the locations of malaria high risk spatial hotspots heavily depend on human movements because simulated malaria hotspots are mainly centered on farms, were laborers work during the day. The agent-based models are then used to test the effectiveness of two different malaria control strategies both designed to reduce local scale malaria incidence by targeting hotspots. The first control scenario consists in treat against mosquito bites people that, during the simulation, enter at least once inside hotspots revealed considering the actual sites where human individuals were infected. The second scenario involves the treatment of people entering in hotspots calculated assuming that the infection sites of every infected individual is located in the household where the individual lives. Simulations show that both considered scenarios perform better in controlling malaria than a randomized treatment, although targeting household hotspots shows slightly better performance.

  14. TARGET Publication Guidelines | Office of Cancer Genomics

    Cancer.gov

    Like other NCI large-scale genomics initiatives, TARGET is a community resource project and data are made available rapidly after validation for use by other researchers. To act in accord with the Fort Lauderdale principles and support the continued prompt public release of large-scale genomic data prior to publication, researchers who plan to prepare manuscripts containing descriptions of TARGET pediatric cancer data that would be of comparable scope to an initial TARGET disease-specific comprehensive, global analysis publication, and journal editors who receive such manuscripts, are

  15. Cedar-a large scale multiprocessor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gajski, D.; Kuck, D.; Lawrie, D.

    1983-01-01

    This paper presents an overview of Cedar, a large scale multiprocessor being designed at the University of Illinois. This machine is designed to accommodate several thousand high performance processors which are capable of working together on a single job, or they can be partitioned into groups of processors where each group of one or more processors can work on separate jobs. Various aspects of the machine are described including the control methodology, communication network, optimizing compiler and plans for construction. 13 references.

  16. Eyjafjallajökull and 9/11: The Impact of Large-Scale Disasters on Worldwide Mobility

    PubMed Central

    Woolley-Meza, Olivia; Grady, Daniel; Thiemann, Christian; Bagrow, James P.; Brockmann, Dirk

    2013-01-01

    Large-scale disasters that interfere with globalized socio-technical infrastructure, such as mobility and transportation networks, trigger high socio-economic costs. Although the origin of such events is often geographically confined, their impact reverberates through entire networks in ways that are poorly understood, difficult to assess, and even more difficult to predict. We investigate how the eruption of volcano Eyjafjallajökull, the September 11th terrorist attacks, and geographical disruptions in general interfere with worldwide mobility. To do this we track changes in effective distance in the worldwide air transportation network from the perspective of individual airports. We find that universal features exist across these events: airport susceptibilities to regional disruptions follow similar, strongly heterogeneous distributions that lack a scale. On the other hand, airports are more uniformly susceptible to attacks that target the most important hubs in the network, exhibiting a well-defined scale. The statistical behavior of susceptibility can be characterized by a single scaling exponent. Using scaling arguments that capture the interplay between individual airport characteristics and the structural properties of routes we can recover the exponent for all types of disruption. We find that the same mechanisms responsible for efficient passenger flow may also keep the system in a vulnerable state. Our approach can be applied to understand the impact of large, correlated disruptions in financial systems, ecosystems and other systems with a complex interaction structure between heterogeneous components. PMID:23950904

  17. Understanding large-scale, long-term larval connectivity patterns: The case of the Northern Line Islands in the Central Pacific Ocean

    PubMed Central

    Mari, Lorenzo; Bonaventura, Luca; Storto, Andrea; Melià, Paco; Gatto, Marino; Masina, Simona

    2017-01-01

    Protecting key hotspots of marine biodiversity is essential to maintain ecosystem services at large spatial scales. Protected areas serve not only as sources of propagules colonizing other habitats, but also as receptors, thus acting as protected nurseries. To quantify the geographical extent and the temporal persistence of ecological benefits resulting from protection, we investigate larval connectivity within a remote archipelago, characterized by a strong spatial gradient of human impact from pristine to heavily exploited: the Northern Line Islands (NLIs), including part of the Pacific Remote Islands Marine National Monument (PRI-MNM). Larvae are described as passive Lagrangian particles transported by oceanic currents obtained from a oceanographic reanalysis. We compare different simulation schemes and compute connectivity measures (larval exchange probabilities and minimum/average larval dispersal distances from target islands). To explore the role of PRI-MNM in protecting marine organisms with pelagic larval stages, we drive millions of individual-based simulations for various Pelagic Larval Durations (PLDs), in all release seasons, and over a two-decades time horizon (1991–2010). We find that connectivity in the NLIs is spatially asymmetric and displays significant intra- and inter-annual variations. The islands belonging to PRI-MNM act more as sinks than sources of larvae, and connectivity is higher during the winter-spring period. In multi-annual analyses, yearly averaged southward connectivity significantly and negatively correlates with climatological anomalies (El Niño). This points out a possible system fragility and susceptibility to global warming. Quantitative assessments of large-scale, long-term marine connectivity patterns help understand region-specific, ecologically relevant interactions between islands. This is fundamental for devising scientifically-based protection strategies, which must be space- and time-varying to cope with the

  18. Understanding large-scale, long-term larval connectivity patterns: The case of the Northern Line Islands in the Central Pacific Ocean.

    PubMed

    Mari, Lorenzo; Bonaventura, Luca; Storto, Andrea; Melià, Paco; Gatto, Marino; Masina, Simona; Casagrandi, Renato

    2017-01-01

    Protecting key hotspots of marine biodiversity is essential to maintain ecosystem services at large spatial scales. Protected areas serve not only as sources of propagules colonizing other habitats, but also as receptors, thus acting as protected nurseries. To quantify the geographical extent and the temporal persistence of ecological benefits resulting from protection, we investigate larval connectivity within a remote archipelago, characterized by a strong spatial gradient of human impact from pristine to heavily exploited: the Northern Line Islands (NLIs), including part of the Pacific Remote Islands Marine National Monument (PRI-MNM). Larvae are described as passive Lagrangian particles transported by oceanic currents obtained from a oceanographic reanalysis. We compare different simulation schemes and compute connectivity measures (larval exchange probabilities and minimum/average larval dispersal distances from target islands). To explore the role of PRI-MNM in protecting marine organisms with pelagic larval stages, we drive millions of individual-based simulations for various Pelagic Larval Durations (PLDs), in all release seasons, and over a two-decades time horizon (1991-2010). We find that connectivity in the NLIs is spatially asymmetric and displays significant intra- and inter-annual variations. The islands belonging to PRI-MNM act more as sinks than sources of larvae, and connectivity is higher during the winter-spring period. In multi-annual analyses, yearly averaged southward connectivity significantly and negatively correlates with climatological anomalies (El Niño). This points out a possible system fragility and susceptibility to global warming. Quantitative assessments of large-scale, long-term marine connectivity patterns help understand region-specific, ecologically relevant interactions between islands. This is fundamental for devising scientifically-based protection strategies, which must be space- and time-varying to cope with the

  19. Economically sustainable scaling of photovoltaics to meet climate targets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Needleman, David Berney; Poindexter, Jeremy R.; Kurchin, Rachel C.

    To meet climate targets, power generation capacity from photovoltaics (PV) in 2030 will have to be much greater than is predicted from either steady state growth using today's manufacturing capacity or industry roadmaps. Analysis of whether current technology can scale, in an economically sustainable way, to sufficient levels to meet these targets has not yet been undertaken, nor have tools to perform this analysis been presented. Here, we use bottom-up cost modeling to predict cumulative capacity as a function of technological and economic variables. We find that today's technology falls short in two ways: profits are too small relative tomore » upfront factory costs to grow manufacturing capacity rapidly enough to meet climate targets, and costs are too high to generate enough demand to meet climate targets. We show that decreasing the capital intensity (capex) of PV manufacturing to increase manufacturing capacity and effectively reducing cost (e.g., through higher efficiency) to increase demand are the most effective and least risky ways to address these barriers to scale. We also assess the effects of variations in demand due to hard-to-predict factors, like public policy, on the necessary reductions in cost.Lastly, we review examples of redundant technology pathways for crystalline silicon PV to achieve the necessary innovations in capex, performance, and price.« less

  20. Economically sustainable scaling of photovoltaics to meet climate targets

    DOE PAGES

    Needleman, David Berney; Poindexter, Jeremy R.; Kurchin, Rachel C.; ...

    2016-04-21

    To meet climate targets, power generation capacity from photovoltaics (PV) in 2030 will have to be much greater than is predicted from either steady state growth using today's manufacturing capacity or industry roadmaps. Analysis of whether current technology can scale, in an economically sustainable way, to sufficient levels to meet these targets has not yet been undertaken, nor have tools to perform this analysis been presented. Here, we use bottom-up cost modeling to predict cumulative capacity as a function of technological and economic variables. We find that today's technology falls short in two ways: profits are too small relative tomore » upfront factory costs to grow manufacturing capacity rapidly enough to meet climate targets, and costs are too high to generate enough demand to meet climate targets. We show that decreasing the capital intensity (capex) of PV manufacturing to increase manufacturing capacity and effectively reducing cost (e.g., through higher efficiency) to increase demand are the most effective and least risky ways to address these barriers to scale. We also assess the effects of variations in demand due to hard-to-predict factors, like public policy, on the necessary reductions in cost.Lastly, we review examples of redundant technology pathways for crystalline silicon PV to achieve the necessary innovations in capex, performance, and price.« less

  1. Understanding Protein Synthesis: A Role-Play Approach in Large Undergraduate Human Anatomy and Physiology Classes

    ERIC Educational Resources Information Center

    Sturges, Diana; Maurer, Trent W.; Cole, Oladipo

    2009-01-01

    This study investigated the effectiveness of role play in a large undergraduate science class. The targeted population consisted of 298 students enrolled in 2 sections of an undergraduate Human Anatomy and Physiology course taught by the same instructor. The section engaged in the role-play activity served as the study group, whereas the section…

  2. An interactive display system for large-scale 3D models

    NASA Astrophysics Data System (ADS)

    Liu, Zijian; Sun, Kun; Tao, Wenbing; Liu, Liman

    2018-04-01

    With the improvement of 3D reconstruction theory and the rapid development of computer hardware technology, the reconstructed 3D models are enlarging in scale and increasing in complexity. Models with tens of thousands of 3D points or triangular meshes are common in practical applications. Due to storage and computing power limitation, it is difficult to achieve real-time display and interaction with large scale 3D models for some common 3D display software, such as MeshLab. In this paper, we propose a display system for large-scale 3D scene models. We construct the LOD (Levels of Detail) model of the reconstructed 3D scene in advance, and then use an out-of-core view-dependent multi-resolution rendering scheme to realize the real-time display of the large-scale 3D model. With the proposed method, our display system is able to render in real time while roaming in the reconstructed scene and 3D camera poses can also be displayed. Furthermore, the memory consumption can be significantly decreased via internal and external memory exchange mechanism, so that it is possible to display a large scale reconstructed scene with over millions of 3D points or triangular meshes in a regular PC with only 4GB RAM.

  3. Decoupling local mechanics from large-scale structure in modular metamaterials.

    PubMed

    Yang, Nan; Silverberg, Jesse L

    2017-04-04

    A defining feature of mechanical metamaterials is that their properties are determined by the organization of internal structure instead of the raw fabrication materials. This shift of attention to engineering internal degrees of freedom has coaxed relatively simple materials into exhibiting a wide range of remarkable mechanical properties. For practical applications to be realized, however, this nascent understanding of metamaterial design must be translated into a capacity for engineering large-scale structures with prescribed mechanical functionality. Thus, the challenge is to systematically map desired functionality of large-scale structures backward into a design scheme while using finite parameter domains. Such "inverse design" is often complicated by the deep coupling between large-scale structure and local mechanical function, which limits the available design space. Here, we introduce a design strategy for constructing 1D, 2D, and 3D mechanical metamaterials inspired by modular origami and kirigami. Our approach is to assemble a number of modules into a voxelized large-scale structure, where the module's design has a greater number of mechanical design parameters than the number of constraints imposed by bulk assembly. This inequality allows each voxel in the bulk structure to be uniquely assigned mechanical properties independent from its ability to connect and deform with its neighbors. In studying specific examples of large-scale metamaterial structures we show that a decoupling of global structure from local mechanical function allows for a variety of mechanically and topologically complex designs.

  4. Decoupling local mechanics from large-scale structure in modular metamaterials

    NASA Astrophysics Data System (ADS)

    Yang, Nan; Silverberg, Jesse L.

    2017-04-01

    A defining feature of mechanical metamaterials is that their properties are determined by the organization of internal structure instead of the raw fabrication materials. This shift of attention to engineering internal degrees of freedom has coaxed relatively simple materials into exhibiting a wide range of remarkable mechanical properties. For practical applications to be realized, however, this nascent understanding of metamaterial design must be translated into a capacity for engineering large-scale structures with prescribed mechanical functionality. Thus, the challenge is to systematically map desired functionality of large-scale structures backward into a design scheme while using finite parameter domains. Such “inverse design” is often complicated by the deep coupling between large-scale structure and local mechanical function, which limits the available design space. Here, we introduce a design strategy for constructing 1D, 2D, and 3D mechanical metamaterials inspired by modular origami and kirigami. Our approach is to assemble a number of modules into a voxelized large-scale structure, where the module’s design has a greater number of mechanical design parameters than the number of constraints imposed by bulk assembly. This inequality allows each voxel in the bulk structure to be uniquely assigned mechanical properties independent from its ability to connect and deform with its neighbors. In studying specific examples of large-scale metamaterial structures we show that a decoupling of global structure from local mechanical function allows for a variety of mechanically and topologically complex designs.

  5. Large- and small-scale constraints on power spectra in Omega = 1 universes

    NASA Technical Reports Server (NTRS)

    Gelb, James M.; Gradwohl, Ben-Ami; Frieman, Joshua A.

    1993-01-01

    The CDM model of structure formation, normalized on large scales, leads to excessive pairwise velocity dispersions on small scales. In an attempt to circumvent this problem, we study three scenarios (all with Omega = 1) with more large-scale and less small-scale power than the standard CDM model: (1) cold dark matter with significantly reduced small-scale power (inspired by models with an admixture of cold and hot dark matter); (2) cold dark matter with a non-scale-invariant power spectrum; and (3) cold dark matter with coupling of dark matter to a long-range vector field. When normalized to COBE on large scales, such models do lead to reduced velocities on small scales and they produce fewer halos compared with CDM. However, models with sufficiently low small-scale velocities apparently fail to produce an adequate number of halos.

  6. Linking crop yield anomalies to large-scale atmospheric circulation in Europe.

    PubMed

    Ceglar, Andrej; Turco, Marco; Toreti, Andrea; Doblas-Reyes, Francisco J

    2017-06-15

    Understanding the effects of climate variability and extremes on crop growth and development represents a necessary step to assess the resilience of agricultural systems to changing climate conditions. This study investigates the links between the large-scale atmospheric circulation and crop yields in Europe, providing the basis to develop seasonal crop yield forecasting and thus enabling a more effective and dynamic adaptation to climate variability and change. Four dominant modes of large-scale atmospheric variability have been used: North Atlantic Oscillation, Eastern Atlantic, Scandinavian and Eastern Atlantic-Western Russia patterns. Large-scale atmospheric circulation explains on average 43% of inter-annual winter wheat yield variability, ranging between 20% and 70% across countries. As for grain maize, the average explained variability is 38%, ranging between 20% and 58%. Spatially, the skill of the developed statistical models strongly depends on the large-scale atmospheric variability impact on weather at the regional level, especially during the most sensitive growth stages of flowering and grain filling. Our results also suggest that preceding atmospheric conditions might provide an important source of predictability especially for maize yields in south-eastern Europe. Since the seasonal predictability of large-scale atmospheric patterns is generally higher than the one of surface weather variables (e.g. precipitation) in Europe, seasonal crop yield prediction could benefit from the integration of derived statistical models exploiting the dynamical seasonal forecast of large-scale atmospheric circulation.

  7. The Plant Phenology Ontology: A New Informatics Resource for Large-Scale Integration of Plant Phenology Data.

    PubMed

    Stucky, Brian J; Guralnick, Rob; Deck, John; Denny, Ellen G; Bolmgren, Kjell; Walls, Ramona

    2018-01-01

    Plant phenology - the timing of plant life-cycle events, such as flowering or leafing out - plays a fundamental role in the functioning of terrestrial ecosystems, including human agricultural systems. Because plant phenology is often linked with climatic variables, there is widespread interest in developing a deeper understanding of global plant phenology patterns and trends. Although phenology data from around the world are currently available, truly global analyses of plant phenology have so far been difficult because the organizations producing large-scale phenology data are using non-standardized terminologies and metrics during data collection and data processing. To address this problem, we have developed the Plant Phenology Ontology (PPO). The PPO provides the standardized vocabulary and semantic framework that is needed for large-scale integration of heterogeneous plant phenology data. Here, we describe the PPO, and we also report preliminary results of using the PPO and a new data processing pipeline to build a large dataset of phenology information from North America and Europe.

  8. Effects of large-scale wind driven turbulence on sound propagation

    NASA Technical Reports Server (NTRS)

    Noble, John M.; Bass, Henry E.; Raspet, Richard

    1990-01-01

    Acoustic measurements made in the atmosphere have shown significant fluctuations in amplitude and phase resulting from the interaction with time varying meteorological conditions. The observed variations appear to have short term and long term (1 to 5 minutes) variations at least in the phase of the acoustic signal. One possible way to account for this long term variation is the use of a large scale wind driven turbulence model. From a Fourier analysis of the phase variations, the outer scales for the large scale turbulence is 200 meters and greater, which corresponds to turbulence in the energy-containing subrange. The large scale turbulence is assumed to be elongated longitudinal vortex pairs roughly aligned with the mean wind. Due to the size of the vortex pair compared to the scale of the present experiment, the effect of the vortex pair on the acoustic field can be modeled as the sound speed of the atmosphere varying with time. The model provides results with the same trends and variations in phase observed experimentally.

  9. The social brain: scale-invariant layering of Erdős-Rényi networks in small-scale human societies.

    PubMed

    Harré, Michael S; Prokopenko, Mikhail

    2016-05-01

    The cognitive ability to form social links that can bind individuals together into large cooperative groups for safety and resource sharing was a key development in human evolutionary and social history. The 'social brain hypothesis' argues that the size of these social groups is based on a neurologically constrained capacity for maintaining long-term stable relationships. No model to date has been able to combine a specific socio-cognitive mechanism with the discrete scale invariance observed in ethnographic studies. We show that these properties result in nested layers of self-organizing Erdős-Rényi networks formed by each individual's ability to maintain only a small number of social links. Each set of links plays a specific role in the formation of different social groups. The scale invariance in our model is distinct from previous 'scale-free networks' studied using much larger social groups; here, the scale invariance is in the relationship between group sizes, rather than in the link degree distribution. We also compare our model with a dominance-based hierarchy and conclude that humans were probably egalitarian in hunter-gatherer-like societies, maintaining an average maximum of four or five social links connecting all members in a largest social network of around 132 people. © 2016 The Author(s).

  10. The Design and Evaluation of a Large-Scale Real-Walking Locomotion Interface

    PubMed Central

    Peck, Tabitha C.; Fuchs, Henry; Whitton, Mary C.

    2014-01-01

    Redirected Free Exploration with Distractors (RFED) is a large-scale real-walking locomotion interface developed to enable people to walk freely in virtual environments that are larger than the tracked space in their facility. This paper describes the RFED system in detail and reports on a user study that evaluated RFED by comparing it to walking-in-place and joystick interfaces. The RFED system is composed of two major components, redirection and distractors. This paper discusses design challenges, implementation details, and lessons learned during the development of two working RFED systems. The evaluation study examined the effect of the locomotion interface on users’ cognitive performance on navigation and wayfinding measures. The results suggest that participants using RFED were significantly better at navigating and wayfinding through virtual mazes than participants using walking-in-place and joystick interfaces. Participants traveled shorter distances, made fewer wrong turns, pointed to hidden targets more accurately and more quickly, and were able to place and label targets on maps more accurately, and more accurately estimate the virtual environment size. PMID:22184262

  11. The Large-scale Structure of the Universe: Probes of Cosmology and Structure Formation

    NASA Astrophysics Data System (ADS)

    Noh, Yookyung

    The usefulness of large-scale structure as a probe of cosmology and structure formation is increasing as large deep surveys in multi-wavelength bands are becoming possible. The observational analysis of large-scale structure guided by large volume numerical simulations are beginning to offer us complementary information and crosschecks of cosmological parameters estimated from the anisotropies in Cosmic Microwave Background (CMB) radiation. Understanding structure formation and evolution and even galaxy formation history is also being aided by observations of different redshift snapshots of the Universe, using various tracers of large-scale structure. This dissertation work covers aspects of large-scale structure from the baryon acoustic oscillation scale, to that of large scale filaments and galaxy clusters. First, I discuss a large- scale structure use for high precision cosmology. I investigate the reconstruction of Baryon Acoustic Oscillation (BAO) peak within the context of Lagrangian perturbation theory, testing its validity in a large suite of cosmological volume N-body simulations. Then I consider galaxy clusters and the large scale filaments surrounding them in a high resolution N-body simulation. I investigate the geometrical properties of galaxy cluster neighborhoods, focusing on the filaments connected to clusters. Using mock observations of galaxy clusters, I explore the correlations of scatter in galaxy cluster mass estimates from multi-wavelength observations and different measurement techniques. I also examine the sources of the correlated scatter by considering the intrinsic and environmental properties of clusters.

  12. Large-Scale Atmospheric Teleconnection Patterns Associated with the Interannual Variability of Heatwaves in East Asia and Its Decadal Changes

    NASA Astrophysics Data System (ADS)

    Choi, N.; Lee, M. I.; Lim, Y. K.; Kim, K. M.

    2017-12-01

    Heatwave is an extreme hot weather event which accompanies fatal damage to human health. The heatwave has a strong relationship with the large-scale atmospheric teleconnection patterns. In this study, we examine the spatial pattern of heatwave in East Asia by using the EOF analysis and the relationship between heatwave frequency and large-scale atmospheric teleconnection patterns. We also separate the time scale of heatwave frequency as the time scale longer than a decade and the interannual time scale. The long-term variation of heatwave frequency in East Asia shows a linkage with the sea surface temperature (SST) variability over the North Atlantic with a decadal time scale (a.k.a. the Atlantic Multidecadal Oscillation; AMO). On the other hands, the interannual variation of heatwave frequency is linked with the two dominant spatial patterns associated with the large-scale teleconnection patterns mimicking the Scandinavian teleconnection (SCAND-like) pattern and the circumglobal teleconnection (CGT-like) pattern, respectively. It is highlighted that the interannual variation of heatwave frequency in East Asia shows a remarkable change after mid-1990s. While the heatwave frequency was mainly associated with the CGT-like pattern before mid-1990s, the SCAND-like pattern becomes the most dominant one after mid-1990s, making the CGT-like pattern as the second. This study implies that the large-scale atmospheric teleconnection patterns play a key role in developing heatwave events in East Asia. This study further discusses possible mechanisms for the decadal change in the linkage between heatwave frequency and the large-scale teleconnection patterns in East Asia such as early melting of snow cover and/or weakening of East Asian jet stream due to global warming.

  13. Targeted tandem duplication of a large chromosomal segment in Aspergillus oryzae.

    PubMed

    Takahashi, Tadashi; Sato, Atsushi; Ogawa, Masahiro; Hanya, Yoshiki; Oguma, Tetsuya

    2014-08-01

    We describe here the first successful construction of a targeted tandem duplication of a large chromosomal segment in Aspergillus oryzae. The targeted tandem chromosomal duplication was achieved by using strains that had a 5'-deleted pyrG upstream of the region targeted for tandem chromosomal duplication and a 3'-deleted pyrG downstream of the target region. Consequently,strains bearing a 210-kb targeted tandem chromosomal duplication near the centromeric region of chromosome 8 and strains bearing a targeted tandem chromosomal duplication of a 700-kb region of chromosome 2 were successfully constructed. The strains bearing the tandem chromosomal duplication were efficiently obtained from the regenerated protoplast of the parental strains. However, the generation of the chromosomal duplication did not depend on the introduction of double-stranded breaks(DSBs) by I-SceI. The chromosomal duplications of these strains were stably maintained after five generations of culture under nonselective conditions. The strains bearing the tandem chromosomal duplication in the 700-kb region of chromosome 2 showed highly increased protease activity in solid-state culture, indicating that the duplication of large chromosomal segments could be a useful new breeding technology and gene analysis method.

  14. Large-scale Density Structures in Magneto-rotational Disk Turbulence

    NASA Astrophysics Data System (ADS)

    Youdin, Andrew; Johansen, A.; Klahr, H.

    2009-01-01

    Turbulence generated by the magneto-rotational instability (MRI) is a strong candidate to drive accretion flows in disks, including sufficiently ionized regions of protoplanetary disks. The MRI is often studied in local shearing boxes, which model a small section of the disk at high resolution. I will present simulations of large, stratified shearing boxes which extend up to 10 gas scale-heights across. These simulations are a useful bridge to fully global disk simulations. We find that MRI turbulence produces large-scale, axisymmetric density perturbations . These structures are part of a zonal flow --- analogous to the banded flow in Jupiter's atmosphere --- which survives in near geostrophic balance for tens of orbits. The launching mechanism is large-scale magnetic tension generated by an inverse cascade. We demonstrate the robustness of these results by careful study of various box sizes, grid resolutions, and microscopic diffusion parameterizations. These gas structures can trap solid material (in the form of large dust or ice particles) with important implications for planet formation. Resolved disk images at mm-wavelengths (e.g. from ALMA) will verify or constrain the existence of these structures.

  15. The role of large scale motions on passive scalar transport

    NASA Astrophysics Data System (ADS)

    Dharmarathne, Suranga; Araya, Guillermo; Tutkun, Murat; Leonardi, Stefano; Castillo, Luciano

    2014-11-01

    We study direct numerical simulation (DNS) of turbulent channel flow at Reτ = 394 to investigate effect of large scale motions on fluctuating temperature field which forms a passive scalar field. Statistical description of the large scale features of the turbulent channel flow is obtained using two-point correlations of velocity components. Two-point correlations of fluctuating temperature field is also examined in order to identify possible similarities between velocity and temperature fields. The two-point cross-correlations betwen the velocity and temperature fluctuations are further analyzed to establish connections between these two fields. In addition, we use proper orhtogonal decompotion (POD) to extract most dominant modes of the fields and discuss the coupling of large scale features of turbulence and the temperature field.

  16. Random access in large-scale DNA data storage.

    PubMed

    Organick, Lee; Ang, Siena Dumas; Chen, Yuan-Jyue; Lopez, Randolph; Yekhanin, Sergey; Makarychev, Konstantin; Racz, Miklos Z; Kamath, Govinda; Gopalan, Parikshit; Nguyen, Bichlien; Takahashi, Christopher N; Newman, Sharon; Parker, Hsing-Yeh; Rashtchian, Cyrus; Stewart, Kendall; Gupta, Gagan; Carlson, Robert; Mulligan, John; Carmean, Douglas; Seelig, Georg; Ceze, Luis; Strauss, Karin

    2018-03-01

    Synthetic DNA is durable and can encode digital data with high density, making it an attractive medium for data storage. However, recovering stored data on a large-scale currently requires all the DNA in a pool to be sequenced, even if only a subset of the information needs to be extracted. Here, we encode and store 35 distinct files (over 200 MB of data), in more than 13 million DNA oligonucleotides, and show that we can recover each file individually and with no errors, using a random access approach. We design and validate a large library of primers that enable individual recovery of all files stored within the DNA. We also develop an algorithm that greatly reduces the sequencing read coverage required for error-free decoding by maximizing information from all sequence reads. These advances demonstrate a viable, large-scale system for DNA data storage and retrieval.

  17. Improving Design Efficiency for Large-Scale Heterogeneous Circuits

    NASA Astrophysics Data System (ADS)

    Gregerson, Anthony

    Despite increases in logic density, many Big Data applications must still be partitioned across multiple computing devices in order to meet their strict performance requirements. Among the most demanding of these applications is high-energy physics (HEP), which uses complex computing systems consisting of thousands of FPGAs and ASICs to process the sensor data created by experiments at particles accelerators such as the Large Hadron Collider (LHC). Designing such computing systems is challenging due to the scale of the systems, the exceptionally high-throughput and low-latency performance constraints that necessitate application-specific hardware implementations, the requirement that algorithms are efficiently partitioned across many devices, and the possible need to update the implemented algorithms during the lifetime of the system. In this work, we describe our research to develop flexible architectures for implementing such large-scale circuits on FPGAs. In particular, this work is motivated by (but not limited in scope to) high-energy physics algorithms for the Compact Muon Solenoid (CMS) experiment at the LHC. To make efficient use of logic resources in multi-FPGA systems, we introduce Multi-Personality Partitioning, a novel form of the graph partitioning problem, and present partitioning algorithms that can significantly improve resource utilization on heterogeneous devices while also reducing inter-chip connections. To reduce the high communication costs of Big Data applications, we also introduce Information-Aware Partitioning, a partitioning method that analyzes the data content of application-specific circuits, characterizes their entropy, and selects circuit partitions that enable efficient compression of data between chips. We employ our information-aware partitioning method to improve the performance of the hardware validation platform for evaluating new algorithms for the CMS experiment. Together, these research efforts help to improve the efficiency

  18. Cytology of DNA Replication Reveals Dynamic Plasticity of Large-Scale Chromatin Fibers.

    PubMed

    Deng, Xiang; Zhironkina, Oxana A; Cherepanynets, Varvara D; Strelkova, Olga S; Kireev, Igor I; Belmont, Andrew S

    2016-09-26

    In higher eukaryotic interphase nuclei, the 100- to >1,000-fold linear compaction of chromatin is difficult to reconcile with its function as a template for transcription, replication, and repair. It is challenging to imagine how DNA and RNA polymerases with their associated molecular machinery would move along the DNA template without transient decondensation of observed large-scale chromatin "chromonema" fibers [1]. Transcription or "replication factory" models [2], in which polymerases remain fixed while DNA is reeled through, are similarly difficult to conceptualize without transient decondensation of these chromonema fibers. Here, we show how a dynamic plasticity of chromatin folding within large-scale chromatin fibers allows DNA replication to take place without significant changes in the global large-scale chromatin compaction or shape of these large-scale chromatin fibers. Time-lapse imaging of lac-operator-tagged chromosome regions shows no major change in the overall compaction of these chromosome regions during their DNA replication. Improved pulse-chase labeling of endogenous interphase chromosomes yields a model in which the global compaction and shape of large-Mbp chromatin domains remains largely invariant during DNA replication, with DNA within these domains undergoing significant movements and redistribution as they move into and then out of adjacent replication foci. In contrast to hierarchical folding models, this dynamic plasticity of large-scale chromatin organization explains how localized changes in DNA topology allow DNA replication to take place without an accompanying global unfolding of large-scale chromatin fibers while suggesting a possible mechanism for maintaining epigenetic programming of large-scale chromatin domains throughout DNA replication. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Lessons Learned from Large-Scale Randomized Experiments

    ERIC Educational Resources Information Center

    Slavin, Robert E.; Cheung, Alan C. K.

    2017-01-01

    Large-scale randomized studies provide the best means of evaluating practical, replicable approaches to improving educational outcomes. This article discusses the advantages, problems, and pitfalls of these evaluations, focusing on alternative methods of randomization, recruitment, ensuring high-quality implementation, dealing with attrition, and…

  20. Nonlinear modulation of the HI power spectrum on ultra-large scales. I

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Umeh, Obinna; Maartens, Roy; Santos, Mario, E-mail: umeobinna@gmail.com, E-mail: roy.maartens@gmail.com, E-mail: mgrsantos@uwc.ac.za

    2016-03-01

    Intensity mapping of the neutral hydrogen brightness temperature promises to provide a three-dimensional view of the universe on very large scales. Nonlinear effects are typically thought to alter only the small-scale power, but we show how they may bias the extraction of cosmological information contained in the power spectrum on ultra-large scales. For linear perturbations to remain valid on large scales, we need to renormalize perturbations at higher order. In the case of intensity mapping, the second-order contribution to clustering from weak lensing dominates the nonlinear contribution at high redshift. Renormalization modifies the mean brightness temperature and therefore the evolutionmore » bias. It also introduces a term that mimics white noise. These effects may influence forecasting analysis on ultra-large scales.« less

  1. Expectation propagation for large scale Bayesian inference of non-linear molecular networks from perturbation data.

    PubMed

    Narimani, Zahra; Beigy, Hamid; Ahmad, Ashar; Masoudi-Nejad, Ali; Fröhlich, Holger

    2017-01-01

    Inferring the structure of molecular networks from time series protein or gene expression data provides valuable information about the complex biological processes of the cell. Causal network structure inference has been approached using different methods in the past. Most causal network inference techniques, such as Dynamic Bayesian Networks and ordinary differential equations, are limited by their computational complexity and thus make large scale inference infeasible. This is specifically true if a Bayesian framework is applied in order to deal with the unavoidable uncertainty about the correct model. We devise a novel Bayesian network reverse engineering approach using ordinary differential equations with the ability to include non-linearity. Besides modeling arbitrary, possibly combinatorial and time dependent perturbations with unknown targets, one of our main contributions is the use of Expectation Propagation, an algorithm for approximate Bayesian inference over large scale network structures in short computation time. We further explore the possibility of integrating prior knowledge into network inference. We evaluate the proposed model on DREAM4 and DREAM8 data and find it competitive against several state-of-the-art existing network inference methods.

  2. Impact of large-scale tides on cosmological distortions via redshift-space power spectrum

    NASA Astrophysics Data System (ADS)

    Akitsu, Kazuyuki; Takada, Masahiro

    2018-03-01

    Although large-scale perturbations beyond a finite-volume survey region are not direct observables, these affect measurements of clustering statistics of small-scale (subsurvey) perturbations in large-scale structure, compared with the ensemble average, via the mode-coupling effect. In this paper we show that a large-scale tide induced by scalar perturbations causes apparent anisotropic distortions in the redshift-space power spectrum of galaxies in a way depending on an alignment between the tide, wave vector of small-scale modes and line-of-sight direction. Using the perturbation theory of structure formation, we derive a response function of the redshift-space power spectrum to large-scale tide. We then investigate the impact of large-scale tide on estimation of cosmological distances and the redshift-space distortion parameter via the measured redshift-space power spectrum for a hypothetical large-volume survey, based on the Fisher matrix formalism. To do this, we treat the large-scale tide as a signal, rather than an additional source of the statistical errors, and show that a degradation in the parameter is restored if we can employ the prior on the rms amplitude expected for the standard cold dark matter (CDM) model. We also discuss whether the large-scale tide can be constrained at an accuracy better than the CDM prediction, if the effects up to a larger wave number in the nonlinear regime can be included.

  3. A topology visualization early warning distribution algorithm for large-scale network security incidents.

    PubMed

    He, Hui; Fan, Guotao; Ye, Jianwei; Zhang, Weizhe

    2013-01-01

    It is of great significance to research the early warning system for large-scale network security incidents. It can improve the network system's emergency response capabilities, alleviate the cyber attacks' damage, and strengthen the system's counterattack ability. A comprehensive early warning system is presented in this paper, which combines active measurement and anomaly detection. The key visualization algorithm and technology of the system are mainly discussed. The large-scale network system's plane visualization is realized based on the divide and conquer thought. First, the topology of the large-scale network is divided into some small-scale networks by the MLkP/CR algorithm. Second, the sub graph plane visualization algorithm is applied to each small-scale network. Finally, the small-scale networks' topologies are combined into a topology based on the automatic distribution algorithm of force analysis. As the algorithm transforms the large-scale network topology plane visualization problem into a series of small-scale network topology plane visualization and distribution problems, it has higher parallelism and is able to handle the display of ultra-large-scale network topology.

  4. Honeycomb: Visual Analysis of Large Scale Social Networks

    NASA Astrophysics Data System (ADS)

    van Ham, Frank; Schulz, Hans-Jörg; Dimicco, Joan M.

    The rise in the use of social network sites allows us to collect large amounts of user reported data on social structures and analysis of this data could provide useful insights for many of the social sciences. This analysis is typically the domain of Social Network Analysis, and visualization of these structures often proves invaluable in understanding them. However, currently available visual analysis tools are not very well suited to handle the massive scale of this network data, and often resolve to displaying small ego networks or heavily abstracted networks. In this paper, we present Honeycomb, a visualization tool that is able to deal with much larger scale data (with millions of connections), which we illustrate by using a large scale corporate social networking site as an example. Additionally, we introduce a new probability based network metric to guide users to potentially interesting or anomalous patterns and discuss lessons learned during design and implementation.

  5. Large-scale electrophysiology: acquisition, compression, encryption, and storage of big data.

    PubMed

    Brinkmann, Benjamin H; Bower, Mark R; Stengel, Keith A; Worrell, Gregory A; Stead, Matt

    2009-05-30

    The use of large-scale electrophysiology to obtain high spatiotemporal resolution brain recordings (>100 channels) capable of probing the range of neural activity from local field potential oscillations to single-neuron action potentials presents new challenges for data acquisition, storage, and analysis. Our group is currently performing continuous, long-term electrophysiological recordings in human subjects undergoing evaluation for epilepsy surgery using hybrid intracranial electrodes composed of up to 320 micro- and clinical macroelectrode arrays. DC-capable amplifiers, sampling at 32kHz per channel with 18-bits of A/D resolution are capable of resolving extracellular voltages spanning single-neuron action potentials, high frequency oscillations, and high amplitude ultra-slow activity, but this approach generates 3 terabytes of data per day (at 4 bytes per sample) using current data formats. Data compression can provide several practical benefits, but only if data can be compressed and appended to files in real-time in a format that allows random access to data segments of varying size. Here we describe a state-of-the-art, scalable, electrophysiology platform designed for acquisition, compression, encryption, and storage of large-scale data. Data are stored in a file format that incorporates lossless data compression using range-encoded differences, a 32-bit cyclically redundant checksum to ensure data integrity, and 128-bit encryption for protection of patient information.

  6. Large-scale Electrophysiology: Acquisition, Compression, Encryption, and Storage of Big Data

    PubMed Central

    Brinkmann, Benjamin H.; Bower, Mark R.; Stengel, Keith A.; Worrell, Gregory A.; Stead, Matt

    2009-01-01

    The use of large-scale electrophysiology to obtain high spatiotemporal resolution brain recordings (>100 channels) capable of probing the range of neural activity from local field potential oscillations to single neuron action potentials presents new challenges for data acquisition, storage, and analysis. Our group is currently performing continuous, long-term electrophysiological recordings in human subjects undergoing evaluation for epilepsy surgery using hybrid intracranial electrodes composed of up to 320 micro- and clinical macroelectrode arrays. DC-capable amplifiers, sampling at 32 kHz per channel with 18-bits of A/D resolution are capable of resolving extracellular voltages spanning single neuron action potentials, high frequency oscillations, and high amplitude ultraslow activity, but this approach generates 3 terabytes of data per day (at 4 bytes per sample) using current data formats. Data compression can provide several practical benefits, but only if data can be compressed and appended to files in real-time in a format that allows random access to data segments of varying size. Here we describe a state-of-the-art, scalable, electrophysiology platform designed for acquisition, compression, encryption, and storage of large-scale data. Data are stored in a file format that incorporates lossless data compression using range encoded differences, a 32-bit cyclically redundant checksum to ensure data integrity, and 128-bit encryption for protection of patient information. PMID:19427545

  7. The Human Service Scale: A New Measure for Evaluation

    ERIC Educational Resources Information Center

    Reagles, Kenneth W.; Butler, Alfred S.

    1976-01-01

    The Human Service Scale is an assessment instrument for measuring the progress of the rehabilitation client and the effectiveness of rehabilitation programs. The theory behind the scale is based on Maslow's hierarchy of human needs. The development and some potential uses of the scale are discussed. (EC)

  8. Social scaling of extrapersonal space: target objects are judged as closer when the reference frame is a human agent with available movement potentialities.

    PubMed

    Fini, C; Brass, M; Committeri, G

    2015-01-01

    effect was simply due to a line-of-sight mechanism (visual perspective taking) we compared the human agent free to move with the same agent tied to a pole with a rope, thus reducing movement potentialities while maintaining equal visual accessibility. The "Near space extension" disappeared when this manipulation was introduced, showing that movement potentialities are the relevant factor for such an effect. Our results demonstrate for the first time that during allocentric distance judgments within extrapersonal space, we implicitly process the movement potentialities of the RF. A target object is perceived as being closer when the allocentric RF is a human with available movement potentialities, suggesting a mechanism of social scaling of extrapersonal space processing. Copyright © 2014. Published by Elsevier B.V.

  9. Advances in Parallelization for Large Scale Oct-Tree Mesh Generation

    NASA Technical Reports Server (NTRS)

    O'Connell, Matthew; Karman, Steve L.

    2015-01-01

    Despite great advancements in the parallelization of numerical simulation codes over the last 20 years, it is still common to perform grid generation in serial. Generating large scale grids in serial often requires using special "grid generation" compute machines that can have more than ten times the memory of average machines. While some parallel mesh generation techniques have been proposed, generating very large meshes for LES or aeroacoustic simulations is still a challenging problem. An automated method for the parallel generation of very large scale off-body hierarchical meshes is presented here. This work enables large scale parallel generation of off-body meshes by using a novel combination of parallel grid generation techniques and a hybrid "top down" and "bottom up" oct-tree method. Meshes are generated using hardware commonly found in parallel compute clusters. The capability to generate very large meshes is demonstrated by the generation of off-body meshes surrounding complex aerospace geometries. Results are shown including a one billion cell mesh generated around a Predator Unmanned Aerial Vehicle geometry, which was generated on 64 processors in under 45 minutes.

  10. Large-scale comparative metagenomics of Blastocystis, a common member of the human gut microbiome

    PubMed Central

    Beghini, Francesco; Pasolli, Edoardo; Truong, Tin Duy; Putignani, Lorenza; Cacciò, Simone M; Segata, Nicola

    2017-01-01

    The influence of unicellular eukaryotic microorganisms on human gut health and disease is still largely unexplored. Blastocystis spp. commonly colonize the gut, but its clinical significance and ecological role are currently unsettled. We have developed a high-sensitivity bioinformatic pipeline to detect Blastocystis subtypes (STs) from shotgun metagenomics, and applied it to 12 large data sets, comprising 1689 subjects of different geographic origin, disease status and lifestyle. We confirmed and extended previous observations on the high prevalence the microrganism in the population (14.9%), its non-random and ST-specific distribution, and its ability to cause persistent (asymptomatic) colonization. These findings, along with the higher prevalence observed in non-westernized individuals, the lack of positive association with any of the disease considered, and decreased presence in individuals with dysbiosis associated with colorectal cancer and Crohn’s disease, strongly suggest that Blastocystis is a component of the healthy gut microbiome. Further, we found an inverse association between body mass index and Blastocystis, and strong co-occurrence with archaeal organisms (Methanobrevibacter smithii) and several bacterial species. The association of specific microbial community structures with Blastocystis was confirmed by the high predictability (up to 0.91 area under the curve) of the microorganism colonization based on the species-level composition of the microbiome. Finally, we reconstructed and functionally profiled 43 new draft Blastocystis genomes and discovered a higher intra subtype variability of ST1 and ST2 compared with ST3 and ST4. Altogether, we provide an in-depth epidemiologic, ecological, and genomic analysis of Blastocystis, and show how metagenomics can be crucial to advance population genomics of human parasites. PMID:28837129

  11. Rainbow: a tool for large-scale whole-genome sequencing data analysis using cloud computing

    PubMed Central

    2013-01-01

    Background Technical improvements have decreased sequencing costs and, as a result, the size and number of genomic datasets have increased rapidly. Because of the lower cost, large amounts of sequence data are now being produced by small to midsize research groups. Crossbow is a software tool that can detect single nucleotide polymorphisms (SNPs) in whole-genome sequencing (WGS) data from a single subject; however, Crossbow has a number of limitations when applied to multiple subjects from large-scale WGS projects. The data storage and CPU resources that are required for large-scale whole genome sequencing data analyses are too large for many core facilities and individual laboratories to provide. To help meet these challenges, we have developed Rainbow, a cloud-based software package that can assist in the automation of large-scale WGS data analyses. Results Here, we evaluated the performance of Rainbow by analyzing 44 different whole-genome-sequenced subjects. Rainbow has the capacity to process genomic data from more than 500 subjects in two weeks using cloud computing provided by the Amazon Web Service. The time includes the import and export of the data using Amazon Import/Export service. The average cost of processing a single sample in the cloud was less than 120 US dollars. Compared with Crossbow, the main improvements incorporated into Rainbow include the ability: (1) to handle BAM as well as FASTQ input files; (2) to split large sequence files for better load balance downstream; (3) to log the running metrics in data processing and monitoring multiple Amazon Elastic Compute Cloud (EC2) instances; and (4) to merge SOAPsnp outputs for multiple individuals into a single file to facilitate downstream genome-wide association studies. Conclusions Rainbow is a scalable, cost-effective, and open-source tool for large-scale WGS data analysis. For human WGS data sequenced by either the Illumina HiSeq 2000 or HiSeq 2500 platforms, Rainbow can be used straight out of

  12. Investigating a link between large and small-scale chaos features on Europa

    NASA Astrophysics Data System (ADS)

    Tognetti, L.; Rhoden, A.; Nelson, D. M.

    2017-12-01

    Chaos is one of the most recognizable, and studied, features on Europa's surface. Most models of chaos formation invoke liquid water at shallow depths within the ice shell; the liquid destabilizes the overlying ice layer, breaking it into mobile rafts and destroying pre-existing terrain. This class of model has been applied to both large-scale chaos like Conamara and small-scale features (i.e. microchaos), which are typically <10 km in diameter. Currently unknown, however, is whether both large-scale and small-scale features are produced together, e.g. through a network of smaller sills linked to a larger liquid water pocket. If microchaos features do form as satellites of large-scale chaos features, we would expect a drop off in the number density of microchaos with increasing distance from the large chaos feature; the trend should not be observed in regions without large-scale chaos features. Here, we test the hypothesis that large chaos features create "satellite" systems of smaller chaos features. Either outcome will help us better understand the relationship between large-scale chaos and microchaos. We focus first on regions surrounding the large chaos features Conamara and Murias (e.g. the Mitten). We map all chaos features within 90,000 sq km of the main chaos feature and assign each one a ranking (High Confidence, Probable, or Low Confidence) based on the observed characteristics of each feature. In particular, we look for a distinct boundary, loss of preexisting terrain, the existence of rafts or blocks, and the overall smoothness of the feature. We also note features that are chaos-like but lack sufficient characteristics to be classified as chaos. We then apply the same criteria to map microchaos features in regions of similar area ( 90,000 sq km) that lack large chaos features. By plotting the distribution of microchaos with distance from the center point of the large chaos feature or the mapping region (for the cases without a large feature), we

  13. Soft X-ray Emission from Large-Scale Galactic Outflows in Seyfert Galaxies

    NASA Astrophysics Data System (ADS)

    Colbert, E. J. M.; Baum, S.; O'Dea, C.; Veilleux, S.

    1998-01-01

    Kiloparsec-scale soft X-ray nebulae extend along the galaxy minor axes in several Seyfert galaxies, including NGC 2992, NGC 4388 and NGC 5506. In these three galaxies, the extended X-ray emission observed in ROSAT HRI images has 0.2-2.4 keV X-ray luminosities of 0.4-3.5 x 10(40) erg s(-1) . The X-ray nebulae are roughly co-spatial with the large-scale radio emission, suggesting that both are produced by large-scale galactic outflows. Assuming pressure balance between the radio and X-ray plasmas, the X-ray filling factor is >~ 10(4) times as large as the radio plasma filling factor, suggesting that large-scale outflows in Seyfert galaxies are predominantly winds of thermal X-ray emitting gas. We favor an interpretation in which large-scale outflows originate as AGN-driven jets that entrain and heat gas on kpc scales as they make their way out of the galaxy. AGN- and starburst-driven winds are also possible explanations if the winds are oriented along the rotation axis of the galaxy disk. Since large-scale outflows are present in at least 50 percent of Seyfert galaxies, the soft X-ray emission from the outflowing gas may, in many cases, explain the ``soft excess" X-ray feature observed below 2 keV in X-ray spectra of many Seyfert 2 galaxies.

  14. CRISPR-Cas systems target a diverse collection of invasive mobile genetic elements in human microbiomes

    PubMed Central

    2013-01-01

    Background Bacteria and archaea develop immunity against invading genomes by incorporating pieces of the invaders' sequences, called spacers, into a clustered regularly interspaced short palindromic repeats (CRISPR) locus between repeats, forming arrays of repeat-spacer units. When spacers are expressed, they direct CRISPR-associated (Cas) proteins to silence complementary invading DNA. In order to characterize the invaders of human microbiomes, we use spacers from CRISPR arrays that we had previously assembled from shotgun metagenomic datasets, and identify contigs that contain these spacers' targets. Results We discover 95,000 contigs that are putative invasive mobile genetic elements, some targeted by hundreds of CRISPR spacers. We find that oral sites in healthy human populations have a much greater variety of mobile genetic elements than stool samples. Mobile genetic elements carry genes encoding diverse functions: only 7% of the mobile genetic elements are similar to known phages or plasmids, although a much greater proportion contain phage- or plasmid-related genes. A small number of contigs share similarity with known integrative and conjugative elements, providing the first examples of CRISPR defenses against this class of element. We provide detailed analyses of a few large mobile genetic elements of various types, and a relative abundance analysis of mobile genetic elements and putative hosts, exploring the dynamic activities of mobile genetic elements in human microbiomes. A joint analysis of mobile genetic elements and CRISPRs shows that protospacer-adjacent motifs drive their interaction network; however, some CRISPR-Cas systems target mobile genetic elements lacking motifs. Conclusions We identify a large collection of invasive mobile genetic elements in human microbiomes, an important resource for further study of the interaction between the CRISPR-Cas immune system and invaders. PMID:23628424

  15. High-Throughput Microbore UPLC-MS Metabolic Phenotyping of Urine for Large-Scale Epidemiology Studies.

    PubMed

    Gray, Nicola; Lewis, Matthew R; Plumb, Robert S; Wilson, Ian D; Nicholson, Jeremy K

    2015-06-05

    A new generation of metabolic phenotyping centers are being created to meet the increasing demands of personalized healthcare, and this has resulted in a major requirement for economical, high-throughput metabonomic analysis by liquid chromatography-mass spectrometry (LC-MS). Meeting these new demands represents an emerging bioanalytical problem that must be solved if metabolic phenotyping is to be successfully applied to large clinical and epidemiological sample sets. Ultraperformance (UP)LC-MS-based metabolic phenotyping, based on 2.1 mm i.d. LC columns, enables comprehensive metabolic phenotyping but, when employed for the analysis of thousands of samples, results in high solvent usage. The use of UPLC-MS employing 1 mm i.d. columns for metabolic phenotyping rather than the conventional 2.1 mm i.d. methodology shows that the resulting optimized microbore method provided equivalent or superior performance in terms of peak capacity, sensitivity, and robustness. On average, we also observed, when using the microbore scale separation, an increase in response of 2-3 fold over that obtained with the standard 2.1 mm scale method. When applied to the analysis of human urine, the 1 mm scale method showed no decline in performance over the course of 1000 analyses, illustrating that microbore UPLC-MS represents a viable alternative to conventional 2.1 mm i.d. formats for routine large-scale metabolic profiling studies while also resulting in a 75% reduction in solvent usage. The modest increase in sensitivity provided by this methodology also offers the potential to either reduce sample consumption or increase the number of metabolite features detected with confidence due to the increased signal-to-noise ratios obtained. Implementation of this miniaturized UPLC-MS method of metabolic phenotyping results in clear analytical, economic, and environmental benefits for large-scale metabolic profiling studies with similar or improved analytical performance compared to

  16. A general path for large-scale solubilization of cellular proteins: From membrane receptors to multiprotein complexes

    PubMed Central

    Pullara, Filippo; Guerrero-Santoro, Jennifer; Calero, Monica; Zhang, Qiangmin; Peng, Ye; Spåhr, Henrik; Kornberg, Guy L.; Cusimano, Antonella; Stevenson, Hilary P.; Santamaria-Suarez, Hugo; Reynolds, Shelley L.; Brown, Ian S.; Monga, Satdarshan P.S.; Van Houten, Bennett; Rapić-Otrin, Vesna; Calero, Guillermo; Levine, Arthur S.

    2014-01-01

    Expression of recombinant proteins in bacterial or eukaryotic systems often results in aggregation rendering them unavailable for biochemical or structural studies. Protein aggregation is a costly problem for biomedical research. It forces research laboratories and the biomedical industry to search for alternative, more soluble, non-human proteins and limits the number of potential “druggable” targets. In this study we present a highly reproducible protocol that introduces the systematic use of an extensive number of detergents to solubilize aggregated proteins expressed in bacterial and eukaryotic systems. We validate the usefulness of this protocol by solubilizing traditionally difficult human protein targets to milligram quantities and confirm their biological activity. We use this method to solubilize monomeric or multimeric components of multi-protein complexes and demonstrate its efficacy to reconstitute large cellular machines. This protocol works equally well on cytosolic, nuclear and membrane proteins and can be easily adapted to a high throughput format. PMID:23137940

  17. Extreme reaction times determine fluctuation scaling in human color vision

    NASA Astrophysics Data System (ADS)

    Medina, José M.; Díaz, José A.

    2016-11-01

    In modern mental chronometry, human reaction time defines the time elapsed from stimulus presentation until a response occurs and represents a reference paradigm for investigating stochastic latency mechanisms in color vision. Here we examine the statistical properties of extreme reaction times and whether they support fluctuation scaling in the skewness-kurtosis plane. Reaction times were measured for visual stimuli across the cardinal directions of the color space. For all subjects, the results show that very large reaction times deviate from the right tail of reaction time distributions suggesting the existence of dragon-kings events. The results also indicate that extreme reaction times are correlated and shape fluctuation scaling over a wide range of stimulus conditions. The scaling exponent was higher for achromatic than isoluminant stimuli, suggesting distinct generative mechanisms. Our findings open a new perspective for studying failure modes in sensory-motor communications and in complex networks.

  18. Global Sensitivity Analysis for Large-scale Socio-hydrological Models using the Cloud

    NASA Astrophysics Data System (ADS)

    Hu, Y.; Garcia-Cabrejo, O.; Cai, X.; Valocchi, A. J.; Dupont, B.

    2014-12-01

    In the context of coupled human and natural system (CHNS), incorporating human factors into water resource management provides us with the opportunity to understand the interactions between human and environmental systems. A multi-agent system (MAS) model is designed to couple with the physically-based Republican River Compact Administration (RRCA) groundwater model, in an attempt to understand the declining water table and base flow in the heavily irrigated Republican River basin. For MAS modelling, we defined five behavioral parameters (κ_pr, ν_pr, κ_prep, ν_prep and λ) to characterize the agent's pumping behavior given the uncertainties of the future crop prices and precipitation. κ and ν describe agent's beliefs in their prior knowledge of the mean and variance of crop prices (κ_pr, ν_pr) and precipitation (κ_prep, ν_prep), and λ is used to describe the agent's attitude towards the fluctuation of crop profits. Notice that these human behavioral parameters as inputs to the MAS model are highly uncertain and even not measurable. Thus, we estimate the influences of these behavioral parameters on the coupled models using Global Sensitivity Analysis (GSA). In this paper, we address two main challenges arising from GSA with such a large-scale socio-hydrological model by using Hadoop-based Cloud Computing techniques and Polynomial Chaos Expansion (PCE) based variance decomposition approach. As a result, 1,000 scenarios of the coupled models are completed within two hours with the Hadoop framework, rather than about 28days if we run those scenarios sequentially. Based on the model results, GSA using PCE is able to measure the impacts of the spatial and temporal variations of these behavioral parameters on crop profits and water table, and thus identifies two influential parameters, κ_pr and λ. The major contribution of this work is a methodological framework for the application of GSA in large-scale socio-hydrological models. This framework attempts to

  19. Analysis of discriminants for experimental 3D SAR imagery of human targets

    NASA Astrophysics Data System (ADS)

    Chan, Brigitte; Sévigny, Pascale; DiFilippo, David D. J.

    2014-10-01

    Development of a prototype 3-D through-wall synthetic aperture radar (SAR) system is currently underway at Defence Research and Development Canada. The intent is to map out building wall layouts and to detect targets of interest and their location behind walls such as humans, arms caches, and furniture. This situational awareness capability can be invaluable to the military working in an urban environment. Tools and algorithms are being developed to exploit the resulting 3-D imagery. Current work involves analyzing signatures of targets behind a wall and understanding the clutter and multipath signals in a room of interest. In this paper, a comprehensive study of 3-D human target signature metrics in free space is presented. The aim is to identify features for discrimination of the human target from other targets. Targets used in this investigation include a human standing, a human standing with arms stretched out, a chair, a table, and a metallic plate. Several features were investigated as potential discriminants and five which were identified as good candidates are presented in this paper. Based on this study, no single feature could be used to fully discriminate the human targets from all others. A combination of at least two different features is required to achieve this.

  20. Some aspects of control of a large-scale dynamic system

    NASA Technical Reports Server (NTRS)

    Aoki, M.

    1975-01-01

    Techniques of predicting and/or controlling the dynamic behavior of large scale systems are discussed in terms of decentralized decision making. Topics discussed include: (1) control of large scale systems by dynamic team with delayed information sharing; (2) dynamic resource allocation problems by a team (hierarchical structure with a coordinator); and (3) some problems related to the construction of a model of reduced dimension.

  1. Large-scale magnetic topologies of early M dwarfs

    NASA Astrophysics Data System (ADS)

    Donati, J.-F.; Morin, J.; Petit, P.; Delfosse, X.; Forveille, T.; Aurière, M.; Cabanac, R.; Dintrans, B.; Fares, R.; Gastine, T.; Jardine, M. M.; Lignières, F.; Paletou, F.; Ramirez Velez, J. C.; Théado, S.

    2008-10-01

    We present here additional results of a spectropolarimetric survey of a small sample of stars ranging from spectral type M0 to M8 aimed at investigating observationally how dynamo processes operate in stars on both sides of the full convection threshold (spectral type M4). The present paper focuses on early M stars (M0-M3), that is above the full convection threshold. Applying tomographic imaging techniques to time series of rotationally modulated circularly polarized profiles collected with the NARVAL spectropolarimeter, we determine the rotation period and reconstruct the large-scale magnetic topologies of six early M dwarfs. We find that early-M stars preferentially host large-scale fields with dominantly toroidal and non-axisymmetric poloidal configurations, along with significant differential rotation (and long-term variability); only the lowest-mass star of our subsample is found to host an almost fully poloidal, mainly axisymmetric large-scale field resembling those found in mid-M dwarfs. This abrupt change in the large-scale magnetic topologies of M dwarfs (occurring at spectral type M3) has no related signature on X-ray luminosities (measuring the total amount of magnetic flux); it thus suggests that underlying dynamo processes become more efficient at producing large-scale fields (despite producing the same flux) at spectral types later than M3. We suspect that this change relates to the rapid decrease in the radiative cores of low-mass stars and to the simultaneous sharp increase of the convective turnover times (with decreasing stellar mass) that models predict to occur at M3; it may also be (at least partly) responsible for the reduced magnetic braking reported for fully convective stars. Based on observations obtained at the Télescope Bernard Lyot (TBL), operated by the Institut National des Science de l'Univers of the Centre National de la Recherche Scientifique of France. E-mail: donati@ast.obs-mip.fr (J-FD); jmorin@ast.obs-mip.fr (JM); petit

  2. Robust large-scale parallel nonlinear solvers for simulations.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bader, Brett William; Pawlowski, Roger Patrick; Kolda, Tamara Gibson

    2005-11-01

    This report documents research to develop robust and efficient solution techniques for solving large-scale systems of nonlinear equations. The most widely used method for solving systems of nonlinear equations is Newton's method. While much research has been devoted to augmenting Newton-based solvers (usually with globalization techniques), little has been devoted to exploring the application of different models. Our research has been directed at evaluating techniques using different models than Newton's method: a lower order model, Broyden's method, and a higher order model, the tensor method. We have developed large-scale versions of each of these models and have demonstrated their usemore » in important applications at Sandia. Broyden's method replaces the Jacobian with an approximation, allowing codes that cannot evaluate a Jacobian or have an inaccurate Jacobian to converge to a solution. Limited-memory methods, which have been successful in optimization, allow us to extend this approach to large-scale problems. We compare the robustness and efficiency of Newton's method, modified Newton's method, Jacobian-free Newton-Krylov method, and our limited-memory Broyden method. Comparisons are carried out for large-scale applications of fluid flow simulations and electronic circuit simulations. Results show that, in cases where the Jacobian was inaccurate or could not be computed, Broyden's method converged in some cases where Newton's method failed to converge. We identify conditions where Broyden's method can be more efficient than Newton's method. We also present modifications to a large-scale tensor method, originally proposed by Bouaricha, for greater efficiency, better robustness, and wider applicability. Tensor methods are an alternative to Newton-based methods and are based on computing a step based on a local quadratic model rather than a linear model. The advantage of Bouaricha's method is that it can use any existing linear solver, which makes it simple

  3. Rotation invariant fast features for large-scale recognition

    NASA Astrophysics Data System (ADS)

    Takacs, Gabriel; Chandrasekhar, Vijay; Tsai, Sam; Chen, David; Grzeszczuk, Radek; Girod, Bernd

    2012-10-01

    We present an end-to-end feature description pipeline which uses a novel interest point detector and Rotation- Invariant Fast Feature (RIFF) descriptors. The proposed RIFF algorithm is 15× faster than SURF1 while producing large-scale retrieval results that are comparable to SIFT.2 Such high-speed features benefit a range of applications from Mobile Augmented Reality (MAR) to web-scale image retrieval and analysis.

  4. Scale-dependent climatic drivers of human epidemics in ancient China.

    PubMed

    Tian, Huidong; Yan, Chuan; Xu, Lei; Büntgen, Ulf; Stenseth, Nils C; Zhang, Zhibin

    2017-12-05

    A wide range of climate change-induced effects have been implicated in the prevalence of infectious diseases. Disentangling causes and consequences, however, remains particularly challenging at historical time scales, for which the quality and quantity of most of the available natural proxy archives and written documentary sources often decline. Here, we reconstruct the spatiotemporal occurrence patterns of human epidemics for large parts of China and most of the last two millennia. Cold and dry climate conditions indirectly increased the prevalence of epidemics through the influences of locusts and famines. Our results further reveal that low-frequency, long-term temperature trends mainly contributed to negative associations with epidemics, while positive associations of epidemics with droughts, floods, locusts, and famines mainly coincided with both higher and lower frequency temperature variations. Nevertheless, unstable relationships between human epidemics and temperature changes were observed on relatively smaller time scales. Our study suggests that an intertwined, direct, and indirect array of biological, ecological, and societal responses to different aspects of past climatic changes strongly depended on the frequency domain and study period chosen.

  5. Large-scale bioenergy production: how to resolve sustainability trade-offs?

    NASA Astrophysics Data System (ADS)

    Humpenöder, Florian; Popp, Alexander; Bodirsky, Benjamin Leon; Weindl, Isabelle; Biewald, Anne; Lotze-Campen, Hermann; Dietrich, Jan Philipp; Klein, David; Kreidenweis, Ulrich; Müller, Christoph; Rolinski, Susanne; Stevanovic, Miodrag

    2018-02-01

    Large-scale 2nd generation bioenergy deployment is a key element of 1.5 °C and 2 °C transformation pathways. However, large-scale bioenergy production might have negative sustainability implications and thus may conflict with the Sustainable Development Goal (SDG) agenda. Here, we carry out a multi-criteria sustainability assessment of large-scale bioenergy crop production throughout the 21st century (300 EJ in 2100) using a global land-use model. Our analysis indicates that large-scale bioenergy production without complementary measures results in negative effects on the following sustainability indicators: deforestation, CO2 emissions from land-use change, nitrogen losses, unsustainable water withdrawals and food prices. One of our main findings is that single-sector environmental protection measures next to large-scale bioenergy production are prone to involve trade-offs among these sustainability indicators—at least in the absence of more efficient land or water resource use. For instance, if bioenergy production is accompanied by forest protection, deforestation and associated emissions (SDGs 13 and 15) decline substantially whereas food prices (SDG 2) increase. However, our study also shows that this trade-off strongly depends on the development of future food demand. In contrast to environmental protection measures, we find that agricultural intensification lowers some side-effects of bioenergy production substantially (SDGs 13 and 15) without generating new trade-offs—at least among the sustainability indicators considered here. Moreover, our results indicate that a combination of forest and water protection schemes, improved fertilization efficiency, and agricultural intensification would reduce the side-effects of bioenergy production most comprehensively. However, although our study includes more sustainability indicators than previous studies on bioenergy side-effects, our study represents only a small subset of all indicators relevant for the

  6. Large-scale fabrication of single crystalline tin nanowire arrays

    NASA Astrophysics Data System (ADS)

    Luo, Bin; Yang, Dachi; Liang, Minghui; Zhi, Linjie

    2010-09-01

    Large-scale single crystalline tin nanowire arrays with preferred lattice orientation along the [100] direction were fabricated in porous anodic aluminium oxide (AAO) membranes by the electrodeposition method using copper nanorod as a second electrode.Large-scale single crystalline tin nanowire arrays with preferred lattice orientation along the [100] direction were fabricated in porous anodic aluminium oxide (AAO) membranes by the electrodeposition method using copper nanorod as a second electrode. Electronic supplementary information (ESI) available: Experimental details and the information for single crystalline copper nanorods. See DOI: 10.1039/c0nr00206b

  7. Large-scale Activities Associated with the 2005 Sep. 7th Event

    NASA Astrophysics Data System (ADS)

    Zong, Weiguo

    We present a multi-wavelength study on large-scale activities associated with a significant solar event. On 2005 September 7, a flare classified as bigger than X17 was observed. Combining with Hα 6562.8 ˚, He I 10830 ˚and soft X-ray observations, three large-scale activities were A A found to propagate over a long distance on the solar surface. 1) The first large-scale activity emanated from the flare site, which propagated westward around the solar equator and appeared as sequential brightenings. With MDI longitudinal magnetic field map, the activity was found to propagate along the magnetic network. 2) The second large-scale activity could be well identified both in He I 10830 ˚images and soft X-ray images and appeared as diffuse emission A enhancement propagating away. The activity started later than the first one and was not centric on the flare site. Moreover, a rotation was found along with the bright front propagating away. 3) The third activity was ahead of the second one, which was identified as a "winking" filament. The three activities have different origins, which were seldom observed in one event. Therefore this study is useful to understand the mechanism of large-scale activities on solar surface.

  8. Mantis: A Fast, Small, and Exact Large-Scale Sequence-Search Index.

    PubMed

    Pandey, Prashant; Almodaresi, Fatemeh; Bender, Michael A; Ferdman, Michael; Johnson, Rob; Patro, Rob

    2018-06-18

    Sequence-level searches on large collections of RNA sequencing experiments, such as the NCBI Sequence Read Archive (SRA), would enable one to ask many questions about the expression or variation of a given transcript in a population. Existing approaches, such as the sequence Bloom tree, suffer from fundamental limitations of the Bloom filter, resulting in slow build and query times, less-than-optimal space usage, and potentially large numbers of false-positives. This paper introduces Mantis, a space-efficient system that uses new data structures to index thousands of raw-read experiments and facilitates large-scale sequence searches. In our evaluation, index construction with Mantis is 6× faster and yields a 20% smaller index than the state-of-the-art split sequence Bloom tree (SSBT). For queries, Mantis is 6-108× faster than SSBT and has no false-positives or -negatives. For example, Mantis was able to search for all 200,400 known human transcripts in an index of 2,652 RNA sequencing experiments in 82 min; SSBT took close to 4 days. Copyright © 2018 Elsevier Inc. All rights reserved.

  9. ProteinInferencer: Confident protein identification and multiple experiment comparison for large scale proteomics projects.

    PubMed

    Zhang, Yaoyang; Xu, Tao; Shan, Bing; Hart, Jonathan; Aslanian, Aaron; Han, Xuemei; Zong, Nobel; Li, Haomin; Choi, Howard; Wang, Dong; Acharya, Lipi; Du, Lisa; Vogt, Peter K; Ping, Peipei; Yates, John R

    2015-11-03

    Shotgun proteomics generates valuable information from large-scale and target protein characterizations, including protein expression, protein quantification, protein post-translational modifications (PTMs), protein localization, and protein-protein interactions. Typically, peptides derived from proteolytic digestion, rather than intact proteins, are analyzed by mass spectrometers because peptides are more readily separated, ionized and fragmented. The amino acid sequences of peptides can be interpreted by matching the observed tandem mass spectra to theoretical spectra derived from a protein sequence database. Identified peptides serve as surrogates for their proteins and are often used to establish what proteins were present in the original mixture and to quantify protein abundance. Two major issues exist for assigning peptides to their originating protein. The first issue is maintaining a desired false discovery rate (FDR) when comparing or combining multiple large datasets generated by shotgun analysis and the second issue is properly assigning peptides to proteins when homologous proteins are present in the database. Herein we demonstrate a new computational tool, ProteinInferencer, which can be used for protein inference with both small- or large-scale data sets to produce a well-controlled protein FDR. In addition, ProteinInferencer introduces confidence scoring for individual proteins, which makes protein identifications evaluable. This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015. Published by Elsevier B.V.

  10. Potential large animal models for gene therapy of human genetic diseases of immune and blood cell systems.

    PubMed

    Bauer, Thomas R; Adler, Rima L; Hickstein, Dennis D

    2009-01-01

    Genetic mutations involving the cellular components of the hematopoietic system--red blood cells, white blood cells, and platelets--manifest clinically as anemia, infection, and bleeding. Although gene targeting has recapitulated many of these diseases in mice, these murine homologues are limited as translational models by their small size and brief life span as well as the fact that mutations induced by gene targeting do not always faithfully reflect the clinical manifestations of such mutations in humans. Many of these limitations can be overcome by identifying large animals with genetic diseases of the hematopoietic system corresponding to their human disease counterparts. In this article, we describe human diseases of the cellular components of the hematopoietic system that have counterparts in large animal species, in most cases carrying mutations in the same gene (CD18 in leukocyte adhesion deficiency) or genes in interacting proteins (DNA cross-link repair 1C protein and protein kinase, DNA-activated catalytic polypeptide in radiation-sensitive severe combined immunodeficiency). Furthermore, we describe the potential of these animal models to serve as disease-specific preclinical models for testing the efficacy and safety of clinical interventions such as hematopoietic stem cell transplantation or gene therapy before their use in humans with the corresponding disease.

  11. Analyzing large-scale proteomics projects with latent semantic indexing.

    PubMed

    Klie, Sebastian; Martens, Lennart; Vizcaíno, Juan Antonio; Côté, Richard; Jones, Phil; Apweiler, Rolf; Hinneburg, Alexander; Hermjakob, Henning

    2008-01-01

    Since the advent of public data repositories for proteomics data, readily accessible results from high-throughput experiments have been accumulating steadily. Several large-scale projects in particular have contributed substantially to the amount of identifications available to the community. Despite the considerable body of information amassed, very few successful analyses have been performed and published on this data, leveling off the ultimate value of these projects far below their potential. A prominent reason published proteomics data is seldom reanalyzed lies in the heterogeneous nature of the original sample collection and the subsequent data recording and processing. To illustrate that at least part of this heterogeneity can be compensated for, we here apply a latent semantic analysis to the data contributed by the Human Proteome Organization's Plasma Proteome Project (HUPO PPP). Interestingly, despite the broad spectrum of instruments and methodologies applied in the HUPO PPP, our analysis reveals several obvious patterns that can be used to formulate concrete recommendations for optimizing proteomics project planning as well as the choice of technologies used in future experiments. It is clear from these results that the analysis of large bodies of publicly available proteomics data by noise-tolerant algorithms such as the latent semantic analysis holds great promise and is currently underexploited.

  12. Multiresolution comparison of precipitation datasets for large-scale models

    NASA Astrophysics Data System (ADS)

    Chun, K. P.; Sapriza Azuri, G.; Davison, B.; DeBeer, C. M.; Wheater, H. S.

    2014-12-01

    Gridded precipitation datasets are crucial for driving large-scale models which are related to weather forecast and climate research. However, the quality of precipitation products is usually validated individually. Comparisons between gridded precipitation products along with ground observations provide another avenue for investigating how the precipitation uncertainty would affect the performance of large-scale models. In this study, using data from a set of precipitation gauges over British Columbia and Alberta, we evaluate several widely used North America gridded products including the Canadian Gridded Precipitation Anomalies (CANGRD), the National Center for Environmental Prediction (NCEP) reanalysis, the Water and Global Change (WATCH) project, the thin plate spline smoothing algorithms (ANUSPLIN) and Canadian Precipitation Analysis (CaPA). Based on verification criteria for various temporal and spatial scales, results provide an assessment of possible applications for various precipitation datasets. For long-term climate variation studies (~100 years), CANGRD, NCEP, WATCH and ANUSPLIN have different comparative advantages in terms of their resolution and accuracy. For synoptic and mesoscale precipitation patterns, CaPA provides appealing performance of spatial coherence. In addition to the products comparison, various downscaling methods are also surveyed to explore new verification and bias-reduction methods for improving gridded precipitation outputs for large-scale models.

  13. Toward Instructional Leadership: Principals' Perceptions of Large-Scale Assessment in Schools

    ERIC Educational Resources Information Center

    Prytula, Michelle; Noonan, Brian; Hellsten, Laurie

    2013-01-01

    This paper describes a study of the perceptions that Saskatchewan school principals have regarding large-scale assessment reform and their perceptions of how assessment reform has affected their roles as principals. The findings revealed that large-scale assessments, especially provincial assessments, have affected the principal in Saskatchewan…

  14. Large scale structure in universes dominated by cold dark matter

    NASA Technical Reports Server (NTRS)

    Bond, J. Richard

    1986-01-01

    The theory of Gaussian random density field peaks is applied to a numerical study of the large-scale structure developing from adiabatic fluctuations in models of biased galaxy formation in universes with Omega = 1, h = 0.5 dominated by cold dark matter (CDM). The angular anisotropy of the cross-correlation function demonstrates that the far-field regions of cluster-scale peaks are asymmetric, as recent observations indicate. These regions will generate pancakes or filaments upon collapse. One-dimensional singularities in the large-scale bulk flow should arise in these CDM models, appearing as pancakes in position space. They are too rare to explain the CfA bubble walls, but pancakes that are just turning around now are sufficiently abundant and would appear to be thin walls normal to the line of sight in redshift space. Large scale streaming velocities are significantly smaller than recent observations indicate. To explain the reported 700 km/s coherent motions, mass must be significantly more clustered than galaxies with a biasing factor of less than 0.4 and a nonlinear redshift at cluster scales greater than one for both massive neutrino and cold models.

  15. Large-scale derived flood frequency analysis based on continuous simulation

    NASA Astrophysics Data System (ADS)

    Dung Nguyen, Viet; Hundecha, Yeshewatesfa; Guse, Björn; Vorogushyn, Sergiy; Merz, Bruno

    2016-04-01

    There is an increasing need for spatially consistent flood risk assessments at the regional scale (several 100.000 km2), in particular in the insurance industry and for national risk reduction strategies. However, most large-scale flood risk assessments are composed of smaller-scale assessments and show spatial inconsistencies. To overcome this deficit, a large-scale flood model composed of a weather generator and catchments models was developed reflecting the spatially inherent heterogeneity. The weather generator is a multisite and multivariate stochastic model capable of generating synthetic meteorological fields (precipitation, temperature, etc.) at daily resolution for the regional scale. These fields respect the observed autocorrelation, spatial correlation and co-variance between the variables. They are used as input into catchment models. A long-term simulation of this combined system enables to derive very long discharge series at many catchment locations serving as a basic for spatially consistent flood risk estimates at the regional scale. This combined model was set up and validated for major river catchments in Germany. The weather generator was trained by 53-year observation data at 528 stations covering not only the complete Germany but also parts of France, Switzerland, Czech Republic and Australia with the aggregated spatial scale of 443,931 km2. 10.000 years of daily meteorological fields for the study area were generated. Likewise, rainfall-runoff simulations with SWIM were performed for the entire Elbe, Rhine, Weser, Donau and Ems catchments. The validation results illustrate a good performance of the combined system, as the simulated flood magnitudes and frequencies agree well with the observed flood data. Based on continuous simulation this model chain is then used to estimate flood quantiles for the whole Germany including upstream headwater catchments in neighbouring countries. This continuous large scale approach overcomes the several

  16. Identification of human-to-human transmissibility factors in PB2 proteins of influenza A by large-scale mutual information analysis

    PubMed Central

    Miotto, Olivo; Heiny, AT; Tan, Tin Wee; August, J Thomas; Brusic, Vladimir

    2008-01-01

    Background The identification of mutations that confer unique properties to a pathogen, such as host range, is of fundamental importance in the fight against disease. This paper describes a novel method for identifying amino acid sites that distinguish specific sets of protein sequences, by comparative analysis of matched alignments. The use of mutual information to identify distinctive residues responsible for functional variants makes this approach highly suitable for analyzing large sets of sequences. To support mutual information analysis, we developed the AVANA software, which utilizes sequence annotations to select sets for comparison, according to user-specified criteria. The method presented was applied to an analysis of influenza A PB2 protein sequences, with the objective of identifying the components of adaptation to human-to-human transmission, and reconstructing the mutation history of these components. Results We compared over 3,000 PB2 protein sequences of human-transmissible and avian isolates, to produce a catalogue of sites involved in adaptation to human-to-human transmission. This analysis identified 17 characteristic sites, five of which have been present in human-transmissible strains since the 1918 Spanish flu pandemic. Sixteen of these sites are located in functional domains, suggesting they may play functional roles in host-range specificity. The catalogue of characteristic sites was used to derive sequence signatures from historical isolates. These signatures, arranged in chronological order, reveal an evolutionary timeline for the adaptation of the PB2 protein to human hosts. Conclusion By providing the most complete elucidation to date of the functional components participating in PB2 protein adaptation to humans, this study demonstrates that mutual information is a powerful tool for comparative characterization of sequence sets. In addition to confirming previously reported findings, several novel characteristic sites within PB2 are

  17. Large-Scale Brain Networks Supporting Divided Attention across Spatial Locations and Sensory Modalities

    PubMed Central

    Santangelo, Valerio

    2018-01-01

    Higher-order cognitive processes were shown to rely on the interplay between large-scale neural networks. However, brain networks involved with the capability to split attentional resource over multiple spatial locations and multiple stimuli or sensory modalities have been largely unexplored to date. Here I re-analyzed data from Santangelo et al. (2010) to explore the causal interactions between large-scale brain networks during divided attention. During fMRI scanning, participants monitored streams of visual and/or auditory stimuli in one or two spatial locations for detection of occasional targets. This design allowed comparing a condition in which participants monitored one stimulus/modality (either visual or auditory) in two spatial locations vs. a condition in which participants monitored two stimuli/modalities (both visual and auditory) in one spatial location. The analysis of the independent components (ICs) revealed that dividing attentional resources across two spatial locations necessitated a brain network involving the left ventro- and dorso-lateral prefrontal cortex plus the posterior parietal cortex, including the intraparietal sulcus (IPS) and the angular gyrus, bilaterally. The analysis of Granger causality highlighted that the activity of lateral prefrontal regions were predictive of the activity of all of the posteriors parietal nodes. By contrast, dividing attention across two sensory modalities necessitated a brain network including nodes belonging to the dorsal frontoparietal network, i.e., the bilateral frontal eye-fields (FEF) and IPS, plus nodes belonging to the salience network, i.e., the anterior cingulated cortex and the left and right anterior insular cortex (aIC). The analysis of Granger causality highlights a tight interdependence between the dorsal frontoparietal and salience nodes in trials requiring divided attention between different sensory modalities. The current findings therefore highlighted a dissociation among brain networks

  18. Large-Scale Brain Networks Supporting Divided Attention across Spatial Locations and Sensory Modalities.

    PubMed

    Santangelo, Valerio

    2018-01-01

    Higher-order cognitive processes were shown to rely on the interplay between large-scale neural networks. However, brain networks involved with the capability to split attentional resource over multiple spatial locations and multiple stimuli or sensory modalities have been largely unexplored to date. Here I re-analyzed data from Santangelo et al. (2010) to explore the causal interactions between large-scale brain networks during divided attention. During fMRI scanning, participants monitored streams of visual and/or auditory stimuli in one or two spatial locations for detection of occasional targets. This design allowed comparing a condition in which participants monitored one stimulus/modality (either visual or auditory) in two spatial locations vs. a condition in which participants monitored two stimuli/modalities (both visual and auditory) in one spatial location. The analysis of the independent components (ICs) revealed that dividing attentional resources across two spatial locations necessitated a brain network involving the left ventro- and dorso-lateral prefrontal cortex plus the posterior parietal cortex, including the intraparietal sulcus (IPS) and the angular gyrus, bilaterally. The analysis of Granger causality highlighted that the activity of lateral prefrontal regions were predictive of the activity of all of the posteriors parietal nodes. By contrast, dividing attention across two sensory modalities necessitated a brain network including nodes belonging to the dorsal frontoparietal network, i.e., the bilateral frontal eye-fields (FEF) and IPS, plus nodes belonging to the salience network, i.e., the anterior cingulated cortex and the left and right anterior insular cortex (aIC). The analysis of Granger causality highlights a tight interdependence between the dorsal frontoparietal and salience nodes in trials requiring divided attention between different sensory modalities. The current findings therefore highlighted a dissociation among brain networks

  19. Links between large-scale circulation patterns and streamflow in Central Europe: A review

    NASA Astrophysics Data System (ADS)

    Steirou, Eva; Gerlitz, Lars; Apel, Heiko; Merz, Bruno

    2017-06-01

    We disentangle the relationships between streamflow and large-scale atmospheric circulation in Central Europe (CE), an area affected by climatic influences from different origins (Atlantic, Mediterranean and Continental) and characterized by diverse topography and flow regimes. Our literature review examines in detail the links between mean, high and low flows in CE and large-scale circulation patterns, with focus on two closely related phenomena, the North Atlantic Oscillation (NAO) and the Western-zonal circulation (WC). For both patterns, significant relations, consistent between different studies, are found for large parts of CE. The strongest links are found for the winter season, forming a dipole-like pattern with positive relationships with streamflow north of the Alps and the Carpathians for both indices and negative relationships for the NAO in the south. An influence of winter NAO is also detected in the amplitude and timing of snowmelt flows later in the year. Discharge in CE has further been linked to other large-scale climatic modes such as the Scandinavia pattern (SCA), the East Atlantic/West Russian pattern (EA/WR), the El Niño-Southern Oscillation (ENSO) and synoptic weather patterns such as the Vb weather regime. Different mechanisms suggested in the literature to modulate links between streamflow and the NAO are combined with topographical characteristics of the target area in order to explain the divergent NAO/WC influence on streamflow in different parts of CE. In particular, a precipitation mechanism seems to regulate winter flows in North-Western Germany, an area with short duration of snow cover and with rainfall-generated floods. The precipitation mechanism is also likely in Southern CE, where correlations between the NAO and temperature are low. Finally, in the rest of the study area (Northern CE, Alpine region), a joint precipitation-snow mechanism influences floods not only in winter, but also in the spring/snowmelt period, providing

  20. Large-scale Cortical Network Properties Predict Future Sound-to-Word Learning Success

    PubMed Central

    Sheppard, John Patrick; Wang, Ji-Ping; Wong, Patrick C. M.

    2013-01-01

    The human brain possesses a remarkable capacity to interpret and recall novel sounds as spoken language. These linguistic abilities arise from complex processing spanning a widely distributed cortical network and are characterized by marked individual variation. Recently, graph theoretical analysis has facilitated the exploration of how such aspects of large-scale brain functional organization may underlie cognitive performance. Brain functional networks are known to possess small-world topologies characterized by efficient global and local information transfer, but whether these properties relate to language learning abilities remains unknown. Here we applied graph theory to construct large-scale cortical functional networks from cerebral hemodynamic (fMRI) responses acquired during an auditory pitch discrimination task and found that such network properties were associated with participants’ future success in learning words of an artificial spoken language. Successful learners possessed networks with reduced local efficiency but increased global efficiency relative to less successful learners and had a more cost-efficient network organization. Regionally, successful and less successful learners exhibited differences in these network properties spanning bilateral prefrontal, parietal, and right temporal cortex, overlapping a core network of auditory language areas. These results suggest that efficient cortical network organization is associated with sound-to-word learning abilities among healthy, younger adults. PMID:22360625