Cui, Chenchen; Song, Yujie; Liu, Jun; Ge, Hengtao; Li, Qian; Huang, Hui; Hu, Linyong; Zhu, Hongmei; Jin, Yaping; Zhang, Yong
2015-01-01
β-Lactoglobulin (BLG) is a major goat’s milk allergen that is absent in human milk. Engineered endonucleases, including transcription activator-like effector nucleases (TALENs) and zinc-finger nucleases, enable targeted genetic modification in livestock. In this study, TALEN-mediated gene knockout followed by gene knock-in were used to generate BLG knockout goats as mammary gland bioreactors for large-scale production of human lactoferrin (hLF). We introduced precise genetic modifications in the goat genome at frequencies of approximately 13.6% and 6.09% for the first and second sequential targeting, respectively, by using targeting vectors that underwent TALEN-induced homologous recombination (HR). Analysis of milk from the cloned goats revealed large-scale hLF expression or/and decreased BLG levels in milk from heterozygous goats as well as the absence of BLG in milk from homozygous goats. Furthermore, the TALEN-mediated targeting events in somatic cells can be transmitted through the germline after SCNT. Our result suggests that gene targeting via TALEN-induced HR may expedite the production of genetically engineered livestock for agriculture and biomedicine. PMID:25994151
Cui, Chenchen; Song, Yujie; Liu, Jun; Ge, Hengtao; Li, Qian; Huang, Hui; Hu, Linyong; Zhu, Hongmei; Jin, Yaping; Zhang, Yong
2015-05-21
β-Lactoglobulin (BLG) is a major goat's milk allergen that is absent in human milk. Engineered endonucleases, including transcription activator-like effector nucleases (TALENs) and zinc-finger nucleases, enable targeted genetic modification in livestock. In this study, TALEN-mediated gene knockout followed by gene knock-in were used to generate BLG knockout goats as mammary gland bioreactors for large-scale production of human lactoferrin (hLF). We introduced precise genetic modifications in the goat genome at frequencies of approximately 13.6% and 6.09% for the first and second sequential targeting, respectively, by using targeting vectors that underwent TALEN-induced homologous recombination (HR). Analysis of milk from the cloned goats revealed large-scale hLF expression or/and decreased BLG levels in milk from heterozygous goats as well as the absence of BLG in milk from homozygous goats. Furthermore, the TALEN-mediated targeting events in somatic cells can be transmitted through the germline after SCNT. Our result suggests that gene targeting via TALEN-induced HR may expedite the production of genetically engineered livestock for agriculture and biomedicine.
Zhang, Bo; Fu, Yingxue; Huang, Chao; Zheng, Chunli; Wu, Ziyin; Zhang, Wenjuan; Yang, Xiaoyan; Gong, Fukai; Li, Yuerong; Chen, Xiaoyu; Gao, Shuo; Chen, Xuetong; Li, Yan; Lu, Aiping; Wang, Yonghua
2016-02-25
The development of modern omics technology has not significantly improved the efficiency of drug development. Rather precise and targeted drug discovery remains unsolved. Here a large-scale cross-species molecular network association (CSMNA) approach for targeted drug screening from natural sources is presented. The algorithm integrates molecular network omics data from humans and 267 plants and microbes, establishing the biological relationships between them and extracting evolutionarily convergent chemicals. This technique allows the researcher to assess targeted drugs for specific human diseases based on specific plant or microbe pathways. In a perspective validation, connections between the plant Halliwell-Asada (HA) cycle and the human Nrf2-ARE pathway were verified and the manner by which the HA cycle molecules act on the human Nrf2-ARE pathway as antioxidants was determined. This shows the potential applicability of this approach in drug discovery. The current method integrates disparate evolutionary species into chemico-biologically coherent circuits, suggesting a new cross-species omics analysis strategy for rational drug development.
PROBLEM OF FORMING IN A MAN-OPERATOR A HABIT OF TRACKING A MOVING TARGET,
Cybernetics stimulated the large-scale use of the method of functional analogy which makes it possible to compare technical and human activity systems...interesting and highly efficient human activity because of the psychological control factor involved in its operation. The human tracking system is
Lunga, Dalton D.; Yang, Hsiuhan Lexie; Reith, Andrew E.; ...
2018-02-06
Satellite imagery often exhibits large spatial extent areas that encompass object classes with considerable variability. This often limits large-scale model generalization with machine learning algorithms. Notably, acquisition conditions, including dates, sensor position, lighting condition, and sensor types, often translate into class distribution shifts introducing complex nonlinear factors and hamper the potential impact of machine learning classifiers. Here, this article investigates the challenge of exploiting satellite images using convolutional neural networks (CNN) for settlement classification where the class distribution shifts are significant. We present a large-scale human settlement mapping workflow based-off multiple modules to adapt a pretrained CNN to address themore » negative impact of distribution shift on classification performance. To extend a locally trained classifier onto large spatial extents areas we introduce several submodules: First, a human-in-the-loop element for relabeling of misclassified target domain samples to generate representative examples for model adaptation; second, an efficient hashing module to minimize redundancy and noisy samples from the mass-selected examples; and third, a novel relevance ranking module to minimize the dominance of source example on the target domain. The workflow presents a novel and practical approach to achieve large-scale domain adaptation with binary classifiers that are based-off CNN features. Experimental evaluations are conducted on areas of interest that encompass various image characteristics, including multisensors, multitemporal, and multiangular conditions. Domain adaptation is assessed on source–target pairs through the transfer loss and transfer ratio metrics to illustrate the utility of the workflow.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lunga, Dalton D.; Yang, Hsiuhan Lexie; Reith, Andrew E.
Satellite imagery often exhibits large spatial extent areas that encompass object classes with considerable variability. This often limits large-scale model generalization with machine learning algorithms. Notably, acquisition conditions, including dates, sensor position, lighting condition, and sensor types, often translate into class distribution shifts introducing complex nonlinear factors and hamper the potential impact of machine learning classifiers. Here, this article investigates the challenge of exploiting satellite images using convolutional neural networks (CNN) for settlement classification where the class distribution shifts are significant. We present a large-scale human settlement mapping workflow based-off multiple modules to adapt a pretrained CNN to address themore » negative impact of distribution shift on classification performance. To extend a locally trained classifier onto large spatial extents areas we introduce several submodules: First, a human-in-the-loop element for relabeling of misclassified target domain samples to generate representative examples for model adaptation; second, an efficient hashing module to minimize redundancy and noisy samples from the mass-selected examples; and third, a novel relevance ranking module to minimize the dominance of source example on the target domain. The workflow presents a novel and practical approach to achieve large-scale domain adaptation with binary classifiers that are based-off CNN features. Experimental evaluations are conducted on areas of interest that encompass various image characteristics, including multisensors, multitemporal, and multiangular conditions. Domain adaptation is assessed on source–target pairs through the transfer loss and transfer ratio metrics to illustrate the utility of the workflow.« less
Large-Scale Analysis of Network Bistability for Human Cancers
Shiraishi, Tetsuya; Matsuyama, Shinako; Kitano, Hiroaki
2010-01-01
Protein–protein interaction and gene regulatory networks are likely to be locked in a state corresponding to a disease by the behavior of one or more bistable circuits exhibiting switch-like behavior. Sets of genes could be over-expressed or repressed when anomalies due to disease appear, and the circuits responsible for this over- or under-expression might persist for as long as the disease state continues. This paper shows how a large-scale analysis of network bistability for various human cancers can identify genes that can potentially serve as drug targets or diagnosis biomarkers. PMID:20628618
Viral Organization of Human Proteins
Wuchty, Stefan; Siwo, Geoffrey; Ferdig, Michael T.
2010-01-01
Although maps of intracellular interactions are increasingly well characterized, little is known about large-scale maps of host-pathogen protein interactions. The investigation of host-pathogen interactions can reveal features of pathogenesis and provide a foundation for the development of drugs and disease prevention strategies. A compilation of experimentally verified interactions between HIV-1 and human proteins and a set of HIV-dependency factors (HDF) allowed insights into the topology and intricate interplay between viral and host proteins on a large scale. We found that targeted and HDF proteins appear predominantly in rich-clubs, groups of human proteins that are strongly intertwined among each other. These assemblies of proteins may serve as an infection gateway, allowing the virus to take control of the human host by reaching protein pathways and diversified cellular functions in a pronounced and focused way. Particular transcription factors and protein kinases facilitate indirect interactions between HDFs and viral proteins. Discerning the entanglement of directly targeted and indirectly interacting proteins may uncover molecular and functional sites that can provide novel perspectives on the progression of HIV infection and highlight new avenues to fight this virus. PMID:20827298
Latzman, Robert D; Sauvigné, Katheryn C; Hopkins, William D
2016-06-01
There is a growing interest in the study of personality in chimpanzees with repeated findings of a similar structure of personality in apes to that found in humans. To date, however, the direct translational value of instruments used to assess chimpanzee personality to humans has yet to be explicitly tested. As such, in the current study we sought to determine the transportability of factor analytically-derived chimpanzee personality scales to humans in a large human sample (N = 301). Human informants reporting on target individuals they knew well completed chimpanzee-derived and human-derived measures of personality from the two most widely studied models of human personality: Big Five and Big Three. The correspondence between informant-reported chimpanzee- and human-derived personality scales was then investigated. Results indicated high convergence for corresponding scales across most chimpanzee- and human-derived personality scales. Findings from the current study provide evidence that chimpanzee-derived scales translate well to humans and operate quite similarly to the established human-derived personality scales in a human sample. This evidence of transportability lends support to the translational nature of chimpanzee personality research suggesting clear relevance of this growing literature to humans. Am. J. Primatol. 78:601-609, 2016. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.
Academic-industrial partnerships in drug discovery in the age of genomics.
Harris, Tim; Papadopoulos, Stelios; Goldstein, David B
2015-06-01
Many US FDA-approved drugs have been developed through productive interactions between the biotechnology industry and academia. Technological breakthroughs in genomics, in particular large-scale sequencing of human genomes, is creating new opportunities to understand the biology of disease and to identify high-value targets relevant to a broad range of disorders. However, the scale of the work required to appropriately analyze large genomic and clinical data sets is challenging industry to develop a broader view of what areas of work constitute precompetitive research. Copyright © 2015 Elsevier Ltd. All rights reserved.
A Mapping of Drug Space from the Viewpoint of Small Molecule Metabolism
Basuino, Li; Chambers, Henry F.; Lee, Deok-Sun; Wiest, Olaf G.; Babbitt, Patricia C.
2009-01-01
Small molecule drugs target many core metabolic enzymes in humans and pathogens, often mimicking endogenous ligands. The effects may be therapeutic or toxic, but are frequently unexpected. A large-scale mapping of the intersection between drugs and metabolism is needed to better guide drug discovery. To map the intersection between drugs and metabolism, we have grouped drugs and metabolites by their associated targets and enzymes using ligand-based set signatures created to quantify their degree of similarity in chemical space. The results reveal the chemical space that has been explored for metabolic targets, where successful drugs have been found, and what novel territory remains. To aid other researchers in their drug discovery efforts, we have created an online resource of interactive maps linking drugs to metabolism. These maps predict the “effect space” comprising likely target enzymes for each of the 246 MDDR drug classes in humans. The online resource also provides species-specific interactive drug-metabolism maps for each of the 385 model organisms and pathogens in the BioCyc database collection. Chemical similarity links between drugs and metabolites predict potential toxicity, suggest routes of metabolism, and reveal drug polypharmacology. The metabolic maps enable interactive navigation of the vast biological data on potential metabolic drug targets and the drug chemistry currently available to prosecute those targets. Thus, this work provides a large-scale approach to ligand-based prediction of drug action in small molecule metabolism. PMID:19701464
Towards large scale multi-target tracking
NASA Astrophysics Data System (ADS)
Vo, Ba-Ngu; Vo, Ba-Tuong; Reuter, Stephan; Lam, Quang; Dietmayer, Klaus
2014-06-01
Multi-target tracking is intrinsically an NP-hard problem and the complexity of multi-target tracking solutions usually do not scale gracefully with problem size. Multi-target tracking for on-line applications involving a large number of targets is extremely challenging. This article demonstrates the capability of the random finite set approach to provide large scale multi-target tracking algorithms. In particular it is shown that an approximate filter known as the labeled multi-Bernoulli filter can simultaneously track one thousand five hundred targets in clutter on a standard laptop computer.
Functional annotation of HOT regions in the human genome: implications for human disease and cancer
Li, Hao; Chen, Hebing; Liu, Feng; Ren, Chao; Wang, Shengqi; Bo, Xiaochen; Shu, Wenjie
2015-01-01
Advances in genome-wide association studies (GWAS) and large-scale sequencing studies have resulted in an impressive and growing list of disease- and trait-associated genetic variants. Most studies have emphasised the discovery of genetic variation in coding sequences, however, the noncoding regulatory effects responsible for human disease and cancer biology have been substantially understudied. To better characterise the cis-regulatory effects of noncoding variation, we performed a comprehensive analysis of the genetic variants in HOT (high-occupancy target) regions, which are considered to be one of the most intriguing findings of recent large-scale sequencing studies. We observed that GWAS variants that map to HOT regions undergo a substantial net decrease and illustrate development-specific localisation during haematopoiesis. Additionally, genetic risk variants are disproportionally enriched in HOT regions compared with LOT (low-occupancy target) regions in both disease-relevant and cancer cells. Importantly, this enrichment is biased toward disease- or cancer-specific cell types. Furthermore, we observed that cancer cells generally acquire cancer-specific HOT regions at oncogenes through diverse mechanisms of cancer pathogenesis. Collectively, our findings demonstrate the key roles of HOT regions in human disease and cancer and represent a critical step toward further understanding disease biology, diagnosis, and therapy. PMID:26113264
Functional annotation of HOT regions in the human genome: implications for human disease and cancer.
Li, Hao; Chen, Hebing; Liu, Feng; Ren, Chao; Wang, Shengqi; Bo, Xiaochen; Shu, Wenjie
2015-06-26
Advances in genome-wide association studies (GWAS) and large-scale sequencing studies have resulted in an impressive and growing list of disease- and trait-associated genetic variants. Most studies have emphasised the discovery of genetic variation in coding sequences, however, the noncoding regulatory effects responsible for human disease and cancer biology have been substantially understudied. To better characterise the cis-regulatory effects of noncoding variation, we performed a comprehensive analysis of the genetic variants in HOT (high-occupancy target) regions, which are considered to be one of the most intriguing findings of recent large-scale sequencing studies. We observed that GWAS variants that map to HOT regions undergo a substantial net decrease and illustrate development-specific localisation during haematopoiesis. Additionally, genetic risk variants are disproportionally enriched in HOT regions compared with LOT (low-occupancy target) regions in both disease-relevant and cancer cells. Importantly, this enrichment is biased toward disease- or cancer-specific cell types. Furthermore, we observed that cancer cells generally acquire cancer-specific HOT regions at oncogenes through diverse mechanisms of cancer pathogenesis. Collectively, our findings demonstrate the key roles of HOT regions in human disease and cancer and represent a critical step toward further understanding disease biology, diagnosis, and therapy.
Keates, Tracy; Cooper, Christopher D O; Savitsky, Pavel; Allerston, Charles K; Phillips, Claire; Hammarström, Martin; Daga, Neha; Berridge, Georgina; Mahajan, Pravin; Burgess-Brown, Nicola A; Müller, Susanne; Gräslund, Susanne; Gileadi, Opher
2012-06-15
The generation of affinity reagents to large numbers of human proteins depends on the ability to express the target proteins as high-quality antigens. The Structural Genomics Consortium (SGC) focuses on the production and structure determination of human proteins. In a 7-year period, the SGC has deposited crystal structures of >800 human protein domains, and has additionally expressed and purified a similar number of protein domains that have not yet been crystallised. The targets include a diversity of protein domains, with an attempt to provide high coverage of protein families. The family approach provides an excellent basis for characterising the selectivity of affinity reagents. We present a summary of the approaches used to generate purified human proteins or protein domains, a test case demonstrating the ability to rapidly generate new proteins, and an optimisation study on the modification of >70 proteins by biotinylation in vivo. These results provide a unique synergy between large-scale structural projects and the recent efforts to produce a wide coverage of affinity reagents to the human proteome. Copyright © 2011 Elsevier B.V. All rights reserved.
Keates, Tracy; Cooper, Christopher D.O.; Savitsky, Pavel; Allerston, Charles K.; Phillips, Claire; Hammarström, Martin; Daga, Neha; Berridge, Georgina; Mahajan, Pravin; Burgess-Brown, Nicola A.; Müller, Susanne; Gräslund, Susanne; Gileadi, Opher
2012-01-01
The generation of affinity reagents to large numbers of human proteins depends on the ability to express the target proteins as high-quality antigens. The Structural Genomics Consortium (SGC) focuses on the production and structure determination of human proteins. In a 7-year period, the SGC has deposited crystal structures of >800 human protein domains, and has additionally expressed and purified a similar number of protein domains that have not yet been crystallised. The targets include a diversity of protein domains, with an attempt to provide high coverage of protein families. The family approach provides an excellent basis for characterising the selectivity of affinity reagents. We present a summary of the approaches used to generate purified human proteins or protein domains, a test case demonstrating the ability to rapidly generate new proteins, and an optimisation study on the modification of >70 proteins by biotinylation in vivo. These results provide a unique synergy between large-scale structural projects and the recent efforts to produce a wide coverage of affinity reagents to the human proteome. PMID:22027370
Remote Imaging Applied to Schistosomiasis Control: The Anning River Project
NASA Technical Reports Server (NTRS)
Seto, Edmund Y. W.; Maszle, Don R.; Spear, Robert C.; Gong, Peng
1997-01-01
The use of satellite imaging to remotely detect areas of high risk for transmission of infectious disease is an appealing prospect for large-scale monitoring of these diseases. The detection of large-scale environmental determinants of disease risk, often called landscape epidemiology, has been motivated by several authors (Pavlovsky 1966; Meade et al. 1988). The basic notion is that large-scale factors such as population density, air temperature, hydrological conditions, soil type, and vegetation can determine in a coarse fashion the local conditions contributing to disease vector abundance and human contact with disease agents. These large-scale factors can often be remotely detected by sensors or cameras mounted on satellite or aircraft platforms and can thus be used in a predictive model to mark high risk areas of transmission and to target control or monitoring efforts. A review of satellite technologies for this purpose was recently presented by Washino and Wood (1994) and Hay (1997) and Hay et al. (1997).
Experimental design and quantitative analysis of microbial community multiomics.
Mallick, Himel; Ma, Siyuan; Franzosa, Eric A; Vatanen, Tommi; Morgan, Xochitl C; Huttenhower, Curtis
2017-11-30
Studies of the microbiome have become increasingly sophisticated, and multiple sequence-based, molecular methods as well as culture-based methods exist for population-scale microbiome profiles. To link the resulting host and microbial data types to human health, several experimental design considerations, data analysis challenges, and statistical epidemiological approaches must be addressed. Here, we survey current best practices for experimental design in microbiome molecular epidemiology, including technologies for generating, analyzing, and integrating microbiome multiomics data. We highlight studies that have identified molecular bioactives that influence human health, and we suggest steps for scaling translational microbiome research to high-throughput target discovery across large populations.
Multiplex amplification of large sets of human exons.
Porreca, Gregory J; Zhang, Kun; Li, Jin Billy; Xie, Bin; Austin, Derek; Vassallo, Sara L; LeProust, Emily M; Peck, Bill J; Emig, Christopher J; Dahl, Fredrik; Gao, Yuan; Church, George M; Shendure, Jay
2007-11-01
A new generation of technologies is poised to reduce DNA sequencing costs by several orders of magnitude. But our ability to fully leverage the power of these technologies is crippled by the absence of suitable 'front-end' methods for isolating complex subsets of a mammalian genome at a scale that matches the throughput at which these platforms will routinely operate. We show that targeting oligonucleotides released from programmable microarrays can be used to capture and amplify approximately 10,000 human exons in a single multiplex reaction. Additionally, we show integration of this protocol with ultra-high-throughput sequencing for targeted variation discovery. Although the multiplex capture reaction is highly specific, we found that nonuniform capture is a key issue that will need to be resolved by additional optimization. We anticipate that highly multiplexed methods for targeted amplification will enable the comprehensive resequencing of human exons at a fraction of the cost of whole-genome resequencing.
Multiscale factors affecting human attitudes toward snow leopards and wolves.
Suryawanshi, Kulbhushansingh R; Bhatia, Saloni; Bhatnagar, Yash Veer; Redpath, Stephen; Mishra, Charudutt
2014-12-01
The threat posed by large carnivores to livestock and humans makes peaceful coexistence between them difficult. Effective implementation of conservation laws and policies depends on the attitudes of local residents toward the target species. There are many known correlates of human attitudes toward carnivores, but they have only been assessed at the scale of the individual. Because human societies are organized hierarchically, attitudes are presumably influenced by different factors at different scales of social organization, but this scale dependence has not been examined. We used structured interview surveys to quantitatively assess the attitudes of a Buddhist pastoral community toward snow leopards (Panthera uncia) and wolves (Canis lupus). We interviewed 381 individuals from 24 villages within 6 study sites across the high-elevation Spiti Valley in the Indian Trans-Himalaya. We gathered information on key explanatory variables that together captured variation in individual and village-level socioeconomic factors. We used hierarchical linear models to examine how the effect of these factors on human attitudes changed with the scale of analysis from the individual to the community. Factors significant at the individual level were gender, education, and age of the respondent (for wolves and snow leopards), number of income sources in the family (wolves), agricultural production, and large-bodied livestock holdings (snow leopards). At the community level, the significant factors included the number of smaller-bodied herded livestock killed by wolves and mean agricultural production (wolves) and village size and large livestock holdings (snow leopards). Our results show that scaling up from the individual to higher levels of social organization can highlight important factors that influence attitudes of people toward wildlife and toward formal conservation efforts in general. Such scale-specific information can help managers apply conservation measures at appropriate scales. Our results reiterate the need for conflict management programs to be multipronged. © 2014 Society for Conservation Biology.
NASA Astrophysics Data System (ADS)
Chan, YinThai
2016-03-01
Colloidal semiconductor nanocrystals are ideal fluorophores for clinical diagnostics, therapeutics, and highly sensitive biochip applications due to their high photostability, size-tunable color of emission and flexible surface chemistry. The relatively recent development of core-seeded semiconductor nanorods showed that the presence of a rod-like shell can confer even more advantageous physicochemical properties than their spherical counterparts, such as large multi-photon absorption cross-sections and facet-specific chemistry that can be exploited to deposit secondary nanoparticles. It may be envisaged that these highly fluorescent nanorods can be integrated with large scale integrated (LSI) microfluidic systems that allow miniaturization and integration of multiple biochemical processes in a single device at the nanoliter scale, resulting in a highly sensitive and automated detection platform. In this talk, I will describe a LSI microfluidic device that integrates RNA extraction, reverse transcription to cDNA, amplification and target pull-down to detect histidine decarboxylase (HDC) gene directly from human white blood cells samples. When anisotropic colloidal semiconductor nanorods (NRs) were used as the fluorescent readout, the detection limit was found to be 0.4 ng of total RNA, which was much lower than that obtained using spherical quantum dots (QDs) or organic dyes. This was attributed to the large action cross-section of NRs and their high probability of target capture in a pull-down detection scheme. The combination of large scale integrated microfluidics with highly fluorescent semiconductor NRs may find widespread utility in point-of-care devices and multi-target diagnostics.
Ai, Haixin; Wu, Xuewei; Qi, Mengyuan; Zhang, Li; Hu, Huan; Zhao, Qi; Zhao, Jian; Liu, Hongsheng
2018-06-01
In recent years, new strains of influenza virus such as H7N9, H10N8, H5N6 and H5N8 had continued to emerge. There was an urgent need for discovery of new anti-influenza virus drugs as well as accurate and efficient large-scale inhibitor screening methods. In this study, we focused on six influenza virus proteins that could be anti-influenza drug targets, including neuraminidase (NA), hemagglutinin (HA), matrix protein 1 (M1), M2 proton channel (M2), nucleoprotein (NP) and non-structural protein 1 (NS1). Structure-based molecular docking was utilized to identify potential inhibitors for these drug targets from 13144 compounds in the Traditional Chinese Medicine Systems Pharmacology Database and Analysis Platform. The results showed that 56 compounds could inhibit more than two drug targets simultaneously. Further, we utilized reverse docking to study the interaction of these compounds with host targets. Finally, the 22 compound inhibitors could stably bind to host targets with high binding free energy. The results showed that the Chinese herbal medicines had a multi-target effect, which could directly inhibit influenza virus by the target viral protein and indirectly inhibit virus by the human target protein. This method was of great value for large-scale virtual screening of new anti-influenza virus compounds.
de Groot, Reinoud; Lüthi, Joel; Lindsay, Helen; Holtackers, René; Pelkmans, Lucas
2018-01-23
High-content imaging using automated microscopy and computer vision allows multivariate profiling of single-cell phenotypes. Here, we present methods for the application of the CISPR-Cas9 system in large-scale, image-based, gene perturbation experiments. We show that CRISPR-Cas9-mediated gene perturbation can be achieved in human tissue culture cells in a timeframe that is compatible with image-based phenotyping. We developed a pipeline to construct a large-scale arrayed library of 2,281 sequence-verified CRISPR-Cas9 targeting plasmids and profiled this library for genes affecting cellular morphology and the subcellular localization of components of the nuclear pore complex (NPC). We conceived a machine-learning method that harnesses genetic heterogeneity to score gene perturbations and identify phenotypically perturbed cells for in-depth characterization of gene perturbation effects. This approach enables genome-scale image-based multivariate gene perturbation profiling using CRISPR-Cas9. © 2018 The Authors. Published under the terms of the CC BY 4.0 license.
Mobile element biology – new possibilities with high-throughput sequencing
Xing, Jinchuan; Witherspoon, David J.; Jorde, Lynn B.
2014-01-01
Mobile elements compose more than half of the human genome, but until recently their large-scale detection was time-consuming and challenging. With the development of new high-throughput sequencing technologies, the complete spectrum of mobile element variation in humans can now be identified and analyzed. Thousands of new mobile element insertions have been discovered, yielding new insights into mobile element biology, evolution, and genomic variation. We review several high-throughput methods, with an emphasis on techniques that specifically target mobile element insertions in humans, and we highlight recent applications of these methods in evolutionary studies and in the analysis of somatic alterations in human cancers. PMID:23312846
Orthographic and Phonological Neighborhood Databases across Multiple Languages.
Marian, Viorica
2017-01-01
The increased globalization of science and technology and the growing number of bilinguals and multilinguals in the world have made research with multiple languages a mainstay for scholars who study human function and especially those who focus on language, cognition, and the brain. Such research can benefit from large-scale databases and online resources that describe and measure lexical, phonological, orthographic, and semantic information. The present paper discusses currently-available resources and underscores the need for tools that enable measurements both within and across multiple languages. A general review of language databases is followed by a targeted introduction to databases of orthographic and phonological neighborhoods. A specific focus on CLEARPOND illustrates how databases can be used to assess and compare neighborhood information across languages, to develop research materials, and to provide insight into broad questions about language. As an example of how using large-scale databases can answer questions about language, a closer look at neighborhood effects on lexical access reveals that not only orthographic, but also phonological neighborhoods can influence visual lexical access both within and across languages. We conclude that capitalizing upon large-scale linguistic databases can advance, refine, and accelerate scientific discoveries about the human linguistic capacity.
Spatiotemporal property and predictability of large-scale human mobility
NASA Astrophysics Data System (ADS)
Zhang, Hai-Tao; Zhu, Tao; Fu, Dongfei; Xu, Bowen; Han, Xiao-Pu; Chen, Duxin
2018-04-01
Spatiotemporal characteristics of human mobility emerging from complexity on individual scale have been extensively studied due to the application potential on human behavior prediction and recommendation, and control of epidemic spreading. We collect and investigate a comprehensive data set of human activities on large geographical scales, including both websites browse and mobile towers visit. Numerical results show that the degree of activity decays as a power law, indicating that human behaviors are reminiscent of scale-free random walks known as Lévy flight. More significantly, this study suggests that human activities on large geographical scales have specific non-Markovian characteristics, such as a two-segment power-law distribution of dwelling time and a high possibility for prediction. Furthermore, a scale-free featured mobility model with two essential ingredients, i.e., preferential return and exploration, and a Gaussian distribution assumption on the exploration tendency parameter is proposed, which outperforms existing human mobility models under scenarios of large geographical scales.
Bioinformatics by Example: From Sequence to Target
NASA Astrophysics Data System (ADS)
Kossida, Sophia; Tahri, Nadia; Daizadeh, Iraj
2002-12-01
With the completion of the human genome, and the imminent completion of other large-scale sequencing and structure-determination projects, computer-assisted bioscience is aimed to become the new paradigm for conducting basic and applied research. The presence of these additional bioinformatics tools stirs great anxiety for experimental researchers (as well as for pedagogues), since they are now faced with a wider and deeper knowledge of differing disciplines (biology, chemistry, physics, mathematics, and computer science). This review targets those individuals who are interested in using computational methods in their teaching or research. By analyzing a real-life, pharmaceutical, multicomponent, target-based example the reader will experience this fascinating new discipline.
Korte, Andrew R.; Stopka, Sylwia A.; Morris, Nicholas; ...
2016-07-11
The unique challenges presented by metabolomics have driven the development of new mass spectrometry (MS)-based techniques for small molecule analysis. We have previously demonstrated silicon nanopost arrays (NAPA) to be an effective substrate for laser desorption ionization (LDI) of small molecules for MS. However, the utility of NAPA-LDI-MS for a wide range of metabolite classes has not been investigated. Here we apply NAPA-LDI-MS to the large-scale acquisition of high-resolution mass spectra and tandem mass spectra from a collection of metabolite standards covering a range of compound classes including amino acids, nucleotides, carbohydrates, xenobiotics, lipids, and other classes. In untargeted analysismore » of metabolite standard mixtures, detection was achieved for 374 compounds and useful MS/MS spectra were obtained for 287 compounds, without individual optimization of ionization or fragmentation conditions. Metabolite detection was evaluated in the context of 31 metabolic pathways, and NAPA-LDI-MS was found to provide detection for 63% of investigated pathway metabolites. Individual, targeted analysis of the 20 common amino acids provided detection of 100% of the investigated compounds, demonstrating that improved coverage is possible through optimization and targeting of individual analytes or analyte classes. In direct analysis of aqueous and organic extracts from human serum samples, spectral features were assigned to a total of 108 small metabolites and lipids. Glucose and amino acids were quantitated within their physiological concentration ranges. Finally, the broad coverage demonstrated by this large-scale screening experiment opens the door for use of NAPA-LDI-MS in numerous metabolite analysis applications« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Korte, Andrew R.; Stopka, Sylwia A.; Morris, Nicholas
The unique challenges presented by metabolomics have driven the development of new mass spectrometry (MS)-based techniques for small molecule analysis. We have previously demonstrated silicon nanopost arrays (NAPA) to be an effective substrate for laser desorption ionization (LDI) of small molecules for MS. However, the utility of NAPA-LDI-MS for a wide range of metabolite classes has not been investigated. Here we apply NAPA-LDI-MS to the large-scale acquisition of high-resolution mass spectra and tandem mass spectra from a collection of metabolite standards covering a range of compound classes including amino acids, nucleotides, carbohydrates, xenobiotics, lipids, and other classes. In untargeted analysismore » of metabolite standard mixtures, detection was achieved for 374 compounds and useful MS/MS spectra were obtained for 287 compounds, without individual optimization of ionization or fragmentation conditions. Metabolite detection was evaluated in the context of 31 metabolic pathways, and NAPA-LDI-MS was found to provide detection for 63% of investigated pathway metabolites. Individual, targeted analysis of the 20 common amino acids provided detection of 100% of the investigated compounds, demonstrating that improved coverage is possible through optimization and targeting of individual analytes or analyte classes. In direct analysis of aqueous and organic extracts from human serum samples, spectral features were assigned to a total of 108 small metabolites and lipids. Glucose and amino acids were quantitated within their physiological concentration ranges. Finally, the broad coverage demonstrated by this large-scale screening experiment opens the door for use of NAPA-LDI-MS in numerous metabolite analysis applications« less
A scalable strategy for high-throughput GFP tagging of endogenous human proteins.
Leonetti, Manuel D; Sekine, Sayaka; Kamiyama, Daichi; Weissman, Jonathan S; Huang, Bo
2016-06-21
A central challenge of the postgenomic era is to comprehensively characterize the cellular role of the ∼20,000 proteins encoded in the human genome. To systematically study protein function in a native cellular background, libraries of human cell lines expressing proteins tagged with a functional sequence at their endogenous loci would be very valuable. Here, using electroporation of Cas9 nuclease/single-guide RNA ribonucleoproteins and taking advantage of a split-GFP system, we describe a scalable method for the robust, scarless, and specific tagging of endogenous human genes with GFP. Our approach requires no molecular cloning and allows a large number of cell lines to be processed in parallel. We demonstrate the scalability of our method by targeting 48 human genes and show that the resulting GFP fluorescence correlates with protein expression levels. We next present how our protocols can be easily adapted for the tagging of a given target with GFP repeats, critically enabling the study of low-abundance proteins. Finally, we show that our GFP tagging approach allows the biochemical isolation of native protein complexes for proteomic studies. Taken together, our results pave the way for the large-scale generation of endogenously tagged human cell lines for the proteome-wide analysis of protein localization and interaction networks in a native cellular context.
ERIC Educational Resources Information Center
Janssen, Rianne; Crauwels, Marion
2011-01-01
A large-scale paper-and-pencil assessment of the attainment targets of environmental studies with a focus on the subject area nature was held in primary education in Flanders (Belgium). The tests on different subfields of nature, i.e. the human body, healthcare, organisms, ecosystems, environmental care and non-living nature, were administered to…
Centrifuge impact cratering experiments: Scaling laws for non-porous targets
NASA Technical Reports Server (NTRS)
Schmidt, Robert M.
1987-01-01
A geotechnical centrifuge was used to investigate large body impacts onto planetary surfaces. At elevated gravity, it is possible to match various dimensionless similarity parameters which were shown to govern large scale impacts. Observations of crater growth and target flow fields have provided detailed and critical tests of a complete and unified scaling theory for impact cratering. Scaling estimates were determined for nonporous targets. Scaling estimates for large scale cratering in rock proposed previously by others have assumed that the crater radius is proportional to powers of the impactor energy and gravity, with no additional dependence on impact velocity. The size scaling laws determined from ongoing centrifuge experiments differ from earlier ones in three respects. First, a distinct dependence of impact velocity is recognized, even for constant impactor energy. Second, the present energy exponent for low porosity targets, like competent rock, is lower than earlier estimates. Third, the gravity exponent is recognized here as being related to both the energy and the velocity exponents.
Investigators from the National Cancer Institute's Clinical Proteomic Tumor Analysis Consortium (CPTAC) who comprehensively analyzed 95 human colorectal tumor samples, have determined how gene alterations identified in previous analyses of the same samples are expressed at the protein level. The integration of proteomic and genomic data, or proteogenomics, provides a more comprehensive view of the biological features that drive cancer than genomic analysis alone and may help identify the most important targets for cancer detection and intervention.
A Life-Cycle Model of Human Social Groups Produces a U-Shaped Distribution in Group Size.
Salali, Gul Deniz; Whitehouse, Harvey; Hochberg, Michael E
2015-01-01
One of the central puzzles in the study of sociocultural evolution is how and why transitions from small-scale human groups to large-scale, hierarchically more complex ones occurred. Here we develop a spatially explicit agent-based model as a first step towards understanding the ecological dynamics of small and large-scale human groups. By analogy with the interactions between single-celled and multicellular organisms, we build a theory of group lifecycles as an emergent property of single cell demographic and expansion behaviours. We find that once the transition from small-scale to large-scale groups occurs, a few large-scale groups continue expanding while small-scale groups gradually become scarcer, and large-scale groups become larger in size and fewer in number over time. Demographic and expansion behaviours of groups are largely influenced by the distribution and availability of resources. Our results conform to a pattern of human political change in which religions and nation states come to be represented by a few large units and many smaller ones. Future enhancements of the model should include decision-making rules and probabilities of fragmentation for large-scale societies. We suggest that the synthesis of population ecology and social evolution will generate increasingly plausible models of human group dynamics.
A Life-Cycle Model of Human Social Groups Produces a U-Shaped Distribution in Group Size
Salali, Gul Deniz; Whitehouse, Harvey; Hochberg, Michael E.
2015-01-01
One of the central puzzles in the study of sociocultural evolution is how and why transitions from small-scale human groups to large-scale, hierarchically more complex ones occurred. Here we develop a spatially explicit agent-based model as a first step towards understanding the ecological dynamics of small and large-scale human groups. By analogy with the interactions between single-celled and multicellular organisms, we build a theory of group lifecycles as an emergent property of single cell demographic and expansion behaviours. We find that once the transition from small-scale to large-scale groups occurs, a few large-scale groups continue expanding while small-scale groups gradually become scarcer, and large-scale groups become larger in size and fewer in number over time. Demographic and expansion behaviours of groups are largely influenced by the distribution and availability of resources. Our results conform to a pattern of human political change in which religions and nation states come to be represented by a few large units and many smaller ones. Future enhancements of the model should include decision-making rules and probabilities of fragmentation for large-scale societies. We suggest that the synthesis of population ecology and social evolution will generate increasingly plausible models of human group dynamics. PMID:26381745
Enabling Large-Scale Design, Synthesis and Validation of Small Molecule Protein-Protein Antagonists
Koes, David; Khoury, Kareem; Huang, Yijun; Wang, Wei; Bista, Michal; Popowicz, Grzegorz M.; Wolf, Siglinde; Holak, Tad A.; Dömling, Alexander; Camacho, Carlos J.
2012-01-01
Although there is no shortage of potential drug targets, there are only a handful known low-molecular-weight inhibitors of protein-protein interactions (PPIs). One problem is that current efforts are dominated by low-yield high-throughput screening, whose rigid framework is not suitable for the diverse chemotypes present in PPIs. Here, we developed a novel pharmacophore-based interactive screening technology that builds on the role anchor residues, or deeply buried hot spots, have in PPIs, and redesigns these entry points with anchor-biased virtual multicomponent reactions, delivering tens of millions of readily synthesizable novel compounds. Application of this approach to the MDM2/p53 cancer target led to high hit rates, resulting in a large and diverse set of confirmed inhibitors, and co-crystal structures validate the designed compounds. Our unique open-access technology promises to expand chemical space and the exploration of the human interactome by leveraging in-house small-scale assays and user-friendly chemistry to rationally design ligands for PPIs with known structure. PMID:22427896
Large-scale prediction of ADAR-mediated effective human A-to-I RNA editing.
Yao, Li; Wang, Heming; Song, Yuanyuan; Dai, Zhen; Yu, Hao; Yin, Ming; Wang, Dongxu; Yang, Xin; Wang, Jinlin; Wang, Tiedong; Cao, Nan; Zhu, Jimin; Shen, Xizhong; Song, Guangqi; Zhao, Yicheng
2017-08-10
Adenosine-to-inosine (A-to-I) editing by adenosine deaminase acting on the RNA (ADAR) proteins is one of the most frequent modifications during post- and co-transcription. To facilitate the assignment of biological functions to specific editing sites, we designed an automatic online platform to annotate A-to-I RNA editing sites in pre-mRNA splicing signals, microRNAs (miRNAs) and miRNA target untranslated regions (3' UTRs) from human (Homo sapiens) high-throughput sequencing data and predict their effects based on large-scale bioinformatic analysis. After analysing plenty of previously reported RNA editing events and human normal tissues RNA high-seq data, >60 000 potentially effective RNA editing events on functional genes were found. The RNA Editing Plus platform is available for free at https://www.rnaeditplus.org/, and we believe our platform governing multiple optimized methods will improve further studies of A-to-I-induced editing post-transcriptional regulation. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
River Food Web Response to Large-Scale Riparian Zone Manipulations
Wootton, J. Timothy
2012-01-01
Conservation programs often focus on select species, leading to management plans based on the autecology of the focal species, but multiple ecosystem components can be affected both by the environmental factors impacting, and the management targeting, focal species. These broader effects can have indirect impacts on target species through the web of interactions within ecosystems. For example, human activity can strongly alter riparian vegetation, potentially impacting both economically-important salmonids and their associated river food web. In an Olympic Peninsula river, Washington state, USA, replicated large-scale riparian vegetation manipulations implemented with the long-term (>40 yr) goal of improving salmon habitat did not affect water temperature, nutrient limitation or habitat characteristics, but reduced canopy cover, causing reduced energy input via leaf litter, increased incident solar radiation (UV and PAR) and increased algal production compared to controls. In response, benthic algae, most insect taxa, and juvenile salmonids increased in manipulated areas. Stable isotope analysis revealed a predominant contribution of algal-derived energy to salmonid diets in manipulated reaches. The experiment demonstrates that riparian management targeting salmonids strongly affects river food webs via changes in the energy base, illustrates how species-based management strategies can have unanticipated indirect effects on the target species via the associated food web, and supports ecosystem-based management approaches for restoring depleted salmonid stocks. PMID:23284786
Pullara, Filippo; Guerrero-Santoro, Jennifer; Calero, Monica; Zhang, Qiangmin; Peng, Ye; Spåhr, Henrik; Kornberg, Guy L.; Cusimano, Antonella; Stevenson, Hilary P.; Santamaria-Suarez, Hugo; Reynolds, Shelley L.; Brown, Ian S.; Monga, Satdarshan P.S.; Van Houten, Bennett; Rapić-Otrin, Vesna; Calero, Guillermo; Levine, Arthur S.
2014-01-01
Expression of recombinant proteins in bacterial or eukaryotic systems often results in aggregation rendering them unavailable for biochemical or structural studies. Protein aggregation is a costly problem for biomedical research. It forces research laboratories and the biomedical industry to search for alternative, more soluble, non-human proteins and limits the number of potential “druggable” targets. In this study we present a highly reproducible protocol that introduces the systematic use of an extensive number of detergents to solubilize aggregated proteins expressed in bacterial and eukaryotic systems. We validate the usefulness of this protocol by solubilizing traditionally difficult human protein targets to milligram quantities and confirm their biological activity. We use this method to solubilize monomeric or multimeric components of multi-protein complexes and demonstrate its efficacy to reconstitute large cellular machines. This protocol works equally well on cytosolic, nuclear and membrane proteins and can be easily adapted to a high throughput format. PMID:23137940
Thogmartin, Wayne E.; Crimmins, Shawn M.; Pearce, Jennie
2014-01-01
Large-scale planning for the conservation of species is often hindered by a poor understanding of factors limiting populations. In regions with declining wildlife populations, it is critical that objective metrics of conservation success are developed to ensure that conservation actions achieve desired results. Using spatially explicit estimates of bird abundance, we evaluated several management alternatives for conserving bird populations in the Prairie Hardwood Transition of the United States. We designed landscapes conserving species at 50% of their current predicted abundance as well as landscapes attempting to achieve species population targets (which often required the doubling of current abundance). Conserving species at reduced (half of current) abundance led to few conservation conflicts. However, because of extensive modification of the landscape to suit human use, strategies for achieving regional population targets for forest bird species would be difficult under even ideal circumstances, and even more so if maintenance of grassland bird populations is also desired. Our results indicated that large-scale restoration of agricultural lands to native grassland and forest habitats may be the most productive conservation action for increasing bird population sizes but the level of landscape transition required to approach target bird population sizes may be societally unacceptable.
Bellamy, Chloe; Altringham, John
2015-01-01
Conservation increasingly operates at the landscape scale. For this to be effective, we need landscape scale information on species distributions and the environmental factors that underpin them. Species records are becoming increasingly available via data centres and online portals, but they are often patchy and biased. We demonstrate how such data can yield useful habitat suitability models, using bat roost records as an example. We analysed the effects of environmental variables at eight spatial scales (500 m - 6 km) on roost selection by eight bat species (Pipistrellus pipistrellus, P. pygmaeus, Nyctalus noctula, Myotis mystacinus, M. brandtii, M. nattereri, M. daubentonii, and Plecotus auritus) using the presence-only modelling software MaxEnt. Modelling was carried out on a selection of 418 data centre roost records from the Lake District National Park, UK. Target group pseudoabsences were selected to reduce the impact of sampling bias. Multi-scale models, combining variables measured at their best performing spatial scales, were used to predict roosting habitat suitability, yielding models with useful predictive abilities. Small areas of deciduous woodland consistently increased roosting habitat suitability, but other habitat associations varied between species and scales. Pipistrellus were positively related to built environments at small scales, and depended on large-scale woodland availability. The other, more specialist, species were highly sensitive to human-altered landscapes, avoiding even small rural towns. The strength of many relationships at large scales suggests that bats are sensitive to habitat modifications far from the roost itself. The fine resolution, large extent maps will aid targeted decision-making by conservationists and planners. We have made available an ArcGIS toolbox that automates the production of multi-scale variables, to facilitate the application of our methods to other taxa and locations. Habitat suitability modelling has the potential to become a standard tool for supporting landscape-scale decision-making as relevant data and open source, user-friendly, and peer-reviewed software become widely available.
Lam, Siew Hong; Mathavan, Sinnakarupan; Tong, Yan; Li, Haixia; Karuturi, R. Krishna Murthy; Wu, Yilian; Vega, Vinsensius B.; Liu, Edison T.; Gong, Zhiyuan
2008-01-01
The ability to perform large-scale, expression-based chemogenomics on whole adult organisms, as in invertebrate models (worm and fly), is highly desirable for a vertebrate model but its feasibility and potential has not been demonstrated. We performed expression-based chemogenomics on the whole adult organism of a vertebrate model, the zebrafish, and demonstrated its potential for large-scale predictive and discovery chemical biology. Focusing on two classes of compounds with wide implications to human health, polycyclic (halogenated) aromatic hydrocarbons [P(H)AHs] and estrogenic compounds (ECs), we generated robust prediction models that can discriminate compounds of the same class from those of different classes in two large independent experiments. The robust expression signatures led to the identification of biomarkers for potent aryl hydrocarbon receptor (AHR) and estrogen receptor (ER) agonists, respectively, and were validated in multiple targeted tissues. Knowledge-based data mining of human homologs of zebrafish genes revealed highly conserved chemical-induced biological responses/effects, health risks, and novel biological insights associated with AHR and ER that could be inferred to humans. Thus, our study presents an effective, high-throughput strategy of capturing molecular snapshots of chemical-induced biological states of a whole adult vertebrate that provides information on biomarkers of effects, deregulated signaling pathways, and possible affected biological functions, perturbed physiological systems, and increased health risks. These findings place zebrafish in a strategic position to bridge the wide gap between cell-based and rodent models in chemogenomics research and applications, especially in preclinical drug discovery and toxicology. PMID:18618001
Evaluation of nucleus segmentation in digital pathology images through large scale image synthesis
NASA Astrophysics Data System (ADS)
Zhou, Naiyun; Yu, Xiaxia; Zhao, Tianhao; Wen, Si; Wang, Fusheng; Zhu, Wei; Kurc, Tahsin; Tannenbaum, Allen; Saltz, Joel; Gao, Yi
2017-03-01
Digital histopathology images with more than 1 Gigapixel are drawing more and more attention in clinical, biomedical research, and computer vision fields. Among the multiple observable features spanning multiple scales in the pathology images, the nuclear morphology is one of the central criteria for diagnosis and grading. As a result it is also the mostly studied target in image computing. Large amount of research papers have devoted to the problem of extracting nuclei from digital pathology images, which is the foundation of any further correlation study. However, the validation and evaluation of nucleus extraction have yet been formulated rigorously and systematically. Some researches report a human verified segmentation with thousands of nuclei, whereas a single whole slide image may contain up to million. The main obstacle lies in the difficulty of obtaining such a large number of validated nuclei, which is essentially an impossible task for pathologist. We propose a systematic validation and evaluation approach based on large scale image synthesis. This could facilitate a more quantitatively validated study for current and future histopathology image analysis field.
On the relationship between human search strategies, conspicuity, and search performance
NASA Astrophysics Data System (ADS)
Hogervorst, Maarten A.; Bijl, Piet; Toet, Alexander
2005-05-01
We determined the relationship between search performance with a limited field of view (FOV) and several scanning- and scene parameters in human observer experiments. The observers (38 trained army scouts) searched through a large search sector for a target (a camouflaged person) on a heath. From trial to trial the target appeared at a different location. With a joystick the observers scanned through a panoramic image (displayed on a PC-monitor) while the scan path was registered. Four conditions were run differing in sensor type (visual or thermal infrared) and window size (large or small). In conditions with a small window size the zoom option could be used. Detection performance was highly dependent on zoom factor and deteriorated when scan speed increased beyond a threshold value. Moreover, the distribution of scan speeds scales with the threshold speed. This indicates that the observers are aware of their limitations and choose a (near) optimal search strategy. We found no correlation between the fraction of detected targets and overall search time for the individual observers, indicating that both are independent measures of individual search performance. Search performance (fraction detected, total search time, time in view for detection) was found to be strongly related to target conspicuity. Moreover, we found the same relationship between search performance and conspicuity for visual and thermal targets. This indicates that search performance can be predicted directly by conspicuity regardless of the sensor type.
NASA Astrophysics Data System (ADS)
van der Bogert, C. H.; Hiesinger, H.; Dundas, C. M.; Krüger, T.; McEwen, A. S.; Zanetti, M.; Robinson, M. S.
2017-12-01
Recent work on dating Copernican-aged craters, using Lunar Reconnaissance Orbiter (LRO) Camera data, re-encountered a curious discrepancy in crater size-frequency distribution (CSFD) measurements that was observed, but not understood, during the Apollo era. For example, at Tycho, Copernicus, and Aristarchus craters, CSFDs of impact melt deposits give significantly younger relative and absolute model ages (AMAs) than impact ejecta blankets, although these two units formed during one impact event, and would ideally yield coeval ages at the resolution of the CSFD technique. We investigated the effects of contrasting target properties on CSFDs and their resultant relative and absolute model ages for coeval lunar impact melt and ejecta units. We counted craters with diameters through the transition from strength- to gravity-scaling on two large impact melt deposits at Tycho and King craters, and we used pi-group scaling calculations to model the effects of differing target properties on final crater diameters for five different theoretical lunar targets. The new CSFD for the large King Crater melt pond bridges the gap between the discrepant CSFDs within a single geologic unit. Thus, the observed trends in the impact melt CSFDs support the occurrence of target property effects, rather than self-secondary and/or field secondary contamination. The CSFDs generated from the pi-group scaling calculations show that targets with higher density and effective strength yield smaller crater diameters than weaker targets, such that the relative ages of the former are lower relative to the latter. Consequently, coeval impact melt and ejecta units will have discrepant apparent ages. Target property differences also affect the resulting slope of the CSFD, with stronger targets exhibiting shallower slopes, so that the final crater diameters may differ more greatly at smaller diameters. Besides their application to age dating, the CSFDs may provide additional information about the characteristics of the target. For example, the transition diameter from strength- to gravity-scaling could provide a tool for investigating the relative strengths of different geologic units. The magnitude of the offset between the impact melt and ejecta isochrons may also provide information about the relative target properties and/or exposure/degradation ages of the two units. Robotic or human sampling of coeval units on the Moon could provide a direct test of the importance and magnitude of target property effects on CSFDs.
Van der Bogert, Carolyn H.; Hiesinger, Harald; Dundas, Colin M.; Kruger, T.; McEwen, Alfred S.; Zanetti, Michael; Robinson, Mark S.
2017-01-01
Recent work on dating Copernican-aged craters, using Lunar Reconnaissance Orbiter (LRO) Camera data, re-encountered a curious discrepancy in crater size-frequency distribution (CSFD) measurements that was observed, but not understood, during the Apollo era. For example, at Tycho, Copernicus, and Aristarchus craters, CSFDs of impact melt deposits give significantly younger relative and absolute model ages (AMAs) than impact ejecta blankets, although these two units formed during one impact event, and would ideally yield coeval ages at the resolution of the CSFD technique. We investigated the effects of contrasting target properties on CSFDs and their resultant relative and absolute model ages for coeval lunar impact melt and ejecta units. We counted craters with diameters through the transition from strength- to gravity-scaling on two large impact melt deposits at Tycho and King craters, and we used pi-group scaling calculations to model the effects of differing target properties on final crater diameters for five different theoretical lunar targets. The new CSFD for the large King Crater melt pond bridges the gap between the discrepant CSFDs within a single geologic unit. Thus, the observed trends in the impact melt CSFDs support the occurrence of target property effects, rather than self-secondary and/or field secondary contamination. The CSFDs generated from the pi-group scaling calculations show that targets with higher density and effective strength yield smaller crater diameters than weaker targets, such that the relative ages of the former are lower relative to the latter. Consequently, coeval impact melt and ejecta units will have discrepant apparent ages. Target property differences also affect the resulting slope of the CSFD, with stronger targets exhibiting shallower slopes, so that the final crater diameters may differ more greatly at smaller diameters. Besides their application to age dating, the CSFDs may provide additional information about the characteristics of the target. For example, the transition diameter from strength- to gravity-scaling could provide a tool for investigating the relative strengths of different geologic units. The magnitude of the offset between the impact melt and ejecta isochrons may also provide information about the relative target properties and/or exposure/degradation ages of the two units. Robotic or human sampling of coeval units on the Moon could provide a direct test of the importance and magnitude of target property effects on CSFDs.
Genome-Wide Identification of CBX2 Targets: Insights in the Human Sex Development Network
Eid, Wassim; Opitz, Lennart
2015-01-01
Chromobox homolog 2 (CBX2) is a chromatin modifier that plays an important role in sexual development and its disorders (disorders of sex development [DSD]), yet the exact rank and function of human CBX2 in this pathway remains unclear. Here, we performed large-scale mapping and analysis of in vivo target loci of the protein CBX2 in Sertoli-like NT-2D1 cells, using the DNA adenine methyltransferase identification technique. We identified close to 1600 direct targets for CBX2. Intriguingly, validation of selected candidate genes using qRT-PCR in cells overexpressing CBX2 or in which CBX2 has been knocked down indicated that several CBX2-responsive genes encode proteins that are involved in DSD. We further validated these effects on the candidate genes using a mutated CBX2 causing DSD in human patient. Overall, our findings suggest that CBX2 role in the sex development cascade is to stimulate the male pathway and concurrently inhibit the female pathway. These data provide fundamental insights into potential etiology of DSD. PMID:25569159
Mase, Yuri; Ishibashi, Osamu; Ishikawa, Tomoko; Takizawa, Takami; Kiguchi, Kazushige; Ohba, Takashi; Katabuchi, Hidetaka; Takeshita, Toshiyuki; Takizawa, Toshihiro
2012-10-01
MicroRNAs (miRNAs) are noncoding small RNAs that play important roles in a variety of physiological and pathological events. In this study, we performed large-scale profiling of EIF2C2-bound miRNAs in 3 human granulosa-derived cell lines (ie, KGN, HSOGT, and GC1a) by high-throughput sequencing and found that miR-21 accounted for more than 80% of EIF2C2-bound miRNAs, suggesting that it was enriched in the RNA-induced silencing complex (RISC) and played a functional role in human granulosa cell (GC) lines. We also found high expression levels of miR-21 in primary human GCs. Assuming that miR-21 target mRNAs are enriched in RISC, we performed cDNA cloning of EIF2C2-bound mRNAs in KGN cells. We identified COL4A1 mRNA as a miR-21 target in the GC lines. These data suggest that miR-21 is involved in the regulation of the synthesis of COL4A1, a component of the basement membrane surrounding the GC layer and granulosa-embedded extracellular structure.
TARGET Publication Guidelines | Office of Cancer Genomics
Like other NCI large-scale genomics initiatives, TARGET is a community resource project and data are made available rapidly after validation for use by other researchers. To act in accord with the Fort Lauderdale principles and support the continued prompt public release of large-scale genomic data prior to publication, researchers who plan to prepare manuscripts containing descriptions of TARGET pediatric cancer data that would be of comparable scope to an initial TARGET disease-specific comprehensive, global analysis publication, and journal editors who receive such manuscripts, are
GWASeq: targeted re-sequencing follow up to GWAS.
Salomon, Matthew P; Li, Wai Lok Sibon; Edlund, Christopher K; Morrison, John; Fortini, Barbara K; Win, Aung Ko; Conti, David V; Thomas, Duncan C; Duggan, David; Buchanan, Daniel D; Jenkins, Mark A; Hopper, John L; Gallinger, Steven; Le Marchand, Loïc; Newcomb, Polly A; Casey, Graham; Marjoram, Paul
2016-03-03
For the last decade the conceptual framework of the Genome-Wide Association Study (GWAS) has dominated the investigation of human disease and other complex traits. While GWAS have been successful in identifying a large number of variants associated with various phenotypes, the overall amount of heritability explained by these variants remains small. This raises the question of how best to follow up on a GWAS, localize causal variants accounting for GWAS hits, and as a consequence explain more of the so-called "missing" heritability. Advances in high throughput sequencing technologies now allow for the efficient and cost-effective collection of vast amounts of fine-scale genomic data to complement GWAS. We investigate these issues using a colon cancer dataset. After QC, our data consisted of 1993 cases, 899 controls. Using marginal tests of associations, we identify 10 variants distributed among six targeted regions that are significantly associated with colorectal cancer, with eight of the variants being novel to this study. Additionally, we perform so-called 'SNP-set' tests of association and identify two sets of variants that implicate both common and rare variants in the etiology of colorectal cancer. Here we present a large-scale targeted re-sequencing resource focusing on genomic regions implicated in colorectal cancer susceptibility previously identified in several GWAS, which aims to 1) provide fine-scale targeted sequencing data for fine-mapping and 2) provide data resources to address methodological questions regarding the design of sequencing-based follow-up studies to GWAS. Additionally, we show that this strategy successfully identifies novel variants associated with colorectal cancer susceptibility and can implicate both common and rare variants.
Centrifuge impact cratering experiments: Scaling laws for non-porous targets
NASA Technical Reports Server (NTRS)
Schmidt, Robert M.
1987-01-01
This research is a continuation of an ongoing program whose objective is to perform experiments and to develop scaling relationships for large body impacts onto planetary surfaces. The development of the centrifuge technique has been pioneered by the present investigator and is used to provide experimental data for actual target materials of interest. With both powder and gas guns mounted on a rotor arm, it is possible to match various dimensionless similarity parameters, which have been shown to govern the behavior of large scale impacts. Current work is directed toward the determination of scaling estimates for nonporous targets. The results are presented in summary form.
European large-scale farmland investments and the land-water-energy-food nexus
NASA Astrophysics Data System (ADS)
Siciliano, Giuseppina; Rulli, Maria Cristina; D'Odorico, Paolo
2017-12-01
The escalating human demand for food, water, energy, fibres and minerals have resulted in increasing commercial pressures on land and water resources, which are partly reflected by the recent increase in transnational land investments. Studies have shown that many of the land-water issues associated with land acquisitions are directly related to the areas of energy and food production. This paper explores the land-water-energy-food nexus in relation to large-scale farmland investments pursued by investors from European countries. The analysis is based on a "resource assessment approach" which evaluates the linkages between land acquisitions for agricultural (including both energy and food production) and forestry purposes, and the availability of land and water in the target countries. To that end, the water appropriated by agricultural and forestry productions is quantitatively assessed and its impact on water resource availability is analysed. The analysis is meant to provide useful information to investors from EU countries and policy makers on aspects of resource acquisition, scarcity, and access to promote responsible land investments in the target countries.
A robust close-range photogrammetric target extraction algorithm for size and type variant targets
NASA Astrophysics Data System (ADS)
Nyarko, Kofi; Thomas, Clayton; Torres, Gilbert
2016-05-01
The Photo-G program conducted by Naval Air Systems Command at the Atlantic Test Range in Patuxent River, Maryland, uses photogrammetric analysis of large amounts of real-world imagery to characterize the motion of objects in a 3-D scene. Current approaches involve several independent processes including target acquisition, target identification, 2-D tracking of image features, and 3-D kinematic state estimation. Each process has its own inherent complications and corresponding degrees of both human intervention and computational complexity. One approach being explored for automated target acquisition relies on exploiting the pixel intensity distributions of photogrammetric targets, which tend to be patterns with bimodal intensity distributions. The bimodal distribution partitioning algorithm utilizes this distribution to automatically deconstruct a video frame into regions of interest (ROI) that are merged and expanded to target boundaries, from which ROI centroids are extracted to mark target acquisition points. This process has proved to be scale, position and orientation invariant, as well as fairly insensitive to global uniform intensity disparities.
Shandas, Vivek; Voelkel, Jackson; Rao, Meenakshi; George, Linda
2016-01-01
Reducing exposure to degraded air quality is essential for building healthy cities. Although air quality and population vary at fine spatial scales, current regulatory and public health frameworks assess human exposures using county- or city-scales. We build on a spatial analysis technique, dasymetric mapping, for allocating urban populations that, together with emerging fine-scale measurements of air pollution, addresses three objectives: (1) evaluate the role of spatial scale in estimating exposure; (2) identify urban communities that are disproportionately burdened by poor air quality; and (3) estimate reduction in mobile sources of pollutants due to local tree-planting efforts using nitrogen dioxide. Our results show a maximum value of 197% difference between cadastrally-informed dasymetric system (CIDS) and standard estimations of population exposure to degraded air quality for small spatial extent analyses, and a lack of substantial difference for large spatial extent analyses. These results provide the foundation for improving policies for managing air quality, and targeting mitigation efforts to address challenges of environmental justice. PMID:27527205
Harnessing the genome for characterization of GPCRs in cancer pathogenesis
Feigin, Michael E.
2014-01-01
G-protein coupled receptors (GPCRs) mediate numerous physiological processes and represent the targets for a vast array of therapeutics for diseases ranging from depression to hypertension to reflux. Despite the recognition that GPCRs can act as oncogenes and tumor suppressors by regulating oncogenic signaling networks, few drugs targeting GPCRs are utilized in cancer therapy. Recent large-scale genome-wide analyses of multiple human tumors have uncovered novel GPCRs altered in cancer. However, the work of determining which GPCRs from these lists are drivers of tumorigenesis, and hence valid therapeutic targets, remains a formidable challenge. In this review I will highlight recent studies providing evidence that GPCRs are relevant targets for cancer therapy through their effects on known cancer signaling pathways, tumor progression, invasion and metastasis, and the microenvironment. Furthermore, I will explore how genomic analysis is beginning to shine a light on GPCRs as therapeutic targets in the age of personalized medicine. PMID:23927072
Targeting PTPRK-RSPO3 colon tumours promotes differentiation and loss of stem-cell function.
Storm, Elaine E; Durinck, Steffen; de Sousa e Melo, Felipe; Tremayne, Jarrod; Kljavin, Noelyn; Tan, Christine; Ye, Xiaofen; Chiu, Cecilia; Pham, Thinh; Hongo, Jo-Anne; Bainbridge, Travis; Firestein, Ron; Blackwood, Elizabeth; Metcalfe, Ciara; Stawiski, Eric W; Yauch, Robert L; Wu, Yan; de Sauvage, Frederic J
2016-01-07
Colorectal cancer remains a major unmet medical need, prompting large-scale genomics efforts in the field to identify molecular drivers for which targeted therapies might be developed. We previously reported the identification of recurrent translocations in R-spondin genes present in a subset of colorectal tumours. Here we show that targeting RSPO3 in PTPRK-RSPO3-fusion-positive human tumour xenografts inhibits tumour growth and promotes differentiation. Notably, genes expressed in the stem-cell compartment of the intestine were among those most sensitive to anti-RSPO3 treatment. This observation, combined with functional assays, suggests that a stem-cell compartment drives PTPRK-RSPO3 colorectal tumour growth and indicates that the therapeutic targeting of stem-cell properties within tumours may be a clinically relevant approach for the treatment of colorectal tumours.
Mina, Michael J
2017-06-01
Interactions between pathogens and commensal microbes are major contributors to health and disease. Infectious diseases however are most often considered independent, viewed within a one-host one-pathogen paradigm and, by extension, the interventions used to treat and prevent them are measured and evaluated within this same paradigm. Vaccines, especially live vaccines, by stimulating immune responses or directly interacting with other microbes can alter the environment in which they act, with effects that span across pathogen species. Live attenuated infl uenza vaccines for example, while safe, increase upper respiratory tract bacterial carriage density of important human commensal pathogens like Streptococcus pneumoniae and Staphylococcus aureus. Further, by altering the ecological niche and dynamics of phylogenetically distinct microbes within the host, vaccines may unintentionally affect transmission of non-vaccine targeted pathogens. Thus, vaccine effects may span across species and across scales, from the individual to the population level. In keeping with traditional vaccine herd-effects that indirectly protect even unvaccinated individuals by reducing population prevalence of vaccine-targeted pathogens, we call these cross-species cross-scale effects "generalized herd-effects". As opposed to traditional herd-effects, "generalized" relaxes the assumption that the effect occurs at the level of the vaccine-target pathogen and "herd effect" implies, as usual, that the effects indirectly impact the population at large, including unvaccinated bystanders. Unlike traditional herd-effects that decrease population prevalence of the vaccine-target, generalized herd-effects may decrease or increase prevalence and disease by the off-target pathogen. LAIV, for example, by increasing pneumococcal density in the upper respiratory tract of vaccine recipients, especially children, may increase pneumococcal transmission and prevalence, leading to excess pneumococcal invasive disease in the population, especially among the elderly and others most susceptible to pneumococcal disease. However, these effects may also be beneficial, for example the large reductions in all-cause mortality noted following measles vaccines. Here we discuss evidence for these novel vaccine effects and suggest that vaccine monitoring and evaluation programs should consider generalized herd effects to appreciate the full impacts of vaccines, beneficial or detrimental, across species and scales that are inevitably hiding in plain sight, affecting human health and disease. © 2017 The British Infection Association. Published by Elsevier Ltd. All rights reserved.
How the amygdala affects emotional memory by altering brain network properties.
Hermans, Erno J; Battaglia, Francesco P; Atsak, Piray; de Voogd, Lycia D; Fernández, Guillén; Roozendaal, Benno
2014-07-01
The amygdala has long been known to play a key role in supporting memory for emotionally arousing experiences. For example, classical fear conditioning depends on neural plasticity within this anterior medial temporal lobe region. Beneficial effects of emotional arousal on memory, however, are not restricted to simple associative learning. Our recollection of emotional experiences often includes rich representations of, e.g., spatiotemporal context, visceral states, and stimulus-response associations. Critically, such memory features are known to bear heavily on regions elsewhere in the brain. These observations led to the modulation account of amygdala function, which postulates that amygdala activation enhances memory consolidation by facilitating neural plasticity and information storage processes in its target regions. Rodent work in past decades has identified the most important brain regions and neurochemical processes involved in these modulatory actions, and neuropsychological and neuroimaging work in humans has produced a large body of convergent data. Importantly, recent methodological developments make it increasingly realistic to monitor neural interactions underlying such modulatory effects as they unfold. For instance, functional connectivity network modeling in humans has demonstrated how information exchanges between the amygdala and specific target regions occur within the context of large-scale neural network interactions. Furthermore, electrophysiological and optogenetic techniques in rodents are beginning to make it possible to quantify and even manipulate such interactions with millisecond precision. In this paper we will discuss that these developments will likely lead to an updated view of the amygdala as a critical nexus within large-scale networks supporting different aspects of memory processing for emotionally arousing experiences. Copyright © 2014 Elsevier Inc. All rights reserved.
Sherman, Sean P; Pyle, April D
2013-01-01
Differentiated cells from human embryonic stem cells (hESCs) provide an unlimited source of cells for use in regenerative medicine. The recent derivation of human induced pluripotent cells (hiPSCs) provides a potential supply of pluripotent cells that avoid immune rejection and could provide patient-tailored therapy. In addition, the use of pluripotent cells for drug screening could enable routine toxicity testing and evaluation of underlying disease mechanisms. However, prior to establishment of patient specific cells for cell therapy it is important to understand the basic regulation of cell fate decisions in hESCs. One critical issue that hinders the use of these cells is the fact that hESCs survive poorly upon dissociation, which limits genetic manipulation because of poor cloning efficiency of individual hESCs, and hampers production of large-scale culture of hESCs. To address the problems associated with poor growth in culture and our lack of understanding of what regulates hESC signaling, we successfully developed a screening platform that allows for large scale screening for small molecules that regulate survival. In this work we developed the first large scale platform for hESC screening using laser scanning cytometry and were able to validate this platform by identifying the pro-survival molecule HA-1077. These small molecules provide targets for both improving our basic understanding of hESC survival as well as a tool to improve our ability to expand and genetically manipulate hESCs for use in regenerative applications.
NASA Astrophysics Data System (ADS)
Thorslund, J.; Jarsjo, J.; Destouni, G.
2017-12-01
The quality of freshwater resources is increasingly impacted by human activities. Humans also extensively change the structure of landscapes, which may alter natural hydrological processes. To manage and maintain freshwater of good water quality, it is critical to understand how pollutants are released into, transported and transformed within the hydrological system. Some key scientific questions include: What are net downstream impacts of pollutants across different hydroclimatic and human disturbance conditions, and on different scales? What are the functions within and between components of the landscape, such as wetlands, on mitigating pollutant load delivery to downstream recipients? We explore these questions by synthesizing results from several relevant case study examples of intensely human-impacted hydrological systems. These case study sites have been specifically evaluated in terms of net impact of human activities on pollutant input to the aquatic system, as well as flow-path distributions trough wetlands as a potential ecosystem service of pollutant mitigation. Results shows that although individual wetlands have high retention capacity, efficient net retention effects were not always achieved at a larger landscape scale. Evidence suggests that the function of wetlands as mitigation solutions to pollutant loads is largely controlled by large-scale parallel and circular flow-paths, through which multiple wetlands are interconnected in the landscape. To achieve net mitigation effects at large scale, a large fraction of the polluted large-scale flows must be transported through multiple connected wetlands. Although such large-scale flow interactions are critical for assessing water pollution spreading and fate through the landscape, our synthesis shows a frequent lack of knowledge at such scales. We suggest ways forward for addressing the mismatch between the large scales at which key pollutant pressures and water quality changes take place and the relatively scale at which most studies and implementations are currently made. These suggestions can help bridge critical knowledge gaps, as needed for improving water quality predictions and mitigation solutions under human and environmental changes.
NASA Astrophysics Data System (ADS)
Beichner, Robert
2015-03-01
The Student Centered Active Learning Environment with Upside-down Pedagogies (SCALE-UP) project was developed nearly 20 years ago as an economical way to provide collaborative, interactive instruction even for large enrollment classes. Nearly all research-based pedagogies have been designed with fairly high faculty-student ratios. The economics of introductory courses at large universities often precludes that situation, so SCALE-UP was created as a way to facilitate highly collaborative active learning with large numbers of students served by only a few faculty and assistants. It enables those students to learn and succeed not only in acquiring content, but also to practice important 21st century skills like problem solving, communication, and teamsmanship. The approach was initially targeted at undergraduate science and engineering students taking introductory physics courses in large enrollment sections. It has since expanded to multiple content areas, including chemistry, math, engineering, biology, business, nursing, and even the humanities. Class sizes range from 24 to over 600. Data collected from multiple sites around the world indicates highly successful implementation at more than 250 institutions. NSF support was critical for initial development and dissemination efforts. Generously supported by NSF (9752313, 9981107) and FIPSE (P116B971905, P116B000659).
Terai, Asuka; Nakagawa, Masanori
2007-08-01
The purpose of this paper is to construct a model that represents the human process of understanding metaphors, focusing specifically on similes of the form an "A like B". Generally speaking, human beings are able to generate and understand many sorts of metaphors. This study constructs the model based on a probabilistic knowledge structure for concepts which is computed from a statistical analysis of a large-scale corpus. Consequently, this model is able to cover the many kinds of metaphors that human beings can generate. Moreover, the model implements the dynamic process of metaphor understanding by using a neural network with dynamic interactions. Finally, the validity of the model is confirmed by comparing model simulations with the results from a psychological experiment.
Human, vector and parasite Hsp90 proteins: A comparative bioinformatics analysis.
Faya, Ngonidzashe; Penkler, David L; Tastan Bishop, Özlem
2015-01-01
The treatment of protozoan parasitic diseases is challenging, and thus identification and analysis of new drug targets is important. Parasites survive within host organisms, and some need intermediate hosts to complete their life cycle. Changing host environment puts stress on parasites, and often adaptation is accompanied by the expression of large amounts of heat shock proteins (Hsps). Among Hsps, Hsp90 proteins play an important role in stress environments. Yet, there has been little computational research on Hsp90 proteins to analyze them comparatively as potential parasitic drug targets. Here, an attempt was made to gain detailed insights into the differences between host, vector and parasitic Hsp90 proteins by large-scale bioinformatics analysis. A total of 104 Hsp90 sequences were divided into three groups based on their cellular localizations; namely cytosolic, mitochondrial and endoplasmic reticulum (ER). Further, the parasitic proteins were divided according to the type of parasite (protozoa, helminth and ectoparasite). Primary sequence analysis, phylogenetic tree calculations, motif analysis and physicochemical properties of Hsp90 proteins suggested that despite the overall structural conservation of these proteins, parasitic Hsp90 proteins have unique features which differentiate them from human ones, thus encouraging the idea that protozoan Hsp90 proteins should be further analyzed as potential drug targets.
2015-10-24
zebrafish reference genome sequence and its relationship to the human genome . Nature. 2013;496(7446):498–503. 21. Linney E, Upchurch L, Donerly S. Zebrafish...To obtain a broader understanding of the effects of dichlorvos on liver metabolism, we per- formed a genome -wide analysis of gene expression in the ...condition) for whole genome transcript ana- lysis, and fixed another set of fish for histological evaluation (n = 5/condition). We determined the target
Mutagenesis and phenotyping resources in zebrafish for studying development and human disease
Varshney, Gaurav Kumar
2014-01-01
The zebrafish (Danio rerio) is an important model organism for studying development and human disease. The zebrafish has an excellent reference genome and the functions of hundreds of genes have been tested using both forward and reverse genetic approaches. Recent years have seen an increasing number of large-scale mutagenesis projects and the number of mutants or gene knockouts in zebrafish has increased rapidly, including for the first time conditional knockout technologies. In addition, targeted mutagenesis techniques such as zinc finger nucleases, transcription activator-like effector nucleases and clustered regularly interspaced short sequences (CRISPR) or CRISPR-associated (Cas), have all been shown to effectively target zebrafish genes as well as the first reported germline homologous recombination, further expanding the utility and power of zebrafish genetics. Given this explosion of mutagenesis resources, it is now possible to perform systematic, high-throughput phenotype analysis of all zebrafish gene knockouts. PMID:24162064
2003-12-01
operations run the full gamut from large-scale, theater-wide combat, as witnessed in Operation Iraqi Freedom, to small-scale operations against terrorists, to... gamut from large-scale, theater-wide combat, as witnessed in Operation Iraqi Freedom, to small-scale operations against terror- ists, to operations
Chen, Chen; Liu, Xiaohui; Zheng, Weimin; Zhang, Lei; Yao, Jun; Yang, Pengyuan
2014-04-04
To completely annotate the human genome, the task of identifying and characterizing proteins that currently lack mass spectrometry (MS) evidence is inevitable and urgent. In this study, as the first effort to screen missing proteins in large scale, we developed an approach based on SDS-PAGE followed by liquid chromatography-multiple reaction monitoring (LC-MRM), for screening of those missing proteins with only a single peptide hit in the previous liver proteome data set. Proteins extracted from normal human liver were separated in SDS-PAGE and digested in split gel slice, and the resulting digests were then subjected to LC-schedule MRM analysis. The MRM assays were developed through synthesized crude peptides for target peptides. In total, the expressions of 57 target proteins were confirmed from 185 MRM assays in normal human liver tissues. Among the proved 57 one-hit wonders, 50 proteins are of the minimally redundant set in the PeptideAtlas database, 7 proteins even have none MS-based information previously in various biological processes. We conclude that our SDS-PAGE-MRM workflow can be a powerful approach to screen missing or poorly characterized proteins in different samples and to provide their quantity if detected. The MRM raw data have been uploaded to ISB/SRM Atlas/PASSEL (PXD000648).
Systematic Identification of Combinatorial Drivers and Targets in Cancer Cell Lines
Tabchy, Adel; Eltonsy, Nevine; Housman, David E.; Mills, Gordon B.
2013-01-01
There is an urgent need to elicit and validate highly efficacious targets for combinatorial intervention from large scale ongoing molecular characterization efforts of tumors. We established an in silico bioinformatic platform in concert with a high throughput screening platform evaluating 37 novel targeted agents in 669 extensively characterized cancer cell lines reflecting the genomic and tissue-type diversity of human cancers, to systematically identify combinatorial biomarkers of response and co-actionable targets in cancer. Genomic biomarkers discovered in a 141 cell line training set were validated in an independent 359 cell line test set. We identified co-occurring and mutually exclusive genomic events that represent potential drivers and combinatorial targets in cancer. We demonstrate multiple cooperating genomic events that predict sensitivity to drug intervention independent of tumor lineage. The coupling of scalable in silico and biologic high throughput cancer cell line platforms for the identification of co-events in cancer delivers rational combinatorial targets for synthetic lethal approaches with a high potential to pre-empt the emergence of resistance. PMID:23577104
Systematic identification of combinatorial drivers and targets in cancer cell lines.
Tabchy, Adel; Eltonsy, Nevine; Housman, David E; Mills, Gordon B
2013-01-01
There is an urgent need to elicit and validate highly efficacious targets for combinatorial intervention from large scale ongoing molecular characterization efforts of tumors. We established an in silico bioinformatic platform in concert with a high throughput screening platform evaluating 37 novel targeted agents in 669 extensively characterized cancer cell lines reflecting the genomic and tissue-type diversity of human cancers, to systematically identify combinatorial biomarkers of response and co-actionable targets in cancer. Genomic biomarkers discovered in a 141 cell line training set were validated in an independent 359 cell line test set. We identified co-occurring and mutually exclusive genomic events that represent potential drivers and combinatorial targets in cancer. We demonstrate multiple cooperating genomic events that predict sensitivity to drug intervention independent of tumor lineage. The coupling of scalable in silico and biologic high throughput cancer cell line platforms for the identification of co-events in cancer delivers rational combinatorial targets for synthetic lethal approaches with a high potential to pre-empt the emergence of resistance.
Food security through large scale investments in agriculture
NASA Astrophysics Data System (ADS)
Rulli, M.; D'Odorico, P.
2013-12-01
Most of the human appropriation of freshwater resources is for food production. There is some concern that in the near future the finite freshwater resources available on Earth might not be sufficient to meet the increasing human demand for agricultural products. In the late 1700s Malthus argued that in the long run the humanity would not have enough resources to feed itself. Malthus' analysis, however, did not account for the emergence of technological innovations that could increase the rate of food production. The modern and contemporary history has seen at least three major technological advances that have increased humans' access to food, namely, the industrial revolution, the green revolution, and the intensification of global trade. Here we argue that a fourth revolution has just started to happen. It involves foreign direct investments in agriculture, which intensify the crop yields of potentially highly productive agricultural lands by introducing the use of more modern technologies. The increasing demand for agricultural products and the uncertainty of international food markets has recently drawn the attention of governments and agribusiness firms toward investments in productive agricultural land, mostly in the developing world. The targeted countries are typically located in regions that have remained only marginally utilized because of lack of modern technology. It is expected that in the long run large scale land acquisitions for commercial farming will bring the technology required to close the existing yield gaps. While the extent of the acquired land and the associated appropriation of freshwater resources have been investigated in detail, the amount of food this land can produce and the number of people it could feed still need to be quantified. Here we use a unique dataset of verified land deals to provide a global quantitative assessment of the rates of crop and food appropriation potentially associated with large scale land acquisitions. We show how at gap closure up to about 290-470 million people could be fed by crops grown on this land, in face of the 200-300 million people that can be supported with the current crop yields. These numbers raise some concern because many of the target countries exhibit high malnourishment levels. If used for domestic consumption, the crops harvested in the acquired land could ensure food security to the local populations.
Where to put things? Spatial land management to sustain biodiversity and economic returns
Expanding human population and economic growth have led to large-scale conversion of natural habitat to human-dominated landscapes with consequent large-scale declines in biodiversity. Conserving biodiversity, while at the same time meeting expanding human needs, is an issue of u...
Scene-Aware Adaptive Updating for Visual Tracking via Correlation Filters
Zhang, Sirou; Qiao, Xiaoya
2017-01-01
In recent years, visual object tracking has been widely used in military guidance, human-computer interaction, road traffic, scene monitoring and many other fields. The tracking algorithms based on correlation filters have shown good performance in terms of accuracy and tracking speed. However, their performance is not satisfactory in scenes with scale variation, deformation, and occlusion. In this paper, we propose a scene-aware adaptive updating mechanism for visual tracking via a kernel correlation filter (KCF). First, a low complexity scale estimation method is presented, in which the corresponding weight in five scales is employed to determine the final target scale. Then, the adaptive updating mechanism is presented based on the scene-classification. We classify the video scenes as four categories by video content analysis. According to the target scene, we exploit the adaptive updating mechanism to update the kernel correlation filter to improve the robustness of the tracker, especially in scenes with scale variation, deformation, and occlusion. We evaluate our tracker on the CVPR2013 benchmark. The experimental results obtained with the proposed algorithm are improved by 33.3%, 15%, 6%, 21.9% and 19.8% compared to those of the KCF tracker on the scene with scale variation, partial or long-time large-area occlusion, deformation, fast motion and out-of-view. PMID:29140311
Living laboratory: whole-genome sequencing as a learning healthcare enterprise.
Angrist, M; Jamal, L
2015-04-01
With the proliferation of affordable large-scale human genomic data come profound and vexing questions about management of such data and their clinical uncertainty. These issues challenge the view that genomic research on human beings can (or should) be fully segregated from clinical genomics, either conceptually or practically. Here, we argue that the sharp distinction between clinical care and research is especially problematic in the context of large-scale genomic sequencing of people with suspected genetic conditions. Core goals of both enterprises (e.g. understanding genotype-phenotype relationships; generating an evidence base for genomic medicine) are more likely to be realized at a population scale if both those ordering and those undergoing sequencing for diagnostic reasons are routinely and longitudinally studied. Rather than relying on expensive and lengthy randomized clinical trials and meta-analyses, we propose leveraging nascent clinical-research hybrid frameworks into a broader, more permanent instantiation of exploratory medical sequencing. Such an investment could enlighten stakeholders about the real-life challenges posed by whole-genome sequencing, such as establishing the clinical actionability of genetic variants, returning 'off-target' results to families, developing effective service delivery models and monitoring long-term outcomes. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Three-dimensional hydrodynamic simulations of OMEGA implosions
NASA Astrophysics Data System (ADS)
Igumenshchev, I. V.; Michel, D. T.; Shah, R. C.; Campbell, E. M.; Epstein, R.; Forrest, C. J.; Glebov, V. Yu.; Goncharov, V. N.; Knauer, J. P.; Marshall, F. J.; McCrory, R. L.; Regan, S. P.; Sangster, T. C.; Stoeckl, C.; Schmitt, A. J.; Obenschain, S.
2017-05-01
The effects of large-scale (with Legendre modes ≲ 10) asymmetries in OMEGA direct-drive implosions caused by laser illumination nonuniformities (beam-power imbalance and beam mispointing and mistiming), target offset, and variation in target-layer thickness were investigated using the low-noise, three-dimensional Eulerian hydrodynamic code ASTER. Simulations indicate that these asymmetries can significantly degrade the implosion performance. The most important sources of the asymmetries are the target offsets ( ˜10 to 20 μm), beam-power imbalance ( σrms˜10 %), and variations ( ˜5 %) in target-layer thickness. Large-scale asymmetries distort implosion cores, resulting in a reduced hot-spot confinement and an increased residual kinetic energy of implosion targets. The ion temperature inferred from the width of simulated neutron spectra is influenced by bulk fuel motion in the distorted hot spot and can result in up to an ˜1 -keV increase in apparent temperature. Similar temperature variations along different lines of sight are observed. Demonstrating hydrodynamic equivalence to ignition designs on OMEGA requires a reduction in large-scale target and laser-imposed nonuniformities, minimizing target offset, and employing highly efficient mid-adiabat (α = 4) implosion designs, which mitigate cross-beam energy transfer and suppress short-wavelength Rayleigh-Taylor growth.
Three-dimensional hydrodynamic simulations of OMEGA implosions
Igumenshchev, I. V.; Michel, D. T.; Shah, R. C.; ...
2017-03-30
Here, the effects of large-scale (with Legendre modes ≲10) asymmetries in OMEGA direct-drive implosions caused by laser illumination nonuniformities (beam-power imbalance and beam mispointing and mistiming), target offset, and variation in target-layer thickness were investigated using the low-noise, three-dimensional Eulerian hydrodynamic code ASTER. Simulations indicate that these asymmetries can significantly degrade the implosion performance. The most important sources of the asymmetries are the target offsets (~10 to 20 μm), beam-power imbalance (σ rms ~ 10%), and variations (~5%) in target-layer thickness. Large-scale asymmetries distort implosion cores, resulting in a reduced hot-spot confinement and an increased residual kinetic energy of implosionmore » targets. The ion temperature inferred from the width of simulated neutron spectra is influenced by bulk fuel motion in the distorted hot spot and can result in up to an ~1 -keV increase in apparent temperature. Similar temperature variations along different lines of sight are observed. Demonstrating hydrodynamic equivalence to ignition designs on OMEGA requires a reduction in large-scale target and laser-imposed nonuniformities, minimizing target offset, and employing highly efficient mid-adiabat (α = 4) implosion designs, which mitigate cross-beam energy transfer and suppress short-wavelength Rayleigh–Taylor growth.« less
Three-dimensional hydrodynamic simulations of OMEGA implosions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Igumenshchev, I. V.; Michel, D. T.; Shah, R. C.
Here, the effects of large-scale (with Legendre modes ≲10) asymmetries in OMEGA direct-drive implosions caused by laser illumination nonuniformities (beam-power imbalance and beam mispointing and mistiming), target offset, and variation in target-layer thickness were investigated using the low-noise, three-dimensional Eulerian hydrodynamic code ASTER. Simulations indicate that these asymmetries can significantly degrade the implosion performance. The most important sources of the asymmetries are the target offsets (~10 to 20 μm), beam-power imbalance (σ rms ~ 10%), and variations (~5%) in target-layer thickness. Large-scale asymmetries distort implosion cores, resulting in a reduced hot-spot confinement and an increased residual kinetic energy of implosionmore » targets. The ion temperature inferred from the width of simulated neutron spectra is influenced by bulk fuel motion in the distorted hot spot and can result in up to an ~1 -keV increase in apparent temperature. Similar temperature variations along different lines of sight are observed. Demonstrating hydrodynamic equivalence to ignition designs on OMEGA requires a reduction in large-scale target and laser-imposed nonuniformities, minimizing target offset, and employing highly efficient mid-adiabat (α = 4) implosion designs, which mitigate cross-beam energy transfer and suppress short-wavelength Rayleigh–Taylor growth.« less
NASA Astrophysics Data System (ADS)
Song, Z. N.; Sui, H. G.
2018-04-01
High resolution remote sensing images are bearing the important strategic information, especially finding some time-sensitive-targets quickly, like airplanes, ships, and cars. Most of time the problem firstly we face is how to rapidly judge whether a particular target is included in a large random remote sensing image, instead of detecting them on a given image. The problem of time-sensitive-targets target finding in a huge image is a great challenge: 1) Complex background leads to high loss and false alarms in tiny object detection in a large-scale images. 2) Unlike traditional image retrieval, what we need to do is not just compare the similarity of image blocks, but quickly find specific targets in a huge image. In this paper, taking the target of airplane as an example, presents an effective method for searching aircraft targets in large scale optical remote sensing images. Firstly, we used an improved visual attention model utilizes salience detection and line segment detector to quickly locate suspected regions in a large and complicated remote sensing image. Then for each region, without region proposal method, a single neural network predicts bounding boxes and class probabilities directly from full images in one evaluation is adopted to search small airplane objects. Unlike sliding window and region proposal-based techniques, we can do entire image (region) during training and test time so it implicitly encodes contextual information about classes as well as their appearance. Experimental results show the proposed method is quickly identify airplanes in large-scale images.
Cumulative Damage in Strength-Dominated Collisions of Rocky Asteroids: Rubble Piles and Brick Piles
NASA Technical Reports Server (NTRS)
Housen, Kevin
2009-01-01
Laboratory impact experiments were performed to investigate the conditions that produce large-scale damage in rock targets. Aluminum cylinders (6.3 mm diameter) impacted basalt cylinders (69 mm diameter) at speeds ranging from 0.7 to 2.0 km/s. Diagnostics included measurements of the largest fragment mass, velocities of the largest remnant and large fragments ejected from the periphery of the target, and X-ray computed tomography imaging to inspect some of the impacted targets for internal damage. Significant damage to the target occurred when the kinetic energy per unit target mass exceeded roughly 1/4 of the energy required for catastrophic shattering (where the target is reduced to one-half its original mass). Scaling laws based on a rate-dependent strength were developed that provide a basis for extrapolating the results to larger strength-dominated collisions. The threshold specific energy for widespread damage was found to scale with event size in the same manner as that for catastrophic shattering. Therefore, the factor of four difference between the two thresholds observed in the lab also applies to larger collisions. The scaling laws showed that for a sequence of collisions that are similar in that they produce the same ratio of largest fragment mass to original target mass, the fragment velocities decrease with increasing event size. As a result, rocky asteroids a couple hundred meters in diameter should retain their large ejecta fragments in a jumbled rubble-pile state. For somewhat larger bodies, the ejection velocities are sufficiently low that large fragments are essentially retained in place, possibly forming ordered "brick-pile" structures.
Stempler, Shiri; Yizhak, Keren; Ruppin, Eytan
2014-01-01
Accumulating evidence links numerous abnormalities in cerebral metabolism with the progression of Alzheimer's disease (AD), beginning in its early stages. Here, we integrate transcriptomic data from AD patients with a genome-scale computational human metabolic model to characterize the altered metabolism in AD, and employ state-of-the-art metabolic modelling methods to predict metabolic biomarkers and drug targets in AD. The metabolic descriptions derived are first tested and validated on a large scale versus existing AD proteomics and metabolomics data. Our analysis shows a significant decrease in the activity of several key metabolic pathways, including the carnitine shuttle, folate metabolism and mitochondrial transport. We predict several metabolic biomarkers of AD progression in the blood and the CSF, including succinate and prostaglandin D2. Vitamin D and steroid metabolism pathways are enriched with predicted drug targets that could mitigate the metabolic alterations observed. Taken together, this study provides the first network wide view of the metabolic alterations associated with AD progression. Most importantly, it offers a cohort of new metabolic leads for the diagnosis of AD and its treatment. PMID:25127241
Samoylenko, Anatoly; Hossain, Jubayer Al; Mennerich, Daniela; Kellokumpu, Sakari; Hiltunen, Jukka Kalervo
2013-01-01
Abstract Reactive oxygen species (ROS) exert various biological effects and contribute to signaling events during physiological and pathological processes. Enhanced levels of ROS are highly associated with different tumors, a Western lifestyle, and a nutritional regime. The supplementation of food with traditional antioxidants was shown to be protective against cancer in a number of studies both in vitro and in vivo. However, recent large-scale human trials in well-nourished populations did not confirm the beneficial role of antioxidants in cancer, whereas there is a well-established connection between longevity of several human populations and increased amount of antioxidants in their diets. Although our knowledge about ROS generators, ROS scavengers, and ROS signaling has improved, the knowledge about the direct link between nutrition, ROS levels, and cancer is limited. These limitations are partly due to lack of standardized reliable ROS measurement methods, easily usable biomarkers, knowledge of ROS action in cellular compartments, and individual genetic predispositions. The current review summarizes ROS formation due to nutrition with respect to macronutrients and antioxidant micronutrients in the context of cancer and discusses signaling mechanisms, used biomarkers, and its limitations along with large-scale human trials. Antioxid. Redox Signal. 19, 2157–2196. PMID:23458328
Control of fluxes in metabolic networks
Basler, Georg; Nikoloski, Zoran; Larhlimi, Abdelhalim; Barabási, Albert-László; Liu, Yang-Yu
2016-01-01
Understanding the control of large-scale metabolic networks is central to biology and medicine. However, existing approaches either require specifying a cellular objective or can only be used for small networks. We introduce new coupling types describing the relations between reaction activities, and develop an efficient computational framework, which does not require any cellular objective for systematic studies of large-scale metabolism. We identify the driver reactions facilitating control of 23 metabolic networks from all kingdoms of life. We find that unicellular organisms require a smaller degree of control than multicellular organisms. Driver reactions are under complex cellular regulation in Escherichia coli, indicating their preeminent role in facilitating cellular control. In human cancer cells, driver reactions play pivotal roles in malignancy and represent potential therapeutic targets. The developed framework helps us gain insights into regulatory principles of diseases and facilitates design of engineering strategies at the interface of gene regulation, signaling, and metabolism. PMID:27197218
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mortensen, Holly M., E-mail: mortensen.holly@epa.gov; Euling, Susan Y.
Response to environmental chemicals can vary widely among individuals and between population groups. In human health risk assessment, data on susceptibility can be utilized by deriving risk levels based on a study of a susceptible population and/or an uncertainty factor may be applied to account for the lack of information about susceptibility. Defining genetic susceptibility in response to environmental chemicals across human populations is an area of interest in the NAS' new paradigm of toxicity pathway-based risk assessment. Data from high-throughput/high content (HT/HC), including -omics (e.g., genomics, transcriptomics, proteomics, metabolomics) technologies, have been integral to the identification and characterization ofmore » drug target and disease loci, and have been successfully utilized to inform the mechanism of action for numerous environmental chemicals. Large-scale population genotyping studies may help to characterize levels of variability across human populations at identified target loci implicated in response to environmental chemicals. By combining mechanistic data for a given environmental chemical with next generation sequencing data that provides human population variation information, one can begin to characterize differential susceptibility due to genetic variability to environmental chemicals within and across genetically heterogeneous human populations. The integration of such data sources will be informative to human health risk assessment.« less
Human Disease-Drug Network Based on Genomic Expression Profiles
Hu, Guanghui; Agarwal, Pankaj
2009-01-01
Background Drug repositioning offers the possibility of faster development times and reduced risks in drug discovery. With the rapid development of high-throughput technologies and ever-increasing accumulation of whole genome-level datasets, an increasing number of diseases and drugs can be comprehensively characterized by the changes they induce in gene expression, protein, metabolites and phenotypes. Methodology/Principal Findings We performed a systematic, large-scale analysis of genomic expression profiles of human diseases and drugs to create a disease-drug network. A network of 170,027 significant interactions was extracted from the ∼24.5 million comparisons between ∼7,000 publicly available transcriptomic profiles. The network includes 645 disease-disease, 5,008 disease-drug, and 164,374 drug-drug relationships. At least 60% of the disease-disease pairs were in the same disease area as determined by the Medical Subject Headings (MeSH) disease classification tree. The remaining can drive a molecular level nosology by discovering relationships between seemingly unrelated diseases, such as a connection between bipolar disorder and hereditary spastic paraplegia, and a connection between actinic keratosis and cancer. Among the 5,008 disease-drug links, connections with negative scores suggest new indications for existing drugs, such as the use of some antimalaria drugs for Crohn's disease, and a variety of existing drugs for Huntington's disease; while the positive scoring connections can aid in drug side effect identification, such as tamoxifen's undesired carcinogenic property. From the ∼37K drug-drug relationships, we discover relationships that aid in target and pathway deconvolution, such as 1) KCNMA1 as a potential molecular target of lobeline, and 2) both apoptotic DNA fragmentation and G2/M DNA damage checkpoint regulation as potential pathway targets of daunorubicin. Conclusions/Significance We have automatically generated thousands of disease and drug expression profiles using GEO datasets, and constructed a large scale disease-drug network for effective and efficient drug repositioning as well as drug target/pathway identification. PMID:19657382
El Bali, Latifa; Diman, Aurélie; Bernard, Alfred; Roosens, Nancy H. C.; De Keersmaecker, Sigrid C. J.
2014-01-01
Human genomic DNA extracted from urine could be an interesting tool for large-scale public health studies involving characterization of genetic variations or DNA biomarkers as a result of the simple and noninvasive collection method. These studies, involving many samples, require a rapid, easy, and standardized extraction protocol. Moreover, for practicability, there is a necessity to collect urine at a moment different from the first void and to store it appropriately until analysis. The present study compared seven commercial kits to select the most appropriate urinary human DNA extraction procedure for epidemiological studies. DNA yield has been determined using different quantification methods: two classical, i.e., NanoDrop and PicoGreen, and two species-specific real-time quantitative (q)PCR assays, as DNA extracted from urine contains, besides human, microbial DNA also, which largely contributes to the total DNA yield. In addition, the kits giving a good yield were also tested for the presence of PCR inhibitors. Further comparisons were performed regarding the sampling time and the storage conditions. Finally, as a proof-of-concept, an important gene related to smoking has been genotyped using the developed tools. We could select one well-performing kit for the human DNA extraction from urine suitable for molecular diagnostic real-time qPCR-based assays targeting genetic variations, applicable to large-scale studies. In addition, successful genotyping was possible using DNA extracted from urine stored at −20°C for several months, and an acceptable yield could also be obtained from urine collected at different moments during the day, which is particularly important for public health studies. PMID:25365790
Impact of US and Canadian precursor regulation on methamphetamine purity in the United States.
Cunningham, James K; Liu, Lon-Mu; Callaghan, Russell
2009-03-01
Reducing drug purity is a major, but largely unstudied, goal of drug suppression. This study examines whether US methamphetamine purity was impacted by the suppression policy of US and Canadian precursor chemical regulation. Autoregressive integrated moving average (ARIMA)-intervention time-series analysis. Continental United States and Hawaii (1985-May 2005). Interventions US federal regulations targeting precursors, ephedrine and pseudoephedrine, in forms used by large-scale producers were implemented in November 1989, August 1995 and October 1997. US regulations targeting precursors in forms used by small-scale producers (e.g. over-the-counter medications) were implemented in October 1996 and October 2001. Canada implemented federal precursor regulations in January 2003 and July 2003 and an essential chemical (e.g. acetone) regulation in January 2004. Monthly median methamphetamine purity series. US regulations targeting large-scale producers were associated with purity declines of 16-67 points; those targeting small-scale producers had little or no impact. Canada's precursor regulations were associated with purity increases of 13-15 points, while its essential chemical regulation was associated with a 13-point decrease. Hawaii's purity was consistently high, and appeared to vary little with the 1990s/2000s regulations. US precursor regulations targeting large-scale producers were associated with substantial decreases in continental US methamphetamine purity, while regulations targeting over-the-counter medications had little or no impact. Canada's essential chemical regulation was also associated with a decrease in continental US purity. However, Canada's precursor regulations were associated with purity increases: these regulations may have impacted primarily producers of lower-quality methamphetamine, leaving higher-purity methamphetamine on the market by default. Hawaii's well-known preference for 'ice' (high-purity methamphetamine) may have helped to constrain purity there to a high, attenuated range, possibly limiting its sensitivity to precursor regulation.
Reaching the global target to reduce stunting: an investment framework
Shekar, Meera; D’Alimonte, Mary R; Rogers, Hilary E; Eberwein, Julia Dayton; Akuoku, Jon Kweku; Pereira, Audrey; Soe-Lin, Shan; Hecht, Robert
2017-01-01
Abstract Childhood stunting, being short for one’s age, has life-long consequences for health, human capital and economic growth. Being stunted in early childhood is associated with slower cognitive development, reduced schooling attainment and adult incomes decreased by 5–53%. The World Health Assembly has endorsed global nutrition targets including one to reduce the number of stunted children under five by 40% by 2025. The target has been included in the Sustainable Development Goals (SDG target 2.2). This paper estimates the cost of achieving this target and develops scenarios for generating the necessary financing. We focus on a key intervention package for stunting (KIPS) with strong evidence of effectiveness. Annual scale-up costs for the period of 2016–25 were estimated for a sample of 37 high burden countries and extrapolated to all low and middle income countries. The Lives Saved Tool was used to model the impact of the scale-up on stunting prevalence. We analysed data on KIPS budget allocations and expenditure by governments, donors and households to derive a global baseline financing estimate. We modelled two financing scenarios, a ‘business as usual’, which extends the current trends in domestic and international financing for nutrition through 2025, and another that proposes increases in financing from all sources under a set of burden-sharing rules. The 10-year financial need to scale up KIPS is US$49.5 billion. Under ‘business as usual’, this financial need is not met and the global stunting target is not reached. To reach the target, current financing will have to increase from US$2.6 billion to US$7.4 billion a year on average. Reaching the stunting target is feasible but will require large coordinated investments in KIPS and a supportive enabling environment. The example of HIV scale-up over 2001–11 is instructive in identifying the factors that could drive such a global response to childhood stunting. PMID:28453717
Poss, Zachary C; Ebmeier, Christopher C; Odell, Aaron T; Tangpeerachaikul, Anupong; Lee, Thomas; Pelish, Henry E; Shair, Matthew D; Dowell, Robin D; Old, William M; Taatjes, Dylan J
2016-04-12
Cortistatin A (CA) is a highly selective inhibitor of the Mediator kinases CDK8 and CDK19. Using CA, we now report a large-scale identification of Mediator kinase substrates in human cells (HCT116). We identified over 16,000 quantified phosphosites including 78 high-confidence Mediator kinase targets within 64 proteins, including DNA-binding transcription factors and proteins associated with chromatin, DNA repair, and RNA polymerase II. Although RNA-seq data correlated with Mediator kinase targets, the effects of CA on gene expression were limited and distinct from CDK8 or CDK19 knockdown. Quantitative proteome analyses, tracking around 7,000 proteins across six time points (0-24 hr), revealed that CA selectively affected pathways implicated in inflammation, growth, and metabolic regulation. Contrary to expectations, increased turnover of Mediator kinase targets was not generally observed. Collectively, these data support Mediator kinases as regulators of chromatin and RNA polymerase II activity and suggest their roles extend beyond transcription to metabolism and DNA repair. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Large-Scale Culture and Genetic Modification of Human Natural Killer Cells for Cellular Therapy.
Lapteva, Natalia; Parihar, Robin; Rollins, Lisa A; Gee, Adrian P; Rooney, Cliona M
2016-01-01
Recent advances in methods for the ex vivo expansion of human natural killer (NK) cells have facilitated the use of these powerful immune cells in clinical protocols. Further, the ability to genetically modify primary human NK cells following rapid expansion allows targeting and enhancement of their immune function. We have successfully adapted an expansion method for primary NK cells from peripheral blood mononuclear cells or from apheresis products in gas permeable rapid expansion devices (G-Rexes). Here, we describe an optimized protocol for rapid and robust NK cell expansion as well as a method for highly efficient retroviral transduction of these ex vivo expanded cells. These methodologies are good manufacturing practice (GMP) compliant and could be used for clinical-grade product manufacturing.
Extremely Scalable Spiking Neuronal Network Simulation Code: From Laptops to Exascale Computers.
Jordan, Jakob; Ippen, Tammo; Helias, Moritz; Kitayama, Itaru; Sato, Mitsuhisa; Igarashi, Jun; Diesmann, Markus; Kunkel, Susanne
2018-01-01
State-of-the-art software tools for neuronal network simulations scale to the largest computing systems available today and enable investigations of large-scale networks of up to 10 % of the human cortex at a resolution of individual neurons and synapses. Due to an upper limit on the number of incoming connections of a single neuron, network connectivity becomes extremely sparse at this scale. To manage computational costs, simulation software ultimately targeting the brain scale needs to fully exploit this sparsity. Here we present a two-tier connection infrastructure and a framework for directed communication among compute nodes accounting for the sparsity of brain-scale networks. We demonstrate the feasibility of this approach by implementing the technology in the NEST simulation code and we investigate its performance in different scaling scenarios of typical network simulations. Our results show that the new data structures and communication scheme prepare the simulation kernel for post-petascale high-performance computing facilities without sacrificing performance in smaller systems.
Extremely Scalable Spiking Neuronal Network Simulation Code: From Laptops to Exascale Computers
Jordan, Jakob; Ippen, Tammo; Helias, Moritz; Kitayama, Itaru; Sato, Mitsuhisa; Igarashi, Jun; Diesmann, Markus; Kunkel, Susanne
2018-01-01
State-of-the-art software tools for neuronal network simulations scale to the largest computing systems available today and enable investigations of large-scale networks of up to 10 % of the human cortex at a resolution of individual neurons and synapses. Due to an upper limit on the number of incoming connections of a single neuron, network connectivity becomes extremely sparse at this scale. To manage computational costs, simulation software ultimately targeting the brain scale needs to fully exploit this sparsity. Here we present a two-tier connection infrastructure and a framework for directed communication among compute nodes accounting for the sparsity of brain-scale networks. We demonstrate the feasibility of this approach by implementing the technology in the NEST simulation code and we investigate its performance in different scaling scenarios of typical network simulations. Our results show that the new data structures and communication scheme prepare the simulation kernel for post-petascale high-performance computing facilities without sacrificing performance in smaller systems. PMID:29503613
Pharmacological approaches to the challenge of treatment-resistant depression
Ionescu, Dawn F.; Rosenbaum, Jerrold F.; Alpert, Jonathan E.
2015-01-01
Although monoaminergic antidepressants revolutionized the treatment of Major Depressive Disorder (MDD) over a half-century ago, approximately one third of depressed patients experience treatment-resistant depression (TRD). Such patients account for a disproportionately large burden of disease, as evidenced by increased disability, cost, human suffering, and suicide. This review addresses the definition, causes, evaluation, and treatment of unipolar TRD, as well as the major treatment strategies, including optimization, augmentation, combination, and switch therapies. Evidence for these options, as outlined in this review, is mainly focused on large-scale trials or meta-analyses. Finally, we briefly review emerging targets for antidepressant drug discovery and the novel effects of rapidly acting antidepressants, with a focus on ketamine. PMID:26246787
Megabase sequencing of human genome by ordered-shotgun-sequencing (OSS) strategy
NASA Astrophysics Data System (ADS)
Chen, Ellson Y.
1997-05-01
So far we have used OSS strategy to sequence over 2 megabases DNA in large-insert clones from regions of human X chromosomes with different characteristic levels of GC content. The method starts by randomly fragmenting a BAC, YAC or PAC to 8-12 kb pieces and subcloning those into lambda phage. Insert-ends of these clones are sequenced and overlapped to create a partial map. Complete sequencing is then done on a minimal tiling path of selected subclones, recursively focusing on those at the edges of contigs to facilitate mergers of clones across the entire target. To reduce manual labor, PCR processes have been adapted to prepare sequencing templates throughout the entire operation. The streamlined process can thus lend itself to further automation. The OSS approach is suitable for large- scale genomic sequencing, providing considerable flexibility in the choice of subclones or regions for more or less intensive sequencing. For example, subclones containing contaminating host cell DNA or cloning vector can be recognized and ignored with minimal sequencing effort; regions overlapping a neighboring clone already sequenced need not be redone; and segments containing tandem repeats or long repetitive sequences can be spotted early on and targeted for additional attention.
D'Aiuto, Leonardo; Zhi, Yun; Kumar Das, Dhanjit; Wilcox, Madeleine R; Johnson, Jon W; McClain, Lora; MacDonald, Matthew L; Di Maio, Roberto; Schurdak, Mark E; Piazza, Paolo; Viggiano, Luigi; Sweet, Robert; Kinchington, Paul R; Bhattacharjee, Ayantika G; Yolken, Robert; Nimgaonka, Vishwajit L; Nimgaonkar, Vishwajit L
2014-01-01
Induced pluripotent stem cell (iPSC)-based technologies offer an unprecedented opportunity to perform high-throughput screening of novel drugs for neurological and neurodegenerative diseases. Such screenings require a robust and scalable method for generating large numbers of mature, differentiated neuronal cells. Currently available methods based on differentiation of embryoid bodies (EBs) or directed differentiation of adherent culture systems are either expensive or are not scalable. We developed a protocol for large-scale generation of neuronal stem cells (NSCs)/early neural progenitor cells (eNPCs) and their differentiation into neurons. Our scalable protocol allows robust and cost-effective generation of NSCs/eNPCs from iPSCs. Following culture in neurobasal medium supplemented with B27 and BDNF, NSCs/eNPCs differentiate predominantly into vesicular glutamate transporter 1 (VGLUT1) positive neurons. Targeted mass spectrometry analysis demonstrates that iPSC-derived neurons express ligand-gated channels and other synaptic proteins and whole-cell patch-clamp experiments indicate that these channels are functional. The robust and cost-effective differentiation protocol described here for large-scale generation of NSCs/eNPCs and their differentiation into neurons paves the way for automated high-throughput screening of drugs for neurological and neurodegenerative diseases.
MetaRanker 2.0: a web server for prioritization of genetic variation data
Pers, Tune H.; Dworzyński, Piotr; Thomas, Cecilia Engel; Lage, Kasper; Brunak, Søren
2013-01-01
MetaRanker 2.0 is a web server for prioritization of common and rare frequency genetic variation data. Based on heterogeneous data sets including genetic association data, protein–protein interactions, large-scale text-mining data, copy number variation data and gene expression experiments, MetaRanker 2.0 prioritizes the protein-coding part of the human genome to shortlist candidate genes for targeted follow-up studies. MetaRanker 2.0 is made freely available at www.cbs.dtu.dk/services/MetaRanker-2.0. PMID:23703204
MetaRanker 2.0: a web server for prioritization of genetic variation data.
Pers, Tune H; Dworzyński, Piotr; Thomas, Cecilia Engel; Lage, Kasper; Brunak, Søren
2013-07-01
MetaRanker 2.0 is a web server for prioritization of common and rare frequency genetic variation data. Based on heterogeneous data sets including genetic association data, protein-protein interactions, large-scale text-mining data, copy number variation data and gene expression experiments, MetaRanker 2.0 prioritizes the protein-coding part of the human genome to shortlist candidate genes for targeted follow-up studies. MetaRanker 2.0 is made freely available at www.cbs.dtu.dk/services/MetaRanker-2.0.
NASA Astrophysics Data System (ADS)
Hansen, A. L.; Donnelly, C.; Refsgaard, J. C.; Karlsson, I. B.
2018-01-01
This paper describes a modeling approach proposed to simulate the impact of local-scale, spatially targeted N-mitigation measures for the Baltic Sea Basin. Spatially targeted N-regulations aim at exploiting the considerable spatial differences in the natural N-reduction taking place in groundwater and surface water. While such measures can be simulated using local-scale physically-based catchment models, use of such detailed models for the 1.8 million km2 Baltic Sea basin is not feasible due to constraints on input data and computing power. Large-scale models that are able to simulate the Baltic Sea basin, on the other hand, do not have adequate spatial resolution to simulate some of the field-scale measures. Our methodology combines knowledge and results from two local-scale physically-based MIKE SHE catchment models, the large-scale and more conceptual E-HYPE model, and auxiliary data in order to enable E-HYPE to simulate how spatially targeted regulation of agricultural practices may affect N-loads to the Baltic Sea. We conclude that the use of E-HYPE with this upscaling methodology enables the simulation of the impact on N-loads of applying a spatially targeted regulation at the Baltic Sea basin scale to the correct order-of-magnitude. The E-HYPE model together with the upscaling methodology therefore provides a sound basis for large-scale policy analysis; however, we do not expect it to be sufficiently accurate to be useful for the detailed design of local-scale measures.
Large-Scale Analysis of Auditory Segregation Behavior Crowdsourced via a Smartphone App.
Teki, Sundeep; Kumar, Sukhbinder; Griffiths, Timothy D
2016-01-01
The human auditory system is adept at detecting sound sources of interest from a complex mixture of several other simultaneous sounds. The ability to selectively attend to the speech of one speaker whilst ignoring other speakers and background noise is of vital biological significance-the capacity to make sense of complex 'auditory scenes' is significantly impaired in aging populations as well as those with hearing loss. We investigated this problem by designing a synthetic signal, termed the 'stochastic figure-ground' stimulus that captures essential aspects of complex sounds in the natural environment. Previously, we showed that under controlled laboratory conditions, young listeners sampled from the university subject pool (n = 10) performed very well in detecting targets embedded in the stochastic figure-ground signal. Here, we presented a modified version of this cocktail party paradigm as a 'game' featured in a smartphone app (The Great Brain Experiment) and obtained data from a large population with diverse demographical patterns (n = 5148). Despite differences in paradigms and experimental settings, the observed target-detection performance by users of the app was robust and consistent with our previous results from the psychophysical study. Our results highlight the potential use of smartphone apps in capturing robust large-scale auditory behavioral data from normal healthy volunteers, which can also be extended to study auditory deficits in clinical populations with hearing impairments and central auditory disorders.
Manoharan, Lokeshwaran; Kushwaha, Sandeep K.; Hedlund, Katarina; Ahrén, Dag
2015-01-01
Microbial enzyme diversity is a key to understand many ecosystem processes. Whole metagenome sequencing (WMG) obtains information on functional genes, but it is costly and inefficient due to large amount of sequencing that is required. In this study, we have applied a captured metagenomics technique for functional genes in soil microorganisms, as an alternative to WMG. Large-scale targeting of functional genes, coding for enzymes related to organic matter degradation, was applied to two agricultural soil communities through captured metagenomics. Captured metagenomics uses custom-designed, hybridization-based oligonucleotide probes that enrich functional genes of interest in metagenomic libraries where only probe-bound DNA fragments are sequenced. The captured metagenomes were highly enriched with targeted genes while maintaining their target diversity and their taxonomic distribution correlated well with the traditional ribosomal sequencing. The captured metagenomes were highly enriched with genes related to organic matter degradation; at least five times more than similar, publicly available soil WMG projects. This target enrichment technique also preserves the functional representation of the soils, thereby facilitating comparative metagenomics projects. Here, we present the first study that applies the captured metagenomics approach in large scale, and this novel method allows deep investigations of central ecosystem processes by studying functional gene abundances. PMID:26490729
Chatterjee, Gourab; Singh, Prashant Kumar; Robinson, A P L; Blackman, D; Booth, N; Culfa, O; Dance, R J; Gizzi, L A; Gray, R J; Green, J S; Koester, P; Kumar, G Ravindra; Labate, L; Lad, Amit D; Lancaster, K L; Pasley, J; Woolsey, N C; Rajeev, P P
2017-08-21
The transport of hot, relativistic electrons produced by the interaction of an intense petawatt laser pulse with a solid has garnered interest due to its potential application in the development of innovative x-ray sources and ion-acceleration schemes. We report on spatially and temporally resolved measurements of megagauss magnetic fields at the rear of a 50-μm thick plastic target, irradiated by a multi-picosecond petawatt laser pulse at an incident intensity of ~10 20 W/cm 2 . The pump-probe polarimetric measurements with micron-scale spatial resolution reveal the dynamics of the magnetic fields generated by the hot electron distribution at the target rear. An annular magnetic field profile was observed ~5 ps after the interaction, indicating a relatively smooth hot electron distribution at the rear-side of the plastic target. This is contrary to previous time-integrated measurements, which infer that such targets will produce highly structured hot electron transport. We measured large-scale filamentation of the hot electron distribution at the target rear only at later time-scales of ~10 ps, resulting in a commensurate large-scale filamentation of the magnetic field profile. Three-dimensional hybrid simulations corroborate our experimental observations and demonstrate a beam-like hot electron transport at initial time-scales that may be attributed to the local resistivity profile at the target rear.
How institutions shaped the last major evolutionary transition to large-scale human societies
Powers, Simon T.; van Schaik, Carel P.; Lehmann, Laurent
2016-01-01
What drove the transition from small-scale human societies centred on kinship and personal exchange, to large-scale societies comprising cooperation and division of labour among untold numbers of unrelated individuals? We propose that the unique human capacity to negotiate institutional rules that coordinate social actions was a key driver of this transition. By creating institutions, humans have been able to move from the default ‘Hobbesian’ rules of the ‘game of life’, determined by physical/environmental constraints, into self-created rules of social organization where cooperation can be individually advantageous even in large groups of unrelated individuals. Examples include rules of food sharing in hunter–gatherers, rules for the usage of irrigation systems in agriculturalists, property rights and systems for sharing reputation between mediaeval traders. Successful institutions create rules of interaction that are self-enforcing, providing direct benefits both to individuals that follow them, and to individuals that sanction rule breakers. Forming institutions requires shared intentionality, language and other cognitive abilities largely absent in other primates. We explain how cooperative breeding likely selected for these abilities early in the Homo lineage. This allowed anatomically modern humans to create institutions that transformed the self-reliance of our primate ancestors into the division of labour of large-scale human social organization. PMID:26729937
Experimental Simulations of Large-Scale Collisions
NASA Technical Reports Server (NTRS)
Housen, Kevin R.
2002-01-01
This report summarizes research on the effects of target porosity on the mechanics of impact cratering. Impact experiments conducted on a centrifuge provide direct simulations of large-scale cratering on porous asteroids. The experiments show that large craters in porous materials form mostly by compaction, with essentially no deposition of material into the ejecta blanket that is a signature of cratering in less-porous materials. The ratio of ejecta mass to crater mass is shown to decrease with increasing crater size or target porosity. These results are consistent with the observation that large closely-packed craters on asteroid Mathilde appear to have formed without degradation to earlier craters.
Lo, Yu-Chen; Senese, Silvia; Li, Chien-Ming; Hu, Qiyang; Huang, Yong; Damoiseaux, Robert; Torres, Jorge Z.
2015-01-01
Target identification is one of the most critical steps following cell-based phenotypic chemical screens aimed at identifying compounds with potential uses in cell biology and for developing novel disease therapies. Current in silico target identification methods, including chemical similarity database searches, are limited to single or sequential ligand analysis that have limited capabilities for accurate deconvolution of a large number of compounds with diverse chemical structures. Here, we present CSNAP (Chemical Similarity Network Analysis Pulldown), a new computational target identification method that utilizes chemical similarity networks for large-scale chemotype (consensus chemical pattern) recognition and drug target profiling. Our benchmark study showed that CSNAP can achieve an overall higher accuracy (>80%) of target prediction with respect to representative chemotypes in large (>200) compound sets, in comparison to the SEA approach (60–70%). Additionally, CSNAP is capable of integrating with biological knowledge-based databases (Uniprot, GO) and high-throughput biology platforms (proteomic, genetic, etc) for system-wise drug target validation. To demonstrate the utility of the CSNAP approach, we combined CSNAP's target prediction with experimental ligand evaluation to identify the major mitotic targets of hit compounds from a cell-based chemical screen and we highlight novel compounds targeting microtubules, an important cancer therapeutic target. The CSNAP method is freely available and can be accessed from the CSNAP web server (http://services.mbi.ucla.edu/CSNAP/). PMID:25826798
Visual analytics of inherently noisy crowdsourced data on ultra high resolution displays
NASA Astrophysics Data System (ADS)
Huynh, Andrew; Ponto, Kevin; Lin, Albert Yu-Min; Kuester, Falko
The increasing prevalence of distributed human microtasking, crowdsourcing, has followed the exponential increase in data collection capabilities. The large scale and distributed nature of these microtasks produce overwhelming amounts of information that is inherently noisy due to the nature of human input. Furthermore, these inputs create a constantly changing dataset with additional information added on a daily basis. Methods to quickly visualize, filter, and understand this information over temporal and geospatial constraints is key to the success of crowdsourcing. This paper present novel methods to visually analyze geospatial data collected through crowdsourcing on top of remote sensing satellite imagery. An ultra high resolution tiled display system is used to explore the relationship between human and satellite remote sensing data at scale. A case study is provided that evaluates the presented technique in the context of an archaeological field expedition. A team in the field communicated in real-time with and was guided by researchers in the remote visual analytics laboratory, swiftly sifting through incoming crowdsourced data to identify target locations that were identified as viable archaeological sites.
Lim, Hansaim; Poleksic, Aleksandar; Yao, Yuan; Tong, Hanghang; He, Di; Zhuang, Luke; Meng, Patrick; Xie, Lei
2016-10-01
Target-based screening is one of the major approaches in drug discovery. Besides the intended target, unexpected drug off-target interactions often occur, and many of them have not been recognized and characterized. The off-target interactions can be responsible for either therapeutic or side effects. Thus, identifying the genome-wide off-targets of lead compounds or existing drugs will be critical for designing effective and safe drugs, and providing new opportunities for drug repurposing. Although many computational methods have been developed to predict drug-target interactions, they are either less accurate than the one that we are proposing here or computationally too intensive, thereby limiting their capability for large-scale off-target identification. In addition, the performances of most machine learning based algorithms have been mainly evaluated to predict off-target interactions in the same gene family for hundreds of chemicals. It is not clear how these algorithms perform in terms of detecting off-targets across gene families on a proteome scale. Here, we are presenting a fast and accurate off-target prediction method, REMAP, which is based on a dual regularized one-class collaborative filtering algorithm, to explore continuous chemical space, protein space, and their interactome on a large scale. When tested in a reliable, extensive, and cross-gene family benchmark, REMAP outperforms the state-of-the-art methods. Furthermore, REMAP is highly scalable. It can screen a dataset of 200 thousands chemicals against 20 thousands proteins within 2 hours. Using the reconstructed genome-wide target profile as the fingerprint of a chemical compound, we predicted that seven FDA-approved drugs can be repurposed as novel anti-cancer therapies. The anti-cancer activity of six of them is supported by experimental evidences. Thus, REMAP is a valuable addition to the existing in silico toolbox for drug target identification, drug repurposing, phenotypic screening, and side effect prediction. The software and benchmark are available at https://github.com/hansaimlim/REMAP.
Poleksic, Aleksandar; Yao, Yuan; Tong, Hanghang; Meng, Patrick; Xie, Lei
2016-01-01
Target-based screening is one of the major approaches in drug discovery. Besides the intended target, unexpected drug off-target interactions often occur, and many of them have not been recognized and characterized. The off-target interactions can be responsible for either therapeutic or side effects. Thus, identifying the genome-wide off-targets of lead compounds or existing drugs will be critical for designing effective and safe drugs, and providing new opportunities for drug repurposing. Although many computational methods have been developed to predict drug-target interactions, they are either less accurate than the one that we are proposing here or computationally too intensive, thereby limiting their capability for large-scale off-target identification. In addition, the performances of most machine learning based algorithms have been mainly evaluated to predict off-target interactions in the same gene family for hundreds of chemicals. It is not clear how these algorithms perform in terms of detecting off-targets across gene families on a proteome scale. Here, we are presenting a fast and accurate off-target prediction method, REMAP, which is based on a dual regularized one-class collaborative filtering algorithm, to explore continuous chemical space, protein space, and their interactome on a large scale. When tested in a reliable, extensive, and cross-gene family benchmark, REMAP outperforms the state-of-the-art methods. Furthermore, REMAP is highly scalable. It can screen a dataset of 200 thousands chemicals against 20 thousands proteins within 2 hours. Using the reconstructed genome-wide target profile as the fingerprint of a chemical compound, we predicted that seven FDA-approved drugs can be repurposed as novel anti-cancer therapies. The anti-cancer activity of six of them is supported by experimental evidences. Thus, REMAP is a valuable addition to the existing in silico toolbox for drug target identification, drug repurposing, phenotypic screening, and side effect prediction. The software and benchmark are available at https://github.com/hansaimlim/REMAP. PMID:27716836
A saliency-based approach to detection of infrared target
NASA Astrophysics Data System (ADS)
Chen, Yanfei; Sang, Nong; Dan, Zhiping
2013-10-01
Automatic target detection in infrared images is a hot research field of national defense technology. We propose a new saliency-based infrared target detection model in this paper, which is based on the fact that human focus of attention is directed towards the relevant target to interpret the most promising information. For a given image, the convolution of the image log amplitude spectrum with a low-pass Gaussian kernel of an appropriate scale is equivalent to an image saliency detector in the frequency domain. At the same time, orientation and shape features extracted are combined into a saliency map in the spatial domain. Our proposed model decides salient targets based on a final saliency map, which is generated by integration of the saliency maps in the frequency and spatial domain. At last, the size of each salient target is obtained by maximizing entropy of the final saliency map. Experimental results show that the proposed model can highlight both small and large salient regions in infrared image, as well as inhibit repeated distractors in cluttered image. In addition, its detecting efficiency has improved significantly.
Walter, Vonn; Patel, Nirali M.; Eberhard, David A.; Hayward, Michele C.; Salazar, Ashley H.; Jo, Heejoon; Soloway, Matthew G.; Wilkerson, Matthew D.; Parker, Joel S.; Yin, Xiaoying; Zhang, Guosheng; Siegel, Marni B.; Rosson, Gary B.; Earp, H. Shelton; Sharpless, Norman E.; Gulley, Margaret L.; Weck, Karen E.
2015-01-01
The recent FDA approval of the MiSeqDx platform provides a unique opportunity to develop targeted next generation sequencing (NGS) panels for human disease, including cancer. We have developed a scalable, targeted panel-based assay termed UNCseq, which involves a NGS panel of over 200 cancer-associated genes and a standardized downstream bioinformatics pipeline for detection of single nucleotide variations (SNV) as well as small insertions and deletions (indel). In addition, we developed a novel algorithm, NGScopy, designed for samples with sparse sequencing coverage to detect large-scale copy number variations (CNV), similar to human SNP Array 6.0 as well as small-scale intragenic CNV. Overall, we applied this assay to 100 snap-frozen lung cancer specimens lacking same-patient germline DNA (07–0120 tissue cohort) and validated our results against Sanger sequencing, SNP Array, and our recently published integrated DNA-seq/RNA-seq assay, UNCqeR, where RNA-seq of same-patient tumor specimens confirmed SNV detected by DNA-seq, if RNA-seq coverage depth was adequate. In addition, we applied the UNCseq assay on an independent lung cancer tumor tissue collection with available same-patient germline DNA (11–1115 tissue cohort) and confirmed mutations using assays performed in a CLIA-certified laboratory. We conclude that UNCseq can identify SNV, indel, and CNV in tumor specimens lacking germline DNA in a cost-efficient fashion. PMID:26076459
Land grabbing: a preliminary quantification of economic impacts on rural livelihoods.
Davis, Kyle F; D'Odorico, Paolo; Rulli, Maria Cristina
2014-01-01
Global demands on agricultural land are increasing due to population growth, dietary changes and the use of biofuels. Their effect on food security is to reduce humans' ability to cope with the uncertainties of global climate change. In light of the 2008 food crisis, to secure reliable future access to sufficient agricultural land, many nations and corporations have begun purchasing large tracts of land in the global South, a phenomenon deemed "land grabbing" by popular media. Because land investors frequently export crops without providing adequate employment, this represents an effective income loss for local communities. We study 28 countries targeted by large-scale land acquisitions [comprising 87 % of reported cases and 27 million hectares (ha)] and estimate the effects of such investments on local communities' incomes. We find that this phenomenon can potentially affect the incomes of ~12 million people globally with implications for food security, poverty levels and urbanization. While it is important to note that our study incorporates a number of assumptions and limitations, it provides a much needed initial quantification of the economic impacts of large-scale land acquisitions on rural livelihoods.
The future of management: The NASA paradigm
NASA Technical Reports Server (NTRS)
Harris, Philip R.
1992-01-01
Prototypes of 21st century management, especially for large scale enterprises, may well be found within the aerospace industry. The space era inaugurated a number of projects of such scope and magnitude that another type of management had to be created to ensure successful achievement. The challenges will be not just in terms of technology and its management, but also human and cultural in dimension. Futurists, students of management, and those concerned with technological administration would do well to review the literature of emerging space management for its wider implications. NASA offers a paradigm, or demonstrated model, of future trends in the field of management at large. More research is needed on issues of leadership for Earth based project in space and space based programs with managers there. It is needed to realize that large scale technical enterprises, such as are undertaken in space, require a new form of management. NASA and other responsible agencies are urged to study excellence in space macromanagement, including the necessary multidisciplinary skills. Two recommended targets are the application of general living systems theory and macromanagement concepts for space stations in the 1990s.
Targeting channels and transporters in protozoan parasite infections
NASA Astrophysics Data System (ADS)
Meier, Anna; Erler, Holger; Beitz, Eric
2018-03-01
Infectious diseases caused by pathogenic protozoa are among the most significant causes of death in humans. Therapeutic options are scarce and massively challenged by the emergence of resistant parasite strains. Many of the current anti-parasite drugs target soluble enzymes, generate unspecific oxidative stress, or act by an unresolved mechanism within the parasite. In recent years, collections of drug-like compounds derived from large-scale phenotypic screenings, such as the malaria or pathogen box, have been made available to researchers free of charge boosting the identification of novel promising targets. Remarkably, several of the compound hits have been found to inhibit membrane proteins at the periphery of the parasites, i.e. channels and transporters for ions and metabolites. In this review, we will focus on the progress made on targeting channels and transporters at different levels and the potential for use against infections with apicomplexan parasites mainly Plasmodium spp. (malaria) and Toxoplasma gondii (toxoplasmosis), with kinetoplastids Trypanosoma brucei (sleeping sickness), Trypanosoma cruzi (Chagas disease) and Leishmania ssp. (leishmaniasis), and the amoeba Entamoeba histolytica (amoebiasis).
Molecular inversion probe assay for allelic quantitation
Ji, Hanlee; Welch, Katrina
2010-01-01
Molecular inversion probe (MIP) technology has been demonstrated to be a robust platform for large-scale dual genotyping and copy number analysis. Applications in human genomic and genetic studies include the possibility of running dual germline genotyping and combined copy number variation ascertainment. MIPs analyze large numbers of specific genetic target sequences in parallel, relying on interrogation of a barcode tag, rather than direct hybridization of genomic DNA to an array. The MIP approach does not replace, but is complementary to many of the copy number technologies being performed today. Some specific advantages of MIP technology include: Less DNA required (37 ng vs. 250 ng), DNA quality less important, more dynamic range (amplifications detected up to copy number 60), allele specific information “cleaner” (less SNP crosstalk/contamination), and quality of markers better (fewer individual MIPs versus SNPs needed to identify copy number changes). MIPs can be considered a candidate gene (targeted whole genome) approach and can find specific areas of interest that otherwise may be missed with other methods. PMID:19488872
Xu, Deshun; Wu, Xiaofang; Han, Jiankang; Chen, Liping; Ji, Lei; Yan, Wei; Shen, Yuehua
2015-12-01
Vibrio parahaemolyticus is a marine seafood-borne pathogen that causes gastrointestinal disorders in humans. In this study, we developed a cross-priming amplification (CPA) assay coupled with vertical flow (VF) visualization for rapid and sensitive detection of V. parahaemolyticus. This assay correctly detected all target strains (n = 13) and none of the non-target strains (n = 27). Small concentrations of V. parahaemolyticus (1.8 CFU/mL for pure cultures and 18 CFU/g for reconstituted samples) were detected within 1 h. CPA-VF can be applied at a large scale and can be used to detect V. parahaemolyticus strains rapidly in seafood and environmental samples, being especially useful in the field. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Bidlingmaier, Scott; Su, Yang; Liu, Bin
2015-01-01
Using phage antibody display, large libraries can be generated and screened to identify monoclonal antibodies with affinity for target antigens. However, while library size and diversity is an advantage of the phage display method, there is limited ability to quantitatively enrich for specific binding properties such as affinity. One way of overcoming this limitation is to combine the scale of phage display selections with the flexibility and quantitativeness of FACS-based yeast surface display selections. In this chapter we describe protocols for generating yeast surface antibody display libraries using phage antibody display selection outputs as starting material and FACS-based enrichment of target antigen-binding clones from these libraries. These methods should be widely applicable for the identification of monoclonal antibodies with specific binding properties.
Segmental Duplications and Copy-Number Variation in the Human Genome
Sharp, Andrew J. ; Locke, Devin P. ; McGrath, Sean D. ; Cheng, Ze ; Bailey, Jeffrey A. ; Vallente, Rhea U. ; Pertz, Lisa M. ; Clark, Royden A. ; Schwartz, Stuart ; Segraves, Rick ; Oseroff, Vanessa V. ; Albertson, Donna G. ; Pinkel, Daniel ; Eichler, Evan E.
2005-01-01
The human genome contains numerous blocks of highly homologous duplicated sequence. This higher-order architecture provides a substrate for recombination and recurrent chromosomal rearrangement associated with genomic disease. However, an assessment of the role of segmental duplications in normal variation has not yet been made. On the basis of the duplication architecture of the human genome, we defined a set of 130 potential rearrangement hotspots and constructed a targeted bacterial artificial chromosome (BAC) microarray (with 2,194 BACs) to assess copy-number variation in these regions by array comparative genomic hybridization. Using our segmental duplication BAC microarray, we screened a panel of 47 normal individuals, who represented populations from four continents, and we identified 119 regions of copy-number polymorphism (CNP), 73 of which were previously unreported. We observed an equal frequency of duplications and deletions, as well as a 4-fold enrichment of CNPs within hotspot regions, compared with control BACs (P < .000001), which suggests that segmental duplications are a major catalyst of large-scale variation in the human genome. Importantly, segmental duplications themselves were also significantly enriched >4-fold within regions of CNP. Almost without exception, CNPs were not confined to a single population, suggesting that these either are recurrent events, having occurred independently in multiple founders, or were present in early human populations. Our study demonstrates that segmental duplications define hotspots of chromosomal rearrangement, likely acting as mediators of normal variation as well as genomic disease, and it suggests that the consideration of genomic architecture can significantly improve the ascertainment of large-scale rearrangements. Our specialized segmental duplication BAC microarray and associated database of structural polymorphisms will provide an important resource for the future characterization of human genomic disorders. PMID:15918152
Telecommunications technology and rural education in the United States
NASA Technical Reports Server (NTRS)
Perrine, J. R.
1975-01-01
The rural sector of the US is examined from the point of view of whether telecommunications technology can augment the development of rural education. Migratory farm workers and American Indians were the target groups which were examined as examples of groups with special needs in rural areas. The general rural population and the target groups were examined to identify problems and to ascertain specific educational needs. Educational projects utilizing telecommunications technology in target group settings were discussed. Large scale regional ATS-6 satellite-based experimental educational telecommunications projects were described. Costs and organizational factors were also examined for large scale rural telecommunications projects.
Sex differences in virtual navigation influenced by scale and navigation experience.
Padilla, Lace M; Creem-Regehr, Sarah H; Stefanucci, Jeanine K; Cashdan, Elizabeth A
2017-04-01
The Morris water maze is a spatial abilities test adapted from the animal spatial cognition literature and has been studied in the context of sex differences in humans. This is because its standard design, which manipulates proximal (close) and distal (far) cues, applies to human navigation. However, virtual Morris water mazes test navigation skills on a scale that is vastly smaller than natural human navigation. Many researchers have argued that navigating in large and small scales is fundamentally different, and small-scale navigation might not simulate natural human navigation. Other work has suggested that navigation experience could influence spatial skills. To address the question of how individual differences influence navigational abilities in differently scaled environments, we employed both a large- (146.4 m in diameter) and a traditional- (36.6 m in diameter) scaled virtual Morris water maze along with a novel measure of navigation experience (lifetime mobility). We found sex differences on the small maze in the distal cue condition only, but in both cue-conditions on the large maze. Also, individual differences in navigation experience modulated navigation performance on the virtual water maze, showing that higher mobility was related to better performance with proximal cues for only females on the small maze, but for both males and females on the large maze.
Comparison of human and algorithmic target detection in passive infrared imagery
NASA Astrophysics Data System (ADS)
Weber, Bruce A.; Hutchinson, Meredith
2003-09-01
We have designed an experiment that compares the performance of human observers and a scale-insensitive target detection algorithm that uses pixel level information for the detection of ground targets in passive infrared imagery. The test database contains targets near clutter whose detectability ranged from easy to very difficult. Results indicate that human observers detect more "easy-to-detect" targets, and with far fewer false alarms, than the algorithm. For "difficult-to-detect" targets, human and algorithm detection rates are considerably degraded, and algorithm false alarms excessive. Analysis of detections as a function of observer confidence shows that algorithm confidence attribution does not correspond to human attribution, and does not adequately correlate with correct detections. The best target detection score for any human observer was 84%, as compared to 55% for the algorithm for the same false alarm rate. At 81%, the maximum detection score for the algorithm, the same human observer had 6 false alarms per frame as compared to 29 for the algorithm. Detector ROC curves and observer-confidence analysis benchmarks the algorithm and provides insights into algorithm deficiencies and possible paths to improvement.
Control of fluxes in metabolic networks.
Basler, Georg; Nikoloski, Zoran; Larhlimi, Abdelhalim; Barabási, Albert-László; Liu, Yang-Yu
2016-07-01
Understanding the control of large-scale metabolic networks is central to biology and medicine. However, existing approaches either require specifying a cellular objective or can only be used for small networks. We introduce new coupling types describing the relations between reaction activities, and develop an efficient computational framework, which does not require any cellular objective for systematic studies of large-scale metabolism. We identify the driver reactions facilitating control of 23 metabolic networks from all kingdoms of life. We find that unicellular organisms require a smaller degree of control than multicellular organisms. Driver reactions are under complex cellular regulation in Escherichia coli, indicating their preeminent role in facilitating cellular control. In human cancer cells, driver reactions play pivotal roles in malignancy and represent potential therapeutic targets. The developed framework helps us gain insights into regulatory principles of diseases and facilitates design of engineering strategies at the interface of gene regulation, signaling, and metabolism. © 2016 Basler et al.; Published by Cold Spring Harbor Laboratory Press.
Human3.6M: Large Scale Datasets and Predictive Methods for 3D Human Sensing in Natural Environments.
Ionescu, Catalin; Papava, Dragos; Olaru, Vlad; Sminchisescu, Cristian
2014-07-01
We introduce a new dataset, Human3.6M, of 3.6 Million accurate 3D Human poses, acquired by recording the performance of 5 female and 6 male subjects, under 4 different viewpoints, for training realistic human sensing systems and for evaluating the next generation of human pose estimation models and algorithms. Besides increasing the size of the datasets in the current state-of-the-art by several orders of magnitude, we also aim to complement such datasets with a diverse set of motions and poses encountered as part of typical human activities (taking photos, talking on the phone, posing, greeting, eating, etc.), with additional synchronized image, human motion capture, and time of flight (depth) data, and with accurate 3D body scans of all the subject actors involved. We also provide controlled mixed reality evaluation scenarios where 3D human models are animated using motion capture and inserted using correct 3D geometry, in complex real environments, viewed with moving cameras, and under occlusion. Finally, we provide a set of large-scale statistical models and detailed evaluation baselines for the dataset illustrating its diversity and the scope for improvement by future work in the research community. Our experiments show that our best large-scale model can leverage our full training set to obtain a 20% improvement in performance compared to a training set of the scale of the largest existing public dataset for this problem. Yet the potential for improvement by leveraging higher capacity, more complex models with our large dataset, is substantially vaster and should stimulate future research. The dataset together with code for the associated large-scale learning models, features, visualization tools, as well as the evaluation server, is available online at http://vision.imar.ro/human3.6m.
Value-focused framework for defining landscape-scale conservation targets
Romañach, Stephanie; Benscoter, Allison M.; Brandt, Laura A.
2016-01-01
Conservation of natural resources can be challenging in a rapidly changing world and require collaborative efforts for success. Conservation planning is the process of deciding how to protect, conserve, and enhance or minimize loss of natural and cultural resources. Establishing conservation targets (also called indicators or endpoints), the measurable expressions of desired resource conditions, can help with site-specific up to landscape-scale conservation planning. Using conservation targets and tracking them through time can deliver benefits such as insight into ecosystem health and providing early warnings about undesirable trends. We describe an approach using value-focused thinking to develop statewide conservation targets for Florida. Using such an approach allowed us to first identify stakeholder objectives and then define conservation targets to meet those objectives. Stakeholders were able to see how their shared efforts fit into the broader conservation context, and also anticipate the benefits of multi-agency and -organization collaboration. We developed an iterative process for large-scale conservation planning that included defining a shared framework for the process, defining the conservation targets themselves, as well as developing management and monitoring strategies for evaluation of their effectiveness. The process we describe is applicable to other geographies where multiple parties are seeking to implement collaborative, large-scale biological planning.
The influence of cognitive load on spatial search performance.
Longstaffe, Kate A; Hood, Bruce M; Gilchrist, Iain D
2014-01-01
During search, executive function enables individuals to direct attention to potential targets, remember locations visited, and inhibit distracting information. In the present study, we investigated these executive processes in large-scale search. In our tasks, participants searched a room containing an array of illuminated locations embedded in the floor. The participants' task was to press the switches at the illuminated locations on the floor so as to locate a target that changed color when pressed. The perceptual salience of the search locations was manipulated by having some locations flashing and some static. Participants were more likely to search at flashing locations, even when they were explicitly informed that the target was equally likely to be at any location. In large-scale search, attention was captured by the perceptual salience of the flashing lights, leading to a bias to explore these targets. Despite this failure of inhibition, participants were able to restrict returns to previously visited locations, a measure of spatial memory performance. Participants were more able to inhibit exploration to flashing locations when they were not required to remember which locations had previously been visited. A concurrent digit-span memory task further disrupted inhibition during search, as did a concurrent auditory attention task. These experiments extend a load theory of attention to large-scale search, which relies on egocentric representations of space. High cognitive load on working memory leads to increased distractor interference, providing evidence for distinct roles for the executive subprocesses of memory and inhibition during large-scale search.
NASA Astrophysics Data System (ADS)
Xie, Hongbo; Mao, Chensheng; Ren, Yongjie; Zhu, Jigui; Wang, Chao; Yang, Lei
2017-10-01
In high precision and large-scale coordinate measurement, one commonly used approach to determine the coordinate of a target point is utilizing the spatial trigonometric relationships between multiple laser transmitter stations and the target point. A light receiving device at the target point is the key element in large-scale coordinate measurement systems. To ensure high-resolution and highly sensitive spatial coordinate measurement, a high-performance and miniaturized omnidirectional single-point photodetector (OSPD) is greatly desired. We report one design of OSPD using an aspheric lens, which achieves an enhanced reception angle of -5 deg to 45 deg in vertical and 360 deg in horizontal. As the heart of our OSPD, the aspheric lens is designed in a geometric model and optimized by LightTools Software, which enables the reflection of a wide-angle incident light beam into the single-point photodiode. The performance of home-made OSPD is characterized with working distances from 1 to 13 m and further analyzed utilizing developed a geometric model. The experimental and analytic results verify that our device is highly suitable for large-scale coordinate metrology. The developed device also holds great potential in various applications such as omnidirectional vision sensor, indoor global positioning system, and optical wireless communication systems.
Multiscale infrared and visible image fusion using gradient domain guided image filtering
NASA Astrophysics Data System (ADS)
Zhu, Jin; Jin, Weiqi; Li, Li; Han, Zhenghao; Wang, Xia
2018-03-01
For better surveillance with infrared and visible imaging, a novel hybrid multiscale decomposition fusion method using gradient domain guided image filtering (HMSD-GDGF) is proposed in this study. In this method, hybrid multiscale decomposition with guided image filtering and gradient domain guided image filtering of source images are first applied before the weight maps of each scale are obtained using a saliency detection technology and filtering means with three different fusion rules at different scales. The three types of fusion rules are for small-scale detail level, large-scale detail level, and base level. Finally, the target becomes more salient and can be more easily detected in the fusion result, with the detail information of the scene being fully displayed. After analyzing the experimental comparisons with state-of-the-art fusion methods, the HMSD-GDGF method has obvious advantages in fidelity of salient information (including structural similarity, brightness, and contrast), preservation of edge features, and human visual perception. Therefore, visual effects can be improved by using the proposed HMSD-GDGF method.
Reaching the global target to reduce stunting: an investment framework.
Shekar, Meera; Kakietek, Jakub; D'Alimonte, Mary R; Rogers, Hilary E; Eberwein, Julia Dayton; Akuoku, Jon Kweku; Pereira, Audrey; Soe-Lin, Shan; Hecht, Robert
2017-06-01
Childhood stunting, being short for one's age, has life-long consequences for health, human capital and economic growth. Being stunted in early childhood is associated with slower cognitive development, reduced schooling attainment and adult incomes decreased by 5-53%. The World Health Assembly has endorsed global nutrition targets including one to reduce the number of stunted children under five by 40% by 2025. The target has been included in the Sustainable Development Goals (SDG target 2.2). This paper estimates the cost of achieving this target and develops scenarios for generating the necessary financing. We focus on a key intervention package for stunting (KIPS) with strong evidence of effectiveness. Annual scale-up costs for the period of 2016-25 were estimated for a sample of 37 high burden countries and extrapolated to all low and middle income countries. The Lives Saved Tool was used to model the impact of the scale-up on stunting prevalence. We analysed data on KIPS budget allocations and expenditure by governments, donors and households to derive a global baseline financing estimate. We modelled two financing scenarios, a 'business as usual', which extends the current trends in domestic and international financing for nutrition through 2025, and another that proposes increases in financing from all sources under a set of burden-sharing rules. The 10-year financial need to scale up KIPS is US$49.5 billion. Under 'business as usual', this financial need is not met and the global stunting target is not reached. To reach the target, current financing will have to increase from US$2.6 billion to US$7.4 billion a year on average. Reaching the stunting target is feasible but will require large coordinated investments in KIPS and a supportive enabling environment. The example of HIV scale-up over 2001-11 is instructive in identifying the factors that could drive such a global response to childhood stunting. © The Author 2017. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine.
NASA Technical Reports Server (NTRS)
Bodkin, Richard J.; Cheatwood, F. M.; Dillman, Robert A; Dinonno, John M.; Hughes, Stephen J.; Lucy, Melvin H.
2016-01-01
As HIAD technology progresses from 3-m diameter experimental scale to as large as 20-m diameter for human Mars entry, the mass penalties of carrying compressed gas has led the HIAD team to research current state-of-the-art gas generator approaches. Summarized below are several technologies identified in this survey, along with some of the pros and cons with respect to supporting large-scale HIAD applications.
Mixture model normalization for non-targeted gas chromatography/mass spectrometry metabolomics data.
Reisetter, Anna C; Muehlbauer, Michael J; Bain, James R; Nodzenski, Michael; Stevens, Robert D; Ilkayeva, Olga; Metzger, Boyd E; Newgard, Christopher B; Lowe, William L; Scholtens, Denise M
2017-02-02
Metabolomics offers a unique integrative perspective for health research, reflecting genetic and environmental contributions to disease-related phenotypes. Identifying robust associations in population-based or large-scale clinical studies demands large numbers of subjects and therefore sample batching for gas-chromatography/mass spectrometry (GC/MS) non-targeted assays. When run over weeks or months, technical noise due to batch and run-order threatens data interpretability. Application of existing normalization methods to metabolomics is challenged by unsatisfied modeling assumptions and, notably, failure to address batch-specific truncation of low abundance compounds. To curtail technical noise and make GC/MS metabolomics data amenable to analyses describing biologically relevant variability, we propose mixture model normalization (mixnorm) that accommodates truncated data and estimates per-metabolite batch and run-order effects using quality control samples. Mixnorm outperforms other approaches across many metrics, including improved correlation of non-targeted and targeted measurements and superior performance when metabolite detectability varies according to batch. For some metrics, particularly when truncation is less frequent for a metabolite, mean centering and median scaling demonstrate comparable performance to mixnorm. When quality control samples are systematically included in batches, mixnorm is uniquely suited to normalizing non-targeted GC/MS metabolomics data due to explicit accommodation of batch effects, run order and varying thresholds of detectability. Especially in large-scale studies, normalization is crucial for drawing accurate conclusions from non-targeted GC/MS metabolomics data.
Real-life helping behaviours in North America: A genome-wide association approach
Fieder, Martin
2018-01-01
In humans, prosocial behaviour is essential for social functioning. Twin studies suggest this distinct human trait to be partly hardwired. In the last decade research on the genetics of prosocial behaviour focused on neurotransmitters and neuropeptides, such as oxytocin, dopamine, and their respective pathways. Recent trends towards large scale medical studies targeting the genetic basis of complex diseases such as Alzheimer’s disease and schizophrenia pave the way for new directions also in behavioural genetics. Based on data from 10,713 participants of the American Health and Retirement Study we estimated heritability of helping behaviour–its total variance explained by 1.2 million single nucleotide polymorphisms–to be 11%. Both, fixed models and mixed linear models identified rs11697300, an intergene variant on chromosome 20, as a candidate variant moderating this particular helping behaviour. We assume that this so far undescribed area is worth further investigation in association with human prosocial behaviour. PMID:29324852
Bhoir, Siddhant; Shaik, Althaf; Thiruvenkatam, Vijay; Kirubakaran, Sivapriya
2018-03-19
Human Tousled-like kinases (TLKs) are highly conserved serine/threonine protein kinases responsible for cell proliferation, DNA repair, and genome surveillance. Their possible involvement in cancer via efficient DNA repair mechanisms have made them clinically relevant molecular targets for anticancer therapy. Innovative approaches in chemical biology have played a key role in validating the importance of kinases as molecular targets. However, the detailed understanding of the protein structure and the mechanisms of protein-drug interaction through biochemical and biophysical techniques demands a method for the production of an active protein of exceptional stability and purity on a large scale. We have designed a bacterial expression system to express and purify biologically active, wild-type Human Tousled-like Kinase 1B (hTLK1B) by co-expression with the protein phosphatase from bacteriophage λ. We have obtained remarkably high amounts of the soluble and homogeneously dephosphorylated form of biologically active hTLK1B with our unique, custom-built vector design strategy. The recombinant hTLK1B can be used for the structural studies and may further facilitate the development of new TLK inhibitors for anti-cancer therapy using a structure-based drug design approach.
Ma, Yingfei; Madupu, Ramana; Karaoz, Ulas; Nossa, Carlos W.; Yang, Liying; Yooseph, Shibu; Yachimski, Patrick S.; Brodie, Eoin L.; Nelson, Karen E.
2014-01-01
ABSTRACT Human papillomavirus (HPV) causes a number of neoplastic diseases in humans. Here, we show a complex normal HPV community in a cohort of 103 healthy human subjects, by metagenomics analysis of the shotgun sequencing data generated from the NIH Human Microbiome Project. The overall HPV prevalence was 68.9% and was highest in the skin (61.3%), followed by the vagina (41.5%), mouth (30%), and gut (17.3%). Of the 109 HPV types as well as additional unclassified types detected, most were undetectable by the widely used commercial kits targeting the vaginal/cervical HPV types. These HPVs likely represent true HPV infections rather than transitory exposure because of strong organ tropism and persistence of the same HPV types in repeat samples. Coexistence of multiple HPV types was found in 48.1% of the HPV-positive samples. Networking between HPV types, cooccurrence or exclusion, was detected in vaginal and skin samples. Large contigs assembled from short HPV reads were obtained from several samples, confirming their genuine HPV origin. This first large-scale survey of HPV using a shotgun sequencing approach yielded a comprehensive map of HPV infections among different body sites of healthy human subjects. IMPORTANCE This nonbiased survey indicates that the HPV community in healthy humans is much more complex than previously defined by widely used kits that are target selective for only a few high- and low-risk HPV types for cervical cancer. The importance of nononcogenic viruses in a mixed HPV infection could be for stimulating or inhibiting a coexisting oncogenic virus via viral interference or immune cross-reaction. Knowledge gained from this study will be helpful to guide the designing of epidemiological and clinical studies in the future to determine the impact of nononcogenic HPV types on the outcome of HPV infections. PMID:24522917
Study of multi-functional precision optical measuring system for large scale equipment
NASA Astrophysics Data System (ADS)
Jiang, Wei; Lao, Dabao; Zhou, Weihu; Zhang, Wenying; Jiang, Xingjian; Wang, Yongxi
2017-10-01
The effective application of high performance measurement technology can greatly improve the large-scale equipment manufacturing ability. Therefore, the geometric parameters measurement, such as size, attitude and position, requires the measurement system with high precision, multi-function, portability and other characteristics. However, the existing measuring instruments, such as laser tracker, total station, photogrammetry system, mostly has single function, station moving and other shortcomings. Laser tracker needs to work with cooperative target, but it can hardly meet the requirement of measurement in extreme environment. Total station is mainly used for outdoor surveying and mapping, it is hard to achieve the demand of accuracy in industrial measurement. Photogrammetry system can achieve a wide range of multi-point measurement, but the measuring range is limited and need to repeatedly move station. The paper presents a non-contact opto-electronic measuring instrument, not only it can work by scanning the measurement path but also measuring the cooperative target by tracking measurement. The system is based on some key technologies, such as absolute distance measurement, two-dimensional angle measurement, automatically target recognition and accurate aiming, precision control, assembly of complex mechanical system and multi-functional 3D visualization software. Among them, the absolute distance measurement module ensures measurement with high accuracy, and the twodimensional angle measuring module provides precision angle measurement. The system is suitable for the case of noncontact measurement of large-scale equipment, it can ensure the quality and performance of large-scale equipment throughout the process of manufacturing and improve the manufacturing ability of large-scale and high-end equipment.
A Community Standard Format for the Representation of Protein Affinity Reagents*
Gloriam, David E.; Orchard, Sandra; Bertinetti, Daniela; Björling, Erik; Bongcam-Rudloff, Erik; Borrebaeck, Carl A. K.; Bourbeillon, Julie; Bradbury, Andrew R. M.; de Daruvar, Antoine; Dübel, Stefan; Frank, Ronald; Gibson, Toby J.; Gold, Larry; Haslam, Niall; Herberg, Friedrich W.; Hiltke, Tara; Hoheisel, Jörg D.; Kerrien, Samuel; Koegl, Manfred; Konthur, Zoltán; Korn, Bernhard; Landegren, Ulf; Montecchi-Palazzi, Luisa; Palcy, Sandrine; Rodriguez, Henry; Schweinsberg, Sonja; Sievert, Volker; Stoevesandt, Oda; Taussig, Michael J.; Ueffing, Marius; Uhlén, Mathias; van der Maarel, Silvère; Wingren, Christer; Woollard, Peter; Sherman, David J.; Hermjakob, Henning
2010-01-01
Protein affinity reagents (PARs), most commonly antibodies, are essential reagents for protein characterization in basic research, biotechnology, and diagnostics as well as the fastest growing class of therapeutics. Large numbers of PARs are available commercially; however, their quality is often uncertain. In addition, currently available PARs cover only a fraction of the human proteome, and their cost is prohibitive for proteome scale applications. This situation has triggered several initiatives involving large scale generation and validation of antibodies, for example the Swedish Human Protein Atlas and the German Antibody Factory. Antibodies targeting specific subproteomes are being pursued by members of Human Proteome Organisation (plasma and liver proteome projects) and the United States National Cancer Institute (cancer-associated antigens). ProteomeBinders, a European consortium, aims to set up a resource of consistently quality-controlled protein-binding reagents for the whole human proteome. An ultimate PAR database resource would allow consumers to visit one on-line warehouse and find all available affinity reagents from different providers together with documentation that facilitates easy comparison of their cost and quality. However, in contrast to, for example, nucleotide databases among which data are synchronized between the major data providers, current PAR producers, quality control centers, and commercial companies all use incompatible formats, hindering data exchange. Here we propose Proteomics Standards Initiative (PSI)-PAR as a global community standard format for the representation and exchange of protein affinity reagent data. The PSI-PAR format is maintained by the Human Proteome Organisation PSI and was developed within the context of ProteomeBinders by building on a mature proteomics standard format, PSI-molecular interaction, which is a widely accepted and established community standard for molecular interaction data. Further information and documentation are available on the PSI-PAR web site. PMID:19674966
Davis, Matthew L; Scott Gayzik, F
2016-10-01
Biofidelity response corridors developed from post-mortem human subjects are commonly used in the design and validation of anthropomorphic test devices and computational human body models (HBMs). Typically, corridors are derived from a diverse pool of biomechanical data and later normalized to a target body habitus. The objective of this study was to use morphed computational HBMs to compare the ability of various scaling techniques to scale response data from a reference to a target anthropometry. HBMs are ideally suited for this type of study since they uphold the assumptions of equal density and modulus that are implicit in scaling method development. In total, six scaling procedures were evaluated, four from the literature (equal-stress equal-velocity, ESEV, and three variations of impulse momentum) and two which are introduced in the paper (ESEV using a ratio of effective masses, ESEV-EffMass, and a kinetic energy approach). In total, 24 simulations were performed, representing both pendulum and full body impacts for three representative HBMs. These simulations were quantitatively compared using the International Organization for Standardization (ISO) ISO-TS18571 standard. Based on these results, ESEV-EffMass achieved the highest overall similarity score (indicating that it is most proficient at scaling a reference response to a target). Additionally, ESEV was found to perform poorly for two degree-of-freedom (DOF) systems. However, the results also indicated that no single technique was clearly the most appropriate for all scenarios.
Neurogenomics and the role of a large mutational target on rapid behavioral change.
Stanley, Craig E; Kulathinal, Rob J
2016-11-08
Behavior, while complex and dynamic, is among the most diverse, derived, and rapidly evolving traits in animals. The highly labile nature of heritable behavioral change is observed in such evolutionary phenomena as the emergence of converged behaviors in domesticated animals, the rapid evolution of preferences, and the routine development of ethological isolation between diverging populations and species. In fact, it is believed that nervous system development and its potential to evolve a seemingly infinite array of behavioral innovations played a major role in the successful diversification of metazoans, including our own human lineage. However, unlike other rapidly evolving functional systems such as sperm-egg interactions and immune defense, the genetic basis of rapid behavioral change remains elusive. Here we propose that the rapid divergence and widespread novelty of innate and adaptive behavior is primarily a function of its genomic architecture. Specifically, we hypothesize that the broad diversity of behavioral phenotypes present at micro- and macroevolutionary scales is promoted by a disproportionately large mutational target of neurogenic genes. We present evidence that these large neuro-behavioral targets are significant and ubiquitous in animal genomes and suggest that behavior's novelty and rapid emergence are driven by a number of factors including more selection on a larger pool of variants, a greater role of phenotypic plasticity, and/or unique molecular features present in large genes. We briefly discuss the origins of these large neurogenic genes, as they relate to the remarkable diversity of metazoan behaviors, and highlight key consequences on both behavioral traits and neurogenic disease across, respectively, evolutionary and ontogenetic time scales. Current approaches to studying the genetic mechanisms underlying rapid phenotypic change primarily focus on identifying signatures of Darwinian selection in protein-coding regions. In contrast, the large mutational target hypothesis places genomic architecture and a larger allelic pool at the forefront of rapid evolutionary change, particularly in genetic systems that are polygenic and regulatory in nature. Genomic data from brain and neural tissues in mammals as well as a preliminary survey of neurogenic genes from comparative genomic data support this hypothesis while rejecting both positive and relaxed selection on proteins or higher mutation rates. In mammals and invertebrates, neurogenic genes harbor larger protein-coding regions and possess a richer regulatory repertoire of miRNA targets and transcription factor binding sites. Overall, neurogenic genes cover a disproportionately large genomic fraction, providing a sizeable substrate for evolutionary, genetic, and molecular mechanisms to act upon. Readily available comparative and functional genomic data provide unexplored opportunities to test whether a distinct neurogenomic architecture can promote rapid behavioral change via several mechanisms unique to large genes, and which components of this large footprint are uniquely metazoan. The large mutational target hypothesis highlights the eminent roles of mutation and functional genomic architecture in generating rapid developmental and evolutionary change. It has broad implications on our understanding of the genetics of complex adaptive traits such as behavior by focusing on the importance of mutational input, from SNPs to alternative transcripts to transposable elements, on driving evolutionary rates of functional systems. Such functional divergence has important implications in promoting behavioral isolation across short- and long-term timescales. Due to genome-scaled polygenic adaptation, the large target effect also contributes to our inability to identify adapted behavioral candidate genes. The presence of large neurogenic genes, particularly in the mammalian brain and other neural tissues, further offers emerging insight into the etiology of neurodevelopmental and neurodegenerative diseases. The well-known correlation between neurological spectrum disorders in children and paternal age may simply be a direct result of aging fathers accumulating mutations across these large neurodevelopmental genes. The large mutational target hypothesis can also explain the rapid evolution of other functional systems covering a large genomic fraction such as male fertility and its preferential association with hybrid male sterility among closely related taxa. Overall, a focus on mutational potential may increase our power in understanding the genetic basis of complex phenotypes such as behavior while filling a general gap in understanding their evolution.
Lucas, Peter W; Philip, Swapna M; Al-Qeoud, Dareen; Al-Draihim, Nuha; Saji, Sreeja; van Casteren, Adam
2016-01-01
Mammalian enamel, the contact dental tissue, is something of an enigma. It is almost entirely made of hydroxyapatite, yet exhibits very different mechanical behavior to a homogeneous block of the same mineral. Recent approaches suggest that its hierarchical composite form, similar to other biological hard tissues, leads to a mechanical performance that depends very much on the scale of measurement. The stiffness of the material is predicted to be highest at the nanoscale, being sacrificed to produce a high toughness at the largest scale, that is, at the level of the tooth crown itself. Yet because virtually all this research has been conducted only on human (or sometimes "bovine") enamel, there has been little regard for structural variation of the tissue considered as evolutionary adaptation to diet. What is mammalian enamel optimized for? We suggest that there are competing selective pressures. We suggest that the structural characteristics that optimize enamel to resist large-scale fractures, such as crown failures, are very different to those that resist wear (small-scale fracture). While enamel is always designed for damage tolerance, this may be suboptimal in the enamel of some species, including modern humans (which have been the target of most investigations), in order to counteract wear. The experimental part of this study introduces novel techniques that help to assess resistance at the nanoscale. © 2015 Wiley Periodicals, Inc.
Wang, He-Xing; Wang, Bin; Zhou, Ying; Jiang, Qing-Wu
2014-12-01
A rapid and sensitive method for the screening and selective quantification of antibiotics in urine by two-dimensional ultraperformance liquid chromatography coupled with quadrupole time-of-flight mass spectrometry was developed. This method allowed the injection of 200 μL urine extract. The 200-μL injection volume used in this method increased the absolute sensitivity for target antibiotics in solvent by an average 13.3 times, with a range from 8.4 to 28.5 times, compared with the 10-μL conventional injection volume. A 96-well solid phase extraction procedure was established to eliminate the contamination on the chromatographic column resulting from the large-volume injection and increase the throughput of sample preparation. Fourteen target antibiotics from six common categories (β-lactams, quinolones, tetracyclines, macrolides, sulfonamides, and chloramphenicols) were selected as model compounds, and a database containing an additional 74 antibiotics was compiled for posttarget screening. The limit of detection of the target antibiotics, defined as a signal-to-noise ratio of 3, ranged from 0.04 to 1.99 ng/mL. The mean interday recoveries ranged between 79.6 and 121.3 %, with a relative standard deviation from 2.9 to 18.3 % at three spiking levels of 20 ng/mL, 50 ng/mL, and 100 ng/mL. This method was successfully applied in 60 real urine samples from schoolchildren aged 8-11 years, and four target antibiotics (azithromycin, sulfadiazine, trimethoprim, and oxytetracycline) and two posttarget antibiotics (sulfadimidine and cefaclor) were found in the urine samples. This method can be used as a large-scale biomonitoring tool for exposure of the human population to antibiotics.
House, Thomas; Hall, Ian; Danon, Leon; Keeling, Matt J
2010-02-14
In the event of a release of a pathogen such as smallpox, which is human-to-human transmissible and has high associated mortality, a key question is how best to deploy containment and control strategies. Given the general uncertainty surrounding this issue, mathematical modelling has played an important role in informing the likely optimal response, in particular defining the conditions under which mass-vaccination would be appropriate. In this paper, we consider two key questions currently unanswered in the literature: firstly, what is the optimal spatial scale for intervention; and secondly, how sensitive are results to the modelling assumptions made about the pattern of human contacts? Here we develop a novel mathematical model for smallpox that incorporates both information on individual contact structure (which is important if the effects of contact tracing are to be captured accurately) and large-scale patterns of movement across a range of spatial scales in Great Britain. Analysis of this model confirms previous work suggesting that a locally targeted 'ring' vaccination strategy is optimal, and that this conclusion is actually quite robust for different socio-demographic and epidemiological assumptions. Our method allows for intuitive understanding of the reasons why national mass vaccination is typically predicted to be suboptimal. As such, we present a general framework for fast calculation of expected outcomes during the attempted control of diverse emerging infections; this is particularly important given that parameters would need to be interactively estimated and modelled in any release scenario.
Preparation of highly multiplexed small RNA sequencing libraries.
Persson, Helena; Søkilde, Rolf; Pirona, Anna Chiara; Rovira, Carlos
2017-08-01
MicroRNAs (miRNAs) are ~22-nucleotide-long small non-coding RNAs that regulate the expression of protein-coding genes by base pairing to partially complementary target sites, preferentially located in the 3´ untranslated region (UTR) of target mRNAs. The expression and function of miRNAs have been extensively studied in human disease, as well as the possibility of using these molecules as biomarkers for prognostication and treatment guidance. To identify and validate miRNAs as biomarkers, their expression must be screened in large collections of patient samples. Here, we develop a scalable protocol for the rapid and economical preparation of a large number of small RNA sequencing libraries using dual indexing for multiplexing. Combined with the use of off-the-shelf reagents, more samples can be sequenced simultaneously on large-scale sequencing platforms at a considerably lower cost per sample. Sample preparation is simplified by pooling libraries prior to gel purification, which allows for the selection of a narrow size range while minimizing sample variation. A comparison with publicly available data from benchmarking of miRNA analysis platforms showed that this method captures absolute and differential expression as effectively as commercially available alternatives.
Sugahara, Daisuke; Kaji, Hiroyuki; Sugihara, Kazushi; Asano, Masahide; Narimatsu, Hisashi
2012-01-01
Model organisms containing deletion or mutation in a glycosyltransferase-gene exhibit various physiological abnormalities, suggesting that specific glycan motifs on certain proteins play important roles in vivo. Identification of the target proteins of glycosyltransferase isozymes is the key to understand the roles of glycans. Here, we demonstrated the proteome-scale identification of the target proteins specific for a glycosyltransferase isozyme, β1,4-galactosyltransferase-I (β4GalT-I). Although β4GalT-I is the most characterized glycosyltransferase, its distinctive contribution to β1,4-galactosylation has been hardly described so far. We identified a large number of candidates for the target proteins specific to β4GalT-I by comparative analysis of β4GalT-I-deleted and wild-type mice using the LC/MS-based technique with the isotope-coded glycosylation site-specific tagging (IGOT) of lectin-captured N-glycopeptides. Our approach to identify the target proteins in a proteome-scale offers common features and trends in the target proteins, which facilitate understanding of the mechanism that controls assembly of a particular glycan motif on specific proteins. PMID:23002422
Zarei, Najmeh; Vaziri, Behrouz; Shokrgozar, Mohammad Ali; Mahdian, Reza; Fazel, Ramin; Khalaj, Vahid
2014-12-01
Single-chain variable fragments (scFvs) have recently emerged as attractive candidates in targeted immunotherapy of various malignancies. The anti-CD22 scFv is able to target CD22, on B cell surface and is being considered as a promising molecule in targeted immunotherapy of B cell malignancies. The recombinant anti-CD22 scFv has been successfully expressed in Escherichia coli; however, the insufficient production yield has been a major bottleneck for its therapeutic application. The methylotrophic yeast Pichia pastoris has become a highly popular expression host for the production of a wide variety of recombinant proteins such as antibody fragments. In this study, we used the Pichia expression system to express a humanized scFv antibody against CD22. The full-length humanized scFv gene was codon optimized, cloned into the pPICZαA and expressed in GS115 strain. The maximum production level of the scFv (25 mg/L) were achieved at methanol concentration, 1 %; pH 6.0; inoculum density, OD600 = 3 and the induction time of 72 h. The correlation between scFv gene dosage and expression level was also investigated by real-time PCR, and the results confirmed the presence of such correlation up to five gene copies. Immunofluorescence and flow cytometry studies and Biacore analysis demonstrated binding to CD22 on the surface of human lymphoid cell line Raji and recombinant soluble CD22, respectively. Taken together, the presented data suggest that the Pichia pastoris can be considered as an efficient host for the large-scale production of anti-CD22 scFv as a promising carrier for targeted drug delivery in treatment of CD22(+) B cell malignancies.
EFFECTS OF LARGE-SCALE POULTRY FARMS ON AQUATIC MICROBIAL COMMUNITIES: A MOLECULAR INVESTIGATION.
The effects of large-scale poultry production operations on water quality and human health are largely unknown. Poultry litter is frequently applied as fertilizer to agricultural lands adjacent to large poultry farms. Run-off from the land introduces a variety of stressors into t...
Nakamura, Kenji; Hirayama-Kurogi, Mio; Ito, Shingo; Kuno, Takuya; Yoneyama, Toshihiro; Obuchi, Wataru; Terasaki, Tetsuya; Ohtsuki, Sumio
2016-08-01
The purpose of the present study was to examine simultaneously the absolute protein amounts of 152 membrane and membrane-associated proteins, including 30 metabolizing enzymes and 107 transporters, in pooled microsomal fractions of human liver, kidney, and intestine by means of SWATH-MS with stable isotope-labeled internal standard peptides, and to compare the results with those obtained by MRM/SRM and high resolution (HR)-MRM/PRM. The protein expression levels of 27 metabolizing enzymes, 54 transporters, and six other membrane proteins were quantitated by SWATH-MS; other targets were below the lower limits of quantitation. Most of the values determined by SWATH-MS differed by less than 50% from those obtained by MRM/SRM or HR-MRM/PRM. Various metabolizing enzymes were expressed in liver microsomes more abundantly than in other microsomes. Ten, 13, and eight transporters listed as important for drugs by International Transporter Consortium were quantified in liver, kidney, and intestinal microsomes, respectively. Our results indicate that SWATH-MS enables large-scale multiplex absolute protein quantification while retaining similar quantitative capability to MRM/SRM or HR-MRM/PRM. SWATH-MS is expected to be useful methodology in the context of drug development for elucidating the molecular mechanisms of drug absorption, metabolism, and excretion in the human body based on protein profile information. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Highly multiplexed targeted proteomics using precise control of peptide retention time.
Gallien, Sebastien; Peterman, Scott; Kiyonami, Reiko; Souady, Jamal; Duriez, Elodie; Schoen, Alan; Domon, Bruno
2012-04-01
Large-scale proteomics applications using SRM analysis on triple quadrupole mass spectrometers present new challenges to LC-MS/MS experimental design. Despite the automation of building large-scale LC-SRM methods, the increased numbers of targeted peptides can compromise the balance between sensitivity and selectivity. To facilitate large target numbers, time-scheduled SRM transition acquisition is performed. Previously published results have demonstrated incorporation of a well-characterized set of synthetic peptides enabled chromatographic characterization of the elution profile for most endogenous peptides. We have extended this application of peptide trainer kits to not only build SRM methods but to facilitate real-time elution profile characterization that enables automated adjustment of the scheduled detection windows. Incorporation of dynamic retention time adjustments better facilitate targeted assays lasting several days without the need for constant supervision. This paper provides an overview of how the dynamic retention correction approach identifies and corrects for commonly observed LC variations. This adjustment dramatically improves robustness in targeted discovery experiments as well as routine quantification experiments. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
A Scalable Approach for Protein False Discovery Rate Estimation in Large Proteomic Data Sets.
Savitski, Mikhail M; Wilhelm, Mathias; Hahne, Hannes; Kuster, Bernhard; Bantscheff, Marcus
2015-09-01
Calculating the number of confidently identified proteins and estimating false discovery rate (FDR) is a challenge when analyzing very large proteomic data sets such as entire human proteomes. Biological and technical heterogeneity in proteomic experiments further add to the challenge and there are strong differences in opinion regarding the conceptual validity of a protein FDR and no consensus regarding the methodology for protein FDR determination. There are also limitations inherent to the widely used classic target-decoy strategy that particularly show when analyzing very large data sets and that lead to a strong over-representation of decoy identifications. In this study, we investigated the merits of the classic, as well as a novel target-decoy-based protein FDR estimation approach, taking advantage of a heterogeneous data collection comprised of ∼19,000 LC-MS/MS runs deposited in ProteomicsDB (https://www.proteomicsdb.org). The "picked" protein FDR approach treats target and decoy sequences of the same protein as a pair rather than as individual entities and chooses either the target or the decoy sequence depending on which receives the highest score. We investigated the performance of this approach in combination with q-value based peptide scoring to normalize sample-, instrument-, and search engine-specific differences. The "picked" target-decoy strategy performed best when protein scoring was based on the best peptide q-value for each protein yielding a stable number of true positive protein identifications over a wide range of q-value thresholds. We show that this simple and unbiased strategy eliminates a conceptual issue in the commonly used "classic" protein FDR approach that causes overprediction of false-positive protein identification in large data sets. The approach scales from small to very large data sets without losing performance, consistently increases the number of true-positive protein identifications and is readily implemented in proteomics analysis software. © 2015 by The American Society for Biochemistry and Molecular Biology, Inc.
Causal Inferences with Large Scale Assessment Data: Using a Validity Framework
ERIC Educational Resources Information Center
Rutkowski, David; Delandshere, Ginette
2016-01-01
To answer the calls for stronger evidence by the policy community, educational researchers and their associated organizations increasingly demand more studies that can yield causal inferences. International large scale assessments (ILSAs) have been targeted as a rich data sources for causal research. It is in this context that we take up a…
Reflections on the Increasing Relevance of Large-Scale Professional Development
ERIC Educational Resources Information Center
Krainer, Konrad
2015-01-01
This paper focuses on commonalities and differences of three approaches to large-scale professional development (PD) in mathematics education, based on two studies from Germany and one from the United States of America. All three initiatives break new ground in improving PD targeted at educating "multipliers", and in all three cases…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schanen, Michel; Marin, Oana; Zhang, Hong
Adjoints are an important computational tool for large-scale sensitivity evaluation, uncertainty quantification, and derivative-based optimization. An essential component of their performance is the storage/recomputation balance in which efficient checkpointing methods play a key role. We introduce a novel asynchronous two-level adjoint checkpointing scheme for multistep numerical time discretizations targeted at large-scale numerical simulations. The checkpointing scheme combines bandwidth-limited disk checkpointing and binomial memory checkpointing. Based on assumptions about the target petascale systems, which we later demonstrate to be realistic on the IBM Blue Gene/Q system Mira, we create a model of the expected performance of our checkpointing approach and validatemore » it using the highly scalable Navier-Stokes spectralelement solver Nek5000 on small to moderate subsystems of the Mira supercomputer. In turn, this allows us to predict optimal algorithmic choices when using all of Mira. We also demonstrate that two-level checkpointing is significantly superior to single-level checkpointing when adjoining a large number of time integration steps. To our knowledge, this is the first time two-level checkpointing had been designed, implemented, tuned, and demonstrated on fluid dynamics codes at large scale of 50k+ cores.« less
NASA Astrophysics Data System (ADS)
Block, B.; Ferretti, F.; White, T.; De Leo, G.; Hazen, E. L.; Bograd, S. J.
2016-12-01
Anthropogenic impacts on marine predators have been examined within exclusive economic zones, but few data sets have enabled assessing human fishing impacts on the high seas. By combining large electronic tagging databases archiving mobile predator movements (e.g. Tagging of Pacific Pelagics, TAG A Giant, Animal Telemetry Network) with the global fishing catch and fishing effort data, from satellite tracks of vessels on the high seas (AIS), a better understanding of human use and exploitation at a global scale can be obtained. This capacity to combine the movements of mobile ocean predators (tunas, sharks, billfishes) with analyses of their human predator's behaviors, via examination of the global fishing fleet activities is unprecedented due to the new access researchers are garnering to these big satellite derived AIS databases. Global Fishing Watch is one example of such a data provider, that now makes accessible, the AIS data from the global community of maritime vessels, and has developed along with researchers new algorithms that delineate distinct types of fishing vessel behaviors (longline, purse seiner) and effort. When combined with satellite tagging data of mobile apex predators, oceanographic preferences, records of fishing fleets catches, targeted species and economic drivers of fisheries, new quantitative insights can be gained about the catch reporting of fleets, and the pelagic species targeted at a global scale. Research communities can now also examine how humans behave on the high seas, and potentially improve how fish stocks, such as tunas, billfishes, and sharks are exploited. The capacity to gather information on diverse human fishing fleets and behaviors remotely, should provide a wealth of new tools that can potentially be applied toward the resource management efforts surrounding these global fishing fleets. This type of information is essential for prioritizing regions of conservation concern for megaufauna swimming in our oceans.
A network of epigenetic regulators guides developmental haematopoiesis in vivo.
Huang, Hsuan-Ting; Kathrein, Katie L; Barton, Abby; Gitlin, Zachary; Huang, Yue-Hua; Ward, Thomas P; Hofmann, Oliver; Dibiase, Anthony; Song, Anhua; Tyekucheva, Svitlana; Hide, Winston; Zhou, Yi; Zon, Leonard I
2013-12-01
The initiation of cellular programs is orchestrated by key transcription factors and chromatin regulators that activate or inhibit target gene expression. To generate a compendium of chromatin factors that establish the epigenetic code during developmental haematopoiesis, a large-scale reverse genetic screen was conducted targeting orthologues of 425 human chromatin factors in zebrafish. A set of chromatin regulators was identified that target different stages of primitive and definitive blood formation, including factors not previously implicated in haematopoiesis. We identified 15 factors that regulate development of primitive erythroid progenitors and 29 factors that regulate development of definitive haematopoietic stem and progenitor cells. These chromatin factors are associated with SWI/SNF and ISWI chromatin remodelling, SET1 methyltransferase, CBP-p300-HBO1-NuA4 acetyltransferase, HDAC-NuRD deacetylase, and Polycomb repressive complexes. Our work provides a comprehensive view of how specific chromatin factors and their associated complexes play a major role in the establishment of haematopoietic cells in vivo.
Steckel, Michael; Molina-Arcas, Miriam; Weigelt, Britta; Marani, Michaela; Warne, Patricia H; Kuznetsov, Hanna; Kelly, Gavin; Saunders, Becky; Howell, Michael; Downward, Julian; Hancock, David C
2012-01-01
Oncogenic mutations in RAS genes are very common in human cancer, resulting in cells with well-characterized selective advantages, but also less well-understood vulnerabilities. We have carried out a large-scale loss-of-function screen to identify genes that are required by KRAS-transformed colon cancer cells, but not by derivatives lacking this oncogene. Top-scoring genes were then tested in a larger panel of KRAS mutant and wild-type cancer cells. Cancer cells expressing oncogenic KRAS were found to be highly dependent on the transcription factor GATA2 and the DNA replication initiation regulator CDC6. Extending this analysis using a collection of drugs with known targets, we found that cancer cells with mutant KRAS showed selective addiction to proteasome function, as well as synthetic lethality with topoisomerase inhibition. Combination targeting of these functions caused improved killing of KRAS mutant cells relative to wild-type cells. These observations suggest novel targets and new ways of combining existing therapies for optimal effect in RAS mutant cancers, which are traditionally seen as being highly refractory to therapy. PMID:22613949
Karanth, Kota Ullas; Gopalaswamy, Arjun M.; Kumar, Narayanarao Samba; Vaidyanathan, Srinivas; Nichols, James D.; MacKenzie, Darryl I.
2011-01-01
1. Assessing spatial distributions of threatened large carnivores at landscape scales poses formidable challenges because of their rarity and elusiveness. As a consequence of logistical constraints, investigators typically rely on sign surveys. Most survey methods, however, do not explicitly address the central problem of imperfect detections of animal signs in the field, leading to underestimates of true habitat occupancy and distribution. 2. We assessed habitat occupancy for a tiger Panthera tigris metapopulation across a c. 38 000-km2 landscape in India, employing a spatially replicated survey to explicitly address imperfect detections. Ecological predictions about tiger presence were confronted with sign detection data generated from occupancy sampling of 205 sites, each of 188 km2. 3. A recent occupancy model that considers Markovian dependency among sign detections on spatial replicates performed better than the standard occupancy model (ΔAIC = 184·9). A formulation of this model that fitted the data best showed that density of ungulate prey and levels of human disturbance were key determinants of local tiger presence. Model averaging resulted in a replicate-level detection probability [inline image] = 0·17 (0·17) for signs and a tiger habitat occupancy estimate of [inline image] = 0·665 (0·0857) or 14 076 (1814) km2 of potential habitat of 21 167 km2. In contrast, a traditional presence-versus-absence approach underestimated occupancy by 47%. Maps of probabilities of local site occupancy clearly identified tiger source populations at higher densities and matched observed tiger density variations, suggesting their potential utility for population assessments at landscape scales. 4. Synthesis and applications. Landscape-scale sign surveys can efficiently assess large carnivore spatial distributions and elucidate the factors governing their local presence, provided ecological and observation processes are both explicitly modelled. Occupancy sampling using spatial replicates can be used to reliably and efficiently identify tiger population sources and help monitor metapopulations. Our results reinforce earlier findings that prey depletion and human disturbance are key drivers of local tiger extinctions and tigers can persist even in human-dominated landscapes through effective protection of source populations. Our approach facilitates efficient targeting of tiger conservation interventions and, more generally, provides a basis for the reliable integration of large carnivore monitoring data between local and landscape scales.
Automatic three-dimensional measurement of large-scale structure based on vision metrology.
Zhu, Zhaokun; Guan, Banglei; Zhang, Xiaohu; Li, Daokui; Yu, Qifeng
2014-01-01
All relevant key techniques involved in photogrammetric vision metrology for fully automatic 3D measurement of large-scale structure are studied. A new kind of coded target consisting of circular retroreflective discs is designed, and corresponding detection and recognition algorithms based on blob detection and clustering are presented. Then a three-stage strategy starting with view clustering is proposed to achieve automatic network orientation. As for matching of noncoded targets, the concept of matching path is proposed, and matches for each noncoded target are found by determination of the optimal matching path, based on a novel voting strategy, among all possible ones. Experiments on a fixed keel of airship have been conducted to verify the effectiveness and measuring accuracy of the proposed methods.
Crater size estimates for large-body terrestrial impact
NASA Technical Reports Server (NTRS)
Schmidt, Robert M.; Housen, Kevin R.
1988-01-01
Calculating the effects of impacts leading to global catastrophes requires knowledge of the impact process at very large size scales. This information cannot be obtained directly but must be inferred from subscale physical simulations, numerical simulations, and scaling laws. Schmidt and Holsapple presented scaling laws based upon laboratory-scale impact experiments performed on a centrifuge (Schmidt, 1980 and Schmidt and Holsapple, 1980). These experiments were used to develop scaling laws which were among the first to include gravity dependence associated with increasing event size. At that time using the results of experiments in dry sand and in water to provide bounds on crater size, they recognized that more precise bounds on large-body impact crater formation could be obtained with additional centrifuge experiments conducted in other geological media. In that previous work, simple power-law formulae were developed to relate final crater diameter to impactor size and velocity. In addition, Schmidt (1980) and Holsapple and Schmidt (1982) recognized that the energy scaling exponent is not a universal constant but depends upon the target media. Recently, Holsapple and Schmidt (1987) includes results for non-porous materials and provides a basis for estimating crater formation kinematics and final crater size. A revised set of scaling relationships for all crater parameters of interest are presented. These include results for various target media and include the kinematics of formation. Particular attention is given to possible limits brought about by very large impactors.
Selecting habitat to survive: the impact of road density on survival in a large carnivore.
Basille, Mathieu; Van Moorter, Bram; Herfindal, Ivar; Martin, Jodie; Linnell, John D C; Odden, John; Andersen, Reidar; Gaillard, Jean-Michel
2013-01-01
Habitat selection studies generally assume that animals select habitat and food resources at multiple scales to maximise their fitness. However, animals sometimes prefer habitats of apparently low quality, especially when considering the costs associated with spatially heterogeneous human disturbance. We used spatial variation in human disturbance, and its consequences on lynx survival, a direct fitness component, to test the Hierarchical Habitat Selection hypothesis from a population of Eurasian lynx Lynx lynx in southern Norway. Data from 46 lynx monitored with telemetry indicated that a high proportion of forest strongly reduced the risk of mortality from legal hunting at the home range scale, while increasing road density strongly increased such risk at the finer scale within the home range. We found hierarchical effects of the impact of human disturbance, with a higher road density at a large scale reinforcing its negative impact at a fine scale. Conversely, we demonstrated that lynx shifted their habitat selection to avoid areas with the highest road densities within their home ranges, thus supporting a compensatory mechanism at fine scale enabling lynx to mitigate the impact of large-scale disturbance. Human impact, positively associated with high road accessibility, was thus a stronger driver of lynx space use at a finer scale, with home range characteristics nevertheless constraining habitat selection. Our study demonstrates the truly hierarchical nature of habitat selection, which aims at maximising fitness by selecting against limiting factors at multiple spatial scales, and indicates that scale-specific heterogeneity of the environment is driving individual spatial behaviour, by means of trade-offs across spatial scales.
NASA Astrophysics Data System (ADS)
Tóthmérész, Béla; Mitchley, Jonathan; Jongepierová, Ivana; Baasch, Annett; Fajmon, Karel; Kirmer, Anita; Prach, Karel; Řehounková, Klára; Tischew, Sabine; Twiston-Davies, Grace; Dutoit, Thierry; Buisson, Elise; Jeunatre, Renaud; Valkó, Orsolya; Deák, Balázs; Török, Péter
2017-04-01
Sustaining the human well-being and the quality of life, it is essential to develop and support green infrastructure (strategically planned network of natural and semi-natural areas with other environmental features designed and managed to deliver a wide range of ecosystem services). For developing and sustaining green infrastructure the conservation and restoration of biodiversity in natural and traditionally managed habitats is essential. Species-rich landscapes in Europe have been maintained over centuries by various kinds of low-intensity use. Recently, they suffered by losses in extent and diversity due to land degradation by intensification or abandonment. Conservation of landscape-scale biodiversity requires the maintenance of species-rich habitats and the restoration of lost grasslands. We are focusing on landscape-level restoration studies including multiple sites in wide geographical scale (including Czech Republic, France, Germany, Hungary, and UK). In a European-wide perspective we aimed at to address four specific questions: (i) What were the aims and objectives of landscape-scale restoration? (ii) What results have been achieved? (iii) What are the costs of large-scale restoration? (iv) What policy tools are available for the restoration of landscape-scale biodiversity? We conclude that landscape-level restoration offers exciting new opportunities to reconnect long-disrupted ecological processes and to restore landscape connectivity. Generally, these measures enable to enhance the biodiversity at the landscape scale. The development of policy tools to achieve restoration at the landscape scale are essential for the achievement of the ambitious targets of the Convention on Biological Diversity and the European Biodiversity Strategy for ecosystem restoration.
Generating classes of 3D virtual mandibles for AR-based medical simulation.
Hippalgaonkar, Neha R; Sider, Alexa D; Hamza-Lup, Felix G; Santhanam, Anand P; Jaganathan, Bala; Imielinska, Celina; Rolland, Jannick P
2008-01-01
Simulation and modeling represent promising tools for several application domains from engineering to forensic science and medicine. Advances in 3D imaging technology convey paradigms such as augmented reality (AR) and mixed reality inside promising simulation tools for the training industry. Motivated by the requirement for superimposing anatomically correct 3D models on a human patient simulator (HPS) and visualizing them in an AR environment, the purpose of this research effort was to develop and validate a method for scaling a source human mandible to a target human mandible within a 2 mm root mean square (RMS) error. Results show that, given a distance between 2 same landmarks on 2 different mandibles, a relative scaling factor may be computed. Using this scaling factor, results show that a 3D virtual mandible model can be made morphometrically equivalent to a real target-specific mandible within a 1.30 mm RMS error. The virtual mandible may be further used as a reference target for registering other anatomic models, such as the lungs, on the HPS. Such registration will be made possible by physical constraints among the mandible and the spinal column in the horizontal normal rest position.
A Scaled Framework for CRISPR Editing of Human Pluripotent Stem Cells to Study Psychiatric Disease.
Hazelbaker, Dane Z; Beccard, Amanda; Bara, Anne M; Dabkowski, Nicole; Messana, Angelica; Mazzucato, Patrizia; Lam, Daisy; Manning, Danielle; Eggan, Kevin; Barrett, Lindy E
2017-10-10
Scaling of CRISPR-Cas9 technology in human pluripotent stem cells (hPSCs) represents an important step for modeling complex disease and developing drug screens in human cells. However, variables affecting the scaling efficiency of gene editing in hPSCs remain poorly understood. Here, we report a standardized CRISPR-Cas9 approach, with robust benchmarking at each step, to successfully target and genotype a set of psychiatric disease-implicated genes in hPSCs and provide a resource of edited hPSC lines for six of these genes. We found that transcriptional state and nucleosome positioning around targeted loci was not correlated with editing efficiency. However, editing frequencies varied between different hPSC lines and correlated with genomic stability, underscoring the need for careful cell line selection and unbiased assessments of genomic integrity. Together, our step-by-step quantification and in-depth analyses provide an experimental roadmap for scaling Cas9-mediated editing in hPSCs to study psychiatric disease, with broader applicability for other polygenic diseases. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Kieffer, Michael J.; Lesaux, Nonie K.; Rivera, Mabel; Francis, David J.
2009-01-01
Including English language learners (ELLs) in large-scale assessments raises questions about the validity of inferences based on their scores. Test accommodations for ELLs are intended to reduce the impact of limited English proficiency on the assessment of the target construct, most often mathematic or science proficiency. This meta-analysis…
Distributed and grid computing projects with research focus in human health.
Diomidous, Marianna; Zikos, Dimitrios
2012-01-01
Distributed systems and grid computing systems are used to connect several computers to obtain a higher level of performance, in order to solve a problem. During the last decade, projects use the World Wide Web to aggregate individuals' CPU power for research purposes. This paper presents the existing active large scale distributed and grid computing projects with research focus in human health. There have been found and presented 11 active projects with more than 2000 Processing Units (PUs) each. The research focus for most of them is molecular biology and, specifically on understanding or predicting protein structure through simulation, comparing proteins, genomic analysis for disease provoking genes and drug design. Though not in all cases explicitly stated, common target diseases include research to find cure against HIV, dengue, Duchene dystrophy, Parkinson's disease, various types of cancer and influenza. Other diseases include malaria, anthrax, Alzheimer's disease. The need for national initiatives and European Collaboration for larger scale projects is stressed, to raise the awareness of citizens to participate in order to create a culture of internet volunteering altruism.
Advances in targeted proteomics and applications to biomedical research
Shi, Tujin; Song, Ehwang; Nie, Song; Rodland, Karin D.; Liu, Tao; Qian, Wei-Jun; Smith, Richard D.
2016-01-01
Targeted proteomics technique has emerged as a powerful protein quantification tool in systems biology, biomedical research, and increasing for clinical applications. The most widely used targeted proteomics approach, selected reaction monitoring (SRM), also known as multiple reaction monitoring (MRM), can be used for quantification of cellular signaling networks and preclinical verification of candidate protein biomarkers. As an extension to our previous review on advances in SRM sensitivity herein we review recent advances in the method and technology for further enhancing SRM sensitivity (from 2012 to present), and highlighting its broad biomedical applications in human bodily fluids, tissue and cell lines. Furthermore, we also review two recently introduced targeted proteomics approaches, parallel reaction monitoring (PRM) and data-independent acquisition (DIA) with targeted data extraction on fast scanning high-resolution accurate-mass (HR/AM) instruments. Such HR/AM targeted quantification with monitoring all target product ions addresses SRM limitations effectively in specificity and multiplexing; whereas when compared to SRM, PRM and DIA are still in the infancy with a limited number of applications. Thus, for HR/AM targeted quantification we focus our discussion on method development, data processing and analysis, and its advantages and limitations in targeted proteomics. Finally, general perspectives on the potential of achieving both high sensitivity and high sample throughput for large-scale quantification of hundreds of target proteins are discussed. PMID:27302376
Advances in targeted proteomics and applications to biomedical research
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shi, Tujin; Song, Ehwang; Nie, Song
Targeted proteomics technique has emerged as a powerful protein quantification tool in systems biology, biomedical research, and increasing for clinical applications. The most widely used targeted proteomics approach, selected reaction monitoring (SRM), also known as multiple reaction monitoring (MRM), can be used for quantification of cellular signaling networks and preclinical verification of candidate protein biomarkers. As an extension to our previous review on advances in SRM sensitivity (Shi et al., Proteomics, 12, 1074–1092, 2012) herein we review recent advances in the method and technology for further enhancing SRM sensitivity (from 2012 to present), and highlighting its broad biomedical applications inmore » human bodily fluids, tissue and cell lines. Furthermore, we also review two recently introduced targeted proteomics approaches, parallel reaction monitoring (PRM) and data-independent acquisition (DIA) with targeted data extraction on fast scanning high-resolution accurate-mass (HR/AM) instruments. Such HR/AM targeted quantification with monitoring all target product ions addresses SRM limitations effectively in specificity and multiplexing; whereas when compared to SRM, PRM and DIA are still in the infancy with a limited number of applications. Thus, for HR/AM targeted quantification we focus our discussion on method development, data processing and analysis, and its advantages and limitations in targeted proteomics. Finally, general perspectives on the potential of achieving both high sensitivity and high sample throughput for large-scale quantification of hundreds of target proteins are discussed.« less
Discovering naturally processed antigenic determinants that confer protective T cell immunity
Gilchuk, Pavlo; Spencer, Charles T.; Conant, Stephanie B.; Hill, Timothy; Gray, Jennifer J.; Niu, Xinnan; Zheng, Mu; Erickson, John J.; Boyd, Kelli L.; McAfee, K. Jill; Oseroff, Carla; Hadrup, Sine R.; Bennink, Jack R.; Hildebrand, William; Edwards, Kathryn M.; Crowe, James E.; Williams, John V.; Buus, Søren; Sette, Alessandro; Schumacher, Ton N.M.; Link, Andrew J.; Joyce, Sebastian
2013-01-01
CD8+ T cells (TCD8) confer protective immunity against many infectious diseases, suggesting that microbial TCD8 determinants are promising vaccine targets. Nevertheless, current T cell antigen identification approaches do not discern which epitopes drive protective immunity during active infection — information that is critical for the rational design of TCD8-targeted vaccines. We employed a proteomics-based approach for large-scale discovery of naturally processed determinants derived from a complex pathogen, vaccinia virus (VACV), that are presented by the most frequent representatives of four major HLA class I supertypes. Immunologic characterization revealed that many previously unidentified VACV determinants were recognized by smallpox-vaccinated human peripheral blood cells in a variegated manner. Many such determinants were recognized by HLA class I–transgenic mouse immune TCD8 too and elicited protective TCD8 immunity against lethal intranasal VACV infection. Notably, efficient processing and stable presentation of immune determinants as well as the availability of naive TCD8 precursors were sufficient to drive a multifunctional, protective TCD8 response. Our approach uses fundamental insights into T cell epitope processing and presentation to define targets of protective TCD8 immunity within human pathogens that have complex proteomes, suggesting that this approach has general applicability in vaccine sciences. PMID:23543059
Discovering naturally processed antigenic determinants that confer protective T cell immunity.
Gilchuk, Pavlo; Spencer, Charles T; Conant, Stephanie B; Hill, Timothy; Gray, Jennifer J; Niu, Xinnan; Zheng, Mu; Erickson, John J; Boyd, Kelli L; McAfee, K Jill; Oseroff, Carla; Hadrup, Sine R; Bennink, Jack R; Hildebrand, William; Edwards, Kathryn M; Crowe, James E; Williams, John V; Buus, Søren; Sette, Alessandro; Schumacher, Ton N M; Link, Andrew J; Joyce, Sebastian
2013-05-01
CD8+ T cells (TCD8) confer protective immunity against many infectious diseases, suggesting that microbial TCD8 determinants are promising vaccine targets. Nevertheless, current T cell antigen identification approaches do not discern which epitopes drive protective immunity during active infection - information that is critical for the rational design of TCD8-targeted vaccines. We employed a proteomics-based approach for large-scale discovery of naturally processed determinants derived from a complex pathogen, vaccinia virus (VACV), that are presented by the most frequent representatives of four major HLA class I supertypes. Immunologic characterization revealed that many previously unidentified VACV determinants were recognized by smallpox-vaccinated human peripheral blood cells in a variegated manner. Many such determinants were recognized by HLA class I-transgenic mouse immune TCD8 too and elicited protective TCD8 immunity against lethal intranasal VACV infection. Notably, efficient processing and stable presentation of immune determinants as well as the availability of naive TCD8 precursors were sufficient to drive a multifunctional, protective TCD8 response. Our approach uses fundamental insights into T cell epitope processing and presentation to define targets of protective TCD8 immunity within human pathogens that have complex proteomes, suggesting that this approach has general applicability in vaccine sciences.
From Pleistocene to Holocene: the prehistory of southwest Asia in evolutionary context.
Watkins, Trevor
2017-08-14
In this paper I seek to show how cultural niche construction theory offers the potential to extend the human evolutionary story beyond the Pleistocene, through the Neolithic, towards the kind of very large-scale societies in which we live today. The study of the human past has been compartmentalised, each compartment using different analytical vocabularies, so that their accounts are written in mutually incompatible languages. In recent years social, cognitive and cultural evolutionary theories, building on a growing body of archaeological evidence, have made substantial sense of the social and cultural evolution of the genus Homo. However, specialists in this field of studies have found it difficult to extend their kind of analysis into the Holocene human world. Within southwest Asia the three or four millennia of the Neolithic period at the beginning of the Holocene represents a pivotal point, which saw the transformation of human society in the emergence of the first large-scale, permanent communities, the domestication of plants and animals, and the establishment of effective farming economies. Following the Neolithic, the pace of human social, economic and cultural evolution continued to increase. By 5000 years ago, in parts of southwest Asia and northeast Africa there were very large-scale urban societies, and the first large-scale states (kingdoms). An extension of cultural niche construction theory enables us to extend the evolutionary narrative of the Pleistocene into the Holocene, opening the way to developing a single, long-term, evolutionary account of human history.
Smith, Catherine M; Downs, Sara H; Mitchell, Andy; Hayward, Andrew C; Fry, Hannah; Le Comber, Steven C
2015-01-01
Bovine tuberculosis is a disease of historical importance to human health in the UK that remains a major animal health and economic issue. Control of the disease in cattle is complicated by the presence of a reservoir species, the Eurasian badger. In spite of uncertainty in the degree to which cattle disease results from transmission from badgers, and opposition from environmental groups, culling of badgers has been licenced in two large areas in England. Methods to limit culls to smaller areas that target badgers infected with TB whilst minimising the number of uninfected badgers culled is therefore of considerable interest. Here, we use historical data from a large-scale field trial of badger culling to assess two alternative hypothetical methods of targeting TB-infected badgers based on the distribution of cattle TB incidents: (i) a simple circular 'ring cull'; and (ii) geographic profiling, a novel technique for spatial targeting of infectious disease control that predicts the locations of sources of infection based on the distribution of linked cases. Our results showed that both methods required coverage of very large areas to ensure a substantial proportion of infected badgers were removed, and would result in many uninfected badgers being culled. Geographic profiling, which accounts for clustering of infections in badger and cattle populations, produced a small but non-significant increase in the proportion of setts with TB-infected compared to uninfected badgers included in a cull. It also provided no overall improvement at targeting setts with infected badgers compared to the ring cull. Cattle TB incidents in this study were therefore insufficiently clustered around TB-infected badger setts to design an efficient spatially targeted cull; and this analysis provided no evidence to support a move towards spatially targeted badger culling policies for bovine TB control.
NASA Astrophysics Data System (ADS)
Weber, Bruce A.
2005-07-01
We have performed an experiment that compares the performance of human observers with that of a robust algorithm for the detection of targets in difficult, nonurban forward-looking infrared imagery. Our purpose was to benchmark the comparison and document performance differences for future algorithm improvement. The scale-insensitive detection algorithm, used as a benchmark by the Night Vision Electronic Sensors Directorate for algorithm evaluation, employed a combination of contrastlike features to locate targets. Detection receiver operating characteristic curves and observer-confidence analyses were used to compare human and algorithmic responses and to gain insight into differences. The test database contained ground targets, in natural clutter, whose detectability, as judged by human observers, ranged from easy to very difficult. In general, as compared with human observers, the algorithm detected most of the same targets, but correlated confidence with correct detections poorly and produced many more false alarms at any useful level of performance. Though characterizing human performance was not the intent of this study, results suggest that previous observational experience was not a strong predictor of human performance, and that combining individual human observations by majority vote significantly reduced false-alarm rates.
Towards large-scale, human-based, mesoscopic neurotechnologies.
Chang, Edward F
2015-04-08
Direct human brain recordings have transformed the scope of neuroscience in the past decade. Progress has relied upon currently available neurophysiological approaches in the context of patients undergoing neurosurgical procedures for medical treatment. While this setting has provided precious opportunities for scientific research, it also has presented significant constraints on the development of new neurotechnologies. A major challenge now is how to achieve high-resolution spatiotemporal neural recordings at a large scale. By narrowing the gap between current approaches, new directions tailored to the mesoscopic (intermediate) scale of resolution may overcome the barriers towards safe and reliable human-based neurotechnology development, with major implications for advancing both basic research and clinical translation. Copyright © 2015 Elsevier Inc. All rights reserved.
The role of large—scale BECCS in the pursuit of the 1.5°C target: an Earth system model perspective
NASA Astrophysics Data System (ADS)
Muri, Helene
2018-04-01
The increasing awareness of the many damaging aspects of climate change has prompted research into ways of reducing and reversing the anthropogenic increase in carbon concentrations in the atmosphere. Most emission scenarios stabilizing climate at low levels, such as the 1.5 °C target as outlined by the Paris Agreement, require large-scale deployment of Bio-Energy with Carbon Capture and Storage (BECCS). Here, the potential of large-scale BECCS deployment in contributing towards the 1.5 °C global warming target is evaluated using an Earth system model, as well as associated climate responses and carbon cycle feedbacks. The geographical location of the bioenergy feedstock is shown to be key to the success of such measures in the context of temperature targets. Although net negative emissions were reached sooner, by ∼6 years, and scaled up, land use change emissions and reductions in forest carbon sinks outweigh these effects in one scenario. Re-cultivating mid-latitudes was found to be beneficial, on the other hand, contributing in the right direction towards the 1.5 °C target, only by ‑0.1 °C and ‑54 Gt C in avoided emissions, however. Obstacles remain related to competition for land from nature preservation and food security, as well as the technological availability of CCS.
Skin Friction Reduction Through Large-Scale Forcing
NASA Astrophysics Data System (ADS)
Bhatt, Shibani; Artham, Sravan; Gnanamanickam, Ebenezer
2017-11-01
Flow structures in a turbulent boundary layer larger than an integral length scale (δ), referred to as large-scales, interact with the finer scales in a non-linear manner. By targeting these large-scales and exploiting this non-linear interaction wall shear stress (WSS) reduction of over 10% has been achieved. The plane wall jet (PWJ), a boundary layer which has highly energetic large-scales that become turbulent independent of the near-wall finer scales, is the chosen model flow field. It's unique configuration allows for the independent control of the large-scales through acoustic forcing. Perturbation wavelengths from about 1 δ to 14 δ were considered with a reduction in WSS for all wavelengths considered. This reduction, over a large subset of the wavelengths, scales with both inner and outer variables indicating a mixed scaling to the underlying physics, while also showing dependence on the PWJ global properties. A triple decomposition of the velocity fields shows an increase in coherence due to forcing with a clear organization of the small scale turbulence with respect to the introduced large-scale. The maximum reduction in WSS occurs when the introduced large-scale acts in a manner so as to reduce the turbulent activity in the very near wall region. This material is based upon work supported by the Air Force Office of Scientific Research under Award Number FA9550-16-1-0194 monitored by Dr. Douglas Smith.
Tang, Yunping; Yang, Xiuliang; Hang, Baojian; Li, Jiangtao; Huang, Lei; Huang, Feng; Xu, Zhinan
2016-04-01
Mature collagen is abundant in human bodies and very valuable for a range of industrial and medical applications. The biosynthesis of mature collagen requires post-translational modifications to increase the stability of collagen triple helix structure. By co-expressing the human-like collagen (HLC) gene with human prolyl 4-hydroxylase (P4H) and D-arabinono-1, 4-lactone oxidase (ALO) in Escherichia coli, we have constructed a prokaryotic expression system to produce the hydroxylated HLC. Then, five different media, as well as the induction conditions were investigated with regard to the soluble expression of such protein. The results indicated that the highest soluble expression level of target HLC obtained in shaking flasks was 49.55 ± 0.36 mg/L, when recombinant cells were grew in MBL medium and induced by 0.1 mM IPTG at the middle stage of exponential growth phase. By adopting the glucose feeding strategy, the expression level of target HLC can be improved up to 260 mg/L in a 10 L bench-top fermentor. Further, HPLC analyses revealed that more than 10 % of proline residues in purified HLC were successfully hydroxylated. The present work has provided a solid base for the large-scale production of hydroxylated HLC in E. coli.
Large Scale Spectral Line Mapping of Galactic Regions with CCAT-Prime
NASA Astrophysics Data System (ADS)
Simon, Robert
2018-01-01
CCAT-prime is a 6-m submillimeter telescope that is being built on the top of Cerro Chajnantor (5600 m altitude) overlooking the ALMA plateau in the Atacama Desert. Its novel Crossed-Dragone design enables a large field of view without blockage and is thus particularly well suited for large scale surveys in the continuum and spectral lines targeting important questions ranging from star formation in the Milky Way to cosmology. On this poster, we focus on the large scale mapping opportunities in important spectral cooling lines of the interstellar medium opened up by CCAT-prime and the Cologne heterodyne instrument CHAI.
DOE Joint Genome Institute 2008 Progress Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbert, David
2009-03-12
While initially a virtual institute, the driving force behind the creation of the DOE Joint Genome Institute in Walnut Creek, California in the Fall of 1999 was the Department of Energy's commitment to sequencing the human genome. With the publication in 2004 of a trio of manuscripts describing the finished 'DOE Human Chromosomes', the Institute successfully completed its human genome mission. In the time between the creation of the Department of Energy Joint Genome Institute (DOE JGI) and completion of the Human Genome Project, sequencing and its role in biology spread to fields extending far beyond what could be imaginedmore » when the Human Genome Project first began. Accordingly, the targets of the DOE JGI's sequencing activities changed, moving from a single human genome to the genomes of large numbers of microbes, plants, and other organisms, and the community of users of DOE JGI data similarly expanded and diversified. Transitioning into operating as a user facility, the DOE JGI modeled itself after other DOE user facilities, such as synchrotron light sources and supercomputer facilities, empowering the science of large numbers of investigators working in areas of relevance to energy and the environment. The JGI's approach to being a user facility is based on the concept that by focusing state-of-the-art sequencing and analysis capabilities on the best peer-reviewed ideas drawn from a broad community of scientists, the DOE JGI will effectively encourage creative approaches to DOE mission areas and produce important science. This clearly has occurred, only partially reflected in the fact that the DOE JGI has played a major role in more than 45 papers published in just the past three years alone in Nature and Science. The involvement of a large and engaged community of users working on important problems has helped maximize the impact of JGI science. A seismic technological change is presently underway at the JGI. The Sanger capillary-based sequencing process that dominated how sequencing was done in the last decade is being replaced by a variety of new processes and sequencing instruments. The JGI, with an increasing number of next-generation sequencers, whose throughput is 100- to 1,000-fold greater than the Sanger capillary-based sequencers, is increasingly focused in new directions on projects of scale and complexity not previously attempted. These new directions for the JGI come, in part, from the 2008 National Research Council report on the goals of the National Plant Genome Initiative as well as the 2007 National Research Council report on the New Science of Metagenomics. Both reports outline a crucial need for systematic large-scale surveys of the plant and microbial components of the biosphere as well as an increasing need for large-scale analysis capabilities to meet the challenge of converting sequence data into knowledge. The JGI is extensively discussed in both reports as vital to progress in these fields of major national interest. JGI's future plan for plants and microbes includes a systematic approach for investigation of these organisms at a scale requiring the special capabilities of the JGI to generate, manage, and analyze the datasets. JGI will generate and provide not only community access to these plant and microbial datasets, but also the tools for analyzing them. These activities will produce essential knowledge that will be needed if we are to be able to respond to the world's energy and environmental challenges. As the JGI Plant and Microbial programs advance, the JGI as a user facility is also evolving. The Institute has been highly successful in bending its technical and analytical skills to help users solve large complex problems of major importance, and that effort will continue unabated. The JGI will increasingly move from a central focus on 'one-off' user projects coming from small user communities to much larger scale projects driven by systematic and problem-focused approaches to selection of sequencing targets. Entire communities of scientists working in a particular field, such as feedstock improvement or biomass degradation, will be users of this information. Despite this new emphasis, an investigator-initiated user program will remain. This program in the future will replace small projects that increasingly can be accomplished without the involvement of JGI, with imaginative large-scale 'Grand Challenge' projects of foundational relevance to energy and the environment that require a new scale of sequencing and analysis capabilities. Close interactions with the DOE Bioenergy Research Centers, and with other DOE institutions that may follow, will also play a major role in shaping aspects of how the JGI operates as a user facility. Based on increased availability of high-throughput sequencing, the JGI will increasingly provide to users, in addition to DNA sequencing, an array of both pre- and post-sequencing value-added capabilities to accelerate their science.« less
Grill, Günther; Khan, Usman; Lehner, Bernhard; Nicell, Jim; Ariwi, Joseph
2016-01-15
Chemicals released into freshwater systems threaten ecological functioning and may put aquatic life and the health of humans at risk. We developed a new contaminant fate model (CFM) that follows simple, well-established methodologies and is unique in its cross-border, seamless hydrological and geospatial framework, including lake routing, a critical component in northern environments. We validated the model using the pharmaceutical Carbamazepine and predicted eco-toxicological risk for 15 pharmaceuticals in the Saint-Lawrence River Basin, Canada. The results indicated negligible to low environmental risk for the majority of tested chemicals, while two pharmaceuticals showed elevated risk in up to 13% of rivers affected by municipal effluents. As an integrated model, our CFM is designed for application at very large scales with the primary goal of detecting high risk zones. In regulatory frameworks, it can help screen existing or new chemicals entering the market regarding their potential impact on human and environmental health. Due to its high geospatial resolution, our CFM can also facilitate the prioritization of actions, such as identifying regions where reducing contamination sources or upgrading treatment plants is most pertinent to achieve targeted pollutant removal or to protect drinking water resources. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.
Herculano-Houzel, Suzana; Kaas, Jon H.
2011-01-01
Gorillas and orangutans are primates at least as large as humans, but their brains amount to about one third of the size of the human brain. This discrepancy has been used as evidence that the human brain is about 3 times larger than it should be for a primate species of its body size. In contrast to the view that the human brain is special in its size, we have suggested that it is the great apes that might have evolved bodies that are unusually large, on the basis of our recent finding that the cellular composition of the human brain matches that expected for a primate brain of its size, making the human brain a linearly scaled-up primate brain in its number of cells. To investigate whether the brain of great apes also conforms to the primate cellular scaling rules identified previously, we determine the numbers of neuronal and other cells that compose the orangutan and gorilla cerebella, use these numbers to calculate the size of the brain and of the cerebral cortex expected for these species, and show that these match the sizes described in the literature. Our results suggest that the brains of great apes also scale linearly in their numbers of neurons like other primate brains, including humans. The conformity of great apes and humans to the linear cellular scaling rules that apply to other primates that diverged earlier in primate evolution indicates that prehistoric Homo species as well as other hominins must have had brains that conformed to the same scaling rules, irrespective of their body size. We then used those scaling rules and published estimated brain volumes for various hominin species to predict the numbers of neurons that composed their brains. We predict that Homo heidelbergensis and Homo neanderthalensis had brains with approximately 80 billion neurons, within the range of variation found in modern Homo sapiens. We propose that while the cellular scaling rules that apply to the primate brain have remained stable in hominin evolution (since they apply to simians, great apes and modern humans alike), the Colobinae and Pongidae lineages favored marked increases in body size rather than brain size from the common ancestor with the Homo lineage, while the Homo lineage seems to have favored a large brain instead of a large body, possibly due to the metabolic limitations to having both. PMID:21228547
Herculano-Houzel, Suzana; Kaas, Jon H
2011-01-01
Gorillas and orangutans are primates at least as large as humans, but their brains amount to about one third of the size of the human brain. This discrepancy has been used as evidence that the human brain is about 3 times larger than it should be for a primate species of its body size. In contrast to the view that the human brain is special in its size, we have suggested that it is the great apes that might have evolved bodies that are unusually large, on the basis of our recent finding that the cellular composition of the human brain matches that expected for a primate brain of its size, making the human brain a linearly scaled-up primate brain in its number of cells. To investigate whether the brain of great apes also conforms to the primate cellular scaling rules identified previously, we determine the numbers of neuronal and other cells that compose the orangutan and gorilla cerebella, use these numbers to calculate the size of the brain and of the cerebral cortex expected for these species, and show that these match the sizes described in the literature. Our results suggest that the brains of great apes also scale linearly in their numbers of neurons like other primate brains, including humans. The conformity of great apes and humans to the linear cellular scaling rules that apply to other primates that diverged earlier in primate evolution indicates that prehistoric Homo species as well as other hominins must have had brains that conformed to the same scaling rules, irrespective of their body size. We then used those scaling rules and published estimated brain volumes for various hominin species to predict the numbers of neurons that composed their brains. We predict that Homo heidelbergensis and Homo neanderthalensis had brains with approximately 80 billion neurons, within the range of variation found in modern Homo sapiens. We propose that while the cellular scaling rules that apply to the primate brain have remained stable in hominin evolution (since they apply to simians, great apes and modern humans alike), the Colobinae and Pongidae lineages favored marked increases in body size rather than brain size from the common ancestor with the Homo lineage, while the Homo lineage seems to have favored a large brain instead of a large body, possibly due to the metabolic limitations to having both. Copyright © 2011 S. Karger AG, Basel.
Target-decoy Based False Discovery Rate Estimation for Large-scale Metabolite Identification.
Wang, Xusheng; Jones, Drew R; Shaw, Timothy I; Cho, Ji-Hoon; Wang, Yuanyuan; Tan, Haiyan; Xie, Boer; Zhou, Suiping; Li, Yuxin; Peng, Junmin
2018-05-23
Metabolite identification is a crucial step in mass spectrometry (MS)-based metabolomics. However, it is still challenging to assess the confidence of assigned metabolites. In this study, we report a novel method for estimating false discovery rate (FDR) of metabolite assignment with a target-decoy strategy, in which the decoys are generated through violating the octet rule of chemistry by adding small odd numbers of hydrogen atoms. The target-decoy strategy was integrated into JUMPm, an automated metabolite identification pipeline for large-scale MS analysis, and was also evaluated with two other metabolomics tools, mzMatch and mzMine 2. The reliability of FDR calculation was examined by false datasets, which were simulated by altering MS1 or MS2 spectra. Finally, we used the JUMPm pipeline coupled with the target-decoy strategy to process unlabeled and stable-isotope labeled metabolomic datasets. The results demonstrate that the target-decoy strategy is a simple and effective method for evaluating the confidence of high-throughput metabolite identification.
News Release: May 25, 2016 — Building on data from The Cancer Genome Atlas (TCGA) project, a multi-institutional team of scientists has completed the first large-scale “proteogenomic” study of breast cancer, linking DNA mutations to protein signaling and helping pinpoint the genes that drive cancer.
Lyons, Eli; Sheridan, Paul; Tremmel, Georg; Miyano, Satoru; Sugano, Sumio
2017-10-24
High-throughput screens allow for the identification of specific biomolecules with characteristics of interest. In barcoded screens, DNA barcodes are linked to target biomolecules in a manner allowing for the target molecules making up a library to be identified by sequencing the DNA barcodes using Next Generation Sequencing. To be useful in experimental settings, the DNA barcodes in a library must satisfy certain constraints related to GC content, homopolymer length, Hamming distance, and blacklisted subsequences. Here we report a novel framework to quickly generate large-scale libraries of DNA barcodes for use in high-throughput screens. We show that our framework dramatically reduces the computation time required to generate large-scale DNA barcode libraries, compared with a naїve approach to DNA barcode library generation. As a proof of concept, we demonstrate that our framework is able to generate a library consisting of one million DNA barcodes for use in a fragment antibody phage display screening experiment. We also report generating a general purpose one billion DNA barcode library, the largest such library yet reported in literature. Our results demonstrate the value of our novel large-scale DNA barcode library generation framework for use in high-throughput screening applications.
Large-scale production of functional human lysozyme from marker-free transgenic cloned cows.
Lu, Dan; Liu, Shen; Ding, Fangrong; Wang, Haiping; Li, Jing; Li, Ling; Dai, Yunping; Li, Ning
2016-03-10
Human lysozyme is an important natural non-specific immune protein that is highly expressed in breast milk and participates in the immune response of infants against bacterial and viral infections. Considering the medicinal value and market demand for human lysozyme, an animal model for large-scale production of recombinant human lysozyme (rhLZ) is needed. In this study, we generated transgenic cloned cows with the marker-free vector pBAC-hLF-hLZ, which was shown to efficiently express rhLZ in cow milk. Seven transgenic cloned cows, identified by polymerase chain reaction, Southern blot, and western blot analyses, produced rhLZ in milk at concentrations of up to 3149.19 ± 24.80 mg/L. The purified rhLZ had a similar molecular weight and enzymatic activity as wild-type human lysozyme possessed the same C-terminal and N-terminal amino acid sequences. The preliminary results from the milk yield and milk compositions from a naturally lactating transgenic cloned cow 0906 were also tested. These results provide a solid foundation for the large-scale production of rhLZ in the future.
Validation of the RAGE Hydrocode for Impacts into Volatile-Rich Targets
NASA Astrophysics Data System (ADS)
Plesko, C. S.; Asphaug, E.; Coker, R. F.; Wohletz, K. H.; Korycansky, D. G.; Gisler, G. R.
2007-12-01
In preparation for a detailed study of large-scale impacts into the Martian surface, we have validated the RAGE hydrocode (Gittings et al., in press, CSD) against a suite of experiments and statistical models. We present comparisons of hydrocode models to centimeter-scale gas gun impacts (Nakazawa et al. 2002), an underground nuclear test (Perret, 1971), and crater scaling laws (Holsapple 1993, O'Keefe and Ahrens 1993). We have also conducted model convergence and uncertainty analyses which will be presented. Results to date are encouraging for our current model goals, and indicate areas where the hydrocode may be extended in the future. This validation work is focused on questions related to the specific problem of large impacts into volatile-rich targets. The overall goal of this effort is to be able to realistically model large-scale Noachian, and possibly post- Noachian, impacts on Mars not so much to model the crater morphology as to understand the evolution of target volatiles in the post-impact regime, to explore how large craters might set the stage for post-impact hydro- geologic evolution both locally (in the crater subsurface) and globally, due to the redistribution of volatiles from the surface and subsurface into the atmosphere. This work is performed under the auspices of IGPP and the DOE at LANL under contracts W-7405-ENG-36 and DE-AC52-06NA25396. Effort by DK and EA is sponsored by NASA's Mars Fundamental Research Program.
Investigations of internal noise levels for different target sizes, contrasts, and noise structures
NASA Astrophysics Data System (ADS)
Han, Minah; Choi, Shinkook; Baek, Jongduk
2014-03-01
To describe internal noise levels for different target sizes, contrasts, and noise structures, Gaussian targets with four different sizes (i.e., standard deviation of 2,4,6 and 8) and three different noise structures(i.e., white, low-pass, and highpass) were generated. The generated noise images were scaled to have standard deviation of 0.15. For each noise type, target contrasts were adjusted to have the same detectability based on NPW, and the detectability of CHO was calculated accordingly. For human observer study, 3 trained observers performed 2AFC detection tasks, and correction rate, Pc, was calculated for each task. By adding proper internal noise level to numerical observer (i.e., NPW and CHO), detectability of human observer was matched with that of numerical observers. Even though target contrasts were adjusted to have the same detectability of NPW observer, detectability of human observer decreases as the target size increases. The internal noise level varies for different target sizes, contrasts, and noise structures, demonstrating different internal noise levels should be considered in numerical observer to predict the detection performance of human observer.
A computational framework for modeling targets as complex adaptive systems
NASA Astrophysics Data System (ADS)
Santos, Eugene; Santos, Eunice E.; Korah, John; Murugappan, Vairavan; Subramanian, Suresh
2017-05-01
Modeling large military targets is a challenge as they can be complex systems encompassing myriad combinations of human, technological, and social elements that interact, leading to complex behaviors. Moreover, such targets have multiple components and structures, extending across multiple spatial and temporal scales, and are in a state of change, either in response to events in the environment or changes within the system. Complex adaptive system (CAS) theory can help in capturing the dynamism, interactions, and more importantly various emergent behaviors, displayed by the targets. However, a key stumbling block is incorporating information from various intelligence, surveillance and reconnaissance (ISR) sources, while dealing with the inherent uncertainty, incompleteness and time criticality of real world information. To overcome these challenges, we present a probabilistic reasoning network based framework called complex adaptive Bayesian Knowledge Base (caBKB). caBKB is a rigorous, overarching and axiomatic framework that models two key processes, namely information aggregation and information composition. While information aggregation deals with the union, merger and concatenation of information and takes into account issues such as source reliability and information inconsistencies, information composition focuses on combining information components where such components may have well defined operations. Since caBKBs can explicitly model the relationships between information pieces at various scales, it provides unique capabilities such as the ability to de-aggregate and de-compose information for detailed analysis. Using a scenario from the Network Centric Operations (NCO) domain, we will describe how our framework can be used for modeling targets with a focus on methodologies for quantifying NCO performance metrics.
NASA Astrophysics Data System (ADS)
Torrisi, L.
2018-02-01
A large-scale study of ion acceleration in laser-generated plasma, extended to intensities from 1010 W/cm2 up to 1019 W/cm2, is presented. Aluminium thick and thin foils were irradiated in high vacuum using different infrared lasers and pulse durations from ns up to fs scale. Plasma was monitored mainly using SiC detectors employed in time-of-flight configuration. Protons and aluminium ions, at different energies and yields, were measured as a function of the laser intensity. The discontinuity region between particle acceleration from both the backward plasma (BPA) in thick targets and the forward plasma in thin foils in the target normal sheath acceleration (TNSA) regimes were investigated.
Human-Machine Cooperation in Large-Scale Multimedia Retrieval: A Survey
ERIC Educational Resources Information Center
Shirahama, Kimiaki; Grzegorzek, Marcin; Indurkhya, Bipin
2015-01-01
"Large-Scale Multimedia Retrieval" (LSMR) is the task to fast analyze a large amount of multimedia data like images or videos and accurately find the ones relevant to a certain semantic meaning. Although LSMR has been investigated for more than two decades in the fields of multimedia processing and computer vision, a more…
Hart, Thomas; Dider, Shihab; Han, Weiwei; Xu, Hua; Zhao, Zhongming; Xie, Lei
2016-01-01
Metformin, a drug prescribed to treat type-2 diabetes, exhibits anti-cancer effects in a portion of patients, but the direct molecular and genetic interactions leading to this pleiotropic effect have not yet been fully explored. To repurpose metformin as a precision anti-cancer therapy, we have developed a novel structural systems pharmacology approach to elucidate metformin’s molecular basis and genetic biomarkers of action. We integrated structural proteome-scale drug target identification with network biology analysis by combining structural genomic, functional genomic, and interactomic data. Through searching the human structural proteome, we identified twenty putative metformin binding targets and their interaction models. We experimentally verified the interactions between metformin and our top-ranked kinase targets. Notably, kinases, particularly SGK1 and EGFR were identified as key molecular targets of metformin. Subsequently, we linked these putative binding targets to genes that do not directly bind to metformin but whose expressions are altered by metformin through protein-protein interactions, and identified network biomarkers of phenotypic response of metformin. The molecular targets and the key nodes in genetic networks are largely consistent with the existing experimental evidence. Their interactions can be affected by the observed cancer mutations. This study will shed new light into repurposing metformin for safe, effective, personalized therapies. PMID:26841718
Zhou, Juntuo; Liu, Huiying; Liu, Yang; Liu, Jia; Zhao, Xuyang; Yin, Yuxin
2016-04-19
Recent advances in mass spectrometers which have yielded higher resolution and faster scanning speeds have expanded their application in metabolomics of diverse diseases. Using a quadrupole-Orbitrap LC-MS system, we developed an efficient large-scale quantitative method targeting 237 metabolites involved in various metabolic pathways using scheduled, parallel reaction monitoring (PRM). We assessed the dynamic range, linearity, reproducibility, and system suitability of the PRM assay by measuring concentration curves, biological samples, and clinical serum samples. The quantification performances of PRM and MS1-based assays in Q-Exactive were compared, and the MRM assay in QTRAP 6500 was also compared. The PRM assay monitoring 237 polar metabolites showed greater reproducibility and quantitative accuracy than MS1-based quantification and also showed greater flexibility in postacquisition assay refinement than the MRM assay in QTRAP 6500. We present a workflow for convenient PRM data processing using Skyline software which is free of charge. In this study we have established a reliable PRM methodology on a quadrupole-Orbitrap platform for evaluation of large-scale targeted metabolomics, which provides a new choice for basic and clinical metabolomics study.
Tsugawa, Hiroshi; Arita, Masanori; Kanazawa, Mitsuhiro; Ogiwara, Atsushi; Bamba, Takeshi; Fukusaki, Eiichiro
2013-05-21
We developed a new software program, MRMPROBS, for widely targeted metabolomics by using the large-scale multiple reaction monitoring (MRM) mode. The strategy became increasingly popular for the simultaneous analysis of up to several hundred metabolites at high sensitivity, selectivity, and quantitative capability. However, the traditional method of assessing measured metabolomics data without probabilistic criteria is not only time-consuming but is often subjective and makeshift work. Our program overcomes these problems by detecting and identifying metabolites automatically, by separating isomeric metabolites, and by removing background noise using a probabilistic score defined as the odds ratio from an optimized multivariate logistic regression model. Our software program also provides a user-friendly graphical interface to curate and organize data matrices and to apply principal component analyses and statistical tests. For a demonstration, we conducted a widely targeted metabolome analysis (152 metabolites) of propagating Saccharomyces cerevisiae measured at 15 time points by gas and liquid chromatography coupled to triple quadrupole mass spectrometry. MRMPROBS is a useful and practical tool for the assessment of large-scale MRM data available to any instrument or any experimental condition.
2016-09-26
Intelligent Automation Incorporated Enhancements for a Dynamic Data Warehousing and Mining ...Enhancements for a Dynamic Data Warehousing and Mining System for N00014-16-P-3014 Large-Scale Human Social Cultural Behavioral (HSBC) Data 5b. GRANT NUMBER...Representative Media Gallery View. We perform Scraawl’s NER algorithm to the text associated with YouTube post, which classifies the named entities into
Arango, Daniel; Morohashi, Kengo; Yilmaz, Alper; Kuramochi, Kouji; Parihar, Arti; Brahimaj, Bledi; Grotewold, Erich; Doseff, Andrea I.
2013-01-01
Flavonoids constitute the largest class of dietary phytochemicals, adding essential health value to our diet, and are emerging as key nutraceuticals. Cellular targets for dietary phytochemicals remain largely unknown, posing significant challenges for the regulation of dietary supplements and the understanding of how nutraceuticals provide health value. Here, we describe the identification of human cellular targets of apigenin, a flavonoid abundantly present in fruits and vegetables, using an innovative high-throughput approach that combines phage display with second generation sequencing. The 160 identified high-confidence candidate apigenin targets are significantly enriched in three main functional categories: GTPase activation, membrane transport, and mRNA metabolism/alternative splicing. This last category includes the heterogeneous nuclear ribonucleoprotein A2 (hnRNPA2), a factor involved in splicing regulation, mRNA stability, and mRNA transport. Apigenin binds to the C-terminal glycine-rich domain of hnRNPA2, preventing hnRNPA2 from forming homodimers, and therefore, it perturbs the alternative splicing of several human hnRNPA2 targets. Our results provide a framework to understand how dietary phytochemicals exert their actions by binding to many functionally diverse cellular targets. In turn, some of them may modulate the activity of a large number of downstream genes, which is exemplified here by the effects of apigenin on the alternative splicing activity of hnRNPA2. Hence, in contrast to small-molecule pharmaceuticals designed for defined target specificity, dietary phytochemicals affect a large number of cellular targets with varied affinities that, combined, result in their recognized health benefits. PMID:23697369
Kasam, Vinod; Salzemann, Jean; Botha, Marli; Dacosta, Ana; Degliesposti, Gianluca; Isea, Raul; Kim, Doman; Maass, Astrid; Kenyon, Colin; Rastelli, Giulio; Hofmann-Apitius, Martin; Breton, Vincent
2009-05-01
Despite continuous efforts of the international community to reduce the impact of malaria on developing countries, no significant progress has been made in the recent years and the discovery of new drugs is more than ever needed. Out of the many proteins involved in the metabolic activities of the Plasmodium parasite, some are promising targets to carry out rational drug discovery. Recent years have witnessed the emergence of grids, which are highly distributed computing infrastructures particularly well fitted for embarrassingly parallel computations like docking. In 2005, a first attempt at using grids for large-scale virtual screening focused on plasmepsins and ended up in the identification of previously unknown scaffolds, which were confirmed in vitro to be active plasmepsin inhibitors. Following this success, a second deployment took place in the fall of 2006 focussing on one well known target, dihydrofolate reductase (DHFR), and on a new promising one, glutathione-S-transferase. In silico drug design, especially vHTS is a widely and well-accepted technology in lead identification and lead optimization. This approach, therefore builds, upon the progress made in computational chemistry to achieve more accurate in silico docking and in information technology to design and operate large scale grid infrastructures. On the computational side, a sustained infrastructure has been developed: docking at large scale, using different strategies in result analysis, storing of the results on the fly into MySQL databases and application of molecular dynamics refinement are MM-PBSA and MM-GBSA rescoring. The modeling results obtained are very promising. Based on the modeling results, In vitro results are underway for all the targets against which screening is performed. The current paper describes the rational drug discovery activity at large scale, especially molecular docking using FlexX software on computational grids in finding hits against three different targets (PfGST, PfDHFR, PvDHFR (wild type and mutant forms) implicated in malaria. Grid-enabled virtual screening approach is proposed to produce focus compound libraries for other biological targets relevant to fight the infectious diseases of the developing world.
Gu, Xun; Wang, Yufeng; Gu, Jianying
2002-06-01
The classical (two-round) hypothesis of vertebrate genome duplication proposes two successive whole-genome duplication(s) (polyploidizations) predating the origin of fishes, a view now being seriously challenged. As the debate largely concerns the relative merits of the 'big-bang mode' theory (large-scale duplication) and the 'continuous mode' theory (constant creation by small-scale duplications), we tested whether a significant proportion of paralogous genes in the contemporary human genome was indeed generated in the early stage of vertebrate evolution. After an extensive search of major databases, we dated 1,739 gene duplication events from the phylogenetic analysis of 749 vertebrate gene families. We found a pattern characterized by two waves (I, II) and an ancient component. Wave I represents a recent gene family expansion by tandem or segmental duplications, whereas wave II, a rapid paralogous gene increase in the early stage of vertebrate evolution, supports the idea of genome duplication(s) (the big-bang mode). Further analysis indicated that large- and small-scale gene duplications both make a significant contribution during the early stage of vertebrate evolution to build the current hierarchy of the human proteome.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacques Hugo
Traditional engineering methods do not make provision for the integration of human considerations, while traditional human factors methods do not scale well to the complexity of large-scale nuclear power plant projects. Although the need for up-to-date human factors engineering processes and tools is recognised widely in industry, so far no formal guidance has been developed. This article proposes such a framework.
Enhanced clinical-scale manufacturing of TCR transduced T-cells using closed culture system modules.
Jin, Jianjian; Gkitsas, Nikolaos; Fellowes, Vicki S; Ren, Jiaqiang; Feldman, Steven A; Hinrichs, Christian S; Stroncek, David F; Highfill, Steven L
2018-01-24
Genetic engineering of T-cells to express specific T cell receptors (TCR) has emerged as a novel strategy to treat various malignancies. More widespread utilization of these types of therapies has been somewhat constrained by the lack of closed culture processes capable of expanding sufficient numbers of T-cells for clinical application. Here, we evaluate a process for robust clinical grade manufacturing of TCR gene engineered T-cells. TCRs that target human papillomavirus E6 and E7 were independently tested. A 21 day process was divided into a transduction phase (7 days) and a rapid expansion phase (14 days). This process was evaluated using two healthy donor samples and four samples obtained from patients with epithelial cancers. The process resulted in ~ 2000-fold increase in viable nucleated cells and high transduction efficiencies (64-92%). At the end of culture, functional assays demonstrated that these cells were potent and specific in their ability to kill tumor cells bearing target and secrete large quantities of interferon and tumor necrosis factor. Both phases of culture were contained within closed or semi-closed modules, which include automated density gradient separation and cell culture bags for the first phase and closed GREX culture devices and wash/concentrate systems for the second phase. Large-scale manufacturing using modular systems and semi-automated devices resulted in highly functional clinical-grade TCR transduced T-cells. This process is now in use in actively accruing clinical trials and the NIH Clinical Center and can be utilized at other cell therapy manufacturing sites that wish to scale-up and optimize their processing using closed systems.
Large-scale protein/antibody patterning with limiting unspecific adsorption
NASA Astrophysics Data System (ADS)
Fedorenko, Viktoriia; Bechelany, Mikhael; Janot, Jean-Marc; Smyntyna, Valentyn; Balme, Sebastien
2017-10-01
A simple synthetic route based on nanosphere lithography has been developed in order to design a large-scale nanoarray for specific control of protein anchoring. This technique based on two-dimensional (2D) colloidal crystals composed of polystyrene spheres allows the easy and inexpensive fabrication of large arrays (up to several centimeters) by reducing the cost. A silicon wafer coated with a thin adhesion layer of chromium (15 nm) and a layer of gold (50 nm) is used as a substrate. PS spheres are deposited on the gold surface using the floating-transferring technique. The PS spheres were then functionalized with PEG-biotin and the defects by self-assembly monolayer (SAM) PEG to prevent unspecific adsorption. Using epifluorescence microscopy, we show that after immersion of sample on target protein (avidin and anti-avidin) solution, the latter are specifically located on polystyrene spheres. Thus, these results are meaningful for exploration of devices based on a large-scale nanoarray of PS spheres and can be used for detection of target proteins or simply to pattern a surface with specific proteins.
Stochastic characterization of small-scale algorithms for human sensory processing
NASA Astrophysics Data System (ADS)
Neri, Peter
2010-12-01
Human sensory processing can be viewed as a functional H mapping a stimulus vector s into a decisional variable r. We currently have no direct access to r; rather, the human makes a decision based on r in order to drive subsequent behavior. It is this (typically binary) decision that we can measure. For example, there may be two external stimuli s[0] and s[1], mapped onto r[0] and r[1] by the sensory apparatus H; the human chooses the stimulus associated with largest r. This kind of decisional transduction poses a major challenge for an accurate characterization of H. In this article, we explore a specific approach based on a behavioral variant of reverse correlation techniques, where the input s contains a target signal corrupted by a controlled noisy perturbation. The presence of the target signal poses an additional challenge because it distorts the otherwise unbiased nature of the noise source. We consider issues arising from both the decisional transducer and the target signal, their impact on system identification, and ways to handle them effectively for system characterizations that extend to second-order functional approximations with associated small-scale cascade models.
High-throughput discovery of rare human nucleotide polymorphisms by Ecotilling
Till, Bradley J.; Zerr, Troy; Bowers, Elisabeth; Greene, Elizabeth A.; Comai, Luca; Henikoff, Steven
2006-01-01
Human individuals differ from one another at only ∼0.1% of nucleotide positions, but these single nucleotide differences account for most heritable phenotypic variation. Large-scale efforts to discover and genotype human variation have been limited to common polymorphisms. However, these efforts overlook rare nucleotide changes that may contribute to phenotypic diversity and genetic disorders, including cancer. Thus, there is an increasing need for high-throughput methods to robustly detect rare nucleotide differences. Toward this end, we have adapted the mismatch discovery method known as Ecotilling for the discovery of human single nucleotide polymorphisms. To increase throughput and reduce costs, we developed a universal primer strategy and implemented algorithms for automated band detection. Ecotilling was validated by screening 90 human DNA samples for nucleotide changes in 5 gene targets and by comparing results to public resequencing data. To increase throughput for discovery of rare alleles, we pooled samples 8-fold and found Ecotilling to be efficient relative to resequencing, with a false negative rate of 5% and a false discovery rate of 4%. We identified 28 new rare alleles, including some that are predicted to damage protein function. The detection of rare damaging mutations has implications for models of human disease. PMID:16893952
Sanderson, E.W.; Redford, Kent; Weber, Bill; Aune, K.; Baldes, Dick; Berger, J.; Carter, Dave; Curtin, C.; Derr, James N.; Dobrott, S.J.; Fearn, Eva; Fleener, Craig; Forrest, Steven C.; Gerlach, Craig; Gates, C. Cormack; Gross, J.E.; Gogan, P.; Grassel, Shaun M.; Hilty, Jodi A.; Jensen, Marv; Kunkel, Kyran; Lammers, Duane; List, R.; Minkowski, Karen; Olson, Tom; Pague, Chris; Robertson, Paul B.; Stephenson, Bob
2008-01-01
Many wide-ranging mammal species have experienced significant declines over the last 200 years; restoring these species will require long-term, large-scale recovery efforts. We highlight 5 attributes of a recent range-wide vision-setting exercise for ecological recovery of the North American bison (Bison bison) that are broadly applicable to other species and restoration targets. The result of the exercise, the “Vermejo Statement” on bison restoration, is explicitly (1) large scale, (2) long term, (3) inclusive, (4) fulfilling of different values, and (5) ambitious. It reads, in part, “Over the next century, the ecological recovery of the North American bison will occur when multiple large herds move freely across extensive landscapes within all major habitats of their historic range, interacting in ecologically significant ways with the fullest possible set of other native species, and inspiring, sustaining and connecting human cultures.” We refined the vision into a scorecard that illustrates how individual bison herds can contribute to the vision. We also developed a set of maps and analyzed the current and potential future distributions of bison on the basis of expert assessment. Although more than 500,000 bison exist in North America today, we estimated they occupy <1% of their historical range and in no place express the full range of ecological and social values of previous times. By formulating an inclusive, affirmative, and specific vision through consultation with a wide range of stakeholders, we hope to provide a foundation for conservation of bison, and other wide-ranging species, over the next 100 years.
NASA Astrophysics Data System (ADS)
Realpe-Gómez, John; Andrighetto, Giulia; Nardin, Luis Gustavo; Montoya, Javier Antonio
2018-04-01
Cooperation is central to the success of human societies as it is crucial for overcoming some of the most pressing social challenges of our time; still, how human cooperation is achieved and may persist is a main puzzle in the social and biological sciences. Recently, scholars have recognized the importance of social norms as solutions to major local and large-scale collective action problems, from the management of water resources to the reduction of smoking in public places to the change in fertility practices. Yet a well-founded model of the effect of social norms on human cooperation is still lacking. Using statistical-physics techniques and integrating findings from cognitive and behavioral sciences, we present an analytically tractable model in which individuals base their decisions to cooperate both on the economic rewards they obtain and on the degree to which their action complies with social norms. Results from this parsimonious model are in agreement with observations in recent large-scale experiments with humans. We also find the phase diagram of the model and show that the experimental human group is poised near a critical point, a regime where recent work suggests living systems respond to changing external conditions in an efficient and coordinated manner.
Söderholm, Sandra; Kainov, Denis E; Öhman, Tiina; Denisova, Oxana V; Schepens, Bert; Kulesskiy, Evgeny; Imanishi, Susumu Y; Corthals, Garry; Hintsanen, Petteri; Aittokallio, Tero; Saelens, Xavier; Matikainen, Sampsa; Nyman, Tuula A
2016-10-01
Influenza A viruses cause infections in the human respiratory tract and give rise to annual seasonal outbreaks, as well as more rarely dreaded pandemics. Influenza A viruses become quickly resistant to the virus-directed antiviral treatments, which are the current main treatment options. A promising alternative approach is to target host cell factors that are exploited by influenza viruses. To this end, we characterized the phosphoproteome of influenza A virus infected primary human macrophages to elucidate the intracellular signaling pathways and critical host factors activated upon influenza infection. We identified 1675 phosphoproteins, 4004 phosphopeptides and 4146 nonredundant phosphosites. The phosphorylation of 1113 proteins (66%) was regulated upon infection, highlighting the importance of such global phosphoproteomic profiling in primary cells. Notably, 285 of the identified phosphorylation sites have not been previously described in publicly available phosphorylation databases, despite many published large-scale phosphoproteome studies using human and mouse cell lines. Systematic bioinformatics analysis of the phosphoproteome data indicated that the phosphorylation of proteins involved in the ubiquitin/proteasome pathway (such as TRIM22 and TRIM25) and antiviral responses (such as MAVS) changed in infected macrophages. Proteins known to play roles in small GTPase-, mitogen-activated protein kinase-, and cyclin-dependent kinase- signaling were also regulated by phosphorylation upon infection. In particular, the influenza infection had a major influence on the phosphorylation profiles of a large number of cyclin-dependent kinase substrates. Functional studies using cyclin-dependent kinase inhibitors showed that the cyclin-dependent kinase activity is required for efficient viral replication and for activation of the host antiviral responses. In addition, we show that cyclin-dependent kinase inhibitors protect IAV-infected mice from death. In conclusion, we provide the first comprehensive phosphoproteome characterization of influenza A virus infection in primary human macrophages, and provide evidence that cyclin-dependent kinases represent potential therapeutic targets for more effective treatment of influenza infections. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.
Söderholm, Sandra; Kainov, Denis E.; Öhman, Tiina; Denisova, Oxana V.; Schepens, Bert; Kulesskiy, Evgeny; Imanishi, Susumu Y.; Corthals, Garry; Hintsanen, Petteri; Aittokallio, Tero; Saelens, Xavier; Matikainen, Sampsa; Nyman, Tuula A.
2016-01-01
Influenza A viruses cause infections in the human respiratory tract and give rise to annual seasonal outbreaks, as well as more rarely dreaded pandemics. Influenza A viruses become quickly resistant to the virus-directed antiviral treatments, which are the current main treatment options. A promising alternative approach is to target host cell factors that are exploited by influenza viruses. To this end, we characterized the phosphoproteome of influenza A virus infected primary human macrophages to elucidate the intracellular signaling pathways and critical host factors activated upon influenza infection. We identified 1675 phosphoproteins, 4004 phosphopeptides and 4146 nonredundant phosphosites. The phosphorylation of 1113 proteins (66%) was regulated upon infection, highlighting the importance of such global phosphoproteomic profiling in primary cells. Notably, 285 of the identified phosphorylation sites have not been previously described in publicly available phosphorylation databases, despite many published large-scale phosphoproteome studies using human and mouse cell lines. Systematic bioinformatics analysis of the phosphoproteome data indicated that the phosphorylation of proteins involved in the ubiquitin/proteasome pathway (such as TRIM22 and TRIM25) and antiviral responses (such as MAVS) changed in infected macrophages. Proteins known to play roles in small GTPase–, mitogen-activated protein kinase–, and cyclin-dependent kinase- signaling were also regulated by phosphorylation upon infection. In particular, the influenza infection had a major influence on the phosphorylation profiles of a large number of cyclin-dependent kinase substrates. Functional studies using cyclin-dependent kinase inhibitors showed that the cyclin-dependent kinase activity is required for efficient viral replication and for activation of the host antiviral responses. In addition, we show that cyclin-dependent kinase inhibitors protect IAV-infected mice from death. In conclusion, we provide the first comprehensive phosphoproteome characterization of influenza A virus infection in primary human macrophages, and provide evidence that cyclin-dependent kinases represent potential therapeutic targets for more effective treatment of influenza infections. PMID:27486199
Genome-scale CRISPR-Cas9 knockout screening in human cells.
Shalem, Ophir; Sanjana, Neville E; Hartenian, Ella; Shi, Xi; Scott, David A; Mikkelson, Tarjei; Heckl, Dirk; Ebert, Benjamin L; Root, David E; Doench, John G; Zhang, Feng
2014-01-03
The simplicity of programming the CRISPR (clustered regularly interspaced short palindromic repeats)-associated nuclease Cas9 to modify specific genomic loci suggests a new way to interrogate gene function on a genome-wide scale. We show that lentiviral delivery of a genome-scale CRISPR-Cas9 knockout (GeCKO) library targeting 18,080 genes with 64,751 unique guide sequences enables both negative and positive selection screening in human cells. First, we used the GeCKO library to identify genes essential for cell viability in cancer and pluripotent stem cells. Next, in a melanoma model, we screened for genes whose loss is involved in resistance to vemurafenib, a therapeutic RAF inhibitor. Our highest-ranking candidates include previously validated genes NF1 and MED12, as well as novel hits NF2, CUL3, TADA2B, and TADA1. We observe a high level of consistency between independent guide RNAs targeting the same gene and a high rate of hit confirmation, demonstrating the promise of genome-scale screening with Cas9.
The analysis of HIV/AIDS drug-resistant on networks
NASA Astrophysics Data System (ADS)
Liu, Maoxing
2014-01-01
In this paper, we present an Human Immunodeficiency Virus (HIV)/Acquired Immune Deficiency Syndrome (AIDS) drug-resistant model using an ordinary differential equation (ODE) model on scale-free networks. We derive the threshold for the epidemic to be zero in infinite scale-free network. We also prove the stability of disease-free equilibrium (DFE) and persistence of HIV/AIDS infection. The effects of two immunization schemes, including proportional scheme and targeted vaccination, are studied and compared. We find that targeted strategy compare favorably to a proportional condom using has prominent effect to control HIV/AIDS spread on scale-free networks.
2014-01-01
Background Abnormal states in human liver metabolism are major causes of human liver diseases ranging from hepatitis to hepatic tumor. The accumulation in relevant data makes it feasible to derive a large-scale human liver metabolic network (HLMN) and to discover important biological principles or drug-targets based on network analysis. Some studies have shown that interesting biological phenomenon and drug-targets could be discovered by applying structural controllability analysis (which is a newly prevailed concept in networks) to biological networks. The exploration on the connections between structural controllability theory and the HLMN could be used to uncover valuable information on the human liver metabolism from a fresh perspective. Results We applied structural controllability analysis to the HLMN and detected driver metabolites. The driver metabolites tend to have strong ability to influence the states of other metabolites and weak susceptibility to be influenced by the states of others. In addition, the metabolites were classified into three classes: critical, high-frequency and low-frequency driver metabolites. Among the identified 36 critical driver metabolites, 27 metabolites were found to be essential; the high-frequency driver metabolites tend to participate in different metabolic pathways, which are important in regulating the whole metabolic systems. Moreover, we explored some other possible connections between the structural controllability theory and the HLMN, and find that transport reactions and the environment play important roles in the human liver metabolism. Conclusion There are interesting connections between the structural controllability theory and the human liver metabolism: driver metabolites have essential biological functions; the crucial role of extracellular metabolites and transport reactions in controlling the HLMN highlights the importance of the environment in the health of human liver metabolism. PMID:24885538
Liquidation sales: Land speculation and landscape change
NASA Astrophysics Data System (ADS)
Lazarus, E.
2012-12-01
Large-scale land-use transitions can occur with astonishing speed, and landscape stability can change with equal suddenness: for example, the catastrophic dustbowl that paralyzed the Midwestern US in the early 1930s came barely 40 years after the derby for homestead land in Oklahoma in 1889. Some human-landscape systems, like the large prehistoric settlements in the Brazilian Amazon, persisted for centuries without environmental collapse. Others quickly exhausted all of the environmental resources available, as occurred with phosphate mining on the Pacific Island of Nauru. Although abrupt shifts from resource plenty to resource scarcity are theoretically interesting for their complexity, the very real consequences of modern social and environmental boom-bust dynamics can catalyze humanitarian crises. Drawing on historical examples and investigative reporting of current events, I explore the hypothesis that land speculation drives rapid transitions in physical landscapes at large spatial scales. "Land grabs" is one of four core environmental justice and equality issues Oxfam International is targeting in its GROW campaign, citing evidence that foreign investors are buying up vast tracts of land in developing countries, and as a consequence exacerbating food scarcity and marginalization of poor families. Al Jazeera has reported extensively on land-rights disputes in Honduras and investment deals involving foreign ownership of large areas of agricultural land in New Zealand, India, Africa, and South America. Overlapping coverage has also appeared in the New York Times, the Washington Post, the BBC News, the Guardian, and other outlets. Although land itself is only one kind of natural resource, land rights typically determine access to other natural resources (e.g. water, timber, minerals, fossil fuels). Consideration of land speculation therefore includes speculative bubbles in natural-resource markets more broadly. There are categorical commonalities in agricultural change and deforestation around the world. Although the details differ at local scales, even disparate cases of land use and landscape changes may express similar patterns and structures. Records of sediment flux in salt marshes and fluvial deposits indicate rates of past landscape responses to human activities; the 1930s dustbowl event left a sedimentary signature in western North American lakes. Petrochemicals and fertilizers from agricultural runnoff are causing hypoxic dead zones in coastal waters to expand. In the Brazilian Amazon, regional-scale changes in weather and climate have been linked to deforestation, and deforestation has been linked to patterns of boom-bust development. But even when rampant land acquisition for agriculture or housing has been identified as problematic, the attendant environmental consequences are not necessarily obvious. The nonlinear attenuation of cause and effect is a function of the hierarchy of scales that typify these complex, human-landscape systems: the emergence of long-term, large-scale environmental dynamics lag behind the short-term, localized dynamics of a resource bubble. Insight into how these coupled systems behave may reveal the scales at which government, institutional, or self-organized social intervention may be most effective, and presents an opportunity to integrate evolving spheres of research from the behavioural sciences and Earth-surface processes.
Historical legacies, information and contemporary water science and management
Bain, Daniel J.; Arrigo, Jennifer A.S.; Green, Mark B.; Pellerin, Brian A.; Vörösmarty, Charles J.
2011-01-01
Hydrologic science has largely built its understanding of the hydrologic cycle using contemporary data sources (i.e., last 100 years). However, as we try to meet water demand over the next 100 years at scales from local to global, we need to expand our scope and embrace other data that address human activities and the alteration of hydrologic systems. For example, the accumulation of human impacts on water systems requires exploration of incompletely documented eras. When examining these historical periods, basic questions relevant to modern systems arise: (1) How is better information incorporated into water management strategies? (2) Does any point in the past (e.g., colonial/pre-European conditions in North America) provide a suitable restoration target? and (3) How can understanding legacies improve our ability to plan for future conditions? Beginning to answer these questions indicates the vital need to incorporate disparate data and less accepted methods to meet looming water management challenges.
Electronics Shielding and Reliability Design Tools
NASA Technical Reports Server (NTRS)
Wilson, John W.; ONeill, P. M.; Zang, Thomas A., Jr.; Pandolf, John E.; Koontz, Steven L.; Boeder, P.; Reddell, B.; Pankop, C.
2006-01-01
It is well known that electronics placement in large-scale human-rated systems provides opportunity to optimize electronics shielding through materials choice and geometric arrangement. For example, several hundred single event upsets (SEUs) occur within the Shuttle avionic computers during a typical mission. An order of magnitude larger SEU rate would occur without careful placement in the Shuttle design. These results used basic physics models (linear energy transfer (LET), track structure, Auger recombination) combined with limited SEU cross section measurements allowing accurate evaluation of target fragment contributions to Shuttle avionics memory upsets. Electronics shielding design on human-rated systems provides opportunity to minimize radiation impact on critical and non-critical electronic systems. Implementation of shielding design tools requires adequate methods for evaluation of design layouts, guiding qualification testing, and an adequate follow-up on final design evaluation including results from a systems/device testing program tailored to meet design requirements.
Effects of Pre-Existing Target Structure on the Formation of Large Craters
NASA Technical Reports Server (NTRS)
Barnouin-Jha, O. S.; Cintala, M. J.; Crawford, D. A.
2003-01-01
The shapes of large-scale craters and the mechanics responsible for melt generation are influenced by broad and small-scale structures present in a target prior to impact. For example, well-developed systems of fractures often create craters that appear square in outline, good examples being Meteor Crater, AZ and the square craters of 433 Eros. Pre-broken target material also affects melt generation. Kieffer has shown how the shock wave generated in Coconino sandstone at Meteor crater created reverberations which, in combination with the natural target heterogeneity present, created peaks and troughs in pressure and compressed density as individual grains collided to produce a range of shock mineralogies and melts within neighboring samples. In this study, we further explore how pre-existing target structure influences various aspects of the cratering process. We combine experimental and numerical techniques to explore the connection between the scales of the impact generated shock wave and the pre-existing target structure. We focus on the propagation of shock waves in coarse, granular media, emphasizing its consequences on excavation, crater growth, ejecta production, cratering efficiency, melt generation, and crater shape. As a baseline, we present a first series of results for idealized targets where the particles are all identical in size and possess the same shock impedance. We will also present a few results, whereby we increase the complexities of the target properties by varying the grain size, strength, impedance and frictional properties. In addition, we investigate the origin and implications of reverberations that are created by the presence of physical and chemical heterogeneity in a target.
The Observations of Redshift Evolution in Large Scale Environments (ORELSE) Survey
NASA Astrophysics Data System (ADS)
Squires, Gordon K.; Lubin, L. M.; Gal, R. R.
2007-05-01
We present the motivation, design, and latest results from the Observations of Redshift Evolution in Large Scale Environments (ORELSE) Survey, a systematic search for structure on scales greater than 10 Mpc around 20 known galaxy clusters at z > 0.6. When complete, the survey will cover nearly 5 square degrees, all targeted at high-density regions, making it complementary and comparable to field surveys such as DEEP2, GOODS, and COSMOS. For the survey, we are using the Large Format Camera on the Palomar 5-m and SuPRIME-Cam on the Subaru 8-m to obtain optical/near-infrared imaging of an approximately 30 arcmin region around previously studied high-redshift clusters. Colors are used to identify likely member galaxies which are targeted for follow-up spectroscopy with the DEep Imaging Multi-Object Spectrograph on the Keck 10-m. This technique has been used to identify successfully the Cl 1604 supercluster at z = 0.9, a large scale structure containing at least eight clusters (Gal & Lubin 2004; Gal, Lubin & Squires 2005). We present the most recent structures to be photometrically and spectroscopically confirmed through this program, discuss the properties of the member galaxies as a function of environment, and describe our planned multi-wavelength (radio, mid-IR, and X-ray) observations of these systems. The goal of this survey is to identify and examine a statistical sample of large scale structures during an active period in the assembly history of the most massive clusters. With such a sample, we can begin to constrain large scale cluster dynamics and determine the effect of the larger environment on galaxy evolution.
Pizzitutti, Francesco; Pan, William; Feingold, Beth; Zaitchik, Ben; Álvarez, Carlos A; Mena, Carlos F
2018-01-01
Though malaria control initiatives have markedly reduced malaria prevalence in recent decades, global eradication is far from actuality. Recent studies show that environmental and social heterogeneities in low-transmission settings have an increased weight in shaping malaria micro-epidemiology. New integrated and more localized control strategies should be developed and tested. Here we present a set of agent-based models designed to study the influence of local scale human movements on local scale malaria transmission in a typical Amazon environment, where malaria is transmission is low and strongly connected with seasonal riverine flooding. The agent-based simulations show that the overall malaria incidence is essentially not influenced by local scale human movements. In contrast, the locations of malaria high risk spatial hotspots heavily depend on human movements because simulated malaria hotspots are mainly centered on farms, were laborers work during the day. The agent-based models are then used to test the effectiveness of two different malaria control strategies both designed to reduce local scale malaria incidence by targeting hotspots. The first control scenario consists in treat against mosquito bites people that, during the simulation, enter at least once inside hotspots revealed considering the actual sites where human individuals were infected. The second scenario involves the treatment of people entering in hotspots calculated assuming that the infection sites of every infected individual is located in the household where the individual lives. Simulations show that both considered scenarios perform better in controlling malaria than a randomized treatment, although targeting household hotspots shows slightly better performance.
Fitting a Point Cloud to a 3d Polyhedral Surface
NASA Astrophysics Data System (ADS)
Popov, E. V.; Rotkov, S. I.
2017-05-01
The ability to measure parameters of large-scale objects in a contactless fashion has a tremendous potential in a number of industrial applications. However, this problem is usually associated with an ambiguous task to compare two data sets specified in two different co-ordinate systems. This paper deals with the study of fitting a set of unorganized points to a polyhedral surface. The developed approach uses Principal Component Analysis (PCA) and Stretched grid method (SGM) to substitute a non-linear problem solution with several linear steps. The squared distance (SD) is a general criterion to control the process of convergence of a set of points to a target surface. The described numerical experiment concerns the remote measurement of a large-scale aerial in the form of a frame with a parabolic shape. The experiment shows that the fitting process of a point cloud to a target surface converges in several linear steps. The method is applicable to the geometry remote measurement of large-scale objects in a contactless fashion.
Lee, Mikyung; Huang, Ruili; Tong, Weida
2016-01-01
Nuclear receptors (NRs) are ligand-activated transcriptional regulators that play vital roles in key biological processes such as growth, differentiation, metabolism, reproduction, and morphogenesis. Disruption of NRs can result in adverse health effects such as NR-mediated endocrine disruption. A comprehensive understanding of core transcriptional targets regulated by NRs helps to elucidate their key biological processes in both toxicological and therapeutic aspects. In this study, we applied a probabilistic graphical model to identify the transcriptional targets of NRs and the biological processes they govern. The Tox21 program profiled a collection of approximate 10 000 environmental chemicals and drugs against a panel of human NRs in a quantitative high-throughput screening format for their NR disruption potential. The Japanese Toxicogenomics Project, one of the most comprehensive efforts in the field of toxicogenomics, generated large-scale gene expression profiles on the effect of 131 compounds (in its first phase of study) at various doses, and different durations, and their combinations. We applied author-topic model to these 2 toxicological datasets, which consists of 11 NRs run in either agonist and/or antagonist mode (18 assays total) and 203 in vitro human gene expression profiles connected by 52 shared drugs. As a result, a set of clusters (topics), which consists of a set of NRs and their associated target genes were determined. Various transcriptional targets of the NRs were identified by assays run in either agonist or antagonist mode. Our results were validated by functional analysis and compared with TRANSFAC data. In summary, our approach resulted in effective identification of associated/affected NRs and their target genes, providing biologically meaningful hypothesis embedded in their relationships. PMID:26643261
Allometry indicates giant eyes of giant squid are not exceptional.
Schmitz, Lars; Motani, Ryosuke; Oufiero, Christopher E; Martin, Christopher H; McGee, Matthew D; Gamarra, Ashlee R; Lee, Johanna J; Wainwright, Peter C
2013-02-18
The eyes of giant and colossal squid are among the largest eyes in the history of life. It was recently proposed that sperm whale predation is the main driver of eye size evolution in giant squid, on the basis of an optical model that suggested optimal performance in detecting large luminous visual targets such as whales in the deep sea. However, it is poorly understood how the eye size of giant and colossal squid compares to that of other aquatic organisms when scaling effects are considered. We performed a large-scale comparative study that included 87 squid species and 237 species of acanthomorph fish. While squid have larger eyes than most acanthomorphs, a comparison of relative eye size among squid suggests that giant and colossal squid do not have unusually large eyes. After revising constants used in a previous model we found that large eyes perform equally well in detecting point targets and large luminous targets in the deep sea. The eyes of giant and colossal squid do not appear exceptionally large when allometric effects are considered. It is probable that the giant eyes of giant squid result from a phylogenetically conserved developmental pattern manifested in very large animals. Whatever the cause of large eyes, they appear to have several advantages for vision in the reduced light of the deep mesopelagic zone.
Extending SME to Handle Large-Scale Cognitive Modeling.
Forbus, Kenneth D; Ferguson, Ronald W; Lovett, Andrew; Gentner, Dedre
2017-07-01
Analogy and similarity are central phenomena in human cognition, involved in processes ranging from visual perception to conceptual change. To capture this centrality requires that a model of comparison must be able to integrate with other processes and handle the size and complexity of the representations required by the tasks being modeled. This paper describes extensions to Structure-Mapping Engine (SME) since its inception in 1986 that have increased its scope of operation. We first review the basic SME algorithm, describe psychological evidence for SME as a process model, and summarize its role in simulating similarity-based retrieval and generalization. Then we describe five techniques now incorporated into the SME that have enabled it to tackle large-scale modeling tasks: (a) Greedy merging rapidly constructs one or more best interpretations of a match in polynomial time: O(n 2 log(n)); (b) Incremental operation enables mappings to be extended as new information is retrieved or derived about the base or target, to model situations where information in a task is updated over time; (c) Ubiquitous predicates model the varying degrees to which items may suggest alignment; (d) Structural evaluation of analogical inferences models aspects of plausibility judgments; (e) Match filters enable large-scale task models to communicate constraints to SME to influence the mapping process. We illustrate via examples from published studies how these enable it to capture a broader range of psychological phenomena than before. Copyright © 2016 Cognitive Science Society, Inc.
Diebel, M.W.; Maxted, J.T.; Robertson, Dale M.; Han, S.; Vander Zanden, M. J.
2009-01-01
Riparian buffers have the potential to improve stream water quality in agricultural landscapes. This potential may vary in response to landscape characteristics such as soils, topography, land use, and human activities, including legacies of historical land management. We built a predictive model to estimate the sediment and phosphorus load reduction that should be achievable following the implementation of riparian buffers; then we estimated load reduction potential for a set of 1598 watersheds (average 54 km2) in Wisconsin. Our results indicate that land cover is generally the most important driver of constituent loads in Wisconsin streams, but its influence varies among pollutants and according to the scale at which it is measured. Physiographic (drainage density) variation also influenced sediment and phosphorus loads. The effect of historical land use on present-day channel erosion and variation in soil texture are the most important sources of phosphorus and sediment that riparian buffers cannot attenuate. However, in most watersheds, a large proportion (approximately 70%) of these pollutants can be eliminated from streams with buffers. Cumulative frequency distributions of load reduction potential indicate that targeting pollution reduction in the highest 10% of Wisconsin watersheds would reduce total phosphorus and sediment loads in the entire state by approximately 20%. These results support our approach of geographically targeting nonpoint source pollution reduction at multiple scales, including the watershed scale. ?? 2008 Springer Science+Business Media, LLC.
Antibody Engineering for Pursuing a Healthier Future
Saeed, Abdullah F. U. H.; Wang, Rongzhi; Ling, Sumei; Wang, Shihua
2017-01-01
Since the development of antibody-production techniques, a number of immunoglobulins have been developed on a large scale using conventional methods. Hybridoma technology opened a new horizon in the production of antibodies against target antigens of infectious pathogens, malignant diseases including autoimmune disorders, and numerous potent toxins. However, these clinical humanized or chimeric murine antibodies have several limitations and complexities. Therefore, to overcome these difficulties, recent advances in genetic engineering techniques and phage display technique have allowed the production of highly specific recombinant antibodies. These engineered antibodies have been constructed in the hunt for novel therapeutic drugs equipped with enhanced immunoprotective abilities, such as engaging immune effector functions, effective development of fusion proteins, efficient tumor and tissue penetration, and high-affinity antibodies directed against conserved targets. Advanced antibody engineering techniques have extensive applications in the fields of immunology, biotechnology, diagnostics, and therapeutic medicines. However, there is limited knowledge regarding dynamic antibody development approaches. Therefore, this review extends beyond our understanding of conventional polyclonal and monoclonal antibodies. Furthermore, recent advances in antibody engineering techniques together with antibody fragments, display technologies, immunomodulation, and broad applications of antibodies are discussed to enhance innovative antibody production in pursuit of a healthier future for humans. PMID:28400756
Neural Mechanisms Behind Identification of Leptokurtic Noise and Adaptive Behavioral Response
d'Acremont, Mathieu; Bossaerts, Peter
2016-01-01
Large-scale human interaction through, for example, financial markets causes ceaseless random changes in outcome variability, producing frequent and salient outliers that render the outcome distribution more peaked than the Gaussian distribution, and with longer tails. Here, we study how humans cope with this evolutionary novel leptokurtic noise, focusing on the neurobiological mechanisms that allow the brain, 1) to recognize the outliers as noise and 2) to regulate the control necessary for adaptive response. We used functional magnetic resonance imaging, while participants tracked a target whose movements were affected by leptokurtic noise. After initial overreaction and insufficient subsequent correction, participants improved performance significantly. Yet, persistently long reaction times pointed to continued need for vigilance and control. We ran a contrasting treatment where outliers reflected permanent moves of the target, as in traditional mean-shift paradigms. Importantly, outliers were equally frequent and salient. There, control was superior and reaction time was faster. We present a novel reinforcement learning model that fits observed choices better than the Bayes-optimal model. Only anterior insula discriminated between the 2 types of outliers. In both treatments, outliers initially activated an extensive bottom-up attention and belief network, followed by sustained engagement of the fronto-parietal control network. PMID:26850528
DOE Office of Scientific and Technical Information (OSTI.GOV)
Toral-Barza, Lourdes; Zhang Weiguo; Lamison, Craig
The mammalian target of rapamycin (mTOR/TOR) is implicated in cancer and other human disorders and thus an important target for therapeutic intervention. To study human TOR in vitro, we have produced in large scale both the full-length TOR (289 kDa) and a truncated TOR (132 kDa) from HEK293 cells. Both enzymes demonstrated a robust and specific catalytic activity towards the physiological substrate proteins, p70 S6 ribosomal protein kinase 1 (p70S6K1) and eIF4E binding protein 1 (4EBP1), as measured by phosphor-specific antibodies in Western blotting. We developed a high capacity dissociation-enhanced lanthanide fluorescence immunoassay (DELFIA) for analysis of kinetic parameters. Themore » Michaelis constant (K {sub m}) values of TOR for ATP and the His6-S6K substrate were shown to be 50 and 0.8 {mu}M, respectively. Dose-response and inhibition mechanisms of several known inhibitors, the rapamycin-FKBP12 complex, wortmannin and LY294002, were also studied in DELFIA. Our data indicate that TOR exhibits kinetic features of those shared by traditional serine/threonine kinases and demonstrate the feasibility for TOR enzyme screen in searching for new inhibitors.« less
NASA Astrophysics Data System (ADS)
Giardiello, Marco; Liptrott, Neill J.; McDonald, Tom O.; Moss, Darren; Siccardi, Marco; Martin, Phil; Smith, Darren; Gurjar, Rohan; Rannard, Steve P.; Owen, Andrew
2016-10-01
Considerable scope exists to vary the physical and chemical properties of nanoparticles, with subsequent impact on biological interactions; however, no accelerated process to access large nanoparticle material space is currently available, hampering the development of new nanomedicines. In particular, no clinically available nanotherapies exist for HIV populations and conventional paediatric HIV medicines are poorly available; one current paediatric formulation utilizes high ethanol concentrations to solubilize lopinavir, a poorly soluble antiretroviral. Here we apply accelerated nanomedicine discovery to generate a potential aqueous paediatric HIV nanotherapy, with clinical translation and regulatory approval for human evaluation. Our rapid small-scale screening approach yields large libraries of solid drug nanoparticles (160 individual components) targeting oral dose. Screening uses 1 mg of drug compound per library member and iterative pharmacological and chemical evaluation establishes potential candidates for progression through to clinical manufacture. The wide applicability of our strategy has implications for multiple therapy development programmes.
Functional genomic Landscape of Human Breast Cancer drivers, vulnerabilities, and resistance
Marcotte, Richard; Sayad, Azin; Brown, Kevin R.; Sanchez-Garcia, Felix; Reimand, Jüri; Haider, Maliha; Virtanen, Carl; Bradner, James E.; Bader, Gary D.; Mills, Gordon B.; Pe’er, Dana; Moffat, Jason; Neel, Benjamin G.
2016-01-01
Summary Large-scale genomic studies have identified multiple somatic aberrations in breast cancer, including copy number alterations, and point mutations. Still, identifying causal variants and emergent vulnerabilities that arise as a consequence of genetic alterations remain major challenges. We performed whole genome shRNA “dropout screens” on 77 breast cancer cell lines. Using a hierarchical linear regression algorithm to score our screen results and integrate them with accompanying detailed genetic and proteomic information, we identify vulnerabilities in breast cancer, including candidate “drivers,” and reveal general functional genomic properties of cancer cells. Comparisons of gene essentiality with drug sensitivity data suggest potential resistance mechanisms, effects of existing anti-cancer drugs, and opportunities for combination therapy. Finally, we demonstrate the utility of this large dataset by identifying BRD4 as a potential target in luminal breast cancer, and PIK3CA mutations as a resistance determinant for BET-inhibitors. PMID:26771497
de Rocquigny, H; Ficheux, D; Gabus, C; Fournié-Zaluski, M C; Darlix, J L; Roques, B P
1991-10-31
The nucleocapsid protein (NC) of the human immunodeficiency virus type 1 plays a crucial role in the formation of infectious viral particles and therefore should be a major target for the development of antiviral agents. This requires an investigation of NC protein structure and of its interactions with both primer tRNA(Lys,3) and genomic RNA. Nucleocapsid protein NCp7, which results from the maturation of NCp15, contains two zinc fingers flanked by sequences rich in basic and proline residues. Here we report the first synthesis of large quantities of NCp7 able to activate HIV-1 RNA dimerization and replication primer tRNA(Lys,3) annealing to the initiation site of reverse transcription. In addition UV spectroscopic analyses performed to characterize the Co2+ binding properties of each zinc finger suggest that the two fingers probably interact in NCp7.
A Computational Approach to Finding Novel Targets for Existing Drugs
Li, Yvonne Y.; An, Jianghong; Jones, Steven J. M.
2011-01-01
Repositioning existing drugs for new therapeutic uses is an efficient approach to drug discovery. We have developed a computational drug repositioning pipeline to perform large-scale molecular docking of small molecule drugs against protein drug targets, in order to map the drug-target interaction space and find novel interactions. Our method emphasizes removing false positive interaction predictions using criteria from known interaction docking, consensus scoring, and specificity. In all, our database contains 252 human protein drug targets that we classify as reliable-for-docking as well as 4621 approved and experimental small molecule drugs from DrugBank. These were cross-docked, then filtered through stringent scoring criteria to select top drug-target interactions. In particular, we used MAPK14 and the kinase inhibitor BIM-8 as examples where our stringent thresholds enriched the predicted drug-target interactions with known interactions up to 20 times compared to standard score thresholds. We validated nilotinib as a potent MAPK14 inhibitor in vitro (IC50 40 nM), suggesting a potential use for this drug in treating inflammatory diseases. The published literature indicated experimental evidence for 31 of the top predicted interactions, highlighting the promising nature of our approach. Novel interactions discovered may lead to the drug being repositioned as a therapeutic treatment for its off-target's associated disease, added insight into the drug's mechanism of action, and added insight into the drug's side effects. PMID:21909252
Horlbeck, Max A; Gilbert, Luke A; Villalta, Jacqueline E; Adamson, Britt; Pak, Ryan A; Chen, Yuwen; Fields, Alexander P; Park, Chong Yon; Corn, Jacob E; Kampmann, Martin; Weissman, Jonathan S
2016-01-01
We recently found that nucleosomes directly block access of CRISPR/Cas9 to DNA (Horlbeck et al., 2016). Here, we build on this observation with a comprehensive algorithm that incorporates chromatin, position, and sequence features to accurately predict highly effective single guide RNAs (sgRNAs) for targeting nuclease-dead Cas9-mediated transcriptional repression (CRISPRi) and activation (CRISPRa). We use this algorithm to design next-generation genome-scale CRISPRi and CRISPRa libraries targeting human and mouse genomes. A CRISPRi screen for essential genes in K562 cells demonstrates that the large majority of sgRNAs are highly active. We also find CRISPRi does not exhibit any detectable non-specific toxicity recently observed with CRISPR nuclease approaches. Precision-recall analysis shows that we detect over 90% of essential genes with minimal false positives using a compact 5 sgRNA/gene library. Our results establish CRISPRi and CRISPRa as premier tools for loss- or gain-of-function studies and provide a general strategy for identifying Cas9 target sites. DOI: http://dx.doi.org/10.7554/eLife.19760.001 PMID:27661255
USDA-ARS?s Scientific Manuscript database
Large-scale assemblies of people in a con'ned space can exert signi'cant impacts on the local air chemistry due to human emissions of volatile organics. Variations of air-quality in such small scale can be studied by quantifying 'ngerprint volatile organic compounds (VOCs) such as acetone, toluene, ...
Cross-indexing of binary SIFT codes for large-scale image search.
Liu, Zhen; Li, Houqiang; Zhang, Liyan; Zhou, Wengang; Tian, Qi
2014-05-01
In recent years, there has been growing interest in mapping visual features into compact binary codes for applications on large-scale image collections. Encoding high-dimensional data as compact binary codes reduces the memory cost for storage. Besides, it benefits the computational efficiency since the computation of similarity can be efficiently measured by Hamming distance. In this paper, we propose a novel flexible scale invariant feature transform (SIFT) binarization (FSB) algorithm for large-scale image search. The FSB algorithm explores the magnitude patterns of SIFT descriptor. It is unsupervised and the generated binary codes are demonstrated to be dispreserving. Besides, we propose a new searching strategy to find target features based on the cross-indexing in the binary SIFT space and original SIFT space. We evaluate our approach on two publicly released data sets. The experiments on large-scale partial duplicate image retrieval system demonstrate the effectiveness and efficiency of the proposed algorithm.
River Networks and Human Activities: Global Fractal Analysis Using Nightlight Data
NASA Astrophysics Data System (ADS)
McCurley, K. 4553; Fang, Y.; Ceola, S.; Paik, K.; McGrath, G. S.; Montanari, A.; Rao, P. S.; Jawitz, J. W.
2016-12-01
River networks hold an important historical role in affecting human population distribution. In this study, we link the geomorphological structure of river networks to the pattern of human activities at a global scale. We use nightlights as a valuable proxy for the presence of human settlements and economic activity, and we employ HydroSHEDS as the main data source on river networks. We test the hypotheses that, analogous to Horton's laws, human activities (magnitude of nightlights) also show scaling relationship with stream order, and that the intensity of human activities decrease as the distance from the basin outlet increase. Our results demonstrate that the distribution of human activities shows a fractal structure, with power-law scaling between human activities and stream order. This relationship is robust among global river basins. Human activities are more concentrated in larger order basins, but show large variation in equivalent order basins, with higher population density emergent in the basins connected with high-order rivers. For all global river basins longer than 400km, the average intensity of human activities decrease as the distance to the outlets increases, albeit with signatures of large cities at varied distances. The power spectrum of human width (area) function is found to exhibit power law scaling, with a scaling exponent that indicates enrichment of low frequency variation. The universal fractal structure of human activities may reflect an optimum arrangement for humans in river basins to better utilize the water resources, ecological assets, and geographic advantages. The generalized patterns of human activities could be applied to better understand hydrologic and biogeochemical responses in river basins, and to advance catchment management.
Zhang, Xing; Romm, Michelle; Zheng, Xueyun; ...
2016-12-29
Characterization of endogenous metabolites and xenobiotics is essential to deconvoluting the genetic and environmental causes of disease. However, surveillance of chemical exposure and disease-related changes in large cohorts requires an analytical platform that offers rapid measurement, high sensitivity, efficient separation, broad dynamic range, and application to an expansive chemical space. Here in this article, we present a novel platform for small molecule analyses that addresses these requirements by combining solid-phase extraction with ion mobility spectrometry and mass spectrometry (SPE-IMS-MS). This platform is capable of performing both targeted and global measurements of endogenous metabolites and xenobiotics in human biofluids with highmore » reproducibility (CV ≤ 3%), sensitivity (LODs in the pM range in biofluids) and throughput (10-s sample-to-sample duty cycle). We report application of this platform to the analysis of human urine from patients with and without type 1 diabetes, where we observed statistically significant variations in the concentration of disaccharides and previously unreported chemical isomers. Lastly, this SPE-IMS-MS platform overcomes many of the current challenges of large-scale metabolomic and exposomic analyses and offers a viable option for population and patient cohort screening in an effort to gain insights into disease processes and human environmental chemical exposure.« less
LITTLE FISH, BIG DATA: ZEBRAFISH AS A MODEL FOR CARDIOVASCULAR AND METABOLIC DISEASE.
Gut, Philipp; Reischauer, Sven; Stainier, Didier Y R; Arnaout, Rima
2017-07-01
The burden of cardiovascular and metabolic diseases worldwide is staggering. The emergence of systems approaches in biology promises new therapies, faster and cheaper diagnostics, and personalized medicine. However, a profound understanding of pathogenic mechanisms at the cellular and molecular levels remains a fundamental requirement for discovery and therapeutics. Animal models of human disease are cornerstones of drug discovery as they allow identification of novel pharmacological targets by linking gene function with pathogenesis. The zebrafish model has been used for decades to study development and pathophysiology. More than ever, the specific strengths of the zebrafish model make it a prime partner in an age of discovery transformed by big-data approaches to genomics and disease. Zebrafish share a largely conserved physiology and anatomy with mammals. They allow a wide range of genetic manipulations, including the latest genome engineering approaches. They can be bred and studied with remarkable speed, enabling a range of large-scale phenotypic screens. Finally, zebrafish demonstrate an impressive regenerative capacity scientists hope to unlock in humans. Here, we provide a comprehensive guide on applications of zebrafish to investigate cardiovascular and metabolic diseases. We delineate advantages and limitations of zebrafish models of human disease and summarize their most significant contributions to understanding disease progression to date. Copyright © 2017 the American Physiological Society.
Zhang, Xing; Romm, Michelle; Zheng, Xueyun; Zink, Erika M; Kim, Young-Mo; Burnum-Johnson, Kristin E; Orton, Daniel J; Apffel, Alex; Ibrahim, Yehia M; Monroe, Matthew E; Moore, Ronald J; Smith, Jordan N; Ma, Jian; Renslow, Ryan S; Thomas, Dennis G; Blackwell, Anne E; Swinford, Glenn; Sausen, John; Kurulugama, Ruwan T; Eno, Nathan; Darland, Ed; Stafford, George; Fjeldsted, John; Metz, Thomas O; Teeguarden, Justin G; Smith, Richard D; Baker, Erin S
2016-12-01
Characterization of endogenous metabolites and xenobiotics is essential to deconvoluting the genetic and environmental causes of disease. However, surveillance of chemical exposure and disease-related changes in large cohorts requires an analytical platform that offers rapid measurement, high sensitivity, efficient separation, broad dynamic range, and application to an expansive chemical space. Here, we present a novel platform for small molecule analyses that addresses these requirements by combining solid-phase extraction with ion mobility spectrometry and mass spectrometry (SPE-IMS-MS). This platform is capable of performing both targeted and global measurements of endogenous metabolites and xenobiotics in human biofluids with high reproducibility (CV 6 3%), sensitivity (LODs in the pM range in biofluids) and throughput (10-s sample-to-sample duty cycle). We report application of this platform to the analysis of human urine from patients with and without type 1 diabetes, where we observed statistically significant variations in the concentration of disaccharides and previously unreported chemical isomers. This SPE-IMS-MS platform overcomes many of the current challenges of large-scale metabolomic and exposomic analyses and offers a viable option for population and patient cohort screening in an effort to gain insights into disease processes and human environmental chemical exposure.
Zhang, Xing; Romm, Michelle; Zheng, Xueyun; Zink, Erika M.; Kim, Young-Mo; Burnum-Johnson, Kristin E.; Orton, Daniel J.; Apffel, Alex; Ibrahim, Yehia M.; Monroe, Matthew E.; Moore, Ronald J.; Smith, Jordan N.; Ma, Jian; Renslow, Ryan S.; Thomas, Dennis G.; Blackwell, Anne E.; Swinford, Glenn; Sausen, John; Kurulugama, Ruwan T.; Eno, Nathan; Darland, Ed; Stafford, George; Fjeldsted, John; Metz, Thomas O.; Teeguarden, Justin G.; Smith, Richard D.; Baker, Erin S.
2017-01-01
Characterization of endogenous metabolites and xenobiotics is essential to deconvoluting the genetic and environmental causes of disease. However, surveillance of chemical exposure and disease-related changes in large cohorts requires an analytical platform that offers rapid measurement, high sensitivity, efficient separation, broad dynamic range, and application to an expansive chemical space. Here, we present a novel platform for small molecule analyses that addresses these requirements by combining solid-phase extraction with ion mobility spectrometry and mass spectrometry (SPE-IMS-MS). This platform is capable of performing both targeted and global measurements of endogenous metabolites and xenobiotics in human biofluids with high reproducibility (CV 6 3%), sensitivity (LODs in the pM range in biofluids) and throughput (10-s sample-to-sample duty cycle). We report application of this platform to the analysis of human urine from patients with and without type 1 diabetes, where we observed statistically significant variations in the concentration of disaccharides and previously unreported chemical isomers. This SPE-IMS-MS platform overcomes many of the current challenges of large-scale metabolomic and exposomic analyses and offers a viable option for population and patient cohort screening in an effort to gain insights into disease processes and human environmental chemical exposure. PMID:29276770
diCenzo, George C; Finan, Turlough M
2018-01-01
The rate at which all genes within a bacterial genome can be identified far exceeds the ability to characterize these genes. To assist in associating genes with cellular functions, a large-scale bacterial genome deletion approach can be employed to rapidly screen tens to thousands of genes for desired phenotypes. Here, we provide a detailed protocol for the generation of deletions of large segments of bacterial genomes that relies on the activity of a site-specific recombinase. In this procedure, two recombinase recognition target sequences are introduced into known positions of a bacterial genome through single cross-over plasmid integration. Subsequent expression of the site-specific recombinase mediates recombination between the two target sequences, resulting in the excision of the intervening region and its loss from the genome. We further illustrate how this deletion system can be readily adapted to function as a large-scale in vivo cloning procedure, in which the region excised from the genome is captured as a replicative plasmid. We next provide a procedure for the metabolic analysis of bacterial large-scale genome deletion mutants using the Biolog Phenotype MicroArray™ system. Finally, a pipeline is described, and a sample Matlab script is provided, for the integration of the obtained data with a draft metabolic reconstruction for the refinement of the reactions and gene-protein-reaction relationships in a metabolic reconstruction.
Designing large-scale conservation corridors for pattern and process.
Rouget, Mathieu; Cowling, Richard M; Lombard, Amanda T; Knight, Andrew T; Kerley, Graham I H
2006-04-01
A major challenge for conservation assessments is to identify priority areas that incorporate biological patterns and processes. Because large-scale processes are mostly oriented along environmental gradients, we propose to accommodate them by designing regional-scale corridors to capture these gradients. Based on systematic conservation planning principles such as representation and persistence, we identified large tracts of untransformed land (i.e., conservation corridors) for conservation that would achieve biodiversity targets for pattern and process in the Subtropical Thicket Biome of South Africa. We combined least-cost path analysis with a target-driven algorithm to identify the best option for capturing key environmental gradients while considering biodiversity targets and conservation opportunities and constraints. We identified seven conservation corridors on the basis of subtropical thicket representation, habitat transformation and degradation, wildlife suitability, irreplaceability of vegetation types, protected area networks, and future land-use pressures. These conservation corridors covered 21.1% of the planning region (ranging from 600 to 5200 km2) and successfully achieved targets for biological processes and to a lesser extent for vegetation types. The corridors we identified are intended to promote the persistence of ecological processes (gradients and fixed processes) and fulfill half of the biodiversity pattern target. We compared the conservation corridors with a simplified corridor design consisting of a fixed-width buffer along major rivers. Conservation corridors outperformed river buffers in seven out of eight criteria. Our corridor design can provide a tool for quantifying trade-offs between various criteria (biodiversity pattern and process, implementation constraints and opportunities). A land-use management model was developed to facilitate implementation of conservation actions within these corridors.
AutoBD: Automated Bi-Level Description for Scalable Fine-Grained Visual Categorization.
Yao, Hantao; Zhang, Shiliang; Yan, Chenggang; Zhang, Yongdong; Li, Jintao; Tian, Qi
Compared with traditional image classification, fine-grained visual categorization is a more challenging task, because it targets to classify objects belonging to the same species, e.g. , classify hundreds of birds or cars. In the past several years, researchers have made many achievements on this topic. However, most of them are heavily dependent on the artificial annotations, e.g., bounding boxes, part annotations, and so on . The requirement of artificial annotations largely hinders the scalability and application. Motivated to release such dependence, this paper proposes a robust and discriminative visual description named Automated Bi-level Description (AutoBD). "Bi-level" denotes two complementary part-level and object-level visual descriptions, respectively. AutoBD is "automated," because it only requires the image-level labels of training images and does not need any annotations for testing images. Compared with the part annotations labeled by the human, the image-level labels can be easily acquired, which thus makes AutoBD suitable for large-scale visual categorization. Specifically, the part-level description is extracted by identifying the local region saliently representing the visual distinctiveness. The object-level description is extracted from object bounding boxes generated with a co-localization algorithm. Although only using the image-level labels, AutoBD outperforms the recent studies on two public benchmark, i.e. , classification accuracy achieves 81.6% on CUB-200-2011 and 88.9% on Car-196, respectively. On the large-scale Birdsnap data set, AutoBD achieves the accuracy of 68%, which is currently the best performance to the best of our knowledge.Compared with traditional image classification, fine-grained visual categorization is a more challenging task, because it targets to classify objects belonging to the same species, e.g. , classify hundreds of birds or cars. In the past several years, researchers have made many achievements on this topic. However, most of them are heavily dependent on the artificial annotations, e.g., bounding boxes, part annotations, and so on . The requirement of artificial annotations largely hinders the scalability and application. Motivated to release such dependence, this paper proposes a robust and discriminative visual description named Automated Bi-level Description (AutoBD). "Bi-level" denotes two complementary part-level and object-level visual descriptions, respectively. AutoBD is "automated," because it only requires the image-level labels of training images and does not need any annotations for testing images. Compared with the part annotations labeled by the human, the image-level labels can be easily acquired, which thus makes AutoBD suitable for large-scale visual categorization. Specifically, the part-level description is extracted by identifying the local region saliently representing the visual distinctiveness. The object-level description is extracted from object bounding boxes generated with a co-localization algorithm. Although only using the image-level labels, AutoBD outperforms the recent studies on two public benchmark, i.e. , classification accuracy achieves 81.6% on CUB-200-2011 and 88.9% on Car-196, respectively. On the large-scale Birdsnap data set, AutoBD achieves the accuracy of 68%, which is currently the best performance to the best of our knowledge.
DESIGN OF LARGE-SCALE AIR MONITORING NETWORKS
The potential effects of air pollution on human health have received much attention in recent years. In the U.S. and other countries, there are extensive large-scale monitoring networks designed to collect data to inform the public of exposure risks to air pollution. A major crit...
Space and time scales in human-landscape systems.
Kondolf, G Mathias; Podolak, Kristen
2014-01-01
Exploring spatial and temporal scales provides a way to understand human alteration of landscape processes and human responses to these processes. We address three topics relevant to human-landscape systems: (1) scales of human impacts on geomorphic processes, (2) spatial and temporal scales in river restoration, and (3) time scales of natural disasters and behavioral and institutional responses. Studies showing dramatic recent change in sediment yields from uplands to the ocean via rivers illustrate the increasingly vast spatial extent and quick rate of human landscape change in the last two millennia, but especially in the second half of the twentieth century. Recent river restoration efforts are typically small in spatial and temporal scale compared to the historical human changes to ecosystem processes, but the cumulative effectiveness of multiple small restoration projects in achieving large ecosystem goals has yet to be demonstrated. The mismatch between infrequent natural disasters and individual risk perception, media coverage, and institutional response to natural disasters results in un-preparedness and unsustainable land use and building practices.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Teeguarden, Justin G.; Mikheev, Vladimir B.; Minard, Kevin R.
testing the rapidly growing number of nanomaterials requires large scale use of in vitro systems under the presumption that these systems are sufficiently predictive or descriptive of responses in in vivo systems for effective use in hazard ranking. We hypothesized that improved relationships between in vitro and in vivo models of experimental toxicology for nanomaterials would result from placing response data in vitro and in vivo on the same dose scale, the amount of material associated with cells (target cell dose). Methods: Balb/c mice were exposed nose-only to an aerosol of 12.8 nm (68.6 nm CMD, 19.9 mg/m3, 4 hours)more » super paramagnetic iron oxide particles, target cell doses were calculated and biomarkers of response anchored with histological evidence were identified by global transcriptomics. Representative murine epithelial and macrophage cell types were exposed in vitro to the same material in liquid suspension for four hours and levels nanoparticle regulated cytokine transcripts identified in vivo were quantified as a function of measured nanoparticle cellular dose. Results. Target tissue doses of 0.009-0.4 μg SPIO/cm2 lung led to an inflammatory response in the alveolar region characterized by interstitial inflammation and macrophage infiltration. In vitro, higher target tissue doses of ~1.2-4 μg SPIO/ cm2 of cells were required to induce transcriptional regulation of markers of inflammation, CXCL2 CCL3, in C10 lung epithelial cells. Estimated in vivo macrophage SPIO nanoparticle doses ranged from 1-100 pg/cell, and induction of inflammatory markers was observed in vitro in macrophages at doses of 8-35 pg/cell. Conclusions: Application of target tissue dosimetry revealed good correspondence between target cell doses triggering inflammatory processes in vitro and in vivo in the alveolar macrophage population, but not in the epithelial cells of the alveolar region. These findings demonstrate the potential for target tissue dosimetry to enable the more quantitative comparison of in vitro and in vivo systems advance their use for hazard assessment and extrapolation to humans. The mildly inflammogentic cellular doses experienced by mice were similar those calculated for humans exposed to the same at the existing permissible exposure limit of 10 mg/m3 iron oxide (as Fe).« less
Mari, Lorenzo; Bonaventura, Luca; Storto, Andrea; Melià, Paco; Gatto, Marino; Masina, Simona
2017-01-01
Protecting key hotspots of marine biodiversity is essential to maintain ecosystem services at large spatial scales. Protected areas serve not only as sources of propagules colonizing other habitats, but also as receptors, thus acting as protected nurseries. To quantify the geographical extent and the temporal persistence of ecological benefits resulting from protection, we investigate larval connectivity within a remote archipelago, characterized by a strong spatial gradient of human impact from pristine to heavily exploited: the Northern Line Islands (NLIs), including part of the Pacific Remote Islands Marine National Monument (PRI-MNM). Larvae are described as passive Lagrangian particles transported by oceanic currents obtained from a oceanographic reanalysis. We compare different simulation schemes and compute connectivity measures (larval exchange probabilities and minimum/average larval dispersal distances from target islands). To explore the role of PRI-MNM in protecting marine organisms with pelagic larval stages, we drive millions of individual-based simulations for various Pelagic Larval Durations (PLDs), in all release seasons, and over a two-decades time horizon (1991–2010). We find that connectivity in the NLIs is spatially asymmetric and displays significant intra- and inter-annual variations. The islands belonging to PRI-MNM act more as sinks than sources of larvae, and connectivity is higher during the winter-spring period. In multi-annual analyses, yearly averaged southward connectivity significantly and negatively correlates with climatological anomalies (El Niño). This points out a possible system fragility and susceptibility to global warming. Quantitative assessments of large-scale, long-term marine connectivity patterns help understand region-specific, ecologically relevant interactions between islands. This is fundamental for devising scientifically-based protection strategies, which must be space- and time-varying to cope with the challenges posed by the concurrent pressures of human exploitation and global climate change. PMID:28809937
Mari, Lorenzo; Bonaventura, Luca; Storto, Andrea; Melià, Paco; Gatto, Marino; Masina, Simona; Casagrandi, Renato
2017-01-01
Protecting key hotspots of marine biodiversity is essential to maintain ecosystem services at large spatial scales. Protected areas serve not only as sources of propagules colonizing other habitats, but also as receptors, thus acting as protected nurseries. To quantify the geographical extent and the temporal persistence of ecological benefits resulting from protection, we investigate larval connectivity within a remote archipelago, characterized by a strong spatial gradient of human impact from pristine to heavily exploited: the Northern Line Islands (NLIs), including part of the Pacific Remote Islands Marine National Monument (PRI-MNM). Larvae are described as passive Lagrangian particles transported by oceanic currents obtained from a oceanographic reanalysis. We compare different simulation schemes and compute connectivity measures (larval exchange probabilities and minimum/average larval dispersal distances from target islands). To explore the role of PRI-MNM in protecting marine organisms with pelagic larval stages, we drive millions of individual-based simulations for various Pelagic Larval Durations (PLDs), in all release seasons, and over a two-decades time horizon (1991-2010). We find that connectivity in the NLIs is spatially asymmetric and displays significant intra- and inter-annual variations. The islands belonging to PRI-MNM act more as sinks than sources of larvae, and connectivity is higher during the winter-spring period. In multi-annual analyses, yearly averaged southward connectivity significantly and negatively correlates with climatological anomalies (El Niño). This points out a possible system fragility and susceptibility to global warming. Quantitative assessments of large-scale, long-term marine connectivity patterns help understand region-specific, ecologically relevant interactions between islands. This is fundamental for devising scientifically-based protection strategies, which must be space- and time-varying to cope with the challenges posed by the concurrent pressures of human exploitation and global climate change.
Application of Large-Scale Aptamer-Based Proteomic Profiling to Planned Myocardial Infarctions.
Jacob, Jaison; Ngo, Debby; Finkel, Nancy; Pitts, Rebecca; Gleim, Scott; Benson, Mark D; Keyes, Michelle J; Farrell, Laurie A; Morgan, Thomas; Jennings, Lori L; Gerszten, Robert E
2018-03-20
Emerging proteomic technologies using novel affinity-based reagents allow for efficient multiplexing with high-sample throughput. To identify early biomarkers of myocardial injury, we recently applied an aptamer-based proteomic profiling platform that measures 1129 proteins to samples from patients undergoing septal alcohol ablation for hypertrophic cardiomyopathy, a human model of planned myocardial injury. Here, we examined the scalability of this approach using a markedly expanded platform to study a far broader range of human proteins in the context of myocardial injury. We applied a highly multiplexed, expanded proteomic technique that uses single-stranded DNA aptamers to assay 4783 human proteins (4137 distinct human gene targets) to derivation and validation cohorts of planned myocardial injury, individuals with spontaneous myocardial infarction, and at-risk controls. We found 376 target proteins that significantly changed in the blood after planned myocardial injury in a derivation cohort (n=20; P <1.05E-05, 1-way repeated measures analysis of variance, Bonferroni threshold). Two hundred forty-seven of these proteins were validated in an independent planned myocardial injury cohort (n=15; P <1.33E-04, 1-way repeated measures analysis of variance); >90% were directionally consistent and reached nominal significance in the validation cohort. Among the validated proteins that were increased within 1 hour after planned myocardial injury, 29 were also elevated in patients with spontaneous myocardial infarction (n=63; P <6.17E-04). Many of the novel markers identified in our study are intracellular proteins not previously identified in the peripheral circulation or have functional roles relevant to myocardial injury. For example, the cardiac LIM protein, cysteine- and glycine-rich protein 3, is thought to mediate cardiac mechanotransduction and stress responses, whereas the mitochondrial ATP synthase F 0 subunit component is a vasoactive peptide on its release from cells. Last, we performed aptamer-affinity enrichment coupled with mass spectrometry to technically verify aptamer specificity for a subset of the new biomarkers. Our results demonstrate the feasibility of large-scale aptamer multiplexing at a level that has not previously been reported and with sample throughput that greatly exceeds other existing proteomic methods. The expanded aptamer-based proteomic platform provides a unique opportunity for biomarker and pathway discovery after myocardial injury. © 2017 American Heart Association, Inc.
A large-scale evaluation of computational protein function prediction
Radivojac, Predrag; Clark, Wyatt T; Ronnen Oron, Tal; Schnoes, Alexandra M; Wittkop, Tobias; Sokolov, Artem; Graim, Kiley; Funk, Christopher; Verspoor, Karin; Ben-Hur, Asa; Pandey, Gaurav; Yunes, Jeffrey M; Talwalkar, Ameet S; Repo, Susanna; Souza, Michael L; Piovesan, Damiano; Casadio, Rita; Wang, Zheng; Cheng, Jianlin; Fang, Hai; Gough, Julian; Koskinen, Patrik; Törönen, Petri; Nokso-Koivisto, Jussi; Holm, Liisa; Cozzetto, Domenico; Buchan, Daniel W A; Bryson, Kevin; Jones, David T; Limaye, Bhakti; Inamdar, Harshal; Datta, Avik; Manjari, Sunitha K; Joshi, Rajendra; Chitale, Meghana; Kihara, Daisuke; Lisewski, Andreas M; Erdin, Serkan; Venner, Eric; Lichtarge, Olivier; Rentzsch, Robert; Yang, Haixuan; Romero, Alfonso E; Bhat, Prajwal; Paccanaro, Alberto; Hamp, Tobias; Kassner, Rebecca; Seemayer, Stefan; Vicedo, Esmeralda; Schaefer, Christian; Achten, Dominik; Auer, Florian; Böhm, Ariane; Braun, Tatjana; Hecht, Maximilian; Heron, Mark; Hönigschmid, Peter; Hopf, Thomas; Kaufmann, Stefanie; Kiening, Michael; Krompass, Denis; Landerer, Cedric; Mahlich, Yannick; Roos, Manfred; Björne, Jari; Salakoski, Tapio; Wong, Andrew; Shatkay, Hagit; Gatzmann, Fanny; Sommer, Ingolf; Wass, Mark N; Sternberg, Michael J E; Škunca, Nives; Supek, Fran; Bošnjak, Matko; Panov, Panče; Džeroski, Sašo; Šmuc, Tomislav; Kourmpetis, Yiannis A I; van Dijk, Aalt D J; ter Braak, Cajo J F; Zhou, Yuanpeng; Gong, Qingtian; Dong, Xinran; Tian, Weidong; Falda, Marco; Fontana, Paolo; Lavezzo, Enrico; Di Camillo, Barbara; Toppo, Stefano; Lan, Liang; Djuric, Nemanja; Guo, Yuhong; Vucetic, Slobodan; Bairoch, Amos; Linial, Michal; Babbitt, Patricia C; Brenner, Steven E; Orengo, Christine; Rost, Burkhard; Mooney, Sean D; Friedberg, Iddo
2013-01-01
Automated annotation of protein function is challenging. As the number of sequenced genomes rapidly grows, the overwhelming majority of protein products can only be annotated computationally. If computational predictions are to be relied upon, it is crucial that the accuracy of these methods be high. Here we report the results from the first large-scale community-based Critical Assessment of protein Function Annotation (CAFA) experiment. Fifty-four methods representing the state-of-the-art for protein function prediction were evaluated on a target set of 866 proteins from eleven organisms. Two findings stand out: (i) today’s best protein function prediction algorithms significantly outperformed widely-used first-generation methods, with large gains on all types of targets; and (ii) although the top methods perform well enough to guide experiments, there is significant need for improvement of currently available tools. PMID:23353650
Decentralized asset management for collaborative sensing
NASA Astrophysics Data System (ADS)
Malhotra, Raj P.; Pribilski, Michael J.; Toole, Patrick A.; Agate, Craig
2017-05-01
There has been increased impetus to leverage Small Unmanned Aerial Systems (SUAS) for collaborative sensing applications in which many platforms work together to provide critical situation awareness in dynamic environments. Such applications require critical sensor observations to be made at the right place and time to facilitate the detection, tracking, and classification of ground-based objects. This further requires rapid response to real-world events and the balancing of multiple, competing mission objectives. In this context, human operators become overwhelmed with management of many platforms. Further, current automated planning paradigms tend to be centralized and don't scale up well to many collaborating platforms. We introduce a decentralized approach based upon information-theory and distributed fusion which enable us to scale up to large numbers of collaborating Small Unmanned Aerial Systems (SUAS) platforms. This is exercised against a military application involving the autonomous detection, tracking, and classification of critical mobile targets. We further show that, based upon monte-carlo simulation results, our decentralized approach out-performs more static management strategies employed by human operators and achieves similar results to a centralized approach while being scalable and robust to degradation of communication. Finally, we describe the limitations of our approach and future directions for our research.
Large-scale weakly supervised object localization via latent category learning.
Chong Wang; Kaiqi Huang; Weiqiang Ren; Junge Zhang; Maybank, Steve
2015-04-01
Localizing objects in cluttered backgrounds is challenging under large-scale weakly supervised conditions. Due to the cluttered image condition, objects usually have large ambiguity with backgrounds. Besides, there is also a lack of effective algorithm for large-scale weakly supervised localization in cluttered backgrounds. However, backgrounds contain useful latent information, e.g., the sky in the aeroplane class. If this latent information can be learned, object-background ambiguity can be largely reduced and background can be suppressed effectively. In this paper, we propose the latent category learning (LCL) in large-scale cluttered conditions. LCL is an unsupervised learning method which requires only image-level class labels. First, we use the latent semantic analysis with semantic object representation to learn the latent categories, which represent objects, object parts or backgrounds. Second, to determine which category contains the target object, we propose a category selection strategy by evaluating each category's discrimination. Finally, we propose the online LCL for use in large-scale conditions. Evaluation on the challenging PASCAL Visual Object Class (VOC) 2007 and the large-scale imagenet large-scale visual recognition challenge 2013 detection data sets shows that the method can improve the annotation precision by 10% over previous methods. More importantly, we achieve the detection precision which outperforms previous results by a large margin and can be competitive to the supervised deformable part model 5.0 baseline on both data sets.
Screening for Viral Pathogens in African Simian Bushmeat Seized at A French Airport.
Temmam, Sarah; Davoust, Bernard; Chaber, Anne-Lise; Lignereux, Yves; Michelle, Caroline; Monteil-Bouchard, Sonia; Raoult, Didier; Desnues, Christelle
2017-08-01
Illegal bushmeat traffic is an important threat to biodiversity conservation of several endangered species and may contribute to the emergence and spread of infectious diseases in humans. The hunting, manipulation and consumption of wildlife-based products, especially those of primate origin, may be a threat to human health; however, few studies have investigated the role of bushmeat trade and consumption as a potential source of human infections to date. In this study, we report the screening of viral pathogens in African simian game seized by French customs at Toulouse Blagnac Airport. Epifluorescence microscopy revealed the presence of virus-like particles in the samples, and further metagenomic sequencing of the DNA and RNA viromes confirmed the presence of sequences related to the Siphoviridae, Myoviridae and Podoviridae bacteriophage families; some of them infecting bacterial hosts that could be potentially pathogenic for humans. To increase the sensitivity of detection, twelve pan-generic PCRs targeting several viral zoonoses were performed, but no positive signal was detected. A large-scale inventory of bacteria, viruses and parasites is urgently needed to globally assess the risk for human health of the trade, manipulation and consumption of wildlife-related bushmeat. © 2016 Blackwell Verlag GmbH.
Scaling effects in a non-linear electromagnetic energy harvester for wearable sensors
NASA Astrophysics Data System (ADS)
Geisler, M.; Boisseau, S.; Perez, M.; Ait-Ali, I.; Perraud, S.
2016-11-01
In the field of inertial energy harvesters targeting human mechanical energy, the ergonomics of the solutions impose to find the best compromise between dimensions reduction and electrical performance. In this paper, we study the properties of a non-linear electromagnetic generator at different scales, by performing simulations based on an experimentally validated model and real human acceleration recordings. The results display that the output power of the structure is roughly proportional to its scaling factor raised to the power of five, which indicates that this system is more relevant at lengths over a few centimetres.
Exploring Entrainment Patterns of Human Emotion in Social Media
Luo, Chuan; Zhang, Zhu
2016-01-01
Emotion entrainment, which is generally defined as the synchronous convergence of human emotions, performs many important social functions. However, what the specific mechanisms of emotion entrainment are beyond in-person interactions, and how human emotions evolve under different entrainment patterns in large-scale social communities, are still unknown. In this paper, we aim to examine the massive emotion entrainment patterns and understand the underlying mechanisms in the context of social media. As modeling emotion dynamics on a large scale is often challenging, we elaborate a pragmatic framework to characterize and quantify the entrainment phenomenon. By applying this framework on the datasets from two large-scale social media platforms, we find that the emotions of online users entrain through social networks. We further uncover that online users often form their relations via dual entrainment, while maintain it through single entrainment. Remarkably, the emotions of online users are more convergent in nonreciprocal entrainment. Building on these findings, we develop an entrainment augmented model for emotion prediction. Experimental results suggest that entrainment patterns inform emotion proximity in dyads, and encoding their associations promotes emotion prediction. This work can further help us to understand the underlying dynamic process of large-scale online interactions and make more reasonable decisions regarding emergency situations, epidemic diseases, and political campaigns in cyberspace. PMID:26953692
Exploring Entrainment Patterns of Human Emotion in Social Media.
He, Saike; Zheng, Xiaolong; Zeng, Daniel; Luo, Chuan; Zhang, Zhu
2016-01-01
Emotion entrainment, which is generally defined as the synchronous convergence of human emotions, performs many important social functions. However, what the specific mechanisms of emotion entrainment are beyond in-person interactions, and how human emotions evolve under different entrainment patterns in large-scale social communities, are still unknown. In this paper, we aim to examine the massive emotion entrainment patterns and understand the underlying mechanisms in the context of social media. As modeling emotion dynamics on a large scale is often challenging, we elaborate a pragmatic framework to characterize and quantify the entrainment phenomenon. By applying this framework on the datasets from two large-scale social media platforms, we find that the emotions of online users entrain through social networks. We further uncover that online users often form their relations via dual entrainment, while maintain it through single entrainment. Remarkably, the emotions of online users are more convergent in nonreciprocal entrainment. Building on these findings, we develop an entrainment augmented model for emotion prediction. Experimental results suggest that entrainment patterns inform emotion proximity in dyads, and encoding their associations promotes emotion prediction. This work can further help us to understand the underlying dynamic process of large-scale online interactions and make more reasonable decisions regarding emergency situations, epidemic diseases, and political campaigns in cyberspace.
3D Data Acquisition Platform for Human Activity Understanding
2016-03-02
address fundamental research problems of representation and invariant description of3D data, human motion modeling and applications of human activity analysis, and computational optimization of large-scale 3D data.
3D Data Acquisition Platform for Human Activity Understanding
2016-03-02
address fundamental research problems of representation and invariant description of 3D data, human motion modeling and applications of human activity analysis, and computational optimization of large-scale 3D data.
Quantifying design trade-offs of beryllium targets on NIF
NASA Astrophysics Data System (ADS)
Yi, S. A.; Zylstra, A. B.; Kline, J. L.; Loomis, E. N.; Kyrala, G. A.; Shah, R. C.; Perry, T. S.; Kanzleiter, R. J.; Batha, S. H.; MacLaren, S. A.; Ralph, J. E.; Masse, L. P.; Salmonson, J. D.; Tipton, R. E.; Callahan, D. A.; Hurricane, O. A.
2017-10-01
An important determinant of target performance is implosion kinetic energy, which scales with the capsule size. The maximum achievable performance for a given laser is thus related to the largest capsule that can be imploded symmetrically, constrained by drive uniformity. A limiting factor for symmetric radiation drive is the ratio of hohlraum to capsule radii, or case-to-capsule ratio (CCR). For a fixed laser energy, a larger hohlraum allows for driving bigger capsules symmetrically at the cost of reduced peak radiation temperature (Tr). Beryllium ablators may thus allow for unique target design trade-offs due to their higher ablation efficiency at lower Tr. By utilizing larger hohlraum sizes than most modern NIF designs, beryllium capsules thus have the potential to operate in unique regions of the target design parameter space. We present design simulations of beryllium targets with a large CCR = 4.3 3.7 . These are scaled surrogates of large hohlraum low Tr beryllium targets, with the goal of quantifying symmetry tunability as a function of CCR. This work performed under the auspices of the U.S. DOE by LANL under contract DE-AC52- 06NA25396, and by LLNL under Contract DE-AC52-07NA27344.
Hiller, Ekkehard; Istel, Fabian; Tscherner, Michael; Brunke, Sascha; Ames, Lauren; Firon, Arnaud; Green, Brian; Cabral, Vitor; Marcet-Houben, Marina; Jacobsen, Ilse D.; Quintin, Jessica; Seider, Katja; Frohner, Ingrid; Glaser, Walter; Jungwirth, Helmut; Bachellier-Bassi, Sophie; Chauvel, Murielle; Zeidler, Ute; Ferrandon, Dominique; Gabaldón, Toni; Hube, Bernhard; d'Enfert, Christophe; Rupp, Steffen; Cormack, Brendan; Haynes, Ken; Kuchler, Karl
2014-01-01
The opportunistic fungal pathogen Candida glabrata is a frequent cause of candidiasis, causing infections ranging from superficial to life-threatening disseminated disease. The inherent tolerance of C. glabrata to azole drugs makes this pathogen a serious clinical threat. To identify novel genes implicated in antifungal drug tolerance, we have constructed a large-scale C. glabrata deletion library consisting of 619 unique, individually bar-coded mutant strains, each lacking one specific gene, all together representing almost 12% of the genome. Functional analysis of this library in a series of phenotypic and fitness assays identified numerous genes required for growth of C. glabrata under normal or specific stress conditions, as well as a number of novel genes involved in tolerance to clinically important antifungal drugs such as azoles and echinocandins. We identified 38 deletion strains displaying strongly increased susceptibility to caspofungin, 28 of which encoding proteins that have not previously been linked to echinocandin tolerance. Our results demonstrate the potential of the C. glabrata mutant collection as a valuable resource in functional genomics studies of this important fungal pathogen of humans, and to facilitate the identification of putative novel antifungal drug target and virulence genes. PMID:24945925
Next Generation Antibody Therapeutics Using Bispecific Antibody Technology.
Igawa, Tomoyuki
2017-01-01
Nearly fifty monoclonal antibodies have been approved to date, and the market for monoclonal antibodies is expected to continue to grow. Since global competition in the field of antibody therapeutics is intense, we need to establish novel antibody engineering technologies to provide true benefit for patients, with differentiated product values. Bispecific antibodies are among the next generation of antibody therapeutics that can bind to two different target antigens by the two arms of immunoglobulin G (IgG) molecule, and are thus believed to be applicable to various therapeutic needs. Until recently, large scale manufacturing of human IgG bispecific antibody was impossible. We have established a technology, named asymmetric re-engineering technology (ART)-Ig, to enable large scale manufacturing of bispecific antibodies. Three examples of next generation antibody therapeutics using ART-Ig technology are described. Recent updates on bispecific antibodies against factor IXa and factor X for the treatment of hemophilia A, bispecific antibodies against a tumor specific antigen and T cell surface marker CD3 for cancer immunotherapy, and bispecific antibodies against two different epitopes of soluble antigen with pH-dependent binding property for the elimination of soluble antigen from plasma are also described.
The salience network causally influences default mode network activity during moral reasoning
Wilson, Stephen M.; D’Esposito, Mark; Kayser, Andrew S.; Grossman, Scott N.; Poorzand, Pardis; Seeley, William W.; Miller, Bruce L.; Rankin, Katherine P.
2013-01-01
Large-scale brain networks are integral to the coordination of human behaviour, and their anatomy provides insights into the clinical presentation and progression of neurodegenerative illnesses such as Alzheimer’s disease, which targets the default mode network, and behavioural variant frontotemporal dementia, which targets a more anterior salience network. Although the default mode network is recruited when healthy subjects deliberate about ‘personal’ moral dilemmas, patients with Alzheimer’s disease give normal responses to these dilemmas whereas patients with behavioural variant frontotemporal dementia give abnormal responses to these dilemmas. We hypothesized that this apparent discrepancy between activation- and patient-based studies of moral reasoning might reflect a modulatory role for the salience network in regulating default mode network activation. Using functional magnetic resonance imaging to characterize network activity of patients with behavioural variant frontotemporal dementia and healthy control subjects, we present four converging lines of evidence supporting a causal influence from the salience network to the default mode network during moral reasoning. First, as previously reported, the default mode network is recruited when healthy subjects deliberate about ‘personal’ moral dilemmas, but patients with behavioural variant frontotemporal dementia producing atrophy in the salience network give abnormally utilitarian responses to these dilemmas. Second, patients with behavioural variant frontotemporal dementia have reduced recruitment of the default mode network compared with healthy control subjects when deliberating about these dilemmas. Third, a Granger causality analysis of functional neuroimaging data from healthy control subjects demonstrates directed functional connectivity from nodes of the salience network to nodes of the default mode network during moral reasoning. Fourth, this Granger causal influence is diminished in patients with behavioural variant frontotemporal dementia. These findings are consistent with a broader model in which the salience network modulates the activity of other large-scale networks, and suggest a revision to a previously proposed ‘dual-process’ account of moral reasoning. These findings also characterize network interactions underlying abnormal moral reasoning in frontotemporal dementia, which may serve as a model for the aberrant judgement and interpersonal behaviour observed in this disease and in other disorders of social function. More broadly, these findings link recent work on the dynamic interrelationships between large-scale brain networks to observable impairments in dementia syndromes, which may shed light on how diseases that target one network also alter the function of interrelated networks. PMID:23576128
Unravelling connections between river flow and large-scale climate: experiences from Europe
NASA Astrophysics Data System (ADS)
Hannah, D. M.; Kingston, D. G.; Lavers, D.; Stagge, J. H.; Tallaksen, L. M.
2016-12-01
The United Nations has identified better knowledge of large-scale water cycle processes as essential for socio-economic development and global water-food-energy security. In this context, and given the ever-growing concerns about climate change/ variability and human impacts on hydrology, there is an urgent research need: (a) to quantify space-time variability in regional river flow, and (b) to improve hydroclimatological understanding of climate-flow connections as a basis for identifying current and future water-related issues. In this paper, we draw together studies undertaken at the pan-European scale: (1) to evaluate current methods for assessing space-time dynamics for different streamflow metrics (annual regimes, low flows and high flows) and for linking flow variability to atmospheric drivers (circulation indices, air-masses, gridded climate fields and vapour flux); and (2) to propose a plan for future research connecting streamflow and the atmospheric conditions in Europe and elsewhere. We believe this research makes a useful, unique contribution to the literature through a systematic inter-comparison of different streamflow metrics and atmospheric descriptors. In our findings, we highlight the need to consider appropriate atmospheric descriptors (dependent on the target flow metric and region of interest) and to develop analytical techniques that best characterise connections in the ocean-atmosphere-land surface process chain. We call for the need to consider not only atmospheric interactions, but also the role of the river basin-scale terrestrial hydrological processes in modifying the climate signal response of river flows.
Ocimum basilicum miRNOME revisited: A cross kingdom approach.
Patel, Maulikkumar; Patel, Shanaya; Mangukia, Naman; Patel, Saumya; Mankad, Archana; Pandya, Himanshu; Rawal, Rakesh
2018-05-15
O. basilicum is medicinally important herb having inevitable role in human health. However, the mechanism of action is largely unknown. Present study aims to understand the mechanism of regulation of key human target genes that could plausibly modulated by O. basilicum miRNAs in cross kingdom manner using computational and system biology approach. O. basilicum miRNA sequences were retrieved and their corresponding human target genes were identified using psRNA target and interaction analysis of hub nodes. Six O. basilicum derived miRNAs were found to modulate 26 human target genes which were associated `with PI3K-AKTand MAPK signaling pathways with PTPN11, EIF2S2, NOS1, IRS1 and USO1 as top 5 Hub nodes. O. basilicum miRNAs not only regulate key human target genes having a significance in various diseases but also paves the path for future studies that might explore potential of miRNA mediated cross-kingdom regulation, prevention and treatment of various human diseases including cancer. Copyright © 2018 Elsevier Inc. All rights reserved.
Semantic Differential Scale Method Can Reveal Multi-Dimensional Aspects of Mind Perception.
Takahashi, Hideyuki; Ban, Midori; Asada, Minoru
2016-01-01
As humans, we tend to perceive minds in both living and non-living entities, such as robots. From a questionnaire developed in a previous mind perception study, authors found that perceived minds could be located on two dimensions "experience" and "agency." This questionnaire allowed the assessment of how we perceive minds of various entities from a multi-dimensional point of view. In this questionnaire, subjects had to evaluate explicit mental capacities of target characters (e.g., capacity to feel hunger). However, we sometimes perceive minds in non-living entities, even though we cannot attribute these evidently biological capacities to the entity. In this study, we performed a large-scale web survey to assess mind perception by using the semantic differential scale method. We revealed that two mind dimensions "emotion" and "intelligence," respectively, corresponded to the two mind dimensions (experience and agency) proposed in a previous mind perception study. We did this without having to ask about specific mental capacities. We believe that the semantic differential scale is a useful method to assess the dimensions of mind perception especially for non-living entities that are hard to be attributed to biological capacities.
Culture rather than genes provides greater scope for the evolution of large-scale human prosociality
Bell, Adrian V.; Richerson, Peter J.; McElreath, Richard
2009-01-01
Whether competition among large groups played an important role in human social evolution is dependent on how variation, whether cultural or genetic, is maintained between groups. Comparisons between genetic and cultural differentiation between neighboring groups show how natural selection on large groups is more plausible on cultural rather than genetic variation. PMID:19822753
KOLAM: a cross-platform architecture for scalable visualization and tracking in wide-area imagery
NASA Astrophysics Data System (ADS)
Fraser, Joshua; Haridas, Anoop; Seetharaman, Guna; Rao, Raghuveer M.; Palaniappan, Kannappan
2013-05-01
KOLAM is an open, cross-platform, interoperable, scalable and extensible framework supporting a novel multi- scale spatiotemporal dual-cache data structure for big data visualization and visual analytics. This paper focuses on the use of KOLAM for target tracking in high-resolution, high throughput wide format video also known as wide-area motion imagery (WAMI). It was originally developed for the interactive visualization of extremely large geospatial imagery of high spatial and spectral resolution. KOLAM is platform, operating system and (graphics) hardware independent, and supports embedded datasets scalable from hundreds of gigabytes to feasibly petabytes in size on clusters, workstations, desktops and mobile computers. In addition to rapid roam, zoom and hyper- jump spatial operations, a large number of simultaneously viewable embedded pyramid layers (also referred to as multiscale or sparse imagery), interactive colormap and histogram enhancement, spherical projection and terrain maps are supported. The KOLAM software architecture was extended to support airborne wide-area motion imagery by organizing spatiotemporal tiles in very large format video frames using a temporal cache of tiled pyramid cached data structures. The current version supports WAMI animation, fast intelligent inspection, trajectory visualization and target tracking (digital tagging); the latter by interfacing with external automatic tracking software. One of the critical needs for working with WAMI is a supervised tracking and visualization tool that allows analysts to digitally tag multiple targets, quickly review and correct tracking results and apply geospatial visual analytic tools on the generated trajectories. One-click manual tracking combined with multiple automated tracking algorithms are available to assist the analyst and increase human effectiveness.
Successful contracting of prevention services: fighting malnutrition in Senegal and Madagascar.
Marek, T; Diallo, I; Ndiaye, B; Rakotosalama, J
1999-12-01
There are very few documented large-scale successes in nutrition in Africa, and virtually no consideration of contracting for preventive services. This paper describes two successful large-scale community nutrition projects in Africa as examples of what can be done in prevention using the contracting approach in rural as well as urban areas. The two case-studies are the Secaline project in Madagascar, and the Community Nutrition Project in Senegal. The article explains what is meant by 'success' in the context of these two projects, how these results were achieved, and how certain bottlenecks were avoided. Both projects are very similar in the type of service they provide, and in combining private administration with public finance. The article illustrates that contracting out is a feasible option to be seriously considered for organizing certain prevention programmes on a large scale. There are strong indications from these projects of success in terms of reducing malnutrition, replicability and scale, and community involvement. When choosing that option, a government can tap available private local human resources through contracting out, rather than delivering those services by the public sector. However, as was done in both projects studied, consideration needs to be given to using a contract management unit for execution and monitoring, which costs 13-17% of the total project's budget. Rigorous assessments of the cost-effectiveness of contracted services are not available, but improved health outcomes, targeting of the poor, and basic cost data suggest that the programmes may well be relatively cost-effective. Although the contracting approach is not presented as the panacea to solve the malnutrition problem faced by Africa, it can certainly provide an alternative in many countries to increase coverage and quality of services.
Cuellar, Maria C; Heijnen, Joseph J; van der Wielen, Luuk A M
2013-06-01
Industrial biotechnology is playing an important role in the transition to a bio-based economy. Currently, however, industrial implementation is still modest, despite the advances made in microorganism development. Given that the fuels and commodity chemicals sectors are characterized by tight economic margins, we propose to address overall process design and efficiency at the start of bioprocess development. While current microorganism development is targeted at product formation and product yield, addressing process design at the start of bioprocess development means that microorganism selection can also be extended to other critical targets for process technology and process scale implementation, such as enhancing cell separation or increasing cell robustness at operating conditions that favor the overall process. In this paper we follow this approach for the microbial production of diesel-like biofuels. We review current microbial routes with both oleaginous and engineered microorganisms. For the routes leading to extracellular production, we identify the process conditions for large scale operation. The process conditions identified are finally translated to microorganism development targets. We show that microorganism development should be directed at anaerobic production, increasing robustness at extreme process conditions and tailoring cell surface properties. All the same time, novel process configurations integrating fermentation and product recovery, cell reuse and low-cost technologies for product separation are mandatory. This review provides a state-of-the-art summary of the latest challenges in large-scale production of diesel-like biofuels. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Fast Open-World Person Re-Identification.
Zhu, Xiatian; Wu, Botong; Huang, Dongcheng; Zheng, Wei-Shi
2018-05-01
Existing person re-identification (re-id) methods typically assume that: 1) any probe person is guaranteed to appear in the gallery target population during deployment (i.e., closed-world) and 2) the probe set contains only a limited number of people (i.e., small search scale). Both assumptions are artificial and breached in real-world applications, since the probe population in target people search can be extremely vast in practice due to the ambiguity of probe search space boundary. Therefore, it is unrealistic that any probe person is assumed as one target people, and a large-scale search in person images is inherently demanded. In this paper, we introduce a new person re-id search setting, called large scale open-world (LSOW) re-id, characterized by huge size probe images and open person population in search thus more close to practical deployments. Under LSOW, the under-studied problem of person re-id efficiency is essential in addition to that of commonly studied re-id accuracy. We, therefore, develop a novel fast person re-id method, called Cross-view Identity Correlation and vErification (X-ICE) hashing, for joint learning of cross-view identity representation binarisation and discrimination in a unified manner. Extensive comparative experiments on three large-scale benchmarks have been conducted to validate the superiority and advantages of the proposed X-ICE method over a wide range of the state-of-the-art hashing models, person re-id methods, and their combinations.
Epidemiology of forest malaria in central Vietnam: a large scale cross-sectional survey.
Erhart, Annette; Ngo, Duc Thang; Phan, Van Ky; Ta, Thi Tinh; Van Overmeir, Chantal; Speybroeck, Niko; Obsomer, Valerie; Le, Xuan Hung; Le, Khanh Thuan; Coosemans, Marc; D'alessandro, Umberto
2005-12-08
In Vietnam, a large proportion of all malaria cases and deaths occurs in the central mountainous and forested part of the country. Indeed, forest malaria, despite intensive control activities, is still a major problem which raises several questions about its dynamics.A large-scale malaria morbidity survey to measure malaria endemicity and identify important risk factors was carried out in 43 villages situated in a forested area of Ninh Thuan province, south central Vietnam. Four thousand three hundred and six randomly selected individuals, aged 10-60 years, participated in the survey. Rag Lays (86%), traditionally living in the forest and practising "slash and burn" cultivation represented the most common ethnic group. The overall parasite rate was 13.3% (range [0-42.3] while Plasmodium falciparum seroprevalence was 25.5% (range [2.1-75.6]). Mapping of these two variables showed a patchy distribution, suggesting that risk factors other than remoteness and forest proximity modulated the human-vector interactions. This was confirmed by the results of the multivariate-adjusted analysis, showing that forest work was a significant risk factor for malaria infection, further increased by staying in the forest overnight (OR= 2.86; 95%CI [1.62; 5.07]). Rag Lays had a higher risk of malaria infection, which inversely related to education level and socio-economic status. Women were less at risk than men (OR = 0.71; 95%CI [0.59; 0.86]), a possible consequence of different behaviour. This study confirms that malaria endemicity is still relatively high in this area and that the dynamics of transmission is constantly modulated by the behaviour of both humans and vectors. A well-targeted intervention reducing the "vector/forest worker" interaction, based on long-lasting insecticidal material, could be appropriate in this environment.
Epidemiology of forest malaria in central Vietnam: a large scale cross-sectional survey
Erhart, Annette; Thang, Ngo Duc; Van Ky, Phan; Tinh, Ta Thi; Van Overmeir, Chantal; Speybroeck, Niko; Obsomer, Valerie; Hung, Le Xuan; Thuan, Le Khanh; Coosemans, Marc; D'alessandro, Umberto
2005-01-01
In Vietnam, a large proportion of all malaria cases and deaths occurs in the central mountainous and forested part of the country. Indeed, forest malaria, despite intensive control activities, is still a major problem which raises several questions about its dynamics. A large-scale malaria morbidity survey to measure malaria endemicity and identify important risk factors was carried out in 43 villages situated in a forested area of Ninh Thuan province, south central Vietnam. Four thousand three hundred and six randomly selected individuals, aged 10–60 years, participated in the survey. Rag Lays (86%), traditionally living in the forest and practising "slash and burn" cultivation represented the most common ethnic group. The overall parasite rate was 13.3% (range [0–42.3] while Plasmodium falciparum seroprevalence was 25.5% (range [2.1–75.6]). Mapping of these two variables showed a patchy distribution, suggesting that risk factors other than remoteness and forest proximity modulated the human-vector interactions. This was confirmed by the results of the multivariate-adjusted analysis, showing that forest work was a significant risk factor for malaria infection, further increased by staying in the forest overnight (OR= 2.86; 95%CI [1.62; 5.07]). Rag Lays had a higher risk of malaria infection, which inversely related to education level and socio-economic status. Women were less at risk than men (OR = 0.71; 95%CI [0.59; 0.86]), a possible consequence of different behaviour. This study confirms that malaria endemicity is still relatively high in this area and that the dynamics of transmission is constantly modulated by the behaviour of both humans and vectors. A well-targeted intervention reducing the "vector/forest worker" interaction, based on long-lasting insecticidal material, could be appropriate in this environment. PMID:16336671
NASA Astrophysics Data System (ADS)
Hazen, E. L.
2016-02-01
Highly migratory species regularly traverse human-imposed boundaries including exclusive economic zones and marine protected areas, thus are difficult to manage using traditional spatial approaches. Blue whales (Balaenoptera musculus) are seasonal visitors to the California Current System that target a single prey resource, krill (Euphausia pacifica, Thysanoessa spinifera), and migrate large distances to find and exploit ephemeral prey patches. Successful management of blue whales requires improved understanding of how fine-scale foraging ecology translates to population abundances. Specifically, sub-lethal factors such as anthropogenic noise and climate change, and lethal factors such as ship strikes may be limiting recovery and can be difficult to account for in current management strategies. Here we use an extensive dataset of fine-scale accelerometers (55) and broad-scale satellite tags (104) deployed on Northeast Pacific blue whales to examine the energetics of foraging, overlap with human risk, and projections of future habitat with climate change. We quantify the importance of dense prey patches (> 100 krill per cubic meter) for blue whale energetics and fitness. Distribution models can be used in concert with industry and regional offices to produce dynamic rules to reduce vessel interactions. We propose telemetry data are ripe for use in establishing dynamic management approaches that account for daily to seasonal management areas to minimize anthropogenic risks, and are also adaptable to long-term climate-driven changes in habitat.
NASA Astrophysics Data System (ADS)
Hazen, E. L.
2016-12-01
Highly migratory species regularly traverse human-imposed boundaries including exclusive economic zones and marine protected areas, thus are difficult to manage using traditional spatial approaches. Blue whales (Balaenoptera musculus) are seasonal visitors to the California Current System that target a single prey resource, krill (Euphausia pacifica, Thysanoessa spinifera), and migrate large distances to find and exploit ephemeral prey patches. Successful management of blue whales requires improved understanding of how fine-scale foraging ecology translates to population abundances. Specifically, sub-lethal factors such as anthropogenic noise and climate change, and lethal factors such as ship strikes may be limiting recovery and can be difficult to account for in current management strategies. Here we use an extensive dataset of fine-scale accelerometers (55) and broad-scale satellite tags (104) deployed on Northeast Pacific blue whales to examine the energetics of foraging, overlap with human risk, and projections of future habitat with climate change. We quantify the importance of dense prey patches (> 100 krill per cubic meter) for blue whale energetics and fitness. Distribution models can be used in concert with industry and regional offices to produce dynamic rules to reduce vessel interactions. We propose telemetry data are ripe for use in establishing dynamic management approaches that account for daily to seasonal management areas to minimize anthropogenic risks, and are also adaptable to long-term climate-driven changes in habitat.
Genetics of Resistant Hypertension: the Missing Heritability and Opportunities.
Teixeira, Samantha K; Pereira, Alexandre C; Krieger, Jose E
2018-05-19
Blood pressure regulation in humans has long been known to be a genetically determined trait. The identification of causal genetic modulators for this trait has been unfulfilling at the least. Despite the recent advances of genome-wide genetic studies, loci associated with hypertension or blood pressure still explain a very low percentage of the overall variation of blood pressure in the general population. This has precluded the translation of discoveries in the genetics of human hypertension to clinical use. Here, we propose the combined use of resistant hypertension as a trait for mapping genetic determinants in humans and the integration of new large-scale technologies to approach in model systems the multidimensional nature of the problem. New large-scale efforts in the genetic and genomic arenas are paving the way for an increased and granular understanding of genetic determinants of hypertension. New technologies for whole genome sequence and large-scale forward genetic screens can help prioritize gene and gene-pathways for downstream characterization and large-scale population studies, and guided pharmacological design can be used to drive discoveries to the translational application through better risk stratification and new therapeutic approaches. Although significant challenges remain in the mapping and identification of genetic determinants of hypertension, new large-scale technological approaches have been proposed to surpass some of the shortcomings that have limited progress in the area for the last three decades. The incorporation of these technologies to hypertension research may significantly help in the understanding of inter-individual blood pressure variation and the deployment of new phenotyping and treatment approaches for the condition.
Wiener, J M; Ehbauer, N N; Mallot, H A
2009-09-01
For large numbers of targets, path planning is a complex and computationally expensive task. Humans, however, usually solve such tasks quickly and efficiently. We present experiments studying human path planning performance and the cognitive processes and heuristics involved. Twenty-five places were arranged on a regular grid in a large room. Participants were repeatedly asked to solve traveling salesman problems (TSP), i.e., to find the shortest closed loop connecting a start location with multiple target locations. In Experiment 1, we tested whether humans employed the nearest neighbor (NN) strategy when solving the TSP. Results showed that subjects outperform the NN-strategy, suggesting that it is not sufficient to explain human route planning behavior. As a second possible strategy we tested a hierarchical planning heuristic in Experiment 2, demonstrating that participants first plan a coarse route on the region level that is refined during navigation. To test for the relevance of spatial working memory (SWM) and spatial long-term memory (LTM) for planning performance and the planning heuristics applied, we varied the memory demands between conditions in Experiment 2. In one condition the target locations were directly marked, such that no memory was required; a second condition required participants to memorize the target locations during path planning (SWM); in a third condition, additionally, the locations of targets had to retrieved from LTM (SWM and LTM). Results showed that navigation performance decreased with increasing memory demands while the dependence on the hierarchical planning heuristic increased.
Test Information Targeting Strategies for Adaptive Multistage Testing Designs.
ERIC Educational Resources Information Center
Luecht, Richard M.; Burgin, William
Adaptive multistage testlet (MST) designs appear to be gaining popularity for many large-scale computer-based testing programs. These adaptive MST designs use a modularized configuration of preconstructed testlets and embedded score-routing schemes to prepackage different forms of an adaptive test. The conditional information targeting (CIT)…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shi, Tujin; Fillmore, Thomas L.; Gao, Yuqian
2013-10-01
Long-gradient separations coupled to tandem MS were recently demonstrated to provide a deep proteome coverage for global proteomics; however, such long-gradient separations have not been explored for targeted proteomics. Herein, we investigate the potential performance of the long-gradient separations coupled with selected reaction monitoring (LG-SRM) for targeted protein quantification. Direct comparison of LG-SRM (5 h gradient) and conventional LC-SRM (45 min gradient) showed that the long-gradient separations significantly reduced background interference levels and provided an 8- to 100-fold improvement in LOQ for target proteins in human female serum. Based on at least one surrogate peptide per protein, an LOQ ofmore » 10 ng/mL was achieved for the two spiked proteins in non-depleted human serum. The LG-SRM detection of seven out of eight endogenous plasma proteins expressed at ng/mL or sub-ng/mL levels in clinical patient sera was also demonstrated. A correlation coefficient of >0.99 was observed for the results of LG-SRM and ELISA measurements for prostate-specific antigen (PSA) in selected patient sera. Further enhancement of LG-SRM sensitivity was achieved by applying front-end IgY14 immunoaffinity depletion. Besides improved sensitivity, LG-SRM offers at least 3 times higher multiplexing capacity than conventional LC-SRM due to ~3-fold increase in average peak widths for a 300-min gradient compared to a 45-min gradient. Therefore, LG-SRM holds great potential for bridging the gap between global and targeted proteomics due to its advantages in both sensitivity and multiplexing capacity.« less
Antisense oligonucleotide–mediated MDM4 exon 6 skipping impairs tumor growth
Dewaele, Michael; Tabaglio, Tommaso; Willekens, Karen; Bezzi, Marco; Teo, Shun Xie; Low, Diana H.P.; Koh, Cheryl M.; Rambow, Florian; Fiers, Mark; Rogiers, Aljosja; Radaelli, Enrico; Al-Haddawi, Muthafar; Tan, Soo Yong; Hermans, Els; Amant, Frederic; Yan, Hualong; Lakshmanan, Manikandan; Koumar, Ratnacaram Chandrahas; Lim, Soon Thye; Derheimer, Frederick A.; Campbell, Robert M.; Bonday, Zahid; Tergaonkar, Vinay; Shackleton, Mark; Blattner, Christine; Marine, Jean-Christophe; Guccione, Ernesto
2015-01-01
MDM4 is a promising target for cancer therapy, as it is undetectable in most normal adult tissues but often upregulated in cancer cells to dampen p53 tumor-suppressor function. The mechanisms that underlie MDM4 upregulation in cancer cells are largely unknown. Here, we have shown that this key oncogenic event mainly depends on a specific alternative splicing switch. We determined that while a nonsense-mediated, decay-targeted isoform of MDM4 (MDM4-S) is produced in normal adult tissues as a result of exon 6 skipping, enhanced exon 6 inclusion leads to expression of full-length MDM4 in a large number of human cancers. Although this alternative splicing event is likely regulated by multiple splicing factors, we identified the SRSF3 oncoprotein as a key enhancer of exon 6 inclusion. In multiple human melanoma cell lines and in melanoma patient–derived xenograft (PDX) mouse models, antisense oligonucleotide–mediated (ASO-mediated) skipping of exon 6 decreased MDM4 abundance, inhibited melanoma growth, and enhanced sensitivity to MAPK-targeting therapeutics. Additionally, ASO-based MDM4 targeting reduced diffuse large B cell lymphoma PDX growth. As full-length MDM4 is enhanced in multiple human tumors, our data indicate that this strategy is applicable to a wide range of tumor types. We conclude that enhanced MDM4 exon 6 inclusion is a common oncogenic event and has potential as a clinically compatible therapeutic target. PMID:26595814
Wu, Ruidong; Long, Yongcheng; Malanson, George P; Garber, Paul A; Zhang, Shuang; Li, Diqiang; Zhao, Peng; Wang, Longzhu; Duo, Hairui
2014-01-01
By addressing several key features overlooked in previous studies, i.e. human disturbance, integration of ecosystem- and species-level conservation features, and principles of complementarity and representativeness, we present the first national-scale systematic conservation planning for China to determine the optimized spatial priorities for biodiversity conservation. We compiled a spatial database on the distributions of ecosystem- and species-level conservation features, and modeled a human disturbance index (HDI) by aggregating information using several socioeconomic proxies. We ran Marxan with two scenarios (HDI-ignored and HDI-considered) to investigate the effects of human disturbance, and explored the geographic patterns of the optimized spatial conservation priorities. Compared to when HDI was ignored, the HDI-considered scenario resulted in (1) a marked reduction (∼9%) in the total HDI score and a slight increase (∼7%) in the total area of the portfolio of priority units, (2) a significant increase (∼43%) in the total irreplaceable area and (3) more irreplaceable units being identified in almost all environmental zones and highly-disturbed provinces. Thus the inclusion of human disturbance is essential for cost-effective priority-setting. Attention should be targeted to the areas that are characterized as moderately-disturbed, <2,000 m in altitude, and/or intermediately- to extremely-rugged in terrain to identify potentially important regions for implementing cost-effective conservation. We delineated 23 primary large-scale priority areas that are significant for conserving China's biodiversity, but those isolated priority units in disturbed regions are in more urgent need of conservation actions so as to prevent immediate and severe biodiversity loss. This study presents a spatially optimized national-scale portfolio of conservation priorities--effectively representing the overall biodiversity of China while minimizing conflicts with economic development. Our results offer critical insights for current conservation and strategic land-use planning in China. The approach is transferable and easy to implement by end-users, and applicable for national- and local-scale systematic conservation prioritization practices.
Wu, Ruidong; Long, Yongcheng; Malanson, George P.; Garber, Paul A.; Zhang, Shuang; Li, Diqiang; Zhao, Peng; Wang, Longzhu; Duo, Hairui
2014-01-01
By addressing several key features overlooked in previous studies, i.e. human disturbance, integration of ecosystem- and species-level conservation features, and principles of complementarity and representativeness, we present the first national-scale systematic conservation planning for China to determine the optimized spatial priorities for biodiversity conservation. We compiled a spatial database on the distributions of ecosystem- and species-level conservation features, and modeled a human disturbance index (HDI) by aggregating information using several socioeconomic proxies. We ran Marxan with two scenarios (HDI-ignored and HDI-considered) to investigate the effects of human disturbance, and explored the geographic patterns of the optimized spatial conservation priorities. Compared to when HDI was ignored, the HDI-considered scenario resulted in (1) a marked reduction (∼9%) in the total HDI score and a slight increase (∼7%) in the total area of the portfolio of priority units, (2) a significant increase (∼43%) in the total irreplaceable area and (3) more irreplaceable units being identified in almost all environmental zones and highly-disturbed provinces. Thus the inclusion of human disturbance is essential for cost-effective priority-setting. Attention should be targeted to the areas that are characterized as moderately-disturbed, <2,000 m in altitude, and/or intermediately- to extremely-rugged in terrain to identify potentially important regions for implementing cost-effective conservation. We delineated 23 primary large-scale priority areas that are significant for conserving China's biodiversity, but those isolated priority units in disturbed regions are in more urgent need of conservation actions so as to prevent immediate and severe biodiversity loss. This study presents a spatially optimized national-scale portfolio of conservation priorities – effectively representing the overall biodiversity of China while minimizing conflicts with economic development. Our results offer critical insights for current conservation and strategic land-use planning in China. The approach is transferable and easy to implement by end-users, and applicable for national- and local-scale systematic conservation prioritization practices. PMID:25072933
Quantifying the Impacts of Large Scale Integration of Renewables in Indian Power Sector
NASA Astrophysics Data System (ADS)
Kumar, P.; Mishra, T.; Banerjee, R.
2017-12-01
India's power sector is responsible for nearly 37 percent of India's greenhouse gas emissions. For a fast emerging economy like India whose population and energy consumption are poised to rise rapidly in the coming decades, renewable energy can play a vital role in decarbonizing power sector. In this context, India has targeted 33-35 percent emission intensity reduction (with respect to 2005 levels) along with large scale renewable energy targets (100GW solar, 60GW wind, and 10GW biomass energy by 2022) in INDCs submitted at Paris agreement. But large scale integration of renewable energy is a complex process which faces a number of problems like capital intensiveness, matching intermittent loads with least storage capacity and reliability. In this context, this study attempts to assess the technical feasibility of integrating renewables into Indian electricity mix by 2022 and analyze its implications on power sector operations. This study uses TIMES, a bottom up energy optimization model with unit commitment and dispatch features. We model coal and gas fired units discretely with region-wise representation of wind and solar resources. The dispatch features are used for operational analysis of power plant units under ramp rate and minimum generation constraints. The study analyzes India's electricity sector transition for the year 2022 with three scenarios. The base case scenario (no RE addition) along with INDC scenario (with 100GW solar, 60GW wind, 10GW biomass) and low RE scenario (50GW solar, 30GW wind) have been created to analyze the implications of large scale integration of variable renewable energy. The results provide us insights on trade-offs involved in achieving mitigation targets and investment decisions involved. The study also examines operational reliability and flexibility requirements of the system for integrating renewables.
The global land rush and climate change
NASA Astrophysics Data System (ADS)
Davis, Kyle Frankel; Rulli, Maria Cristina; D'Odorico, Paolo
2015-08-01
Climate change poses a serious global challenge in the face of rapidly increasing human demand for energy and food. A recent phenomenon in which climate change may play an important role is the acquisition of large tracts of land in the developing world by governments and corporations. In the target countries, where land is relatively inexpensive, the potential to increase crop yields is generally high and property rights are often poorly defined. By acquiring land, investors can realize large profits and countries can substantially alter the land and water resources under their control, thereby changing their outlook for meeting future demand. While the drivers, actors, and impacts involved with land deals have received substantial attention in the literature, we propose that climate change plays an important yet underappreciated role, both through its direct effects on agricultural production and through its influence on mitigative or adaptive policy decisions. Drawing from various literature sources as well as a new global database on reported land deals, we trace the evolution of the global land rush and highlight prominent examples in which the role of climate change is evident. We find that climate change—both historical and anticipated—interacts substantially with drivers of land acquisitions, having important implications for the resilience of communities in targeted areas. As a result of this synthesis, we ultimately contend that considerations of climate change should be integrated into future policy decisions relating to the large-scale land acquisitions.
Discovery, innovation and the cyclical nature of the pharmaceutical business.
Schmid, Esther F; Smith, Dennis A
2002-05-15
Unlike many recent articles, which paint the future of the pharmaceutical industry in gloomy colours, this article provides an optimistic outlook. It explores the foundations on which the pharmaceutical industry has based its outstanding successes. Case studies of important drug classes underpin the arguments made and provide the basis for the authors' argument that recent technological breakthroughs and the unravelling of the human genome will provide a new wave of high quality targets (substrate) on which the industry can build. The article suggests that in a conducive environment that understands the benefits that pharmaceuticals provide to healthcare, those players who can base their innovation on a sufficient scale and from a large capital base will reshape the industry.
Construction and Application of a Refined Hospital Management Chain.
Lihua, Yi
2016-01-01
Large scale development was quite common in the later period of hospital industrialization in China. Today, Chinese hospital management faces such problems as service inefficiency, high human resources cost, and low rate of capital use. This study analyzes the refined management chain of Wuxi No.2 People's Hospital. This consists of six gears namely, "organizational structure, clinical practice, outpatient service, medical technology, and nursing care and logistics." The gears are based on "flat management system targets, chief of medical staff, centralized outpatient service, intensified medical examinations, vertical nursing management and socialized logistics." The core concepts of refined hospital management are optimizing flow process, reducing waste, improving efficiency, saving costs, and taking good care of patients as most important. Keywords: Hospital, Refined, Management chain
Improving Future Ecosystem Benefits through Earth Observations: the H2020 Project ECOPOTENTIAL
NASA Astrophysics Data System (ADS)
Provenzale, Antonello; Beierkuhnlein, Carl; Ziv, Guy
2016-04-01
Terrestrial and marine ecosystems provide essential goods and services to human societies. In the last decades, however, anthropogenic pressures caused serious threats to ecosystem integrity, functions and processes, potentially leading to the loss of essential ecosystem services. ECOPOTENTIAL is a large European-funded H2020 project which focuses its activities on a targeted set of internationally recognised protected areas in Europe, European Territories and beyond, blending Earth Observations from remote sensing and field measurements, data analysis and modelling of current and future ecosystem conditions and services. The definition of future scenarios is based on climate and land-use change projections, addressing the issue of uncertainties and uncertainty propagation across the modelling chain. The ECOPOTENTIAL project addresses cross-scale geosphere-biosphere interactions and landscape-ecosystem dynamics at regional to continental scales, using geostatistical methods and the emerging approaches in Macrosystem Ecology and Earth Critical Zone studies, addressing long-term and large-scale environmental and ecological challenges. The project started its activities in 2015, by defining a set of storylines which allow to tackle some of the most crucial issues in the assessment of present conditions and the estimate of the future state of selected ecosystem services. In this contribution, we focus on some of the main storylines of the project and discuss the general approach, focusing on the interplay of data and models and on the estimate of projection uncertainties.
NASA Astrophysics Data System (ADS)
Rasera, L. G.; Mariethoz, G.; Lane, S. N.
2017-12-01
Frequent acquisition of high-resolution digital elevation models (HR-DEMs) over large areas is expensive and difficult. Satellite-derived low-resolution digital elevation models (LR-DEMs) provide extensive coverage of Earth's surface but at coarser spatial and temporal resolutions. Although useful for large scale problems, LR-DEMs are not suitable for modeling hydrologic and geomorphic processes at scales smaller than their spatial resolution. In this work, we present a multiple-point geostatistical approach for downscaling a target LR-DEM based on available high-resolution training data and recurrent high-resolution remote sensing images. The method aims at generating several equiprobable HR-DEMs conditioned to a given target LR-DEM by borrowing small scale topographic patterns from an analogue containing data at both coarse and fine scales. An application of the methodology is demonstrated by using an ensemble of simulated HR-DEMs as input to a flow-routing algorithm. The proposed framework enables a probabilistic assessment of the spatial structures generated by natural phenomena operating at scales finer than the available terrain elevation measurements. A case study in the Swiss Alps is provided to illustrate the methodology.
siRNA screen identifies QPCT as a druggable target for Huntington's disease.
Jimenez-Sanchez, Maria; Lam, Wun; Hannus, Michael; Sönnichsen, Birte; Imarisio, Sara; Fleming, Angeleen; Tarditi, Alessia; Menzies, Fiona; Dami, Teresa Ed; Xu, Catherine; Gonzalez-Couto, Eduardo; Lazzeroni, Giulia; Heitz, Freddy; Diamanti, Daniela; Massai, Luisa; Satagopam, Venkata P; Marconi, Guido; Caramelli, Chiara; Nencini, Arianna; Andreini, Matteo; Sardone, Gian Luca; Caradonna, Nicola P; Porcari, Valentina; Scali, Carla; Schneider, Reinhard; Pollio, Giuseppe; O'Kane, Cahir J; Caricasole, Andrea; Rubinsztein, David C
2015-05-01
Huntington's disease (HD) is a currently incurable neurodegenerative condition caused by an abnormally expanded polyglutamine tract in huntingtin (HTT). We identified new modifiers of mutant HTT toxicity by performing a large-scale 'druggable genome' siRNA screen in human cultured cells, followed by hit validation in Drosophila. We focused on glutaminyl cyclase (QPCT), which had one of the strongest effects on mutant HTT-induced toxicity and aggregation in the cell-based siRNA screen and also rescued these phenotypes in Drosophila. We found that QPCT inhibition induced the levels of the molecular chaperone αB-crystallin and reduced the aggregation of diverse proteins. We generated new QPCT inhibitors using in silico methods followed by in vitro screening, which rescued the HD-related phenotypes in cell, Drosophila and zebrafish HD models. Our data reveal a new HD druggable target affecting mutant HTT aggregation and provide proof of principle for a discovery pipeline from druggable genome screen to drug development.
Pesticide-Induced Stress in Arthropod Pests for Optimized Integrated Pest Management Programs.
Guedes, R N C; Smagghe, G; Stark, J D; Desneux, N
2016-01-01
More than six decades after the onset of wide-scale commercial use of synthetic pesticides and more than fifty years after Rachel Carson's Silent Spring, pesticides, particularly insecticides, arguably remain the most influential pest management tool around the globe. Nevertheless, pesticide use is still a controversial issue and is at the regulatory forefront in most countries. The older generation of insecticide groups has been largely replaced by a plethora of novel molecules that exhibit improved human and environmental safety profiles. However, the use of such compounds is guided by their short-term efficacy; the indirect and subtler effects on their target species, namely arthropod pest species, have been neglected. Curiously, comprehensive risk assessments have increasingly explored effects on nontarget species, contrasting with the majority of efforts focused on the target arthropod pest species. The present review mitigates this shortcoming by hierarchically exploring within an ecotoxicology framework applied to integrated pest management the myriad effects of insecticide use on arthropod pest species.
ERIC Educational Resources Information Center
Sturges, Diana; Maurer, Trent W.; Cole, Oladipo
2009-01-01
This study investigated the effectiveness of role play in a large undergraduate science class. The targeted population consisted of 298 students enrolled in 2 sections of an undergraduate Human Anatomy and Physiology course taught by the same instructor. The section engaged in the role-play activity served as the study group, whereas the section…
Functional Topography of Human Auditory Cortex
Rauschecker, Josef P.
2016-01-01
Functional and anatomical studies have clearly demonstrated that auditory cortex is populated by multiple subfields. However, functional characterization of those fields has been largely the domain of animal electrophysiology, limiting the extent to which human and animal research can inform each other. In this study, we used high-resolution functional magnetic resonance imaging to characterize human auditory cortical subfields using a variety of low-level acoustic features in the spectral and temporal domains. Specifically, we show that topographic gradients of frequency preference, or tonotopy, extend along two axes in human auditory cortex, thus reconciling historical accounts of a tonotopic axis oriented medial to lateral along Heschl's gyrus and more recent findings emphasizing tonotopic organization along the anterior–posterior axis. Contradictory findings regarding topographic organization according to temporal modulation rate in acoustic stimuli, or “periodotopy,” are also addressed. Although isolated subregions show a preference for high rates of amplitude-modulated white noise (AMWN) in our data, large-scale “periodotopic” organization was not found. Organization by AM rate was correlated with dominant pitch percepts in AMWN in many regions. In short, our data expose early auditory cortex chiefly as a frequency analyzer, and spectral frequency, as imposed by the sensory receptor surface in the cochlea, seems to be the dominant feature governing large-scale topographic organization across human auditory cortex. SIGNIFICANCE STATEMENT In this study, we examine the nature of topographic organization in human auditory cortex with fMRI. Topographic organization by spectral frequency (tonotopy) extended in two directions: medial to lateral, consistent with early neuroimaging studies, and anterior to posterior, consistent with more recent reports. Large-scale organization by rates of temporal modulation (periodotopy) was correlated with confounding spectral content of amplitude-modulated white-noise stimuli. Together, our results suggest that the organization of human auditory cortex is driven primarily by its response to spectral acoustic features, and large-scale periodotopy spanning across multiple regions is not supported. This fundamental information regarding the functional organization of early auditory cortex will inform our growing understanding of speech perception and the processing of other complex sounds. PMID:26818527
NASA Astrophysics Data System (ADS)
Zhao, Y.; Zhang, L.; Ma, W.; Zhang, P.; Zhao, T.
2018-04-01
The First National Geographical Condition Survey is a predecessor task to dynamically master basic situations of the nature, ecology and human activities on the earth's surface and it is the brand-new mapping geographic information engineering. In order to ensure comprehensive, real and accurate survey results and achieve the quality management target which the qualified rate is 100 % and the yield is more than 80 %, it is necessary to carry out the quality control and result inspection for national geographical conditions survey on a national scale. To ensure that achievement quality meets quality target requirements, this paper develops the key technology method of "five-in-one" quality control that is constituted by "quality control system of national geographical condition survey, quality inspection technology system, quality evaluation system, quality inspection information management system and national linked quality control institutions" by aiming at large scale, wide coverage range, more undertaking units, more management levels, technical updating, more production process and obvious regional differences in the national geographical condition survey and combining with novel achievement manifestation, complicated dependency, more special reference data, and large data size. This project fully considering the domestic and foreign related research results and production practice experience, combined with the technology development and the needs of the production, it stipulates the inspection methods and technical requirements of each stage in the quality inspection of the geographical condition survey results, and extends the traditional inspection and acceptance technology, and solves the key technologies that are badly needed in the first national geographic survey.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dutta, Tanushree
Large-scale assemblies of people in a confined space can exert significant impacts on the local air chemistry due to human emissions of volatile organics. Variations of air-quality in such small scale can be studied by quantifying fingerprint volatile organic compounds (VOCs) such as acetone, toluene, and isoprene produced during concerts, movie screenings, and sport events (like the Olympics and the World Cup). This review summarizes the extent of VOC accumulation resulting from a large population in a confined area or in a small open area during sporting and other recreational activities. Apart from VOCs emitted directly from human bodies (e.g.,more » perspiration and exhaled breath), those released indirectly from other related sources (e.g., smoking, waste disposal, discharge of food-waste, and use of personal-care products) are also discussed. Although direct and indirect emissions of VOCs from human may constitute <1% of the global atmospheric VOCs budget, unique spatiotemporal variations in VOCs species within a confined space can have unforeseen impacts on the local atmosphere to lead to acute human exposure to harmful pollutants.« less
Three-Dimensional Terahertz Coded-Aperture Imaging Based on Single Input Multiple Output Technology.
Chen, Shuo; Luo, Chenggao; Deng, Bin; Wang, Hongqiang; Cheng, Yongqiang; Zhuang, Zhaowen
2018-01-19
As a promising radar imaging technique, terahertz coded-aperture imaging (TCAI) can achieve high-resolution, forward-looking, and staring imaging by producing spatiotemporal independent signals with coded apertures. In this paper, we propose a three-dimensional (3D) TCAI architecture based on single input multiple output (SIMO) technology, which can reduce the coding and sampling times sharply. The coded aperture applied in the proposed TCAI architecture loads either purposive or random phase modulation factor. In the transmitting process, the purposive phase modulation factor drives the terahertz beam to scan the divided 3D imaging cells. In the receiving process, the random phase modulation factor is adopted to modulate the terahertz wave to be spatiotemporally independent for high resolution. Considering human-scale targets, images of each 3D imaging cell are reconstructed one by one to decompose the global computational complexity, and then are synthesized together to obtain the complete high-resolution image. As for each imaging cell, the multi-resolution imaging method helps to reduce the computational burden on a large-scale reference-signal matrix. The experimental results demonstrate that the proposed architecture can achieve high-resolution imaging with much less time for 3D targets and has great potential in applications such as security screening, nondestructive detection, medical diagnosis, etc.
Punishment sustains large-scale cooperation in prestate warfare
Mathew, Sarah; Boyd, Robert
2011-01-01
Understanding cooperation and punishment in small-scale societies is crucial for explaining the origins of human cooperation. We studied warfare among the Turkana, a politically uncentralized, egalitarian, nomadic pastoral society in East Africa. Based on a representative sample of 88 recent raids, we show that the Turkana sustain costly cooperation in combat at a remarkably large scale, at least in part, through punishment of free-riders. Raiding parties comprised several hundred warriors and participants are not kin or day-to-day interactants. Warriors incur substantial risk of death and produce collective benefits. Cowardice and desertions occur, and are punished by community-imposed sanctions, including collective corporal punishment and fines. Furthermore, Turkana norms governing warfare benefit the ethnolinguistic group, a population of a half-million people, at the expense of smaller social groupings. These results challenge current views that punishment is unimportant in small-scale societies and that human cooperation evolved in small groups of kin and familiar individuals. Instead, these results suggest that cooperation at the larger scale of ethnolinguistic units enforced by third-party sanctions could have a deep evolutionary history in the human species. PMID:21670285
Kuipers, Jeroen; Kalicharan, Ruby D; Wolters, Anouk H G; van Ham, Tjakko J; Giepmans, Ben N G
2016-05-25
Large-scale 2D electron microscopy (EM), or nanotomy, is the tissue-wide application of nanoscale resolution electron microscopy. Others and we previously applied large scale EM to human skin pancreatic islets, tissue culture and whole zebrafish larvae(1-7). Here we describe a universally applicable method for tissue-scale scanning EM for unbiased detection of sub-cellular and molecular features. Nanotomy was applied to investigate the healthy and a neurodegenerative zebrafish brain. Our method is based on standardized EM sample preparation protocols: Fixation with glutaraldehyde and osmium, followed by epoxy-resin embedding, ultrathin sectioning and mounting of ultrathin-sections on one-hole grids, followed by post staining with uranyl and lead. Large-scale 2D EM mosaic images are acquired using a scanning EM connected to an external large area scan generator using scanning transmission EM (STEM). Large scale EM images are typically ~ 5 - 50 G pixels in size, and best viewed using zoomable HTML files, which can be opened in any web browser, similar to online geographical HTML maps. This method can be applied to (human) tissue, cross sections of whole animals as well as tissue culture(1-5). Here, zebrafish brains were analyzed in a non-invasive neuronal ablation model. We visualize within a single dataset tissue, cellular and subcellular changes which can be quantified in various cell types including neurons and microglia, the brain's macrophages. In addition, nanotomy facilitates the correlation of EM with light microscopy (CLEM)(8) on the same tissue, as large surface areas previously imaged using fluorescent microscopy, can subsequently be subjected to large area EM, resulting in the nano-anatomy (nanotomy) of tissues. In all, nanotomy allows unbiased detection of features at EM level in a tissue-wide quantifiable manner.
Kuipers, Jeroen; Kalicharan, Ruby D.; Wolters, Anouk H. G.
2016-01-01
Large-scale 2D electron microscopy (EM), or nanotomy, is the tissue-wide application of nanoscale resolution electron microscopy. Others and we previously applied large scale EM to human skin pancreatic islets, tissue culture and whole zebrafish larvae1-7. Here we describe a universally applicable method for tissue-scale scanning EM for unbiased detection of sub-cellular and molecular features. Nanotomy was applied to investigate the healthy and a neurodegenerative zebrafish brain. Our method is based on standardized EM sample preparation protocols: Fixation with glutaraldehyde and osmium, followed by epoxy-resin embedding, ultrathin sectioning and mounting of ultrathin-sections on one-hole grids, followed by post staining with uranyl and lead. Large-scale 2D EM mosaic images are acquired using a scanning EM connected to an external large area scan generator using scanning transmission EM (STEM). Large scale EM images are typically ~ 5 - 50 G pixels in size, and best viewed using zoomable HTML files, which can be opened in any web browser, similar to online geographical HTML maps. This method can be applied to (human) tissue, cross sections of whole animals as well as tissue culture1-5. Here, zebrafish brains were analyzed in a non-invasive neuronal ablation model. We visualize within a single dataset tissue, cellular and subcellular changes which can be quantified in various cell types including neurons and microglia, the brain's macrophages. In addition, nanotomy facilitates the correlation of EM with light microscopy (CLEM)8 on the same tissue, as large surface areas previously imaged using fluorescent microscopy, can subsequently be subjected to large area EM, resulting in the nano-anatomy (nanotomy) of tissues. In all, nanotomy allows unbiased detection of features at EM level in a tissue-wide quantifiable manner. PMID:27285162
Zhou, Yanrong; Lin, Yanli; Wu, Xiaojie; Xiong, Fuyin; Lv, Yuemeng; Zheng, Tao; Huang, Peitang; Chen, Hongxing
2012-02-01
Transgene expression for the mammary gland bioreactor aimed at producing recombinant proteins requires optimized expression vector construction. Previously we presented a hybrid gene locus strategy, which was originally tested with human lactoferrin (hLF) as target transgene, and an extremely high-level expression of rhLF ever been achieved as to 29.8 g/l in mice milk. Here to demonstrate the broad application of this strategy, another 38.4 kb mWAP-htPA hybrid gene locus was constructed, in which the 3-kb genomic coding sequence in the 24-kb mouse whey acidic protein (mWAP) gene locus was substituted by the 17.4-kb genomic coding sequence of human tissue plasminogen activator (htPA), exactly from the start codon to the end codon. Corresponding five transgenic mice lines were generated and the highest expression level of rhtPA in the milk attained as to 3.3 g/l. Our strategy will provide a universal way for the large-scale production of pharmaceutical proteins in the mammary gland of transgenic animals.
How to Tackle Natural Focal Infections: From Risk Assessment to Vaccination Strategies.
Busani, Luca; Platonov, Alexander E; Ergonul, Onder; Rezza, Giovanni
2017-01-01
Natural focal diseases are caused by biological agents associated with specific landscapes. The natural focus of such diseases is defined as any natural ecosystem containing the pathogen's population as an essential component. In such context, the agent circulates independently on human presence, and humans may become accidentally infected through contact with vectors or reservoirs. Some viruses (i.e., tick-borne encephalitis and Congo-Crimean hemorrhagic fever virus) are paradigmatic examples of natural focal diseases. When environmental changes, increase of reservoir/vector populations, demographic pressure, and/or changes in human behavior occur, increased risk of exposure to the pathogen may lead to clusters of cases or even to larger outbreaks. Intervention is often not highly cost-effective, thus only a few examples of large-scale or even targeted vaccination campaigns are reported in the international literature. To develop intervention models, risk assessment through disease mapping is an essential component of the response against these neglected threats and key to the design of prevention strategies, especially when effective vaccines against the disease are available.
Barrett, Lisa Feldman; Satpute, Ajay
2013-01-01
Understanding how a human brain creates a human mind ultimately depends on mapping psychological categories and concepts to physical measurements of neural response. Although it has long been assumed that emotional, social, and cognitive phenomena are realized in the operations of separate brain regions or brain networks, we demonstrate that it is possible to understand the body of neuroimaging evidence using a framework that relies on domain general, distributed structure-function mappings. We review current research in affective and social neuroscience and argue that the emerging science of large-scale intrinsic brain networks provides a coherent framework for a domain-general functional architecture of the human brain. PMID:23352202
Characterizing the cancer genome in lung adenocarcinoma
Weir, Barbara A.; Woo, Michele S.; Getz, Gad; Perner, Sven; Ding, Li; Beroukhim, Rameen; Lin, William M.; Province, Michael A.; Kraja, Aldi; Johnson, Laura A.; Shah, Kinjal; Sato, Mitsuo; Thomas, Roman K.; Barletta, Justine A.; Borecki, Ingrid B.; Broderick, Stephen; Chang, Andrew C.; Chiang, Derek Y.; Chirieac, Lucian R.; Cho, Jeonghee; Fujii, Yoshitaka; Gazdar, Adi F.; Giordano, Thomas; Greulich, Heidi; Hanna, Megan; Johnson, Bruce E.; Kris, Mark G.; Lash, Alex; Lin, Ling; Lindeman, Neal; Mardis, Elaine R.; McPherson, John D.; Minna, John D.; Morgan, Margaret B.; Nadel, Mark; Orringer, Mark B.; Osborne, John R.; Ozenberger, Brad; Ramos, Alex H.; Robinson, James; Roth, Jack A.; Rusch, Valerie; Sasaki, Hidefumi; Shepherd, Frances; Sougnez, Carrie; Spitz, Margaret R.; Tsao, Ming-Sound; Twomey, David; Verhaak, Roel G. W.; Weinstock, George M.; Wheeler, David A.; Winckler, Wendy; Yoshizawa, Akihiko; Yu, Soyoung; Zakowski, Maureen F.; Zhang, Qunyuan; Beer, David G.; Wistuba, Ignacio I.; Watson, Mark A.; Garraway, Levi A.; Ladanyi, Marc; Travis, William D.; Pao, William; Rubin, Mark A.; Gabriel, Stacey B.; Gibbs, Richard A.; Varmus, Harold E.; Wilson, Richard K.; Lander, Eric S.; Meyerson, Matthew
2008-01-01
Somatic alterations in cellular DNA underlie almost all human cancers1. The prospect of targeted therapies2 and the development of high-resolution, genome-wide approaches3–8 are now spurring systematic efforts to characterize cancer genomes. Here we report a large-scale project to characterize copy-number alterations in primary lung adenocarcinomas. By analysis of a large collection of tumors (n = 371) using dense single nucleotide polymorphism arrays, we identify a total of 57 significantly recurrent events. We find that 26 of 39 autosomal chromosome arms show consistent large-scale copy-number gain or loss, of which only a handful have been linked to a specific gene. We also identify 31 recurrent focal events, including 24 amplifications and 7 homozygous deletions. Only six of these focal events are currently associated with known mutations in lung carcinomas. The most common event, amplification of chromosome 14q13.3, is found in ~12% of samples. On the basis of genomic and functional analyses, we identify NKX2-1 (NK2 homeobox 1, also called TITF1), which lies in the minimal 14q13.3 amplification interval and encodes a lineage-specific transcription factor, as a novel candidate proto-oncogene involved in a significant fraction of lung adenocarcinomas. More generally, our results indicate that many of the genes that are involved in lung adenocarcinoma remain to be discovered. PMID:17982442
Reconstruction of genome-scale human metabolic models using omics data.
Ryu, Jae Yong; Kim, Hyun Uk; Lee, Sang Yup
2015-08-01
The impact of genome-scale human metabolic models on human systems biology and medical sciences is becoming greater, thanks to increasing volumes of model building platforms and publicly available omics data. The genome-scale human metabolic models started with Recon 1 in 2007, and have since been used to describe metabolic phenotypes of healthy and diseased human tissues and cells, and to predict therapeutic targets. Here we review recent trends in genome-scale human metabolic modeling, including various generic and tissue/cell type-specific human metabolic models developed to date, and methods, databases and platforms used to construct them. For generic human metabolic models, we pay attention to Recon 2 and HMR 2.0 with emphasis on data sources used to construct them. Draft and high-quality tissue/cell type-specific human metabolic models have been generated using these generic human metabolic models. Integration of tissue/cell type-specific omics data with the generic human metabolic models is the key step, and we discuss omics data and their integration methods to achieve this task. The initial version of the tissue/cell type-specific human metabolic models can further be computationally refined through gap filling, reaction directionality assignment and the subcellular localization of metabolic reactions. We review relevant tools for this model refinement procedure as well. Finally, we suggest the direction of further studies on reconstructing an improved human metabolic model.
TOXICOGENOMICS AND HUMAN DISEASE RISK ASSESSMENT
Toxicogenomics and Human Disease Risk Assessment.
Complete sequencing of human and other genomes, availability of large-scale gene
expression arrays with ever-increasing numbers of genes displayed, and steady
improvements in protein expression technology can hav...
2017-01-01
Phase relations between specific scales in a turbulent boundary layer are studied here by highlighting the associated nonlinear scale interactions in the flow. This is achieved through an experimental technique that allows for targeted forcing of the flow through the use of a dynamic wall perturbation. Two distinct large-scale modes with well-defined spatial and temporal wavenumbers were simultaneously forced in the boundary layer, and the resulting nonlinear response from their direct interactions was isolated from the turbulence signal for the study. This approach advances the traditional studies of large- and small-scale interactions in wall turbulence by focusing on the direct interactions between scales with triadic wavenumber consistency. The results are discussed in the context of modelling high Reynolds number wall turbulence. This article is part of the themed issue ‘Toward the development of high-fidelity models of wall turbulence at large Reynolds number’. PMID:28167576
Effects on aquatic and human health due to large scale bioenergy crop expansion.
Love, Bradley J; Einheuser, Matthew D; Nejadhashemi, A Pouyan
2011-08-01
In this study, the environmental impacts of large scale bioenergy crops were evaluated using the Soil and Water Assessment Tool (SWAT). Daily pesticide concentration data for a study area consisting of four large watersheds located in Michigan (totaling 53,358 km²) was estimated over a six year period (2000-2005). Model outputs for atrazine, bromoxynil, glyphosate, metolachlor, pendimethalin, sethoxydim, triflualin, and 2,4-D model output were used to predict the possible long-term implications that large-scale bioenergy crop expansion may have on the bluegill (Lepomis macrochirus) and humans. Threshold toxicity levels were obtained for the bluegill and for human consumption for all pesticides being evaluated through an extensive literature review. Model output was compared to each toxicity level for the suggested exposure time (96-hour for bluegill and 24-hour for humans). The results suggest that traditional intensive row crops such as canola, corn and sorghum may negatively impact aquatic life, and in most cases affect the safe drinking water availability. The continuous corn rotation, the most representative rotation for current agricultural practices for a starch-based ethanol economy, delivers the highest concentrations of glyphosate to the stream. In addition, continuous canola contributed to a concentration of 1.11 ppm of trifluralin, a highly toxic herbicide, which is 8.7 times the 96-hour ecotoxicity of bluegills and 21 times the safe drinking water level. Also during the period of study, continuous corn resulted in the impairment of 541,152 km of stream. However, there is promise with second-generation lignocellulosic bioenergy crops such as switchgrass, which resulted in a 171,667 km reduction in total stream length that exceeds the human threshold criteria, as compared to the base scenario. Results of this study may be useful in determining the suitability of bioenergy crop rotations and aid in decision making regarding the adaptation of large-scale bioenergy cropping systems. Published by Elsevier B.V.
Risk of large-scale fires in boreal forests of Finland under changing climate
NASA Astrophysics Data System (ADS)
Lehtonen, I.; Venäläinen, A.; Kämäräinen, M.; Peltola, H.; Gregow, H.
2016-01-01
The target of this work was to assess the impact of projected climate change on forest-fire activity in Finland with special emphasis on large-scale fires. In addition, we were particularly interested to examine the inter-model variability of the projected change of fire danger. For this purpose, we utilized fire statistics covering the period 1996-2014 and consisting of almost 20 000 forest fires, as well as daily meteorological data from five global climate models under representative concentration pathway RCP4.5 and RCP8.5 scenarios. The model data were statistically downscaled onto a high-resolution grid using the quantile-mapping method before performing the analysis. In examining the relationship between weather and fire danger, we applied the Canadian fire weather index (FWI) system. Our results suggest that the number of large forest fires may double or even triple during the present century. This would increase the risk that some of the fires could develop into real conflagrations which have become almost extinct in Finland due to active and efficient fire suppression. However, the results reveal substantial inter-model variability in the rate of the projected increase of forest-fire danger, emphasizing the large uncertainty related to the climate change signal in fire activity. We moreover showed that the majority of large fires in Finland occur within a relatively short period in May and June due to human activities and that FWI correlates poorer with the fire activity during this time of year than later in summer when lightning is a more important cause of fires.
Ajore, Ram; Raiser, David; McConkey, Marie; Jöud, Magnus; Boidol, Bernd; Mar, Brenton; Saksena, Gordon; Weinstock, David M; Armstrong, Scott; Ellis, Steven R; Ebert, Benjamin L; Nilsson, Björn
2017-04-01
Heterozygous inactivating mutations in ribosomal protein genes (RPGs) are associated with hematopoietic and developmental abnormalities, activation of p53, and altered risk of cancer in humans and model organisms. Here we performed a large-scale analysis of cancer genome data to examine the frequency and selective pressure of RPG lesions across human cancers. We found that hemizygous RPG deletions are common, occurring in about 43% of 10,744 cancer specimens and cell lines. Consistent with p53-dependent negative selection, such lesions are underrepresented in TP53 -intact tumors ( P ≪ 10 -10 ), and shRNA-mediated knockdown of RPGs activated p53 in TP53 -wild-type cells. In contrast, we did not see negative selection of RPG deletions in TP53 -mutant tumors. RPGs are conserved with respect to homozygous deletions, and shRNA screening data from 174 cell lines demonstrate that further suppression of hemizygously deleted RPGs inhibits cell growth. Our results establish RPG haploinsufficiency as a strikingly common vulnerability of human cancers that associates with TP53 mutations and could be targetable therapeutically. © 2017 The Authors. Published under the terms of the CC BY 4.0 license.
Has Madagascar lost its exceptional leptospirosis free-like status?
Ratsitorahina, Maherisoa; Rahelinirina, Soanandrasana; Michault, Alain; Rajerison, Minoarisoa; Rajatonirina, Soatiana; Richard, Vincent
2015-01-01
Leptospirosis is a widespread but underreported cause of morbidity and mortality. It has rarely been reported in either humans or animals in Madagascar. We conducted a cross-sectional survey of the inhabitants in Moramanga, Madagascar, in June 2011, to estimate the prevalence of human infection using the microscopic agglutination test (MAT). This activity was carried out as part of a workshop implemented by the Pasteur Institute of Madagascar, focusing on surveillance with a one week field study and targeting the health staff of the district level. In total, we sampled 678 inhabitants from 263 households. The sex ratio (M/F) was 0.65 and the mean age 26.7 years. We obtained a value of 2.9% for the first recorded seroprevalence of this disease in the human community of Moramanga. Questionnaire responses revealed frequent contacts between humans and rodents in Moramanga. However, activities involving cattle were identified as a risk factor significantly associated with seropositivity (OR=3). Leptospirosis remains a neglected disease in Madagascar. This study highlights the need to quantify the public health impact of this neglected disease in a more large scale, in all the country and to establish point-of-care laboratories in remote areas.
Target charging in short-pulse-laser-plasma experiments.
Dubois, J-L; Lubrano-Lavaderci, F; Raffestin, D; Ribolzi, J; Gazave, J; Compant La Fontaine, A; d'Humières, E; Hulin, S; Nicolaï, Ph; Poyé, A; Tikhonchuk, V T
2014-01-01
Interaction of high-intensity laser pulses with solid targets results in generation of large quantities of energetic electrons that are the origin of various effects such as intense x-ray emission, ion acceleration, and so on. Some of these electrons are escaping the target, leaving behind a significant positive electric charge and creating a strong electromagnetic pulse long after the end of the laser pulse. We propose here a detailed model of the target electric polarization induced by a short and intense laser pulse and an escaping electron bunch. A specially designed experiment provides direct measurements of the target polarization and the discharge current in the function of the laser energy, pulse duration, and target size. Large-scale numerical simulations describe the energetic electron generation and their emission from the target. The model, experiment, and numerical simulations demonstrate that the hot-electron ejection may continue long after the laser pulse ends, enhancing significantly the polarization charge.
Wilson, Kris; Mole, Damian J; Homer, Natalie Z M; Iredale, John P; Auer, Manfred; Webster, Scott P
2015-02-01
Human kynurenine 3-monooxygenase (KMO) is emerging as an important drug target enzyme in a number of inflammatory and neurodegenerative disease states. Recombinant protein production of KMO, and therefore discovery of KMO ligands, is challenging due to a large membrane targeting domain at the C-terminus of the enzyme that causes stability, solubility, and purification difficulties. The purpose of our investigation was to develop a suitable screening method for targeting human KMO and other similarly challenging drug targets. Here, we report the development of a magnetic bead-based binding assay using mass spectrometry detection for human KMO protein. The assay incorporates isolation of FLAG-tagged KMO enzyme on protein A magnetic beads. The protein-bound beads are incubated with potential binding compounds before specific cleavage of the protein-compound complexes from the beads. Mass spectrometry analysis is used to identify the compounds that demonstrate specific binding affinity for the target protein. The technique was validated using known inhibitors of KMO. This assay is a robust alternative to traditional ligand-binding assays for challenging protein targets, and it overcomes specific difficulties associated with isolating human KMO. © 2014 Society for Laboratory Automation and Screening.
Incorporating human-water dynamics in a hyper-resolution land surface model
NASA Astrophysics Data System (ADS)
Vergopolan, N.; Chaney, N.; Wanders, N.; Sheffield, J.; Wood, E. F.
2017-12-01
The increasing demand for water, energy, and food is leading to unsustainable groundwater and surface water exploitation. As a result, the human interactions with the environment, through alteration of land and water resources dynamics, need to be reflected in hydrologic and land surface models (LSMs). Advancements in representing human-water dynamics still leave challenges related to the lack of water use data, water allocation algorithms, and modeling scales. This leads to an over-simplistic representation of human water use in large-scale models; this is in turn leads to an inability to capture extreme events signatures and to provide reliable information at stakeholder-level spatial scales. The emergence of hyper-resolution models allows one to address these challenges by simulating the hydrological processes and interactions with the human impacts at field scales. We integrated human-water dynamics into HydroBlocks - a hyper-resolution, field-scale resolving LSM. HydroBlocks explicitly solves the field-scale spatial heterogeneity of land surface processes through interacting hydrologic response units (HRUs); and its HRU-based model parallelization allows computationally efficient long-term simulations as well as ensemble predictions. The implemented human-water dynamics include groundwater and surface water abstraction to meet agricultural, domestic and industrial water demands. Furthermore, a supply-demand water allocation scheme based on relative costs helps to determine sectoral water use requirements and tradeoffs. A set of HydroBlocks simulations over the Midwest United States (daily, at 30-m spatial resolution for 30 years) are used to quantify the irrigation impacts on water availability. The model captures large reductions in total soil moisture and water table levels, as well as spatiotemporal changes in evapotranspiration and runoff peaks, with their intensity related to the adopted water management strategy. By incorporating human-water dynamics in a hyper-resolution LSM this work allows for progress on hydrological monitoring and predictions, as well as drought preparedness and water impact assessments at relevant decision-making scales.
Analyses of deep mammalian sequence alignments and constraint predictions for 1% of the human genome
Margulies, Elliott H.; Cooper, Gregory M.; Asimenos, George; Thomas, Daryl J.; Dewey, Colin N.; Siepel, Adam; Birney, Ewan; Keefe, Damian; Schwartz, Ariel S.; Hou, Minmei; Taylor, James; Nikolaev, Sergey; Montoya-Burgos, Juan I.; Löytynoja, Ari; Whelan, Simon; Pardi, Fabio; Massingham, Tim; Brown, James B.; Bickel, Peter; Holmes, Ian; Mullikin, James C.; Ureta-Vidal, Abel; Paten, Benedict; Stone, Eric A.; Rosenbloom, Kate R.; Kent, W. James; Bouffard, Gerard G.; Guan, Xiaobin; Hansen, Nancy F.; Idol, Jacquelyn R.; Maduro, Valerie V.B.; Maskeri, Baishali; McDowell, Jennifer C.; Park, Morgan; Thomas, Pamela J.; Young, Alice C.; Blakesley, Robert W.; Muzny, Donna M.; Sodergren, Erica; Wheeler, David A.; Worley, Kim C.; Jiang, Huaiyang; Weinstock, George M.; Gibbs, Richard A.; Graves, Tina; Fulton, Robert; Mardis, Elaine R.; Wilson, Richard K.; Clamp, Michele; Cuff, James; Gnerre, Sante; Jaffe, David B.; Chang, Jean L.; Lindblad-Toh, Kerstin; Lander, Eric S.; Hinrichs, Angie; Trumbower, Heather; Clawson, Hiram; Zweig, Ann; Kuhn, Robert M.; Barber, Galt; Harte, Rachel; Karolchik, Donna; Field, Matthew A.; Moore, Richard A.; Matthewson, Carrie A.; Schein, Jacqueline E.; Marra, Marco A.; Antonarakis, Stylianos E.; Batzoglou, Serafim; Goldman, Nick; Hardison, Ross; Haussler, David; Miller, Webb; Pachter, Lior; Green, Eric D.; Sidow, Arend
2007-01-01
A key component of the ongoing ENCODE project involves rigorous comparative sequence analyses for the initially targeted 1% of the human genome. Here, we present orthologous sequence generation, alignment, and evolutionary constraint analyses of 23 mammalian species for all ENCODE targets. Alignments were generated using four different methods; comparisons of these methods reveal large-scale consistency but substantial differences in terms of small genomic rearrangements, sensitivity (sequence coverage), and specificity (alignment accuracy). We describe the quantitative and qualitative trade-offs concomitant with alignment method choice and the levels of technical error that need to be accounted for in applications that require multisequence alignments. Using the generated alignments, we identified constrained regions using three different methods. While the different constraint-detecting methods are in general agreement, there are important discrepancies relating to both the underlying alignments and the specific algorithms. However, by integrating the results across the alignments and constraint-detecting methods, we produced constraint annotations that were found to be robust based on multiple independent measures. Analyses of these annotations illustrate that most classes of experimentally annotated functional elements are enriched for constrained sequences; however, large portions of each class (with the exception of protein-coding sequences) do not overlap constrained regions. The latter elements might not be under primary sequence constraint, might not be constrained across all mammals, or might have expendable molecular functions. Conversely, 40% of the constrained sequences do not overlap any of the functional elements that have been experimentally identified. Together, these findings demonstrate and quantify how many genomic functional elements await basic molecular characterization. PMID:17567995
A high-throughput shotgun mutagenesis approach to mapping B-cell antibody epitopes.
Davidson, Edgar; Doranz, Benjamin J
2014-09-01
Characterizing the binding sites of monoclonal antibodies (mAbs) on protein targets, their 'epitopes', can aid in the discovery and development of new therapeutics, diagnostics and vaccines. However, the speed of epitope mapping techniques has not kept pace with the increasingly large numbers of mAbs being isolated. Obtaining detailed epitope maps for functionally relevant antibodies can be challenging, particularly for conformational epitopes on structurally complex proteins. To enable rapid epitope mapping, we developed a high-throughput strategy, shotgun mutagenesis, that enables the identification of both linear and conformational epitopes in a fraction of the time required by conventional approaches. Shotgun mutagenesis epitope mapping is based on large-scale mutagenesis and rapid cellular testing of natively folded proteins. Hundreds of mutant plasmids are individually cloned, arrayed in 384-well microplates, expressed within human cells, and tested for mAb reactivity. Residues are identified as a component of a mAb epitope if their mutation (e.g. to alanine) does not support candidate mAb binding but does support that of other conformational mAbs or allows full protein function. Shotgun mutagenesis is particularly suited for studying structurally complex proteins because targets are expressed in their native form directly within human cells. Shotgun mutagenesis has been used to delineate hundreds of epitopes on a variety of proteins, including G protein-coupled receptor and viral envelope proteins. The epitopes mapped on dengue virus prM/E represent one of the largest collections of epitope information for any viral protein, and results are being used to design better vaccines and drugs. © 2014 John Wiley & Sons Ltd.
Reid, Beth; Ho, Shirley; Padmanabhan, Nikhil; ...
2015-11-17
The Baryon Oscillation Spectroscopic Survey (BOSS), part of the Sloan Digital Sky Survey (SDSS) III project, has provided the largest survey of galaxy redshifts available to date, in terms of both the number of galaxy redshifts measured by a single survey, and the effective cosmological volume covered. Key to analysing the clustering of these data to provide cosmological measurements is understanding the detailed properties of this sample. Potential issues include variations in the target catalogue caused by changes either in the targeting algorithm or properties of the data used, the pattern of spectroscopic observations, the spatial distribution of targets formore » which redshifts were not obtained, and variations in the target sky density due to observational systematics. We document here the target selection algorithms used to create the galaxy samples that comprise BOSS. We also present the algorithms used to create large-scale structure catalogues for the final Data Release (DR12) samples and the associated random catalogues that quantify the survey mask. The algorithms are an evolution of those used by the BOSS team to construct catalogues from earlier data, and have been designed to accurately quantify the galaxy sample. Furthermore, the code used, designated mksample, is released with this paper.« less
Stable isotope-resolved metabolomics and applications for drug development
Fan, Teresa W-M.; Lorkiewicz, Pawel; Sellers, Katherine; Moseley, Hunter N.B.; Higashi, Richard M.; Lane, Andrew N.
2012-01-01
Advances in analytical methodologies, principally nuclear magnetic resonance spectroscopy (NMR) and mass spectrometry (MS), during the last decade have made large-scale analysis of the human metabolome a reality. This is leading to the reawakening of the importance of metabolism in human diseases, particularly cancer. The metabolome is the functional readout of the genome, functional genome, and proteome; it is also an integral partner in molecular regulations for homeostasis. The interrogation of the metabolome, or metabolomics, is now being applied to numerous diseases, largely by metabolite profiling for biomarker discovery, but also in pharmacology and therapeutics. Recent advances in stable isotope tracer-based metabolomic approaches enable unambiguous tracking of individual atoms through compartmentalized metabolic networks directly in human subjects, which promises to decipher the complexity of the human metabolome at an unprecedented pace. This knowledge will revolutionize our understanding of complex human diseases, clinical diagnostics, as well as individualized therapeutics and drug response. In this review, we focus on the use of stable isotope tracers with metabolomics technologies for understanding metabolic network dynamics in both model systems and in clinical applications. Atom-resolved isotope tracing via the two major analytical platforms, NMR and MS, has the power to determine novel metabolic reprogramming in diseases, discover new drug targets, and facilitates ADME studies. We also illustrate new metabolic tracer-based imaging technologies, which enable direct visualization of metabolic processes in vivo. We further outline current practices and future requirements for biochemoinformatics development, which is an integral part of translating stable isotope-resolved metabolomics into clinical reality. PMID:22212615
Validation of a Cost-Efficient Multi-Purpose SNP Panel for Disease Based Research
Hou, Liping; Phillips, Christopher; Azaro, Marco; Brzustowicz, Linda M.; Bartlett, Christopher W.
2011-01-01
Background Here we present convergent methodologies using theoretical calculations, empirical assessment on in-house and publicly available datasets as well as in silico simulations, that validate a panel of SNPs for a variety of necessary tasks in human genetics disease research before resources are committed to larger-scale genotyping studies on those samples. While large-scale well-funded human genetic studies routinely have up to a million SNP genotypes, samples in a human genetics laboratory that are not yet part of such studies may be productively utilized in pilot projects or as part of targeted follow-up work though such smaller scale applications require at least some genome-wide genotype data for quality control purposes such as DNA “barcoding” to detect swaps or contamination issues, determining familial relationships between samples and correcting biases due to population effects such as population stratification in pilot studies. Principal Findings Empirical performance in classification of relative types for any two given DNA samples (e.g., full siblings, parental, etc) indicated that for outbred populations the panel performs sufficiently to classify relationship in extended families and therefore also for smaller structures such as trios and for twin zygosity testing. Additionally, familial relationships do not significantly diminish the (mean match) probability of sharing SNP genotypes in pedigrees, further indicating the uniqueness of the “barcode.” Simulation using these SNPs for an African American case-control disease association study demonstrated that population stratification, even in complex admixed samples, can be adequately corrected under a range of disease models using the SNP panel. Conclusion The panel has been validated for use in a variety of human disease genetics research tasks including sample barcoding, relationship verification, population substructure detection and statistical correction. Given the ease of genotyping our specific assay contained herein, this panel represents a useful and economical panel for human geneticists. PMID:21611176
Colonna, William; Brehm-Stecher, Byron; Shetty, Kalidas; Pometto, Anthony
2017-12-01
This study focused on advancing a rapid turbidimetric bioassay to screen antimicrobials using specific cocktails of targeted foodborne bacterial pathogens. Specifically, to show the relevance of this rapid screening tool, the antimicrobial potential of generally recognized as safe calcium diacetate (DAX) and blends with cranberry (NC) and oregano (OX) natural extracts was evaluated. Furthermore, the same extracts were evaluated against beneficial lactic acid bacteria. The targeted foodborne pathogens evaluated were Escherichia coli O157:H7, Salmonella spp., Listeria monocytogenes, and Staphylococcus aureus using optimized initial cocktails (∼10 8 colony-forming unit/mL) containing strains isolated from human food outbreaks. Of all extracts evaluated, 0.51% (w/v) DAX in ethanol was the most effective against all four pathogens. However, DAX when reduced to 0.26% and with added blends from ethanol extractions consisting of DAX:OX (3:1), slightly outperformed or was equal to same levels of DAX alone. Subculture of wells in which no growth occurred after 1 week indicated that all water and ethanol extracts were bacteriostatic against the pathogens tested. All the targeted antimicrobials had no effect on the probiotic organism Lactobacillus plantarum. The use of such rapid screening methods combined with the use of multistrain cocktails of targeted foodborne pathogens from outbreaks will allow rapid large-scale screening of antimicrobials and enable further detailed studies in targeted model food systems.
E-TALEN: a web tool to design TALENs for genome engineering.
Heigwer, Florian; Kerr, Grainne; Walther, Nike; Glaeser, Kathrin; Pelz, Oliver; Breinig, Marco; Boutros, Michael
2013-11-01
Use of transcription activator-like effector nucleases (TALENs) is a promising new technique in the field of targeted genome engineering, editing and reverse genetics. Its applications span from introducing knockout mutations to endogenous tagging of proteins and targeted excision repair. Owing to this wide range of possible applications, there is a need for fast and user-friendly TALEN design tools. We developed E-TALEN (http://www.e-talen.org), a web-based tool to design TALENs for experiments of varying scale. E-TALEN enables the design of TALENs against a single target or a large number of target genes. We significantly extended previously published design concepts to consider genomic context and different applications. E-TALEN guides the user through an end-to-end design process of de novo TALEN pairs, which are specific to a certain sequence or genomic locus. Furthermore, E-TALEN offers a functionality to predict targeting and specificity for existing TALENs. Owing to the computational complexity of many of the steps in the design of TALENs, particular emphasis has been put on the implementation of fast yet accurate algorithms. We implemented a user-friendly interface, from the input parameters to the presentation of results. An additional feature of E-TALEN is the in-built sequence and annotation database available for many organisms, including human, mouse, zebrafish, Drosophila and Arabidopsis, which can be extended in the future.
Kim, Hyerin; Kang, NaNa; An, KyuHyeon; Koo, JaeHyung; Kim, Min-Soo
2016-01-01
Design of high-quality primers for multiple target sequences is essential for qPCR experiments, but is challenging due to the need to consider both homology tests on off-target sequences and the same stringent filtering constraints on the primers. Existing web servers for primer design have major drawbacks, including requiring the use of BLAST-like tools for homology tests, lack of support for ranking of primers, TaqMan probes and simultaneous design of primers against multiple targets. Due to the large-scale computational overhead, the few web servers supporting homology tests use heuristic approaches or perform homology tests within a limited scope. Here, we describe the MRPrimerW, which performs complete homology testing, supports batch design of primers for multi-target qPCR experiments, supports design of TaqMan probes and ranks the resulting primers to return the top-1 best primers to the user. To ensure high accuracy, we adopted the core algorithm of a previously reported MapReduce-based method, MRPrimer, but completely redesigned it to allow users to receive query results quickly in a web interface, without requiring a MapReduce cluster or a long computation. MRPrimerW provides primer design services and a complete set of 341 963 135 in silico validated primers covering 99% of human and mouse genes. Free access: http://MRPrimerW.com. PMID:27154272
Molecular inversion probe assay.
Absalan, Farnaz; Ronaghi, Mostafa
2007-01-01
We have described molecular inversion probe technologies for large-scale genetic analyses. This technique provides a comprehensive and powerful tool for the analysis of genetic variation and enables affordable, large-scale studies that will help uncover the genetic basis of complex disease and explain the individual variation in response to therapeutics. Major applications of the molecular inversion probes (MIP) technologies include targeted genotyping from focused regions to whole-genome studies, and allele quantification of genomic rearrangements. The MIP technology (used in the HapMap project) provides an efficient, scalable, and affordable way to score polymorphisms in case/control populations for genetic studies. The MIP technology provides the highest commercially available multiplexing levels and assay conversion rates for targeted genotyping. This enables more informative, genome-wide studies with either the functional (direct detection) approach or the indirect detection approach.
ERIC Educational Resources Information Center
Tipton, Elizabeth; Fellers, Lauren; Caverly, Sarah; Vaden-Kiernan, Michael; Borman, Geoffrey; Sullivan, Kate; Ruiz de Castilla, Veronica
2016-01-01
Recently, statisticians have begun developing methods to improve the generalizability of results from large-scale experiments in education. This work has included the development of methods for improved site selection when random sampling is infeasible, including the use of stratification and targeted recruitment strategies. This article provides…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lepesheva, Galina I.; Park, Hee-Won; Hargrove, Tatiana Y.
2010-01-25
Sterol 14{alpha}-demethylase (14DM, the CYP51 family of cytochrome P450) is an essential enzyme in sterol biosynthesis in eukaryotes. It serves as a major drug target for fungal diseases and can potentially become a target for treatment of human infections with protozoa. Here we present 1.9 {angstrom} resolution crystal structures of 14DM from the protozoan pathogen Trypanosoma brucei, ligand-free and complexed with a strong chemically selected inhibitor N-1-(2,4-dichlorophenyl)-2-(1H-imidazol-1-yl)ethyl-4-(5-phenyl-1,3,4-oxadi-azol-2-yl)benzamide that we previously found to produce potent antiparasitic effects in Trypanosomatidae. This is the first structure of a eukaryotic microsomal 14DM that acts on sterol biosynthesis, and it differs profoundly from that ofmore » the water-soluble CYP51 family member from Mycobacterium tuberculosis, both in organization of the active site cavity and in the substrate access channel location. Inhibitor binding does not cause large scale conformational rearrangements, yet induces unanticipated local alterations in the active site, including formation of a hydrogen bond network that connects, via the inhibitor amide group fragment, two remote functionally essential protein segments and alters the heme environment. The inhibitor binding mode provides a possible explanation for both its functionally irreversible effect on the enzyme activity and its selectivity toward the 14DM from human pathogens versus the human 14DM ortholog. The structures shed new light on 14DM functional conservation and open an excellent opportunity for directed design of novel antiparasitic drugs.« less
Living Long and Well: Prospects for a Personalized Approach to the Medicine of Ageing.
Fuellen, Georg; Schofield, Paul; Flatt, Thomas; Schulz, Ralf-Joachim; Boege, Fritz; Kraft, Karin; Rimbach, Gerald; Ibrahim, Saleh; Tietz, Alexander; Schmidt, Christian; Köhling, Rüdiger; Simm, Andreas
2016-01-01
Research into ageing and its underlying molecular basis enables us to develop and implement targeted interventions to ameliorate or cure its consequences. However, the efficacy of interventions often differs widely between individuals, suggesting that populations should be stratified or even individualized. Large-scale cohort studies in humans, similar systematic studies in model organisms as well as detailed investigations into the biology of ageing can provide individual validated biomarkers and mechanisms, leading to recommendations for targeted interventions. Human cohort studies are already ongoing, and they can be supplemented by in silico simulations. Systematic studies in animal models are made possible by the use of inbred strains or genetic reference populations of mice. Combining the two, a comprehensive picture of the various determinants of ageing and 'health span' can be studied in detail, and an appreciation of the relevance of results from model organisms to humans is emerging. The interactions between genotype and environment, particularly the psychosocial environment, are poorly studied in both humans and model organisms, presenting serious challenges to any approach to a personalized medicine of ageing. To increase the success of preventive interventions, we argue that there is a pressing need for an individualized evaluation of interventions such as physical exercise, nutrition, nutraceuticals and calorie restriction mimetics as well as psychosocial and environmental factors, separately and in combination. The expected extension of the health span enables us to refocus health care spending on individual prevention, starting in late adulthood, and on the brief period of morbidity at very old age. © 2015 S. Karger AG, Basel.
Unfolding large-scale online collaborative human dynamics
Zha, Yilong; Zhou, Tao; Zhou, Changsong
2016-01-01
Large-scale interacting human activities underlie all social and economic phenomena, but quantitative understanding of regular patterns and mechanism is very challenging and still rare. Self-organized online collaborative activities with a precise record of event timing provide unprecedented opportunity. Our empirical analysis of the history of millions of updates in Wikipedia shows a universal double–power-law distribution of time intervals between consecutive updates of an article. We then propose a generic model to unfold collaborative human activities into three modules: (i) individual behavior characterized by Poissonian initiation of an action, (ii) human interaction captured by a cascading response to previous actions with a power-law waiting time, and (iii) population growth due to the increasing number of interacting individuals. This unfolding allows us to obtain an analytical formula that is fully supported by the universal patterns in empirical data. Our modeling approaches reveal “simplicity” beyond complex interacting human activities. PMID:27911766
Mitochondrial Targets for Pharmacological Intervention in Human Disease
2015-01-01
Over the past several years, mitochondrial dysfunction has been linked to an increasing number of human illnesses, making mitochondrial proteins (MPs) an ever more appealing target for therapeutic intervention. With 20% of the mitochondrial proteome (312 of an estimated 1500 MPs) having known interactions with small molecules, MPs appear to be highly targetable. Yet, despite these targeted proteins functioning in a range of biological processes (including induction of apoptosis, calcium homeostasis, and metabolism), very few of the compounds targeting MPs find clinical use. Recent work has greatly expanded the number of proteins known to localize to the mitochondria and has generated a considerable increase in MP 3D structures available in public databases, allowing experimental screening and in silico prediction of mitochondrial drug targets on an unprecedented scale. Here, we summarize the current literature on clinically active drugs that target MPs, with a focus on how existing drug targets are distributed across biochemical pathways and organelle substructures. Also, we examine current strategies for mitochondrial drug discovery, focusing on genetic, proteomic, and chemogenomic assays, and relevant model systems. As cell models and screening techniques improve, MPs appear poised to emerge as relevant targets for a wide range of complex human diseases, an eventuality that can be expedited through systematic analysis of MP function. PMID:25367773
Honda, Yoshitomo; Ding, Xianting; Mussano, Federico; Wiberg, Akira; Ho, Chih-Ming; Nishimura, Ichiro
2013-12-05
Stem cell-based disease modeling presents unique opportunities for mechanistic elucidation and therapeutic targeting. The stable induction of fate-specific differentiation is an essential prerequisite for stem cell-based strategy. Bone morphogenetic protein 2 (BMP-2) initiates receptor-regulated Smad phosphorylation, leading to the osteogenic differentiation of mesenchymal stromal/stem cells (MSC) in vitro; however, it requires supra-physiological concentrations, presenting a bottleneck problem for large-scale drug screening. Here, we report the use of a double-objective feedback system control (FSC) with a differential evolution (DE) algorithm to identify osteogenic cocktails of extrinsic factors. Cocktails containing significantly reduced doses of BMP-2 in combination with physiologically relevant doses of dexamethasone, ascorbic acid, beta-glycerophosphate, heparin, retinoic acid and vitamin D achieved accelerated in vitro mineralization of mouse and human MSC. These results provide insight into constructive approaches of FSC to determine the applicable functional and physiological environment for MSC in disease modeling, drug screening and tissue engineering.
Alfonse, Lauren E; Garrett, Amanda D; Lun, Desmond S; Duffy, Ken R; Grgicak, Catherine M
2018-01-01
DNA-based human identity testing is conducted by comparison of PCR-amplified polymorphic Short Tandem Repeat (STR) motifs from a known source with the STR profiles obtained from uncertain sources. Samples such as those found at crime scenes often result in signal that is a composite of incomplete STR profiles from an unknown number of unknown contributors, making interpretation an arduous task. To facilitate advancement in STR interpretation challenges we provide over 25,000 multiplex STR profiles produced from one to five known individuals at target levels ranging from one to 160 copies of DNA. The data, generated under 144 laboratory conditions, are classified by total copy number and contributor proportions. For the 70% of samples that were synthetically compromised, we report the level of DNA damage using quantitative and end-point PCR. In addition, we characterize the complexity of the signal by exploring the number of detected alleles in each profile. Copyright © 2017 Elsevier B.V. All rights reserved.
Hippo/YAP-mediated rigidity-dependent motor neuron differentiation of human pluripotent stem cells
NASA Astrophysics Data System (ADS)
Sun, Yubing; Yong, Koh Meng Aw; Villa-Diaz, Luis G.; Zhang, Xiaoli; Chen, Weiqiang; Philson, Renee; Weng, Shinuo; Xu, Haoxing; Krebsbach, Paul H.; Fu, Jianping
2014-06-01
Our understanding of the intrinsic mechanosensitive properties of human pluripotent stem cells (hPSCs), in particular the effects that the physical microenvironment has on their differentiation, remains elusive. Here, we show that neural induction and caudalization of hPSCs can be accelerated by using a synthetic microengineered substrate system consisting of poly(dimethylsiloxane) micropost arrays (PMAs) with tunable mechanical rigidities. The purity and yield of functional motor neurons derived from hPSCs within 23 days of culture using soft PMAs were improved more than fourfold and tenfold, respectively, compared with coverslips or rigid PMAs. Mechanistic studies revealed a multi-targeted mechanotransductive process involving Smad phosphorylation and nucleocytoplasmic shuttling, regulated by rigidity-dependent Hippo/YAP activities and actomyosin cytoskeleton integrity and contractility. Our findings suggest that substrate rigidity is an important biophysical cue influencing neural induction and subtype specification, and that microengineered substrates can thus serve as a promising platform for large-scale culture of hPSCs.
High-throughput screening of a CRISPR/Cas9 library for functional genomics in human cells.
Zhou, Yuexin; Zhu, Shiyou; Cai, Changzu; Yuan, Pengfei; Li, Chunmei; Huang, Yanyi; Wei, Wensheng
2014-05-22
Targeted genome editing technologies are powerful tools for studying biology and disease, and have a broad range of research applications. In contrast to the rapid development of toolkits to manipulate individual genes, large-scale screening methods based on the complete loss of gene expression are only now beginning to be developed. Here we report the development of a focused CRISPR/Cas-based (clustered regularly interspaced short palindromic repeats/CRISPR-associated) lentiviral library in human cells and a method of gene identification based on functional screening and high-throughput sequencing analysis. Using knockout library screens, we successfully identified the host genes essential for the intoxication of cells by anthrax and diphtheria toxins, which were confirmed by functional validation. The broad application of this powerful genetic screening strategy will not only facilitate the rapid identification of genes important for bacterial toxicity but will also enable the discovery of genes that participate in other biological processes.
Honda, Yoshitomo; Ding, Xianting; Mussano, Federico; Wiberg, Akira; Ho, Chih-ming; Nishimura, Ichiro
2013-01-01
Stem cell-based disease modeling presents unique opportunities for mechanistic elucidation and therapeutic targeting. The stable induction of fate-specific differentiation is an essential prerequisite for stem cell-based strategy. Bone morphogenetic protein 2 (BMP-2) initiates receptor-regulated Smad phosphorylation, leading to the osteogenic differentiation of mesenchymal stromal/stem cells (MSC) in vitro; however, it requires supra-physiological concentrations, presenting a bottleneck problem for large-scale drug screening. Here, we report the use of a double-objective feedback system control (FSC) with a differential evolution (DE) algorithm to identify osteogenic cocktails of extrinsic factors. Cocktails containing significantly reduced doses of BMP-2 in combination with physiologically relevant doses of dexamethasone, ascorbic acid, beta-glycerophosphate, heparin, retinoic acid and vitamin D achieved accelerated in vitro mineralization of mouse and human MSC. These results provide insight into constructive approaches of FSC to determine the applicable functional and physiological environment for MSC in disease modeling, drug screening and tissue engineering. PMID:24305548
Manor, Ohad; Borenstein, Elhanan
2017-02-08
Comparative analyses of the human microbiome have identified both taxonomic and functional shifts that are associated with numerous diseases. To date, however, microbiome taxonomy and function have mostly been studied independently and the taxonomic drivers of functional imbalances have not been systematically identified. Here, we present FishTaco, an analytical and computational framework that integrates taxonomic and functional comparative analyses to accurately quantify taxon-level contributions to disease-associated functional shifts. Applying FishTaco to several large-scale metagenomic cohorts, we show that shifts in the microbiome's functional capacity can be traced back to specific taxa. Furthermore, the set of taxa driving functional shifts and their contribution levels vary markedly between functions. We additionally find that similar functional imbalances in different diseases are driven by both disease-specific and shared taxa. Such integrated analysis of microbiome ecological and functional dynamics can inform future microbiome-based therapy, pinpointing putative intervention targets for manipulating the microbiome's functional capacity. Copyright © 2017 Elsevier Inc. All rights reserved.
Porciani, David; Cardwell, Leah N; Tawiah, Kwaku D; Alam, Khalid K; Lange, Margaret J; Daniels, Mark A; Burke, Donald H
2018-06-11
Large RNAs and ribonucleoprotein complexes have powerful therapeutic potential, but effective cell-targeted delivery tools are limited. Aptamers that internalize into target cells can deliver siRNAs (<15 kDa, 19-21 nt/strand). We demonstrate a modular nanostructure for cellular delivery of large, functional RNA payloads (50-80 kDa, 175-250 nt) by aptamers that recognize multiple human B cell cancer lines and transferrin receptor-expressing cells. Fluorogenic RNA reporter payloads enable accelerated testing of platform designs and rapid evaluation of assembly and internalization. Modularity is demonstrated by swapping in different targeting and payload aptamers. Both modules internalize into leukemic B cell lines and remained colocalized within endosomes. Fluorescence from internalized RNA persists for ≥2 h, suggesting a sizable window for aptamer payloads to exert influence upon targeted cells. This demonstration of aptamer-mediated, cell-internalizing delivery of large RNAs with retention of functional structure raises the possibility of manipulating endosomes and cells by delivering large aptamers and regulatory RNAs.
Betel, Doron; Koppal, Anjali; Agius, Phaedra; Sander, Chris; Leslie, Christina
2010-01-01
mirSVR is a new machine learning method for ranking microRNA target sites by a down-regulation score. The algorithm trains a regression model on sequence and contextual features extracted from miRanda-predicted target sites. In a large-scale evaluation, miRanda-mirSVR is competitive with other target prediction methods in identifying target genes and predicting the extent of their downregulation at the mRNA or protein levels. Importantly, the method identifies a significant number of experimentally determined non-canonical and non-conserved sites.
Drug repurposing: translational pharmacology, chemistry, computers and the clinic.
Issa, Naiem T; Byers, Stephen W; Dakshanamurthy, Sivanesan
2013-01-01
The process of discovering a pharmacological compound that elicits a desired clinical effect with minimal side effects is a challenge. Prior to the advent of high-performance computing and large-scale screening technologies, drug discovery was largely a serendipitous endeavor, as in the case of thalidomide for erythema nodosum leprosum or cancer drugs in general derived from flora located in far-reaching geographic locations. More recently, de novo drug discovery has become a more rationalized process where drug-target-effect hypotheses are formulated on the basis of already known compounds/protein targets and their structures. Although this approach is hypothesis-driven, the actual success has been very low, contributing to the soaring costs of research and development as well as the diminished pharmaceutical pipeline in the United States. In this review, we discuss the evolution in computational pharmacology as the next generation of successful drug discovery and implementation in the clinic where high-performance computing (HPC) is used to generate and validate drug-target-effect hypotheses completely in silico. The use of HPC would decrease development time and errors while increasing productivity prior to in vitro, animal and human testing. We highlight approaches in chemoinformatics, bioinformatics as well as network biopharmacology to illustrate potential avenues from which to design clinically efficacious drugs. We further discuss the implications of combining these approaches into an integrative methodology for high-accuracy computational predictions within the context of drug repositioning for the efficient streamlining of currently approved drugs back into clinical trials for possible new indications.
Automated production of plant-based vaccines and pharmaceuticals.
Wirz, Holger; Sauer-Budge, Alexis F; Briggs, John; Sharpe, Aaron; Shu, Sudong; Sharon, Andre
2012-12-01
A fully automated "factory" was developed that uses tobacco plants to produce large quantities of vaccines and other therapeutic biologics within weeks. This first-of-a-kind factory takes advantage of a plant viral vector technology to produce specific proteins within the leaves of rapidly growing plant biomass. The factory's custom-designed robotic machines plant seeds, nurture the growing plants, introduce a viral vector that directs the plant to produce a target protein, and harvest the biomass once the target protein has accumulated in the plants-all in compliance with Food and Drug Administration (FDA) guidelines (e.g., current Good Manufacturing Practices). The factory was designed to be time, cost, and space efficient. The plants are grown in custom multiplant trays. Robots ride up and down a track, servicing the plants and delivering the trays from the lighted, irrigated growth modules to each processing station as needed. Using preprogrammed robots and processing equipment eliminates the need for human contact, preventing potential contamination of the process and economizing the operation. To quickly produce large quantities of protein-based medicines, we transformed a laboratory-based biological process and scaled it into an industrial process. This enables quick, safe, and cost-effective vaccine production that would be required in case of a pandemic.
The Read-Across Hypothesis and Environmental Risk Assessment of Pharmaceuticals
2013-01-01
Pharmaceuticals in the environment have received increased attention over the past decade, as they are ubiquitous in rivers and waterways. Concentrations are in sub-ng to low μg/L, well below acute toxic levels, but there are uncertainties regarding the effects of chronic exposures and there is a need to prioritise which pharmaceuticals may be of concern. The read-across hypothesis stipulates that a drug will have an effect in non-target organisms only if the molecular targets such as receptors and enzymes have been conserved, resulting in a (specific) pharmacological effect only if plasma concentrations are similar to human therapeutic concentrations. If this holds true for different classes of pharmaceuticals, it should be possible to predict the potential environmental impact from information obtained during the drug development process. This paper critically reviews the evidence for read-across, and finds that few studies include plasma concentrations and mode of action based effects. Thus, despite a large number of apparently relevant papers and a general acceptance of the hypothesis, there is an absence of documented evidence. There is a need for large-scale studies to generate robust data for testing the read-across hypothesis and developing predictive models, the only feasible approach to protecting the environment. PMID:24006913
Stokes, Emma J.; Strindberg, Samantha; Bakabana, Parfait C.; Elkan, Paul W.; Iyenguet, Fortuné C.; Madzoké, Bola; Malanda, Guy Aimé F.; Mowawa, Brice S.; Moukoumbou, Calixte; Ouakabadio, Franck K.; Rainey, Hugo J.
2010-01-01
Protected areas are fundamental to biodiversity conservation, but there is growing recognition of the need to extend beyond protected areas to meet the ecological requirements of species at larger scales. Landscape-scale conservation requires an evaluation of management impact on biodiversity under different land-use strategies; this is challenging and there exist few empirical studies. In a conservation landscape in northern Republic of Congo we demonstrate the application of a large-scale monitoring program designed to evaluate the impact of conservation interventions on three globally threatened species: western gorillas, chimpanzees and forest elephants, under three land-use types: integral protection, commercial logging, and community-based natural resource management. We applied distance-sampling methods to examine species abundance across different land-use types under varying degrees of management and human disturbance. We found no clear trends in abundance between land-use types. However, units with interventions designed to reduce poaching and protect habitats - irrespective of land-use type - harboured all three species at consistently higher abundance than a neighbouring logging concession undergoing no wildlife management. We applied Generalized-Additive Models to evaluate a priori predictions of species response to different landscape processes. Our results indicate that, given adequate protection from poaching, elephants and gorillas can profit from herbaceous vegetation in recently logged forests and maintain access to ecologically important resources located outside of protected areas. However, proximity to the single integrally protected area in the landscape maintained an overriding positive influence on elephant abundance, and logging roads – even subject to anti-poaching controls - were exploited by elephant poachers and had a major negative influence on elephant distribution. Chimpanzees show a clear preference for unlogged or more mature forests and human disturbance had a negative influence on chimpanzee abundance, in spite of anti-poaching interventions. We caution against the pitfalls of missing and confounded co-variables in model-based estimation approaches and highlight the importance of spatial scale in the response of different species to landscape processes. We stress the importance of a stratified design-based approach to monitoring species status in response to conservation interventions and advocate a holistic framework for landscape-scale monitoring that includes smaller-scale targeted research and punctual assessment of threats. PMID:20428233
Stokes, Emma J; Strindberg, Samantha; Bakabana, Parfait C; Elkan, Paul W; Iyenguet, Fortuné C; Madzoké, Bola; Malanda, Guy Aimé F; Mowawa, Brice S; Moukoumbou, Calixte; Ouakabadio, Franck K; Rainey, Hugo J
2010-04-23
Protected areas are fundamental to biodiversity conservation, but there is growing recognition of the need to extend beyond protected areas to meet the ecological requirements of species at larger scales. Landscape-scale conservation requires an evaluation of management impact on biodiversity under different land-use strategies; this is challenging and there exist few empirical studies. In a conservation landscape in northern Republic of Congo we demonstrate the application of a large-scale monitoring program designed to evaluate the impact of conservation interventions on three globally threatened species: western gorillas, chimpanzees and forest elephants, under three land-use types: integral protection, commercial logging, and community-based natural resource management. We applied distance-sampling methods to examine species abundance across different land-use types under varying degrees of management and human disturbance. We found no clear trends in abundance between land-use types. However, units with interventions designed to reduce poaching and protect habitats--irrespective of land-use type--harboured all three species at consistently higher abundance than a neighbouring logging concession undergoing no wildlife management. We applied Generalized-Additive Models to evaluate a priori predictions of species response to different landscape processes. Our results indicate that, given adequate protection from poaching, elephants and gorillas can profit from herbaceous vegetation in recently logged forests and maintain access to ecologically important resources located outside of protected areas. However, proximity to the single integrally protected area in the landscape maintained an overriding positive influence on elephant abundance, and logging roads--even subject to anti-poaching controls--were exploited by elephant poachers and had a major negative influence on elephant distribution. Chimpanzees show a clear preference for unlogged or more mature forests and human disturbance had a negative influence on chimpanzee abundance, in spite of anti-poaching interventions. We caution against the pitfalls of missing and confounded co-variables in model-based estimation approaches and highlight the importance of spatial scale in the response of different species to landscape processes. We stress the importance of a stratified design-based approach to monitoring species status in response to conservation interventions and advocate a holistic framework for landscape-scale monitoring that includes smaller-scale targeted research and punctual assessment of threats.
Using Big Data to Understand the Human Condition: The Kavli HUMAN Project.
Azmak, Okan; Bayer, Hannah; Caplin, Andrew; Chun, Miyoung; Glimcher, Paul; Koonin, Steven; Patrinos, Aristides
2015-09-01
Until now, most large-scale studies of humans have either focused on very specific domains of inquiry or have relied on between-subjects approaches. While these previous studies have been invaluable for revealing important biological factors in cardiac health or social factors in retirement choices, no single repository contains anything like a complete record of the health, education, genetics, environmental, and lifestyle profiles of a large group of individuals at the within-subject level. This seems critical today because emerging evidence about the dynamic interplay between biology, behavior, and the environment point to a pressing need for just the kind of large-scale, long-term synoptic dataset that does not yet exist at the within-subject level. At the same time that the need for such a dataset is becoming clear, there is also growing evidence that just such a synoptic dataset may now be obtainable-at least at moderate scale-using contemporary big data approaches. To this end, we introduce the Kavli HUMAN Project (KHP), an effort to aggregate data from 2,500 New York City households in all five boroughs (roughly 10,000 individuals) whose biology and behavior will be measured using an unprecedented array of modalities over 20 years. It will also richly measure environmental conditions and events that KHP members experience using a geographic information system database of unparalleled scale, currently under construction in New York. In this manner, KHP will offer both synoptic and granular views of how human health and behavior coevolve over the life cycle and why they evolve differently for different people. In turn, we argue that this will allow for new discovery-based scientific approaches, rooted in big data analytics, to improving the health and quality of human life, particularly in urban contexts.
NASA Astrophysics Data System (ADS)
Tsagkrasoulis, Dimosthenis; Hysi, Pirro; Spector, Tim; Montana, Giovanni
2017-04-01
The human face is a complex trait under strong genetic control, as evidenced by the striking visual similarity between twins. Nevertheless, heritability estimates of facial traits have often been surprisingly low or difficult to replicate. Furthermore, the construction of facial phenotypes that correspond to naturally perceived facial features remains largely a mystery. We present here a large-scale heritability study of face geometry that aims to address these issues. High-resolution, three-dimensional facial models have been acquired on a cohort of 952 twins recruited from the TwinsUK registry, and processed through a novel landmarking workflow, GESSA (Geodesic Ensemble Surface Sampling Algorithm). The algorithm places thousands of landmarks throughout the facial surface and automatically establishes point-wise correspondence across faces. These landmarks enabled us to intuitively characterize facial geometry at a fine level of detail through curvature measurements, yielding accurate heritability maps of the human face (www.heritabilitymaps.info).
Endobronchial Photoacoustic Microscopy for Staging of Lung Cancer
2016-08-01
acoustic lens: (1) Three hairs were buried at different depths within the background phantom with 4mm distance between each hair . The advantage of this...and carried out tests to demonstrate this advantage using human hair as micro-scale targets (Figure 4). The targets were buried in a background with...the signal from the hair targets. The 30MHz transducer has outer diameter 11mm, and was equipped with corresponding lens whose apertures fit its outer
Expansion of Human Induced Pluripotent Stem Cells in Stirred Suspension Bioreactors.
Almutawaa, Walaa; Rohani, Leili; Rancourt, Derrick E
2016-01-01
Human induced pluripotent stem cells (hiPSCs) hold great promise as a cell source for therapeutic applications and regenerative medicine. Traditionally, hiPSCs are expanded in two-dimensional static culture as colonies in the presence or absence of feeder cells. However, this expansion procedure is associated with lack of reproducibility and low cell yields. To fulfill the large cell number demand for clinical use, robust large-scale production of these cells under defined conditions is needed. Herein, we describe a scalable, low-cost protocol for expanding hiPSCs as aggregates in a lab-scale bioreactor.
Oulas, Anastasis; Karathanasis, Nestoras; Louloupi, Annita; Pavlopoulos, Georgios A; Poirazi, Panayiota; Kalantidis, Kriton; Iliopoulos, Ioannis
2015-01-01
Computational methods for miRNA target prediction are currently undergoing extensive review and evaluation. There is still a great need for improvement of these tools and bioinformatics approaches are looking towards high-throughput experiments in order to validate predictions. The combination of large-scale techniques with computational tools will not only provide greater credence to computational predictions but also lead to the better understanding of specific biological questions. Current miRNA target prediction tools utilize probabilistic learning algorithms, machine learning methods and even empirical biologically defined rules in order to build models based on experimentally verified miRNA targets. Large-scale protein downregulation assays and next-generation sequencing (NGS) are now being used to validate methodologies and compare the performance of existing tools. Tools that exhibit greater correlation between computational predictions and protein downregulation or RNA downregulation are considered the state of the art. Moreover, efficiency in prediction of miRNA targets that are concurrently verified experimentally provides additional validity to computational predictions and further highlights the competitive advantage of specific tools and their efficacy in extracting biologically significant results. In this review paper, we discuss the computational methods for miRNA target prediction and provide a detailed comparison of methodologies and features utilized by each specific tool. Moreover, we provide an overview of current state-of-the-art high-throughput methods used in miRNA target prediction.
Large scale tracking algorithms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hansen, Ross L.; Love, Joshua Alan; Melgaard, David Kennett
2015-01-01
Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase signi cantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, "blob" tracking is the norm. For highermore » resolution data, additional information may be employed in the detection and classfication steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment.« less
Progress in long scale length laser plasma interactions
NASA Astrophysics Data System (ADS)
Glenzer, S. H.; Arnold, P.; Bardsley, G.; Berger, R. L.; Bonanno, G.; Borger, T.; Bower, D. E.; Bowers, M.; Bryant, R.; Buckman, S.; Burkhart, S. C.; Campbell, K.; Chrisp, M. P.; Cohen, B. I.; Constantin, C.; Cooper, F.; Cox, J.; Dewald, E.; Divol, L.; Dixit, S.; Duncan, J.; Eder, D.; Edwards, J.; Erbert, G.; Felker, B.; Fornes, J.; Frieders, G.; Froula, D. H.; Gardner, S. D.; Gates, C.; Gonzalez, M.; Grace, S.; Gregori, G.; Greenwood, A.; Griffith, R.; Hall, T.; Hammel, B. A.; Haynam, C.; Heestand, G.; Henesian, M.; Hermes, G.; Hinkel, D.; Holder, J.; Holdner, F.; Holtmeier, G.; Hsing, W.; Huber, S.; James, T.; Johnson, S.; Jones, O. S.; Kalantar, D.; Kamperschroer, J. H.; Kauffman, R.; Kelleher, T.; Knight, J.; Kirkwood, R. K.; Kruer, W. L.; Labiak, W.; Landen, O. L.; Langdon, A. B.; Langer, S.; Latray, D.; Lee, A.; Lee, F. D.; Lund, D.; MacGowan, B.; Marshall, S.; McBride, J.; McCarville, T.; McGrew, L.; Mackinnon, A. J.; Mahavandi, S.; Manes, K.; Marshall, C.; Menapace, J.; Mertens, E.; Meezan, N.; Miller, G.; Montelongo, S.; Moody, J. D.; Moses, E.; Munro, D.; Murray, J.; Neumann, J.; Newton, M.; Ng, E.; Niemann, C.; Nikitin, A.; Opsahl, P.; Padilla, E.; Parham, T.; Parrish, G.; Petty, C.; Polk, M.; Powell, C.; Reinbachs, I.; Rekow, V.; Rinnert, R.; Riordan, B.; Rhodes, M.; Roberts, V.; Robey, H.; Ross, G.; Sailors, S.; Saunders, R.; Schmitt, M.; Schneider, M. B.; Shiromizu, S.; Spaeth, M.; Stephens, A.; Still, B.; Suter, L. J.; Tietbohl, G.; Tobin, M.; Tuck, J.; Van Wonterghem, B. M.; Vidal, R.; Voloshin, D.; Wallace, R.; Wegner, P.; Whitman, P.; Williams, E. A.; Williams, K.; Winward, K.; Work, K.; Young, B.; Young, P. E.; Zapata, P.; Bahr, R. E.; Seka, W.; Fernandez, J.; Montgomery, D.; Rose, H.
2004-12-01
The first experiments on the National Ignition Facility (NIF) have employed the first four beams to measure propagation and laser backscattering losses in large ignition-size plasmas. Gas-filled targets between 2 and 7 mm length have been heated from one side by overlapping the focal spots of the four beams from one quad operated at 351 nm (3ω) with a total intensity of 2 × 1015 W cm-2. The targets were filled with 1 atm of CO2 producing up to 7 mm long homogeneously heated plasmas with densities of ne = 6 × 1020 cm-3 and temperatures of Te = 2 keV. The high energy in an NIF quad of beams of 16 kJ, illuminating the target from one direction, creates unique conditions for the study of laser-plasma interactions at scale lengths not previously accessible. The propagation through the large-scale plasma was measured with a gated x-ray imager that was filtered for 3.5 keV x-rays. These data indicate that the beams interact with the full length of this ignition-scale plasma during the last ~1 ns of the experiment. During that time, the full aperture measurements of the stimulated Brillouin scattering and stimulated Raman scattering show scattering into the four focusing lenses of 3% for the smallest length (~2 mm), increasing to 10-12% for ~7 mm. These results demonstrate the NIF experimental capabilities and further provide a benchmark for three-dimensional modelling of the laser-plasma interactions at ignition-size scale lengths.
Cornish, Alex J; Filippis, Ioannis; David, Alessia; Sternberg, Michael J E
2015-09-01
Each cell type found within the human body performs a diverse and unique set of functions, the disruption of which can lead to disease. However, there currently exists no systematic mapping between cell types and the diseases they can cause. In this study, we integrate protein-protein interaction data with high-quality cell-type-specific gene expression data from the FANTOM5 project to build the largest collection of cell-type-specific interactomes created to date. We develop a novel method, called gene set compactness (GSC), that contrasts the relative positions of disease-associated genes across 73 cell-type-specific interactomes to map genes associated with 196 diseases to the cell types they affect. We conduct text-mining of the PubMed database to produce an independent resource of disease-associated cell types, which we use to validate our method. The GSC method successfully identifies known disease-cell-type associations, as well as highlighting associations that warrant further study. This includes mast cells and multiple sclerosis, a cell population currently being targeted in a multiple sclerosis phase 2 clinical trial. Furthermore, we build a cell-type-based diseasome using the cell types identified as manifesting each disease, offering insight into diseases linked through etiology. The data set produced in this study represents the first large-scale mapping of diseases to the cell types in which they are manifested and will therefore be useful in the study of disease systems. Overall, we demonstrate that our approach links disease-associated genes to the phenotypes they produce, a key goal within systems medicine.
Bazak, Lily; Haviv, Ami; Barak, Michal; Jacob-Hirsch, Jasmine; Deng, Patricia; Zhang, Rui; Isaacs, Farren J; Rechavi, Gideon; Li, Jin Billy; Eisenberg, Eli; Levanon, Erez Y
2014-03-01
RNA molecules transmit the information encoded in the genome and generally reflect its content. Adenosine-to-inosine (A-to-I) RNA editing by ADAR proteins converts a genomically encoded adenosine into inosine. It is known that most RNA editing in human takes place in the primate-specific Alu sequences, but the extent of this phenomenon and its effect on transcriptome diversity are not yet clear. Here, we analyzed large-scale RNA-seq data and detected ∼1.6 million editing sites. As detection sensitivity increases with sequencing coverage, we performed ultradeep sequencing of selected Alu sequences and showed that the scope of editing is much larger than anticipated. We found that virtually all adenosines within Alu repeats that form double-stranded RNA undergo A-to-I editing, although most sites exhibit editing at only low levels (<1%). Moreover, using high coverage sequencing, we observed editing of transcripts resulting from residual antisense expression, doubling the number of edited sites in the human genome. Based on bioinformatic analyses and deep targeted sequencing, we estimate that there are over 100 million human Alu RNA editing sites, located in the majority of human genes. These findings set the stage for exploring how this primate-specific massive diversification of the transcriptome is utilized.
The effect of saccade metrics on the corollary discharge contribution to perceived eye location
Bansal, Sonia; Jayet Bray, Laurence C.; Peterson, Matthew S.
2015-01-01
Corollary discharge (CD) is hypothesized to provide the movement information (direction and amplitude) required to compensate for the saccade-induced disruptions to visual input. Here, we investigated to what extent these conveyed metrics influence perceptual stability in human subjects with a target-displacement detection task. Subjects made saccades to targets located at different amplitudes (4°, 6°, or 8°) and directions (horizontal or vertical). During the saccade, the target disappeared and then reappeared at a shifted location either in the same direction or opposite to the movement vector. Subjects reported the target displacement direction, and from these reports we determined the perceptual threshold for shift detection and estimate of target location. Our results indicate that the thresholds for all amplitudes and directions generally scaled with saccade amplitude. Additionally, subjects on average produced hypometric saccades with an estimated CD gain <1. Finally, we examined the contribution of different error signals to perceptual performance, the saccade error (movement-to-movement variability in saccade amplitude) and visual error (distance between the fovea and the shifted target location). Perceptual judgment was not influenced by the fluctuations in movement amplitude, and performance was largely the same across movement directions for different magnitudes of visual error. Importantly, subjects reported the correct direction of target displacement above chance level for very small visual errors (<0.75°), even when these errors were opposite the target-shift direction. Collectively, these results suggest that the CD-based compensatory mechanisms for visual disruptions are highly accurate and comparable for saccades with different metrics. PMID:25761955
Aghebati Maleki, Leili; Majidi, Jafar; Baradaran, Behzad; Abdolalizadeh, Jalal; Kazemi, Tohid; Aghebati Maleki, Ali; Sineh sepehr, Koushan
2013-01-01
Purpose: Monoclonal antibodies or specific antibodies are now an essential tool of biomedical research and are of great commercial and medical value. The purpose of this study was to produce large scale of monoclonal antibody against CD34 in order to diagnostic application in leukemia and purification of human hematopoietic stem/progenitor cells. Methods: For large scale production of monoclonal antibody, hybridoma cells that produce monoclonal antibody against human CD34 were injected into the peritoneum of the Balb/c mice which have previously been primed with 0.5 ml Pristane. 5 ml ascitic fluid was harvested from each mouse in two times. Evaluation of mAb titration was assessed by ELISA method. The ascitic fluid was examined for class and subclasses by ELISA mouse mAb isotyping Kit. mAb was purified from ascitic fluid by affinity chromatography on Protein A-Sepharose. Purity of monoclonal antibody was monitored by SDS -PAGE and the purified monoclonal antibody was conjugated with FITC. Results: Monoclonal antibodies with high specificity and sensitivity against human CD34 by hybridoma technology were prepared. The subclass of antibody was IgG1 and its light chain was kappa. Conclusion: The conjugated monoclonal antibody could be a useful tool for isolation, purification and characterization of human hematopoietic stem cells. PMID:24312838
Pan, Xin; Qi, Jian-cheng; Long, Ming; Liang, Hao; Chen, Xiao; Li, Han; Li, Guang-bo; Zheng, Hao
2010-01-01
The close phylogenetic relationship between humans and non-human primates makes non-human primates an irreplaceable model for the study of human infectious diseases. In this study, we describe the development of a large-scale automatic multi-functional isolation chamber for use with medium-sized laboratory animals carrying infectious diseases. The isolation chamber, including the transfer chain, disinfection chain, negative air pressure isolation system, animal welfare system, and the automated system, is designed to meet all biological safety standards. To create an internal chamber environment that is completely isolated from the exterior, variable frequency drive blowers are used in the air-intake and air-exhaust system, precisely controlling the filtered air flow and providing an air-barrier protection. A double door transfer port is used to transfer material between the interior of the isolation chamber and the outside. A peracetic acid sterilizer and its associated pipeline allow for complete disinfection of the isolation chamber. All of the isolation chamber parameters can be automatically controlled by a programmable computerized menu, allowing for work with different animals in different-sized cages depending on the research project. The large-scale multi-functional isolation chamber provides a useful and safe system for working with infectious medium-sized laboratory animals in high-level bio-safety laboratories. PMID:20872984
Forier, Cynthia; Boschetti, Egisto; Ouhammouch, Mohamed; Cibiel, Agnès; Ducongé, Frédéric; Nogré, Michel; Tellier, Michel; Bataille, Damien; Bihoreau, Nicolas; Santambien, Patrick; Chtourou, Sami; Perret, Gérald
2017-03-17
Nucleic acid aptamers are promising ligands for analytical and preparative-scale affinity chromatography applications. However, a full industrial exploitation requires that aptamer-grafted chromatography media provide a number of high technical standards that remained largely untested. Ideally, they should exhibit relatively high binding capacity associated to a very high degree of specificity. In addition, they must be highly resistant to harsh cleaning/sanitization conditions, as well as to prolonged and repeated exposure to biological environment. Here, we present practical examples of aptamer affinity chromatography for the purification of three human therapeutic proteins from various sources: Factor VII, Factor H and Factor IX. In a single chromatographic step, three DNA aptamer ligands enabled the efficient purification of their target protein, with an unprecedented degree of selectivity (from 0.5% to 98% of purity in one step). Furthermore, these aptamers demonstrated a high stability under harsh sanitization conditions (100h soaking in 1M NaOH). These results pave the way toward a wider adoption of aptamer-based affinity ligands in the industrial-scale purification of not only plasma-derived proteins but also of any other protein in general. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Giardiello, Marco; Liptrott, Neill J.; McDonald, Tom O.; Moss, Darren; Siccardi, Marco; Martin, Phil; Smith, Darren; Gurjar, Rohan; Rannard, Steve P.; Owen, Andrew
2016-01-01
Considerable scope exists to vary the physical and chemical properties of nanoparticles, with subsequent impact on biological interactions; however, no accelerated process to access large nanoparticle material space is currently available, hampering the development of new nanomedicines. In particular, no clinically available nanotherapies exist for HIV populations and conventional paediatric HIV medicines are poorly available; one current paediatric formulation utilizes high ethanol concentrations to solubilize lopinavir, a poorly soluble antiretroviral. Here we apply accelerated nanomedicine discovery to generate a potential aqueous paediatric HIV nanotherapy, with clinical translation and regulatory approval for human evaluation. Our rapid small-scale screening approach yields large libraries of solid drug nanoparticles (160 individual components) targeting oral dose. Screening uses 1 mg of drug compound per library member and iterative pharmacological and chemical evaluation establishes potential candidates for progression through to clinical manufacture. The wide applicability of our strategy has implications for multiple therapy development programmes. PMID:27767027
Functional Genomic Landscape of Human Breast Cancer Drivers, Vulnerabilities, and Resistance.
Marcotte, Richard; Sayad, Azin; Brown, Kevin R; Sanchez-Garcia, Felix; Reimand, Jüri; Haider, Maliha; Virtanen, Carl; Bradner, James E; Bader, Gary D; Mills, Gordon B; Pe'er, Dana; Moffat, Jason; Neel, Benjamin G
2016-01-14
Large-scale genomic studies have identified multiple somatic aberrations in breast cancer, including copy number alterations and point mutations. Still, identifying causal variants and emergent vulnerabilities that arise as a consequence of genetic alterations remain major challenges. We performed whole-genome small hairpin RNA (shRNA) "dropout screens" on 77 breast cancer cell lines. Using a hierarchical linear regression algorithm to score our screen results and integrate them with accompanying detailed genetic and proteomic information, we identify vulnerabilities in breast cancer, including candidate "drivers," and reveal general functional genomic properties of cancer cells. Comparisons of gene essentiality with drug sensitivity data suggest potential resistance mechanisms, effects of existing anti-cancer drugs, and opportunities for combination therapy. Finally, we demonstrate the utility of this large dataset by identifying BRD4 as a potential target in luminal breast cancer and PIK3CA mutations as a resistance determinant for BET-inhibitors. Copyright © 2016 Elsevier Inc. All rights reserved.
Highly Efficient Large-Scale Lentiviral Vector Concentration by Tandem Tangential Flow Filtration
Cooper, Aaron R.; Patel, Sanjeet; Senadheera, Shantha; Plath, Kathrin; Kohn, Donald B.; Hollis, Roger P.
2014-01-01
Large-scale lentiviral vector (LV) concentration can be inefficient and time consuming, often involving multiple rounds of filtration and centrifugation. This report describes a simpler method using two tangential flow filtration (TFF) steps to concentrate liter-scale volumes of LV supernatant, achieving in excess of 2000-fold concentration in less than 3 hours with very high recovery (>97%). Large volumes of LV supernatant can be produced easily through the use of multi-layer flasks, each having 1720 cm2 surface area and producing ~560 mL of supernatant per flask. Combining the use of such flasks and TFF greatly simplifies large-scale production of LV. As a demonstration, the method is used to produce a very high titer LV (>1010 TU/mL) and transduce primary human CD34+ hematopoietic stem/progenitor cells at high final vector concentrations with no overt toxicity. A complex LV (STEMCCA) for induced pluripotent stem cell generation is also concentrated from low initial titer and used to transduce and reprogram primary human fibroblasts with no overt toxicity. Additionally, a generalized and simple multiplexed real- time PCR assay is described for lentiviral vector titer and copy number determination. PMID:21784103
Vulnerability Analyst’s Guide to Geometric Target Description
1992-09-01
not constitute indorsement of any commercial product. Form Approved REPORT DOCUMENTATION PAGE OMB No. 0704-O,8 public reporting burden for this...46 5.3 Surrogacy ..............................................46 5.4 Specialized Targets......................................46 5.5... commercially available documents for other large-scale software. The documentation is not a BRL technical report, but can be obtained by contacting
NASA Astrophysics Data System (ADS)
Flippo, Kirk; Hegelich, B. Manuel; Cort Gautier, D.; Johnson, J. Randy; Kline, John L.; Shimada, Tsutomu; Fernández, Juan C.; Gaillard, Sandrine; Rassuchine, Jennifer; Le Galloudec, Nathalie; Cowan, Thomas E.; Malekos, Steve; Korgan, Grant
2006-10-01
Ion-driven Fast Ignition (IFI) has certain advantages over electron-driven FI due to a possible large reduction in the amount of energy required. Recent experiments at the Los Alamos National Laboratory's Trident facility have yielded ion energies and efficiencies many times in excess of recent published scaling laws, leading to even more potential advantages of IFI. Proton energies in excess of 35 MeV have been observed from targets produced by the University of Nevada, Reno - dubbed ``Pizza-top Cone'' targets - at intensities of only 1x10^19 W/cm^2 with 20 joules in 600 fs. Energies in excess of 24 MeV were observed from simple flat foil targets as well. The observed energies, above any published scaling laws, are attributed to target production, preparation, and shot to shot monitoring of many laser parameters, especially the laser ASE prepulse level and laser pulse duration. The laser parameters are monitored in real-time to keep the laser in optimal condition throughout the run providing high quality, reproducible shots.
Concurrent multiscale imaging with magnetic resonance imaging and optical coherence tomography
NASA Astrophysics Data System (ADS)
Liang, Chia-Pin; Yang, Bo; Kim, Il Kyoon; Makris, George; Desai, Jaydev P.; Gullapalli, Rao P.; Chen, Yu
2013-04-01
We develop a novel platform based on a tele-operated robot to perform high-resolution optical coherence tomography (OCT) imaging under continuous large field-of-view magnetic resonance imaging (MRI) guidance. Intra-operative MRI (iMRI) is a promising guidance tool for high-precision surgery, but it may not have sufficient resolution or contrast to visualize certain small targets. To address these limitations, we develop an MRI-compatible OCT needle probe, which is capable of providing microscale tissue architecture in conjunction with macroscale MRI tissue morphology in real time. Coregistered MRI/OCT images on ex vivo chicken breast and human brain tissues demonstrate that the complementary imaging scales and contrast mechanisms have great potential to improve the efficiency and the accuracy of iMRI procedure.
Comparative Gene Expression Profiles Induced by PPARγ and PPARα/γ Agonists in Human Hepatocytes
Rogue, Alexandra; Lambert, Carine; Jossé, Rozenn; Antherieu, Sebastien; Spire, Catherine; Claude, Nancy; Guillouzo, André
2011-01-01
Background Several glitazones (PPARγ agonists) and glitazars (dual PPARα/γ agonists) have been developed to treat hyperglycemia and, simultaneously, hyperglycemia and dyslipidemia, respectively. However, most have caused idiosyncratic hepatic or extrahepatic toxicities through mechanisms that remain largely unknown. Since the liver plays a key role in lipid metabolism, we analyzed changes in gene expression profiles induced by these two types of PPAR agonists in human hepatocytes. Methodology/Principal Findings Primary human hepatocytes and the well-differentiated human hepatoma HepaRG cells were exposed to different concentrations of two PPARγ (troglitazone and rosiglitazone) and two PPARα/γ (muraglitazar and tesaglitazar) agonists for 24 h and their transcriptomes were analyzed using human pangenomic Agilent microarrays. Principal Component Analysis, hierarchical clustering and Ingenuity Pathway Analysis® revealed large inter-individual variability in the response of the human hepatocyte populations to the different compounds. Many genes involved in lipid, carbohydrate, xenobiotic and cholesterol metabolism, as well as inflammation and immunity, were regulated by both PPARγ and PPARα/γ agonists in at least a number of human hepatocyte populations and/or HepaRG cells. Only a few genes were selectively deregulated by glitazars when compared to glitazones, indicating that PPARγ and PPARα/γ agonists share most of their target genes. Moreover, some target genes thought to be regulated only in mouse or to be expressed in Kupffer cells were also found to be responsive in human hepatocytes and HepaRG cells. Conclusions/Significance This first comprehensive analysis of gene regulation by PPARγ and PPARα/γ agonists favor the conclusion that glitazones and glitazars share most of their target genes and induce large differential changes in gene profiles in human hepatocytes depending on hepatocyte donor, the compound class and/or individual compound, thereby supporting the occurrence of idiosyncratic toxicity in some patients. PMID:21533120
Water limited agriculture in Africa: Climate change sensitivity of large scale land investments
NASA Astrophysics Data System (ADS)
Rulli, M. C.; D'Odorico, P.; Chiarelli, D. D.; Davis, K. F.
2015-12-01
The past few decades have seen unprecedented changes in the global agricultural system with a dramatic increase in the rates of food production fueled by an escalating demand for food calories, as a result of demographic growth, dietary changes, and - more recently - new bioenergy policies. Food prices have become consistently higher and increasingly volatile with dramatic spikes in 2007-08 and 2010-11. The confluence of these factors has heightened demand for land and brought a wave of land investment to the developing world: some of the more affluent countries are trying to secure land rights in areas suitable for agriculture. According to some estimates, to date, roughly 38 million hectares have been acquired worldwide by large scale investors, 16 million of which in Africa. More than 85% of large scale land acquisitions in Africa are by foreign investors. Many land deals are motivated not only by the need for fertile land but for the water resources required for crop production. Despite some recent assessments of the water appropriation associated with large scale land investments, their impact on the water resources of the target countries under present conditions and climate change scenarios remains poorly understood. Here we investigate irrigation water requirements by various crops planted in the acquired land as an indicator of the pressure likely placed by land investors on ("blue") water resources of target regions in Africa and evaluate the sensitivity to climate changes scenarios.
Romero-Durán, Francisco J; Alonso, Nerea; Yañez, Matilde; Caamaño, Olga; García-Mera, Xerardo; González-Díaz, Humberto
2016-04-01
The use of Cheminformatics tools is gaining importance in the field of translational research from Medicinal Chemistry to Neuropharmacology. In particular, we need it for the analysis of chemical information on large datasets of bioactive compounds. These compounds form large multi-target complex networks (drug-target interactome network) resulting in a very challenging data analysis problem. Artificial Neural Network (ANN) algorithms may help us predict the interactions of drugs and targets in CNS interactome. In this work, we trained different ANN models able to predict a large number of drug-target interactions. These models predict a dataset of thousands of interactions of central nervous system (CNS) drugs characterized by > 30 different experimental measures in >400 different experimental protocols for >150 molecular and cellular targets present in 11 different organisms (including human). The model was able to classify cases of non-interacting vs. interacting drug-target pairs with satisfactory performance. A second aim focus on two main directions: the synthesis and assay of new derivatives of TVP1022 (S-analogues of rasagiline) and the comparison with other rasagiline derivatives recently reported. Finally, we used the best of our models to predict drug-target interactions for the best new synthesized compound against a large number of CNS protein targets. Copyright © 2015 Elsevier Ltd. All rights reserved.
Knowledge-Based Methods To Train and Optimize Virtual Screening Ensembles
2016-01-01
Ensemble docking can be a successful virtual screening technique that addresses the innate conformational heterogeneity of macromolecular drug targets. Yet, lacking a method to identify a subset of conformational states that effectively segregates active and inactive small molecules, ensemble docking may result in the recommendation of a large number of false positives. Here, three knowledge-based methods that construct structural ensembles for virtual screening are presented. Each method selects ensembles by optimizing an objective function calculated using the receiver operating characteristic (ROC) curve: either the area under the ROC curve (AUC) or a ROC enrichment factor (EF). As the number of receptor conformations, N, becomes large, the methods differ in their asymptotic scaling. Given a set of small molecules with known activities and a collection of target conformations, the most resource intense method is guaranteed to find the optimal ensemble but scales as O(2N). A recursive approximation to the optimal solution scales as O(N2), and a more severe approximation leads to a faster method that scales linearly, O(N). The techniques are generally applicable to any system, and we demonstrate their effectiveness on the androgen nuclear hormone receptor (AR), cyclin-dependent kinase 2 (CDK2), and the peroxisome proliferator-activated receptor δ (PPAR-δ) drug targets. Conformations that consisted of a crystal structure and molecular dynamics simulation cluster centroids were used to form AR and CDK2 ensembles. Multiple available crystal structures were used to form PPAR-δ ensembles. For each target, we show that the three methods perform similarly to one another on both the training and test sets. PMID:27097522
Structural Plasticity and Conformational Transitions of HIV Envelope Glycoprotein gp120
Korkut, Anil; Hendrickson, Wayne A.
2012-01-01
HIV envelope glycoproteins undergo large-scale conformational changes as they interact with cellular receptors to cause the fusion of viral and cellular membranes that permits viral entry to infect targeted cells. Conformational dynamics in HIV gp120 are also important in masking conserved receptor epitopes from being detected for effective neutralization by the human immune system. Crystal structures of HIV gp120 and its complexes with receptors and antibody fragments provide high-resolution pictures of selected conformational states accessible to gp120. Here we describe systematic computational analyses of HIV gp120 plasticity in such complexes with CD4 binding fragments, CD4 mimetic proteins, and various antibody fragments. We used three computational approaches: an isotropic elastic network analysis of conformational plasticity, a full atomic normal mode analysis, and simulation of conformational transitions with our coarse-grained virtual atom molecular mechanics (VAMM) potential function. We observe collective sub-domain motions about hinge points that coordinate those motions, correlated local fluctuations at the interfacial cavity formed when gp120 binds to CD4, and concerted changes in structural elements that form at the CD4 interface during large-scale conformational transitions to the CD4-bound state from the deformed states of gp120 in certain antibody complexes. PMID:23300605
Meisner, Joshua K.; Price, Richard J.
2010-01-01
Arterial occlusive disease (AOD) is the leading cause of morbidity and mortality through the developed world, which creates a significant need for effective therapies to halt disease progression. Despite success of animal and small-scale human therapeutic arteriogenesis studies, this promising concept for treating AOD has yielded largely disappointing results in large-scale clinical trials. One reason for this lack of successful translation is that endogenous arteriogenesis is highly dependent on a poorly understood sequence of events and interactions between bone marrow derived cells (BMCs) and vascular cells, which makes designing effective therapies difficult. We contend that the process follows a complex, ordered sequence of events with multiple, specific BMC populations recruited at specific times and locations. Here we present the evidence suggesting roles for multiple BMC populations from neutrophils and mast cells to progenitor cells and propose how and where these cell populations fit within the sequence of events during arteriogenesis. Disruptions in these various BMC populations can impair the arteriogenesis process in patterns that characterize specific patient populations. We propose that an improved understanding of how arteriogenesis functions as a system can reveal individual BMC populations and functions that can be targeted for overcoming particular impairments in collateral vessel development. PMID:21044213
You, Zhu-Hong; Li, Shuai; Gao, Xin; Luo, Xin; Ji, Zhen
2014-01-01
Protein-protein interactions are the basis of biological functions, and studying these interactions on a molecular level is of crucial importance for understanding the functionality of a living cell. During the past decade, biosensors have emerged as an important tool for the high-throughput identification of proteins and their interactions. However, the high-throughput experimental methods for identifying PPIs are both time-consuming and expensive. On the other hand, high-throughput PPI data are often associated with high false-positive and high false-negative rates. Targeting at these problems, we propose a method for PPI detection by integrating biosensor-based PPI data with a novel computational model. This method was developed based on the algorithm of extreme learning machine combined with a novel representation of protein sequence descriptor. When performed on the large-scale human protein interaction dataset, the proposed method achieved 84.8% prediction accuracy with 84.08% sensitivity at the specificity of 85.53%. We conducted more extensive experiments to compare the proposed method with the state-of-the-art techniques, support vector machine. The achieved results demonstrate that our approach is very promising for detecting new PPIs, and it can be a helpful supplement for biosensor-based PPI data detection.
Ribeiro, Haroldo V; Hanley, Quentin S; Lewis, Dan
2018-01-01
Scale-adjusted metrics (SAMs) are a significant achievement of the urban scaling hypothesis. SAMs remove the inherent biases of per capita measures computed in the absence of isometric allometries. However, this approach is limited to urban areas, while a large portion of the world's population still lives outside cities and rural areas dominate land use worldwide. Here, we extend the concept of SAMs to population density scale-adjusted metrics (DSAMs) to reveal relationships among different types of crime and property metrics. Our approach allows all human environments to be considered, avoids problems in the definition of urban areas, and accounts for the heterogeneity of population distributions within urban regions. By combining DSAMs, cross-correlation, and complex network analysis, we find that crime and property types have intricate and hierarchically organized relationships leading to some striking conclusions. Drugs and burglary had uncorrelated DSAMs and, to the extent property transaction values are indicators of affluence, twelve out of fourteen crime metrics showed no evidence of specifically targeting affluence. Burglary and robbery were the most connected in our network analysis and the modular structures suggest an alternative to "zero-tolerance" policies by unveiling the crime and/or property types most likely to affect each other.
Hanley, Quentin S.; Lewis, Dan
2018-01-01
Scale-adjusted metrics (SAMs) are a significant achievement of the urban scaling hypothesis. SAMs remove the inherent biases of per capita measures computed in the absence of isometric allometries. However, this approach is limited to urban areas, while a large portion of the world’s population still lives outside cities and rural areas dominate land use worldwide. Here, we extend the concept of SAMs to population density scale-adjusted metrics (DSAMs) to reveal relationships among different types of crime and property metrics. Our approach allows all human environments to be considered, avoids problems in the definition of urban areas, and accounts for the heterogeneity of population distributions within urban regions. By combining DSAMs, cross-correlation, and complex network analysis, we find that crime and property types have intricate and hierarchically organized relationships leading to some striking conclusions. Drugs and burglary had uncorrelated DSAMs and, to the extent property transaction values are indicators of affluence, twelve out of fourteen crime metrics showed no evidence of specifically targeting affluence. Burglary and robbery were the most connected in our network analysis and the modular structures suggest an alternative to “zero-tolerance” policies by unveiling the crime and/or property types most likely to affect each other. PMID:29470499
NASA Astrophysics Data System (ADS)
Zhou, Ying; Wang, Youhua; Liu, Runfeng; Xiao, Lin; Zhang, Qin; Huang, YongAn
2018-01-01
Epidermal electronics (e-skin) emerging in recent years offer the opportunity to noninvasively and wearably extract biosignals from human bodies. The conventional processes of e-skin based on standard microelectronic fabrication processes and a variety of transfer printing methods, nevertheless, unquestionably constrains the size of the devices, posing a serious challenge to collecting signals via skin, the largest organ in the human body. Herein we propose a multichannel noninvasive human-machine interface (HMI) using stretchable surface electromyography (sEMG) patches to realize a robot hand mimicking human gestures. Time-efficient processes are first developed to manufacture µm thick large-scale stretchable devices. With micron thickness, the stretchable µm thick sEMG patches show excellent conformability with human skin and consequently comparable electrical performance with conventional gel electrodes. Combined with the large-scale size, the multichannel noninvasive HMI via stretchable µm thick sEMG patches successfully manipulates the robot hand with eight different gestures, whose precision is as high as conventional gel electrodes array.
Targeted enrichment strategies for next-generation plant biology
Richard Cronn; Brian J. Knaus; Aaron Liston; Peter J. Maughan; Matthew Parks; John V. Syring; Joshua Udall
2012-01-01
The dramatic advances offered by modem DNA sequencers continue to redefine the limits of what can be accomplished in comparative plant biology. Even with recent achievements, however, plant genomes present obstacles that can make it difficult to execute large-scale population and phylogenetic studies on next-generation sequencing platforms. Factors like large genome...
The HIV Nef protein modulates cellular and exosomal miRNA profiles in human monocytic cells.
Aqil, Madeeha; Naqvi, Afsar Raza; Mallik, Saurav; Bandyopadhyay, Sanghamitra; Maulik, Ujjwal; Jameel, Shahid
2014-01-01
The HIV Nef protein is a multifunctional virulence factor that perturbs intracellular membranes and signalling and is secreted into exosomes. While Nef-containing exosomes have a distinct proteomic profile, no comprehensive analysis of their miRNA cargo has been carried out. Since Nef functions as a viral suppressor of RNA interference and disturbs the distribution of RNA-induced silencing complex proteins between cells and exosomes, we hypothesized that it might also affect the export of miRNAs into exosomes. Exosomes were purified from human monocytic U937 cells that stably expressed HIV-1 Nef. The RNA from cells and exosomes was profiled for 667 miRNAs using a Taqman Low Density Array. Selected miRNAs and their mRNA targets were validated by quantitative RT-PCR. Bioinformatics analyses were used to identify targets and predict pathways. Nef expression affected a significant fraction of miRNAs in U937 cells. Our analysis showed 47 miRNAs to be selectively secreted into Nef exosomes and 2 miRNAs to be selectively retained in Nef-expressing cells. The exosomal miRNAs were predicted to target several cellular genes in inflammatory cytokine and other pathways important for HIV pathogenesis, and an overwhelming majority had targets within the HIV genome. This is the first study to report miRnome analysis of HIV Nef expressing monocytes and exosomes. Our results demonstrate that Nef causes large-scale dysregulation of cellular miRNAs, including their secretion through exosomes. We suggest this to be a novel viral strategy to affect pathogenesis and to limit the effects of RNA interference on viral replication and persistence.
Robinson, Hugh S.; Abarca, Maria; Zeller, Katherine A.; Velasquez, Grisel; Paemelaere, Evi A. D.; Goldberg, Joshua F.; Payan, Esteban; Hoogesteijn, Rafael; Boede, Ernesto O.; Schmidt, Krzysztof; Lampo, Margarita; Viloria, Ángel L.; Carreño, Rafael; Robinson, Nathaniel; Lukacs, Paul M.; Nowak, J. Joshua; Salom-Pérez, Roberto; Castañeda, Franklin; Boron, Valeria; Quigley, Howard
2018-01-01
Broad scale population estimates of declining species are desired for conservation efforts. However, for many secretive species including large carnivores, such estimates are often difficult. Based on published density estimates obtained through camera trapping, presence/absence data, and globally available predictive variables derived from satellite imagery, we modelled density and occurrence of a large carnivore, the jaguar, across the species’ entire range. We then combined these models in a hierarchical framework to estimate the total population. Our models indicate that potential jaguar density is best predicted by measures of primary productivity, with the highest densities in the most productive tropical habitats and a clear declining gradient with distance from the equator. Jaguar distribution, in contrast, is determined by the combined effects of human impacts and environmental factors: probability of jaguar occurrence increased with forest cover, mean temperature, and annual precipitation and declined with increases in human foot print index and human density. Probability of occurrence was also significantly higher for protected areas than outside of them. We estimated the world’s jaguar population at 173,000 (95% CI: 138,000–208,000) individuals, mostly concentrated in the Amazon Basin; elsewhere, populations tend to be small and fragmented. The high number of jaguars results from the large total area still occupied (almost 9 million km2) and low human densities (< 1 person/km2) coinciding with high primary productivity in the core area of jaguar range. Our results show the importance of protected areas for jaguar persistence. We conclude that combining modelling of density and distribution can reveal ecological patterns and processes at global scales, can provide robust estimates for use in species assessments, and can guide broad-scale conservation actions. PMID:29579129
Patterns of resting state connectivity in human primary visual cortical areas: a 7T fMRI study.
Raemaekers, Mathijs; Schellekens, Wouter; van Wezel, Richard J A; Petridou, Natalia; Kristo, Gert; Ramsey, Nick F
2014-01-01
The nature and origin of fMRI resting state fluctuations and connectivity are still not fully known. More detailed knowledge on the relationship between resting state patterns and brain function may help to elucidate this matter. We therefore performed an in depth study of how resting state fluctuations map to the well known architecture of the visual system. We investigated resting state connectivity at both a fine and large scale within and across visual areas V1, V2 and V3 in ten human subjects using a 7Tesla scanner. We found evidence for several coexisting and overlapping connectivity structures at different spatial scales. At the fine-scale level we found enhanced connectivity between the same topographic locations in the fieldmaps of V1, V2 and V3, enhanced connectivity to the contralateral functional homologue, and to a lesser extent enhanced connectivity between iso-eccentric locations within the same visual area. However, by far the largest proportion of the resting state fluctuations occurred within large-scale bilateral networks. These large-scale networks mapped to some extent onto the architecture of the visual system and could thereby obscure fine-scale connectivity. In fact, most of the fine-scale connectivity only became apparent after the large-scale network fluctuations were filtered from the timeseries. We conclude that fMRI resting state fluctuations in the visual cortex may in fact be a composite signal of different overlapping sources. Isolating the different sources could enhance correlations between BOLD and electrophysiological correlates of resting state activity. © 2013 Elsevier Inc. All rights reserved.
Timescales of Massive Human Entrainment
Fusaroli, Riccardo; Perlman, Marcus; Mislove, Alan; Paxton, Alexandra; Matlock, Teenie; Dale, Rick
2015-01-01
The past two decades have seen an upsurge of interest in the collective behaviors of complex systems composed of many agents entrained to each other and to external events. In this paper, we extend the concept of entrainment to the dynamics of human collective attention. We conducted a detailed investigation of the unfolding of human entrainment—as expressed by the content and patterns of hundreds of thousands of messages on Twitter—during the 2012 US presidential debates. By time-locking these data sources, we quantify the impact of the unfolding debate on human attention at three time scales. We show that collective social behavior covaries second-by-second to the interactional dynamics of the debates: A candidate speaking induces rapid increases in mentions of his name on social media and decreases in mentions of the other candidate. Moreover, interruptions by an interlocutor increase the attention received. We also highlight a distinct time scale for the impact of salient content during the debates: Across well-known remarks in each debate, mentions in social media start within 5–10 seconds after it occurs; peak at approximately one minute; and slowly decay in a consistent fashion across well-known events during the debates. Finally, we show that public attention after an initial burst slowly decays through the course of the debates. Thus we demonstrate that large-scale human entrainment may hold across a number of distinct scales, in an exquisitely time-locked fashion. The methods and results pave the way for careful study of the dynamics and mechanisms of large-scale human entrainment. PMID:25880357
Using Relational Reasoning to Learn about Scientific Phenomena at Unfamiliar Scales
ERIC Educational Resources Information Center
Resnick, Ilyse; Davatzes, Alexandra; Newcombe, Nora S.; Shipley, Thomas F.
2016-01-01
Many scientific theories and discoveries involve reasoning about extreme scales, removed from human experience, such as time in geology, size in nanoscience. Thus, understanding scale is central to science, technology, engineering, and mathematics. Unfortunately, novices have trouble understanding and comparing sizes of unfamiliar large and small…
Using Relational Reasoning to Learn about Scientific Phenomena at Unfamiliar Scales
ERIC Educational Resources Information Center
Resnick, Ilyse; Davatzes, Alexandra; Newcombe, Nora S.; Shipley, Thomas F.
2017-01-01
Many scientific theories and discoveries involve reasoning about extreme scales, removed from human experience, such as time in geology and size in nanoscience. Thus, understanding scale is central to science, technology, engineering, and mathematics. Unfortunately, novices have trouble understanding and comparing sizes of unfamiliar large and…
NASA Astrophysics Data System (ADS)
Wang, Lixia; Pei, Jihong; Xie, Weixin; Liu, Jinyuan
2018-03-01
Large-scale oceansat remote sensing images cover a big area sea surface, which fluctuation can be considered as a non-stationary process. Short-Time Fourier Transform (STFT) is a suitable analysis tool for the time varying nonstationary signal. In this paper, a novel ship detection method using 2-D STFT sea background statistical modeling for large-scale oceansat remote sensing images is proposed. First, the paper divides the large-scale oceansat remote sensing image into small sub-blocks, and 2-D STFT is applied to each sub-block individually. Second, the 2-D STFT spectrum of sub-blocks is studied and the obvious different characteristic between sea background and non-sea background is found. Finally, the statistical model for all valid frequency points in the STFT spectrum of sea background is given, and the ship detection method based on the 2-D STFT spectrum modeling is proposed. The experimental result shows that the proposed algorithm can detect ship targets with high recall rate and low missing rate.
Artificial intelligence issues related to automated computing operations
NASA Technical Reports Server (NTRS)
Hornfeck, William A.
1989-01-01
Large data processing installations represent target systems for effective applications of artificial intelligence (AI) constructs. The system organization of a large data processing facility at the NASA Marshall Space Flight Center is presented. The methodology and the issues which are related to AI application to automated operations within a large-scale computing facility are described. Problems to be addressed and initial goals are outlined.
Hoffmann, Markus; Crone, Lisa; Dietzel, Erik; Paijo, Jennifer; González-Hernández, Mariana; Nehlmeier, Inga; Kalinke, Ulrich; Becker, Stephan; Pöhlmann, Stefan
2017-05-01
The large scale of the Ebola virus disease (EVD) outbreak in West Africa in 2013-2016 raised the question whether the host cell interactions of the responsible Ebola virus (EBOV) strain differed from those of other ebolaviruses. We previously reported that the glycoprotein (GP) of the virus circulating in West Africa in 2014 (EBOV2014) exhibited reduced ability to mediate entry into two nonhuman primate (NHP)-derived cell lines relative to the GP of EBOV1976. Here, we investigated the molecular determinants underlying the differential entry efficiency. We found that EBOV2014-GP-driven entry into diverse NHP-derived cell lines, as well as human monocyte-derived macrophages and dendritic cells, was reduced compared to EBOV1976-GP, although entry into most human- and all bat-derived cell lines tested was comparable. Moreover, EBOV2014 replication in NHP but not human cells was diminished relative to EBOV1976, suggesting that reduced cell entry translated into reduced viral spread. Mutagenic analysis of EBOV2014-GP and EBOV1976-GP revealed that an amino acid polymorphism in the receptor-binding domain, A82V, modulated entry efficiency in a cell line-independent manner and did not account for the reduced EBOV2014-GP-driven entry into NHP cells. In contrast, polymorphism T544I, located in the internal fusion loop in the GP2 subunit, was found to be responsible for the entry phenotype. These results suggest that position 544 is an important determinant of EBOV infectivity for both NHP and certain human target cells. IMPORTANCE The Ebola virus disease outbreak in West Africa in 2013 entailed more than 10,000 deaths. The scale of the outbreak and its dramatic impact on human health raised the question whether the responsible virus was particularly adept at infecting human cells. Our study shows that an amino acid exchange, A82V, that was acquired during the epidemic and that was not observed in previously circulating viruses, increases viral entry into diverse target cells. In contrast, the epidemic virus showed a reduced ability to enter cells of nonhuman primates compared to the virus circulating in 1976, and a single amino acid exchange in the internal fusion loop of the viral glycoprotein was found to account for this phenotype. Copyright © 2017 American Society for Microbiology.
NASA Astrophysics Data System (ADS)
Fonseca, R. A.; Vieira, J.; Fiuza, F.; Davidson, A.; Tsung, F. S.; Mori, W. B.; Silva, L. O.
2013-12-01
A new generation of laser wakefield accelerators (LWFA), supported by the extreme accelerating fields generated in the interaction of PW-Class lasers and underdense targets, promises the production of high quality electron beams in short distances for multiple applications. Achieving this goal will rely heavily on numerical modelling to further understand the underlying physics and identify optimal regimes, but large scale modelling of these scenarios is computationally heavy and requires the efficient use of state-of-the-art petascale supercomputing systems. We discuss the main difficulties involved in running these simulations and the new developments implemented in the OSIRIS framework to address these issues, ranging from multi-dimensional dynamic load balancing and hybrid distributed/shared memory parallelism to the vectorization of the PIC algorithm. We present the results of the OASCR Joule Metric program on the issue of large scale modelling of LWFA, demonstrating speedups of over 1 order of magnitude on the same hardware. Finally, scalability to over ˜106 cores and sustained performance over ˜2 P Flops is demonstrated, opening the way for large scale modelling of LWFA scenarios.
Utilization of Large Scale Surface Models for Detailed Visibility Analyses
NASA Astrophysics Data System (ADS)
Caha, J.; Kačmařík, M.
2017-11-01
This article demonstrates utilization of large scale surface models with small spatial resolution and high accuracy, acquired from Unmanned Aerial Vehicle scanning, for visibility analyses. The importance of large scale data for visibility analyses on the local scale, where the detail of the surface model is the most defining factor, is described. The focus is not only the classic Boolean visibility, that is usually determined within GIS, but also on so called extended viewsheds that aims to provide more information about visibility. The case study with examples of visibility analyses was performed on river Opava, near the Ostrava city (Czech Republic). The multiple Boolean viewshed analysis and global horizon viewshed were calculated to determine most prominent features and visibility barriers of the surface. Besides that, the extended viewshed showing angle difference above the local horizon, which describes angular height of the target area above the barrier, is shown. The case study proved that large scale models are appropriate data source for visibility analyses on local level. The discussion summarizes possible future applications and further development directions of visibility analyses.
Large scale analysis of signal reachability.
Todor, Andrei; Gabr, Haitham; Dobra, Alin; Kahveci, Tamer
2014-06-15
Major disorders, such as leukemia, have been shown to alter the transcription of genes. Understanding how gene regulation is affected by such aberrations is of utmost importance. One promising strategy toward this objective is to compute whether signals can reach to the transcription factors through the transcription regulatory network (TRN). Due to the uncertainty of the regulatory interactions, this is a #P-complete problem and thus solving it for very large TRNs remains to be a challenge. We develop a novel and scalable method to compute the probability that a signal originating at any given set of source genes can arrive at any given set of target genes (i.e., transcription factors) when the topology of the underlying signaling network is uncertain. Our method tackles this problem for large networks while providing a provably accurate result. Our method follows a divide-and-conquer strategy. We break down the given network into a sequence of non-overlapping subnetworks such that reachability can be computed autonomously and sequentially on each subnetwork. We represent each interaction using a small polynomial. The product of these polynomials express different scenarios when a signal can or cannot reach to target genes from the source genes. We introduce polynomial collapsing operators for each subnetwork. These operators reduce the size of the resulting polynomial and thus the computational complexity dramatically. We show that our method scales to entire human regulatory networks in only seconds, while the existing methods fail beyond a few tens of genes and interactions. We demonstrate that our method can successfully characterize key reachability characteristics of the entire transcriptions regulatory networks of patients affected by eight different subtypes of leukemia, as well as those from healthy control samples. All the datasets and code used in this article are available at bioinformatics.cise.ufl.edu/PReach/scalable.htm. © The Author 2014. Published by Oxford University Press.
Gray matter alterations in chronic pain: A network-oriented meta-analytic approach
Cauda, Franco; Palermo, Sara; Costa, Tommaso; Torta, Riccardo; Duca, Sergio; Vercelli, Ugo; Geminiani, Giuliano; Torta, Diana M.E.
2014-01-01
Several studies have attempted to characterize morphological brain changes due to chronic pain. Although it has repeatedly been suggested that longstanding pain induces gray matter modifications, there is still some controversy surrounding the direction of the change (increase or decrease in gray matter) and the role of psychological and psychiatric comorbidities. In this study, we propose a novel, network-oriented, meta-analytic approach to characterize morphological changes in chronic pain. We used network decomposition to investigate whether different kinds of chronic pain are associated with a common or specific set of altered networks. Representational similarity techniques, network decomposition and model-based clustering were employed: i) to verify the presence of a core set of brain areas commonly modified by chronic pain; ii) to investigate the involvement of these areas in a large-scale network perspective; iii) to study the relationship between altered networks and; iv) to find out whether chronic pain targets clusters of areas. Our results showed that chronic pain causes both core and pathology-specific gray matter alterations in large-scale networks. Common alterations were observed in the prefrontal regions, in the anterior insula, cingulate cortex, basal ganglia, thalamus, periaqueductal gray, post- and pre-central gyri and inferior parietal lobule. We observed that the salience and attentional networks were targeted in a very similar way by different chronic pain pathologies. Conversely, alterations in the sensorimotor and attention circuits were differentially targeted by chronic pain pathologies. Moreover, model-based clustering revealed that chronic pain, in line with some neurodegenerative diseases, selectively targets some large-scale brain networks. Altogether these findings indicate that chronic pain can be better conceived and studied in a network perspective. PMID:24936419
Bauer, Thomas R; Adler, Rima L; Hickstein, Dennis D
2009-01-01
Genetic mutations involving the cellular components of the hematopoietic system--red blood cells, white blood cells, and platelets--manifest clinically as anemia, infection, and bleeding. Although gene targeting has recapitulated many of these diseases in mice, these murine homologues are limited as translational models by their small size and brief life span as well as the fact that mutations induced by gene targeting do not always faithfully reflect the clinical manifestations of such mutations in humans. Many of these limitations can be overcome by identifying large animals with genetic diseases of the hematopoietic system corresponding to their human disease counterparts. In this article, we describe human diseases of the cellular components of the hematopoietic system that have counterparts in large animal species, in most cases carrying mutations in the same gene (CD18 in leukocyte adhesion deficiency) or genes in interacting proteins (DNA cross-link repair 1C protein and protein kinase, DNA-activated catalytic polypeptide in radiation-sensitive severe combined immunodeficiency). Furthermore, we describe the potential of these animal models to serve as disease-specific preclinical models for testing the efficacy and safety of clinical interventions such as hematopoietic stem cell transplantation or gene therapy before their use in humans with the corresponding disease.
FDTD method for laser absorption in metals for large scale problems.
Deng, Chun; Ki, Hyungson
2013-10-21
The FDTD method has been successfully used for many electromagnetic problems, but its application to laser material processing has been limited because even a several-millimeter domain requires a prohibitively large number of grids. In this article, we present a novel FDTD method for simulating large-scale laser beam absorption problems, especially for metals, by enlarging laser wavelength while maintaining the material's reflection characteristics. For validation purposes, the proposed method has been tested with in-house FDTD codes to simulate p-, s-, and circularly polarized 1.06 μm irradiation on Fe and Sn targets, and the simulation results are in good agreement with theoretical predictions.
NASA Astrophysics Data System (ADS)
Yen, H.; White, M. J.; Arnold, J. G.; Keitzer, S. C.; Johnson, M. V. V.; Atwood, J. D.; Daggupati, P.; Herbert, M. E.; Sowa, S. P.; Ludsin, S.; Robertson, D. M.; Srinivasan, R.; Rewa, C. A.
2016-12-01
By the substantial improvement of computer technology, large-scale watershed modeling has become practically feasible in conducting detailed investigations of hydrologic, sediment, and nutrient processes. In the Western Lake Erie Basin (WLEB), water quality issues caused by anthropogenic activities are not just interesting research subjects but, have implications related to human health and welfare, as well as ecological integrity, resistance, and resilience. In this study, the Soil and Water Assessment Tool (SWAT) and the finest resolution stream network, NHDPlus, were implemented on the WLEB to examine the interactions between achievable conservation scenarios with corresponding additional projected costs. During the calibration/validation processes, both hard (temporal) and soft (non-temporal) data were used to ensure the modeling outputs are coherent with actual watershed behavior. The results showed that widespread adoption of conservation practices intended to provide erosion control could deliver average reductions of sediment and nutrients without additional nutrient management changes. On the other hand, responses of nitrate (NO3) and dissolved inorganic phosphorus (DIP) dynamics may be different than responses of total nitrogen and total phosphorus dynamics under the same conservation practice. Model results also implied that fewer financial resources are required to achieve conservation goals if the goal is to achieve reductions in targeted watershed outputs (ex. NO3 or DIP) rather than aggregated outputs (ex. total nitrogen or total phosphorus). In addition, it was found that the model's capacity to simulate seasonal effects and responses to changing conservation adoption on a seasonal basis could provide a useful index to help alleviate additional cost through temporal targeting of conservation practices. Scientists, engineers, and stakeholders can take advantage of the work performed in this study as essential information while conducting policy making processes in the future.
Habitat degradation and fishing effects on the size structure of coral reef fish communities.
Wilson, S K; Fisher, R; Pratchett, M S; Graham, N A J; Dulvy, N K; Turner, R A; Cakacaka, A; Polunin, N V C
2010-03-01
Overfishing and habitat degradation through climate change pose the greatest threats to sustainability of marine resources on coral reefs. We examined how changes in fishing pressure and benthic habitat composition influenced the size spectra of island-scale reef fish communities in Lau, Fiji. Between 2000 and 2006 fishing pressure declined in the Lau Islands due to declining human populations and reduced demand for fresh fish. At the same time, coral cover declined and fine-scale architectural complexity eroded due to coral bleaching and outbreaks of crown-of-thorns starfish, Acanthaster planci. We examined the size distribution of reef fish communities using size spectra analysis, the linearized relationship between abundance and body size class. Spatial variation in fishing pressure accounted for 31% of the variation in the slope of the size spectra in 2000, higher fishing pressure being associated with a steeper slope, which is indicative of fewer large-bodied fish and/or more small-bodied fish. Conversely, in 2006 spatial variation in habitat explained 53% of the variation in the size spectra slopes, and the relationship with fishing pressure was much weaker (approximately 12% of variation) than in 2000. Reduced cover of corals and lower structural complexity was associated with less steep size spectra slopes, primarily due to reduced abundance of fish < 20 cm. Habitat degradation will compound effects of fishing on coral reefs as increased fishing reduces large-bodied target species, while habitat loss results in fewer small-bodied juveniles and prey that replenish stocks and provide dietary resources for predatory target species. Effective management of reef resources therefore depends on both reducing fishing pressure and maintaining processes that encourage rapid recovery of coral habitat.
Kushniruk, A; Kaipio, J; Nieminen, M; Hyppönen, H; Lääveri, T; Nohr, C; Kanstrup, A M; Berg Christiansen, M; Kuo, M-H; Borycki, E
2014-08-15
The objective of this paper is to explore approaches to understanding the usability of health information systems at regional and national levels. Several different methods are discussed in case studies from Denmark, Finland and Canada. They range from small scale qualitative studies involving usability testing of systems to larger scale national level questionnaire studies aimed at assessing the use and usability of health information systems by entire groups of health professionals. It was found that regional and national usability studies can complement smaller scale usability studies, and that they are needed in order to understand larger trends regarding system usability. Despite adoption of EHRs, many health professionals rate the usability of the systems as low. A range of usability issues have been noted when data is collected on a large scale through use of widely distributed questionnaires and websites designed to monitor user perceptions of usability. As health information systems are deployed on a widespread basis, studies that examine systems used regionally or nationally are required. In addition, collection of large scale data on the usability of specific IT products is needed in order to complement smaller scale studies of specific systems.
A Magnetic Bead-Integrated Chip for the Large Scale Manufacture of Normalized esiRNAs
Wang, Zhao; Huang, Huang; Zhang, Hanshuo; Sun, Changhong; Hao, Yang; Yang, Junyu; Fan, Yu; Xi, Jianzhong Jeff
2012-01-01
The chemically-synthesized siRNA duplex has become a powerful and widely used tool for RNAi loss-of-function studies, but suffers from a high off-target effect problem. Recently, endoribonulease-prepared siRNA (esiRNA) has been shown to be an attractive alternative due to its lower off-target effect and cost effectiveness. However, the current manufacturing method for esiRNA is complicated, mainly in regards to purification and normalization on a large-scale level. In this study, we present a magnetic bead-integrated chip that can immobilize amplification or transcription products on beads and accomplish transcription, digestion, normalization and purification in a robust and convenient manner. This chip is equipped to manufacture ready-to-use esiRNAs on a large-scale level. Silencing specificity and efficiency of these esiRNAs were validated at the transcriptional, translational and functional levels. Manufacture of several normalized esiRNAs in a single well, including those silencing PARP1 and BRCA1, was successfully achieved, and the esiRNAs were subsequently utilized to effectively investigate their synergistic effect on cell viability. A small esiRNA library targeting 68 tyrosine kinase genes was constructed for a loss-of-function study, and four genes were identified in regulating the migration capability of Hela cells. We believe that this approach provides a more robust and cost-effective choice for manufacturing esiRNAs than current approaches, and therefore these heterogeneous RNA strands may have utility in most intensive and extensive applications. PMID:22761791
2011-01-01
Background Since the classic Hopkins and Groom druggable genome review in 2002, there have been a number of publications updating both the hypothetical and successful human drug target statistics. However, listings of research targets that define the area between these two extremes are sparse because of the challenges of collating published information at the necessary scale. We have addressed this by interrogating databases, populated by expert curation, of bioactivity data extracted from patents and journal papers over the last 30 years. Results From a subset of just over 27,000 documents we have extracted a set of compound-to-target relationships for biochemical in vitro binding-type assay data for 1,736 human proteins and 1,654 gene identifiers. These are linked to 1,671,951 compound records derived from 823,179 unique chemical structures. The distribution showed a compounds-per-target average of 964 with a maximum of 42,869 (Factor Xa). The list includes non-targets, failed targets and cross-screening targets. The top-278 most actively pursued targets cover 90% of the compounds. We further investigated target ranking by determining the number of molecular frameworks and scaffolds. These were compared to the compound counts as alternative measures of chemical diversity on a per-target basis. Conclusions The compounds-per-protein listing generated in this work (provided as a supplementary file) represents the major proportion of the human drug target landscape defined by published data. We supplemented the simple ranking by the number of compounds assayed with additional rankings by molecular topology. These showed significant differences and provide complementary assessments of chemical tractability. PMID:21569515
Tatro, Erick T; Scott, Erick R; Nguyen, Timothy B; Salaria, Shahid; Banerjee, Sugato; Moore, David J; Masliah, Eliezer; Achim, Cristian L; Everall, Ian P
2010-04-26
HIV infection disturbs the central nervous system (CNS) through inflammation and glial activation. Evidence suggests roles for microRNA (miRNA) in host defense and neuronal homeostasis, though little is known about miRNAs' role in HIV CNS infection. MiRNAs are non-coding RNAs that regulate gene translation through post-transcriptional mechanisms. Messenger-RNA profiling alone is insufficient to elucidate the dynamic dance of molecular expression of the genome. We sought to clarify RNA alterations in the frontal cortex (FC) of HIV-infected individuals and those concurrently infected and diagnosed with major depressive disorder (MDD). This report is the first published study of large-scale miRNA profiling from human HIV-infected FC. The goals of this study were to: 1. Identify changes in miRNA expression that occurred in the frontal cortex (FC) of HIV individuals, 2. Determine whether miRNA expression profiles of the FC could differentiate HIV from HIV/MDD, and 3. Adapt a method to meaningfully integrate gene expression data and miRNA expression data in clinical samples. We isolated RNA from the FC (n = 3) of three separate groups (uninfected controls, HIV, and HIV/MDD) and then pooled the RNA within each group for use in large-scale miRNA profiling. RNA from HIV and HIV/MDD patients (n = 4 per group) were also used for non-pooled mRNA analysis on Affymetrix U133 Plus 2.0 arrays. We then utilized a method for integrating the two datasets in a Target Bias Analysis. We found miRNAs of three types: A) Those with many dysregulated mRNA targets of less stringent statistical significance, B) Fewer dysregulated target-genes of highly stringent statistical significance, and C) unclear bias. In HIV/MDD, more miRNAs were downregulated than in HIV alone. Specific miRNA families at targeted chromosomal loci were dysregulated. The dysregulated miRNAs clustered on Chromosomes 14, 17, 19, and X. A small subset of dysregulated genes had many 3' untranslated region (3'UTR) target-sites for dysregulated miRNAs. We provide evidence that certain miRNAs serve as key elements in gene regulatory networks in HIV-infected FC and may be implicated in neurobehavioral disorder. Finally, our data indicates that some genes may serve as hubs of miRNA activity.
Public Health Crisis in War and Conflict - Health Security in Aggregate.
Quinn, John; Zelený, Tomáš; Subramaniam, Rammika; Bencko, Vladimír
2017-03-01
Public health status of populations is multifactorial and besides other factors it is linked to war and conflict. Public health crisis can erupt when states go to war or are invaded; health security may be reduced for affected populations. This study reviews in aggregate multiple indices of human security, human development and legitimacy of the state in order to describe a predictable global health portrait. Paradigm shift of large global powers to that non-state actors and proxies impact regional influence through scaled conflict and present major global health challenges for policy makers. Small scale conflict with large scale violence threatens health security for at-risk populations. The paper concludes that health security is directly proportional to state security. Copyright© by the National Institute of Public Health, Prague 2017
ERIC Educational Resources Information Center
Crane, Earl Newell
2013-01-01
The research problem that inspired this effort is the challenge of managing the security of systems in large-scale heterogeneous networked environments. Human intervention is slow and limited: humans operate at much slower speeds than networked computer communications and there are few humans associated with each network. Enabling each node in the…
Shvedova, Anna A.; Yanamala, Naveena; Kisin, Elena R.; Khailullin, Timur O.; Birch, M. Eileen; Fatkhutdinova, Liliya M.
2016-01-01
Background As the application of carbon nanotubes (CNT) in consumer products continues to rise, studies have expanded to determine the associated risks of exposure on human and environmental health. In particular, several lines of evidence indicate that exposure to multi-walled carbon nanotubes (MWCNT) could pose a carcinogenic risk similar to asbestos fibers. However, to date the potential markers of MWCNT exposure are not yet explored in humans. Methods In the present study, global mRNA and ncRNA expression profiles in the blood of exposed workers, having direct contact with MWCNT aerosol for at least 6 months (n = 8), were compared with expression profiles of non-exposed (n = 7) workers (e.g., professional and/or technical staff) from the same manufacturing facility. Results Significant changes in the ncRNA and mRNA expression profiles were observed between exposed and non-exposed worker groups. An integrative analysis of ncRNA-mRNA correlations was performed to identify target genes, functional relationships, and regulatory networks in MWCNT-exposed workers. The coordinated changes in ncRNA and mRNA expression profiles revealed a set of miRNAs and their target genes with roles in cell cycle regulation/progression/control, apoptosis and proliferation. Further, the identified pathways and signaling networks also revealed MWCNT potential to trigger pulmonary and cardiovascular effects as well as carcinogenic outcomes in humans, similar to those previously described in rodents exposed to MWCNTs. Conclusion This study is the first to investigate aberrant changes in mRNA and ncRNA expression profiles in the blood of humans exposed to MWCNT. The significant changes in several miRNAs and mRNAs expression as well as their regulatory networks are important for getting molecular insights into the MWCNT-induced toxicity and pathogenesis in humans. Further large-scale prospective studies are necessary to validate the potential applicability of such changes in mRNAs and miRNAs as prognostic markers of MWCNT exposures in humans. PMID:26930275
From drug to protein: using yeast genetics for high-throughput target discovery.
Armour, Christopher D; Lum, Pek Yee
2005-02-01
The budding yeast Saccharomyces cerevisiae has long been an effective eukaryotic model system for understanding basic cellular processes. The genetic tractability and ease of manipulation in the laboratory make yeast well suited for large-scale chemical and genetic screens. Several recent studies describing the use of yeast genetics for high-throughput drug target identification are discussed in this review.
Functional genomics (FG) screens, using RNAi or CRISPR technology, have become a standard tool for systematic, genome-wide loss-of-function studies for therapeutic target discovery. As in many large-scale assays, however, off-target effects, variable reagents' potency and experimental noise must be accounted for appropriately control for false positives.
Duvvuri, Subrahmanyam; McKeon, Beverley
2017-03-13
Phase relations between specific scales in a turbulent boundary layer are studied here by highlighting the associated nonlinear scale interactions in the flow. This is achieved through an experimental technique that allows for targeted forcing of the flow through the use of a dynamic wall perturbation. Two distinct large-scale modes with well-defined spatial and temporal wavenumbers were simultaneously forced in the boundary layer, and the resulting nonlinear response from their direct interactions was isolated from the turbulence signal for the study. This approach advances the traditional studies of large- and small-scale interactions in wall turbulence by focusing on the direct interactions between scales with triadic wavenumber consistency. The results are discussed in the context of modelling high Reynolds number wall turbulence.This article is part of the themed issue 'Toward the development of high-fidelity models of wall turbulence at large Reynolds number'. © 2017 The Author(s).
To reduce the global burden of human schistosomiasis, use ‘old fashioned’ snail control
Sokolow, Susanne H.; Wood, Chelsea L.; Jones, Isabel J.; Lafferty, Kevin D.; Kuris, Armand; Hsieh, Michael H.; De Leo, Giulio A.
2018-01-01
Control strategies to reduce human schistosomiasis have evolved from ‘snail picking’ campaigns, a century ago, to modern wide-scale human treatment campaigns, or preventive chemotherapy. Unfortunately, despite the rise in preventive chemotherapy campaigns, just as many people suffer from schistosomiasis today as they did 50 years ago. Snail control can complement preventive chemotherapy by reducing the risk of transmission from snails to humans. Here, we present ideas for modernizing and scaling up snail control, including spatiotemporal targeting, environmental diagnostics, better molluscicides, new technologies (e.g., gene drive), and ‘outside the box’ strategies such as natural enemies, traps, and repellants. We conclude that, to achieve the World Health Assembly’s stated goal to eliminate schistosomiasis, it is time to give snail control another look.
Targeted Capture and High-Throughput Sequencing Using Molecular Inversion Probes (MIPs).
Cantsilieris, Stuart; Stessman, Holly A; Shendure, Jay; Eichler, Evan E
2017-01-01
Molecular inversion probes (MIPs) in combination with massively parallel DNA sequencing represent a versatile, yet economical tool for targeted sequencing of genomic DNA. Several thousand genomic targets can be selectively captured using long oligonucleotides containing unique targeting arms and universal linkers. The ability to append sequencing adaptors and sample-specific barcodes allows large-scale pooling and subsequent high-throughput sequencing at relatively low cost per sample. Here, we describe a "wet bench" protocol detailing the capture and subsequent sequencing of >2000 genomic targets from 192 samples, representative of a single lane on the Illumina HiSeq 2000 platform.
Human seizures couple across spatial scales through travelling wave dynamics
NASA Astrophysics Data System (ADS)
Martinet, L.-E.; Fiddyment, G.; Madsen, J. R.; Eskandar, E. N.; Truccolo, W.; Eden, U. T.; Cash, S. S.; Kramer, M. A.
2017-04-01
Epilepsy--the propensity toward recurrent, unprovoked seizures--is a devastating disease affecting 65 million people worldwide. Understanding and treating this disease remains a challenge, as seizures manifest through mechanisms and features that span spatial and temporal scales. Here we address this challenge through the analysis and modelling of human brain voltage activity recorded simultaneously across microscopic and macroscopic spatial scales. We show that during seizure large-scale neural populations spanning centimetres of cortex coordinate with small neural groups spanning cortical columns, and provide evidence that rapidly propagating waves of activity underlie this increased inter-scale coupling. We develop a corresponding computational model to propose specific mechanisms--namely, the effects of an increased extracellular potassium concentration diffusing in space--that support the observed spatiotemporal dynamics. Understanding the multi-scale, spatiotemporal dynamics of human seizures--and connecting these dynamics to specific biological mechanisms--promises new insights to treat this devastating disease.
Integrating visual learning within a model-based ATR system
NASA Astrophysics Data System (ADS)
Carlotto, Mark; Nebrich, Mark
2017-05-01
Automatic target recognition (ATR) systems, like human photo-interpreters, rely on a variety of visual information for detecting, classifying, and identifying manmade objects in aerial imagery. We describe the integration of a visual learning component into the Image Data Conditioner (IDC) for target/clutter and other visual classification tasks. The component is based on an implementation of a model of the visual cortex developed by Serre, Wolf, and Poggio. Visual learning in an ATR context requires the ability to recognize objects independent of location, scale, and rotation. Our method uses IDC to extract, rotate, and scale image chips at candidate target locations. A bootstrap learning method effectively extends the operation of the classifier beyond the training set and provides a measure of confidence. We show how the classifier can be used to learn other features that are difficult to compute from imagery such as target direction, and to assess the performance of the visual learning process itself.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aleksandrova, I. V.; Koresheva, E. R., E-mail: elena.koresheva@gmail.com; Krokhin, O. N.
2016-12-15
In inertial fusion energy research, considerable attention has recently been focused on low-cost fabrication of a large number of targets by developing a specialized layering module of repeatable operation. The targets must be free-standing, or unmounted. Therefore, the development of a target factory for inertial confinement fusion (ICF) is based on methods that can ensure a cost-effective target production with high repeatability. Minimization of the amount of tritium (i.e., minimization of time and space at all production stages) is a necessary condition as well. Additionally, the cryogenic hydrogen fuel inside the targets must have a structure (ultrafine layers—the grain sizemore » should be scaled back to the nanometer range) that supports the fuel layer survivability under target injection and transport through the reactor chamber. To meet the above requirements, significant progress has been made at the Lebedev Physical Institute (LPI) in the technology developed on the basis of rapid fuel layering inside moving free-standing targets (FST), also referred to as the FST layering method. Owing to the research carried out at LPI, unique experience has been gained in the development of the FST-layering module for target fabrication with an ultrafine fuel layer, including a reactor- scale target design. This experience can be used for the development of the next-generation FST-layering module for construction of a prototype of a target factory for power laser facilities and inertial fusion power plants.« less
A Multiscale Survival Process for Modeling Human Activity Patterns.
Zhang, Tianyang; Cui, Peng; Song, Chaoming; Zhu, Wenwu; Yang, Shiqiang
2016-01-01
Human activity plays a central role in understanding large-scale social dynamics. It is well documented that individual activity pattern follows bursty dynamics characterized by heavy-tailed interevent time distributions. Here we study a large-scale online chatting dataset consisting of 5,549,570 users, finding that individual activity pattern varies with timescales whereas existing models only approximate empirical observations within a limited timescale. We propose a novel approach that models the intensity rate of an individual triggering an activity. We demonstrate that the model precisely captures corresponding human dynamics across multiple timescales over five orders of magnitudes. Our model also allows extracting the population heterogeneity of activity patterns, characterized by a set of individual-specific ingredients. Integrating our approach with social interactions leads to a wide range of implications.
Wearable Wide-Range Strain Sensors Based on Ionic Liquids and Monitoring of Human Activities
Zhang, Shao-Hui; Wang, Feng-Xia; Li, Jia-Jia; Peng, Hong-Dan; Yan, Jing-Hui; Pan, Ge-Bo
2017-01-01
Wearable sensors for detection of human activities have encouraged the development of highly elastic sensors. In particular, to capture subtle and large-scale body motion, stretchable and wide-range strain sensors are highly desired, but still a challenge. Herein, a highly stretchable and transparent stain sensor based on ionic liquids and elastic polymer has been developed. The as-obtained sensor exhibits impressive stretchability with wide-range strain (from 0.1% to 400%), good bending properties and high sensitivity, whose gauge factor can reach 7.9. Importantly, the sensors show excellent biological compatibility and succeed in monitoring the diverse human activities ranging from the complex large-scale multidimensional motions to subtle signals, including wrist, finger and elbow joint bending, finger touch, breath, speech, swallow behavior and pulse wave. PMID:29135928
Kaufmann, Markus; Schuffenhauer, Ansgar; Fruh, Isabelle; Klein, Jessica; Thiemeyer, Anke; Rigo, Pierre; Gomez-Mancilla, Baltazar; Heidinger-Millot, Valerie; Bouwmeester, Tewis; Schopfer, Ulrich; Mueller, Matthias; Fodor, Barna D; Cobos-Correa, Amanda
2015-10-01
Fragile X syndrome (FXS) is the most common form of inherited mental retardation, and it is caused in most of cases by epigenetic silencing of the Fmr1 gene. Today, no specific therapy exists for FXS, and current treatments are only directed to improve behavioral symptoms. Neuronal progenitors derived from FXS patient induced pluripotent stem cells (iPSCs) represent a unique model to study the disease and develop assays for large-scale drug discovery screens since they conserve the Fmr1 gene silenced within the disease context. We have established a high-content imaging assay to run a large-scale phenotypic screen aimed to identify compounds that reactivate the silenced Fmr1 gene. A set of 50,000 compounds was tested, including modulators of several epigenetic targets. We describe an integrated drug discovery model comprising iPSC generation, culture scale-up, and quality control and screening with a very sensitive high-content imaging assay assisted by single-cell image analysis and multiparametric data analysis based on machine learning algorithms. The screening identified several compounds that induced a weak expression of fragile X mental retardation protein (FMRP) and thus sets the basis for further large-scale screens to find candidate drugs or targets tackling the underlying mechanism of FXS with potential for therapeutic intervention. © 2015 Society for Laboratory Automation and Screening.
Choi, Seunghee; Coon, Joshua J.; Goggans, Matthew Scott; Kreisman, Thomas F.; Silver, Daniel M.; Nesson, Michael H.
2016-01-01
Many of the materials that are challenging for large animals to cut or puncture are also cut and punctured by much smaller organisms that are limited to much smaller forces. Small organisms can overcome their force limitations by using sharper tools, but one drawback may be an increased susceptibility to fracture. We use simple contact mechanics models to estimate how much smaller the diameter of the tips or edges of tools such as teeth, claws and cutting blades must be in smaller organisms in order for them to puncture or cut the same materials as larger organisms. In order to produce the same maximum stress when maximum force scales as the square of body length, the diameter of the tool region that is in contact with the target material must scale isometrically for punch-like tools (e.g. scorpion stings) on thick targets, and for crushing tools (e.g. molars). For punch-like tools on thin targets, and for cutting blades on thick targets, the tip or edge diameters must be even smaller than expected from isometry in smaller animals. The diameters of a small sample of unworn punch-like tools from a large range of animal sizes are consistent with the model, scaling isometrically or more steeply (positively allometric). In addition, we find that the force required to puncture a thin target using real biological tools scales linearly with tip diameter, as predicted by the model. We argue that, for smaller tools, the minimum energy to fracture the tool will be a greater fraction of the minimum energy required to puncture the target, making fracture more likely. Finally, energy stored in tool bending, relative to the energy to fracture the tool, increases rapidly with the aspect ratio (length/width), and we expect that smaller organisms often have to employ higher aspect ratio tools in order to puncture or cut to the required depth with available force. The extra stored energy in higher aspect ratio tools is likely to increase the probability of fracture. We discuss some of the implications of the suggested scaling rules and possible adaptations to compensate for fracture sensitivity in smaller organisms. PMID:27274804
Kennedy, Jacob J.; Abbatiello, Susan E.; Kim, Kyunggon; Yan, Ping; Whiteaker, Jeffrey R.; Lin, Chenwei; Kim, Jun Seok; Zhang, Yuzheng; Wang, Xianlong; Ivey, Richard G.; Zhao, Lei; Min, Hophil; Lee, Youngju; Yu, Myeong-Hee; Yang, Eun Gyeong; Lee, Cheolju; Wang, Pei; Rodriguez, Henry; Kim, Youngsoo; Carr, Steven A.; Paulovich, Amanda G.
2014-01-01
The successful application of MRM in biological specimens raises the exciting possibility that assays can be configured to measure all human proteins, resulting in an assay resource that would promote advances in biomedical research. We report the results of a pilot study designed to test the feasibility of a large-scale, international effort in MRM assay generation. We have configured, validated across three laboratories, and made publicly available as a resource to the community 645 novel MRM assays representing 319 proteins expressed in human breast cancer. Assays were multiplexed in groups of >150 peptides and deployed to quantify endogenous analyte in a panel of breast cancer-related cell lines. Median assay precision was 5.4%, with high inter-laboratory correlation (R2 >0.96). Peptide measurements in breast cancer cell lines were able to discriminate amongst molecular subtypes and identify genome-driven changes in the cancer proteome. These results establish the feasibility of a scaled, international effort. PMID:24317253
2001-05-01
isolates could retain gp120 in an oligomer. A large scale purification scheme was developed using lentil lectin affinity and size exclusion...34 e. Western blot analysis……………………………………………… 35 f. Large scale protein expression and purification…………………... 35 g. Metabolic labeling, size...isolate HIV-1 Env………... 60 c. Large scale antigen preparation and analysis……………………… 67 d. Cleaved, soluble crosslinked primary isolate Env binds
Comprehensive prediction of drug-protein interactions and side effects for the human proteome
Zhou, Hongyi; Gao, Mu; Skolnick, Jeffrey
2015-01-01
Identifying unexpected drug-protein interactions is crucial for drug repurposing. We develop a comprehensive proteome scale approach that predicts human protein targets and side effects of drugs. For drug-protein interaction prediction, FINDSITEcomb, whose average precision is ~30% and recall ~27%, is employed. For side effect prediction, a new method is developed with a precision of ~57% and a recall of ~24%. Our predictions show that drugs are quite promiscuous, with the average (median) number of human targets per drug of 329 (38), while a given protein interacts with 57 drugs. The result implies that drug side effects are inevitable and existing drugs may be useful for repurposing, with only ~1,000 human proteins likely causing serious side effects. A killing index derived from serious side effects has a strong correlation with FDA approved drugs being withdrawn. Therefore, it provides a pre-filter for new drug development. The methodology is free to the academic community on the DR. PRODIS (DRugome, PROteome, and DISeasome) webserver at http://cssb.biology.gatech.edu/dr.prodis/. DR. PRODIS provides protein targets of drugs, drugs for a given protein target, associated diseases and side effects of drugs, as well as an interface for the virtual target screening of new compounds. PMID:26057345
Humans and seasonal climate variability threaten large-bodied coral reef fish with small ranges.
Mellin, C; Mouillot, D; Kulbicki, M; McClanahan, T R; Vigliola, L; Bradshaw, C J A; Brainard, R E; Chabanet, P; Edgar, G J; Fordham, D A; Friedlander, A M; Parravicini, V; Sequeira, A M M; Stuart-Smith, R D; Wantiez, L; Caley, M J
2016-02-03
Coral reefs are among the most species-rich and threatened ecosystems on Earth, yet the extent to which human stressors determine species occurrences, compared with biogeography or environmental conditions, remains largely unknown. With ever-increasing human-mediated disturbances on these ecosystems, an important question is not only how many species can inhabit local communities, but also which biological traits determine species that can persist (or not) above particular disturbance thresholds. Here we show that human pressure and seasonal climate variability are disproportionately and negatively associated with the occurrence of large-bodied and geographically small-ranging fishes within local coral reef communities. These species are 67% less likely to occur where human impact and temperature seasonality exceed critical thresholds, such as in the marine biodiversity hotspot: the Coral Triangle. Our results identify the most sensitive species and critical thresholds of human and climatic stressors, providing opportunity for targeted conservation intervention to prevent local extinctions.
Social sciences in Puget Sound recovery
Katharine F. Wellman; Kelly Biedenweg; Kathleen Wolf
2014-01-01
Advancing the recovery of large-scale ecosystems, such as the Puget Sound inWashington State, requires improved knowledge of the interdependencies between nature and humans in that basin region. As Biedenweg et al. (this issue) illustrate, human wellbeing and human behavior do not occur independently of the biophysical environment. Natural environments contribute to...
Wang, Lu-Yong; Fasulo, D
2006-01-01
Genome-wide association study for complex diseases will generate massive amount of single nucleotide polymorphisms (SNPs) data. Univariate statistical test (i.e. Fisher exact test) was used to single out non-associated SNPs. However, the disease-susceptible SNPs may have little marginal effects in population and are unlikely to retain after the univariate tests. Also, model-based methods are impractical for large-scale dataset. Moreover, genetic heterogeneity makes the traditional methods harder to identify the genetic causes of diseases. A more recent random forest method provides a more robust method for screening the SNPs in thousands scale. However, for more large-scale data, i.e., Affymetrix Human Mapping 100K GeneChip data, a faster screening method is required to screening SNPs in whole-genome large scale association analysis with genetic heterogeneity. We propose a boosting-based method for rapid screening in large-scale analysis of complex traits in the presence of genetic heterogeneity. It provides a relatively fast and fairly good tool for screening and limiting the candidate SNPs for further more complex computational modeling task.
Ivarsson, Ylva; Arnold, Roland; McLaughlin, Megan; Nim, Satra; Joshi, Rakesh; Ray, Debashish; Liu, Bernard; Teyra, Joan; Pawson, Tony; Moffat, Jason; Li, Shawn Shun-Cheng; Sidhu, Sachdev S; Kim, Philip M
2014-02-18
The human proteome contains a plethora of short linear motifs (SLiMs) that serve as binding interfaces for modular protein domains. Such interactions are crucial for signaling and other cellular processes, but are difficult to detect because of their low to moderate affinities. Here we developed a dedicated approach, proteomic peptide-phage display (ProP-PD), to identify domain-SLiM interactions. Specifically, we generated phage libraries containing all human and viral C-terminal peptides using custom oligonucleotide microarrays. With these libraries we screened the nine PSD-95/Dlg/ZO-1 (PDZ) domains of human Densin-180, Erbin, Scribble, and Disks large homolog 1 for peptide ligands. We identified several known and putative interactions potentially relevant to cellular signaling pathways and confirmed interactions between full-length Scribble and the target proteins β-PIX, plakophilin-4, and guanylate cyclase soluble subunit α-2 using colocalization and coimmunoprecipitation experiments. The affinities of recombinant Scribble PDZ domains and the synthetic peptides representing the C termini of these proteins were in the 1- to 40-μM range. Furthermore, we identified several well-established host-virus protein-protein interactions, and confirmed that PDZ domains of Scribble interact with the C terminus of Tax-1 of human T-cell leukemia virus with micromolar affinity. Previously unknown putative viral protein ligands for the PDZ domains of Scribble and Erbin were also identified. Thus, we demonstrate that our ProP-PD libraries are useful tools for probing PDZ domain interactions. The method can be extended to interrogate all potential eukaryotic, bacterial, and viral SLiMs and we suggest it will be a highly valuable approach for studying cellular and pathogen-host protein-protein interactions.
The co-evolution of social institutions, demography, and large-scale human cooperation.
Powers, Simon T; Lehmann, Laurent
2013-11-01
Human cooperation is typically coordinated by institutions, which determine the outcome structure of the social interactions individuals engage in. Explaining the Neolithic transition from small- to large-scale societies involves understanding how these institutions co-evolve with demography. We study this using a demographically explicit model of institution formation in a patch-structured population. Each patch supports both social and asocial niches. Social individuals create an institution, at a cost to themselves, by negotiating how much of the costly public good provided by cooperators is invested into sanctioning defectors. The remainder of their public good is invested in technology that increases carrying capacity, such as irrigation systems. We show that social individuals can invade a population of asocials, and form institutions that support high levels of cooperation. We then demonstrate conditions where the co-evolution of cooperation, institutions, and demographic carrying capacity creates a transition from small- to large-scale social groups. © 2013 John Wiley & Sons Ltd/CNRS.
Introduction: demography and cultural macroevolution.
Steele, James; Shennan, Stephen
2009-04-01
The papers in this special issue of Human Biology, which derive from a conference sponsored by the Arts and Humanities Research Council (AHRC) Center for the Evolution of Cultural Diversity, lay some of the foundations for an empirical macroevolutionary analysis of cultural dynamics. Our premise here is that cultural dynamics-including the stability of traditions and the rate of origination of new variants-are influenced by independently occurring demographic processes (population size, structure, and distribution as these vary over time as a result of changes in rates of fertility, mortality, and migration). The contributors focus on three sets of problems relevant to empirical studies of cultural macroevolution: large-scale reconstruction of past population dynamics from archaeological and genetic data; juxtaposition of models and evidence of cultural dynamics using large-scale archaeological and historical data sets; and juxtaposition of models and evidence of cultural dynamics from large-scale linguistic data sets. In this introduction we outline some of the theoretical and methodological issues and briefly summarize the individual contributions.
Large-scale imputation of epigenomic datasets for systematic annotation of diverse human tissues.
Ernst, Jason; Kellis, Manolis
2015-04-01
With hundreds of epigenomic maps, the opportunity arises to exploit the correlated nature of epigenetic signals, across both marks and samples, for large-scale prediction of additional datasets. Here, we undertake epigenome imputation by leveraging such correlations through an ensemble of regression trees. We impute 4,315 high-resolution signal maps, of which 26% are also experimentally observed. Imputed signal tracks show overall similarity to observed signals and surpass experimental datasets in consistency, recovery of gene annotations and enrichment for disease-associated variants. We use the imputed data to detect low-quality experimental datasets, to find genomic sites with unexpected epigenomic signals, to define high-priority marks for new experiments and to delineate chromatin states in 127 reference epigenomes spanning diverse tissues and cell types. Our imputed datasets provide the most comprehensive human regulatory region annotation to date, and our approach and the ChromImpute software constitute a useful complement to large-scale experimental mapping of epigenomic information.
Expansion of Human Mesenchymal Stem Cells in a Microcarrier Bioreactor.
Tsai, Ang-Chen; Ma, Teng
2016-01-01
Human mesenchymal stem cells (hMSCs) are considered as a primary candidate in cell therapy owing to their self-renewability, high differentiation capabilities, and secretions of trophic factors. In clinical application, a large quantity of therapeutically competent hMSCs is required that cannot be produced in conventional petri dish culture. Bioreactors are scalable and have the capacity to meet the production demand. Microcarrier suspension culture in stirred-tank bioreactors is the most widely used method to expand anchorage dependent cells in a large scale. Stirred-tank bioreactors have the potential to scale up and microcarriers provide the high surface-volume ratio. As a result, a spinner flask bioreactor with microcarriers has been commonly used in large scale expansion of adherent cells. This chapter describes a detailed culture protocol for hMSC expansion in a 125 mL spinner flask using microcarriers, Cytodex I, and a procedure for cell seeding, expansion, metabolic sampling, and quantification and visualization using microculture tetrazolium (MTT) reagent.
NASA Astrophysics Data System (ADS)
Singh, Sarabjeet; Schneider, David J.; Myers, Christopher R.
2014-03-01
Branching processes have served as a model for chemical reactions, biological growth processes, and contagion (of disease, information, or fads). Through this connection, these seemingly different physical processes share some common universalities that can be elucidated by analyzing the underlying branching process. In this work we focus on coupled branching processes as a model of infectious diseases spreading from one population to another. An exceedingly important example of such coupled outbreaks are zoonotic infections that spill over from animal populations to humans. We derive several statistical quantities characterizing the first spillover event from animals to humans, including the probability of spillover, the first passage time distribution for human infection, and disease prevalence in the animal population at spillover. Large stochastic fluctuations in those quantities can make inference of the state of the system at the time of spillover difficult. Focusing on outbreaks in the human population, we then characterize the critical threshold for a large outbreak, the distribution of outbreak sizes, and associated scaling laws. These all show a strong dependence on the basic reproduction number in the animal population and indicate the existence of a novel multicritical point with altered scaling behavior. The coupling of animal and human infection dynamics has crucial implications, most importantly allowing for the possibility of large human outbreaks even when human-to-human transmission is subcritical.
Smith, Bruce D.
2011-01-01
Niche construction efforts by small-scale human societies that involve ‘wild’ species of plants and animals are organized into a set of six general categories based on the shared characteristics of the target species and similar patterns of human management and manipulation: (i) general modification of vegetation communities, (ii) broadcast sowing of wild annuals, (iii) transplantation of perennial fruit-bearing species, (iv) in-place encouragement of economically important perennials, (v) transplantation and in-place encouragement of perennial root crops, and (vi) landscape modification to increase prey abundance in specific locations. Case study examples, mostly drawn from North America, are presented for each of the six general categories of human niche construction. These empirically documented categories of ecosystem engineering form the basis for a predictive model that outlines potential general principles and commonalities in how small-scale human societies worldwide have modified and manipulated their ‘natural’ landscapes throughout the Holocene. PMID:21320898
Active Guidance of a Handheld Micromanipulator using Visual Servoing.
Becker, Brian C; Voros, Sandrine; Maclachlan, Robert A; Hager, Gregory D; Riviere, Cameron N
2009-05-12
In microsurgery, a surgeon often deals with anatomical structures of sizes that are close to the limit of the human hand accuracy. Robotic assistants can help to push beyond the current state of practice by integrating imaging and robot-assisted tools. This paper demonstrates control of a handheld tremor reduction micromanipulator with visual servo techniques, aiding the operator by providing three behaviors: snap-to, motion-scaling, and standoff-regulation. A stereo camera setup viewing the workspace under high magnification tracks the tip of the micromanipulator and the desired target object being manipulated. Individual behaviors activate in task-specific situations when the micromanipulator tip is in the vicinity of the target. We show that the snap-to behavior can reach and maintain a position at a target with an accuracy of 17.5 ± 0.4μm Root Mean Squared Error (RMSE) distance between the tip and target. Scaling the operator's motions and preventing unwanted contact with non-target objects also provides a larger margin of safety.
Identifying and modeling the structural discontinuities of human interactions
NASA Astrophysics Data System (ADS)
Grauwin, Sebastian; Szell, Michael; Sobolevsky, Stanislav; Hövel, Philipp; Simini, Filippo; Vanhoof, Maarten; Smoreda, Zbigniew; Barabási, Albert-László; Ratti, Carlo
2017-04-01
The idea of a hierarchical spatial organization of society lies at the core of seminal theories in human geography that have strongly influenced our understanding of social organization. Along the same line, the recent availability of large-scale human mobility and communication data has offered novel quantitative insights hinting at a strong geographical confinement of human interactions within neighboring regions, extending to local levels within countries. However, models of human interaction largely ignore this effect. Here, we analyze several country-wide networks of telephone calls - both, mobile and landline - and in either case uncover a systematic decrease of communication induced by borders which we identify as the missing variable in state-of-the-art models. Using this empirical evidence, we propose an alternative modeling framework that naturally stylizes the damping effect of borders. We show that this new notion substantially improves the predictive power of widely used interaction models. This increases our ability to understand, model and predict social activities and to plan the development of infrastructures across multiple scales.
Identifying and modeling the structural discontinuities of human interactions
Grauwin, Sebastian; Szell, Michael; Sobolevsky, Stanislav; Hövel, Philipp; Simini, Filippo; Vanhoof, Maarten; Smoreda, Zbigniew; Barabási, Albert-László; Ratti, Carlo
2017-01-01
The idea of a hierarchical spatial organization of society lies at the core of seminal theories in human geography that have strongly influenced our understanding of social organization. Along the same line, the recent availability of large-scale human mobility and communication data has offered novel quantitative insights hinting at a strong geographical confinement of human interactions within neighboring regions, extending to local levels within countries. However, models of human interaction largely ignore this effect. Here, we analyze several country-wide networks of telephone calls - both, mobile and landline - and in either case uncover a systematic decrease of communication induced by borders which we identify as the missing variable in state-of-the-art models. Using this empirical evidence, we propose an alternative modeling framework that naturally stylizes the damping effect of borders. We show that this new notion substantially improves the predictive power of widely used interaction models. This increases our ability to understand, model and predict social activities and to plan the development of infrastructures across multiple scales. PMID:28443647
Identifying and modeling the structural discontinuities of human interactions.
Grauwin, Sebastian; Szell, Michael; Sobolevsky, Stanislav; Hövel, Philipp; Simini, Filippo; Vanhoof, Maarten; Smoreda, Zbigniew; Barabási, Albert-László; Ratti, Carlo
2017-04-26
The idea of a hierarchical spatial organization of society lies at the core of seminal theories in human geography that have strongly influenced our understanding of social organization. Along the same line, the recent availability of large-scale human mobility and communication data has offered novel quantitative insights hinting at a strong geographical confinement of human interactions within neighboring regions, extending to local levels within countries. However, models of human interaction largely ignore this effect. Here, we analyze several country-wide networks of telephone calls - both, mobile and landline - and in either case uncover a systematic decrease of communication induced by borders which we identify as the missing variable in state-of-the-art models. Using this empirical evidence, we propose an alternative modeling framework that naturally stylizes the damping effect of borders. We show that this new notion substantially improves the predictive power of widely used interaction models. This increases our ability to understand, model and predict social activities and to plan the development of infrastructures across multiple scales.
Large-scale absence of sharks on reefs in the greater-Caribbean: a footprint of human pressures.
Ward-Paige, Christine A; Mora, Camilo; Lotze, Heike K; Pattengill-Semmens, Christy; McClenachan, Loren; Arias-Castro, Ery; Myers, Ransom A
2010-08-05
In recent decades, large pelagic and coastal shark populations have declined dramatically with increased fishing; however, the status of sharks in other systems such as coral reefs remains largely unassessed despite a long history of exploitation. Here we explore the contemporary distribution and sighting frequency of sharks on reefs in the greater-Caribbean and assess the possible role of human pressures on observed patterns. We analyzed 76,340 underwater surveys carried out by trained volunteer divers between 1993 and 2008. Surveys were grouped within one km2 cells, which allowed us to determine the contemporary geographical distribution and sighting frequency of sharks. Sighting frequency was calculated as the ratio of surveys with sharks to the total number of surveys in each cell. We compared sighting frequency to the number of people in the cell vicinity and used population viability analyses to assess the effects of exploitation on population trends. Sharks, with the exception of nurse sharks occurred mainly in areas with very low human population or strong fishing regulations and marine conservation. Population viability analysis suggests that exploitation alone could explain the large-scale absence; however, this pattern is likely to be exacerbated by additional anthropogenic stressors, such as pollution and habitat degradation, that also correlate with human population. Human pressures in coastal zones have lead to the broad-scale absence of sharks on reefs in the greater-Caribbean. Preventing further loss of sharks requires urgent management measures to curb fishing mortality and to mitigate other anthropogenic stressors to protect sites where sharks still exist. The fact that sharks still occur in some densely populated areas where strong fishing regulations are in place indicates the possibility of success and encourages the implementation of conservation measures.
PHOBOS Exploration using Two Small Solar Electric Propulsion Spacecraft
NASA Technical Reports Server (NTRS)
Lang, Jared J.; Baker, John D.; Castillo-Rogez, Julie C.; McElrath, Timothy P.; Piacentine, Jamie S.; Snyder, J. Steve
2012-01-01
Primitive bodies are exciting targets for exploration as they provide clues to the early Solar system conditions and dynamical evolution. The two moons of Mars are particularly interesting because of their proximity to an astrobiological target. However, after four decades of Mars exploration, their origin and nature remain enigmatic. In addition, when considering the long-term objectives of the flexible path for the potential human exploration to Mars, Phobos and Deimos present exciting intermediate opportunities without the complication and expense of landing and ascending from the surface. As interest in these targets for the next frontier of human exploration grows, characterization missions designed specifically to examine surface properties, landing environments, and surface mapping prior to human exploration are becoming increasingly important. A precursor mission concept of this sort has been developed using two identical spacecraft designed from low cost, flight proven and certified off-the-shelf component and utilizing Solar Electric Propulsion (SEP) to orbit both targets as secondary payloads launched aboard any NASA or GTO launch. This precursor mission has the potential to address both precursor measurements that are strategic knowledge gaps and decadal science, including soil physical properties at the global and local (human) scale and the search for in situ resources.
Identifying Attenuating Mutations: Tools for a New Vaccine Design against Flaviviruses.
Khou, Cécile; Pardigon, Nathalie
2017-01-01
Emerging Flaviviruses pose an increasing threat to global human health. To date, human vaccines against yellow fever virus (YFV), Japanese encephalitis virus (JEV), dengue virus (DV), and tick-borne encephalitis virus (TBEV) exist. However, there is no human vaccine against other Flaviviruses such as Zika virus (ZIKV) and West Nile virus (WNV). In order to restrict their spread and to protect populations against the diseases they induce, vaccines against these emerging viruses must be designed. Obtaining new live attenuated Flavivirus vaccines using molecular biology methods is now possible. Molecular infectious clones of the parental viruses are relatively easy to generate. Key mutations present in live attenuated vaccines or mutations known to have a key role in the Flavivirus life cycle and/or interactions with their hosts can be identified by sequencing, and are then inserted in infectious clones by site-directed mutagenesis. More recently, the use of chimeric viruses and large-scale reencoding and introduction of microRNA target sequences have also been tested. Indeed, a combination of these methods will help in designing new generations of vaccines against emerging and reemerging Flaviviruses. © 2017 S. Karger AG, Basel.
Cheng, Chialin; Fass, Daniel M; Folz-Donahue, Kat; MacDonald, Marcy E; Haggarty, Stephen J
2017-01-11
Reprogramming of human somatic cells into induced pluripotent stem (iPS) cells has greatly expanded the set of research tools available to investigate the molecular and cellular mechanisms underlying central nervous system (CNS) disorders. Realizing the promise of iPS cell technology for the identification of novel therapeutic targets and for high-throughput drug screening requires implementation of methods for the large-scale production of defined CNS cell types. Here we describe a protocol for generating stable, highly expandable, iPS cell-derived CNS neural progenitor cells (NPC) using multi-dimensional fluorescence activated cell sorting (FACS) to purify NPC defined by cell surface markers. In addition, we describe a rapid, efficient, and reproducible method for generating excitatory cortical-like neurons from these NPC through inducible expression of the pro-neural transcription factor Neurogenin 2 (iNgn2-NPC). Finally, we describe methodology for the use of iNgn2-NPC for probing human neuroplasticity and mechanisms underlying CNS disorders using high-content, single-cell-level automated microscopy assays. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.
Mercury as a Global Pollutant: Sources, Pathways, and Effects
2013-01-01
Mercury (Hg) is a global pollutant that affects human and ecosystem health. We synthesize understanding of sources, atmosphere-land-ocean Hg dynamics and health effects, and consider the implications of Hg-control policies. Primary anthropogenic Hg emissions greatly exceed natural geogenic sources, resulting in increases in Hg reservoirs and subsequent secondary Hg emissions that facilitate its global distribution. The ultimate fate of emitted Hg is primarily recalcitrant soil pools and deep ocean waters and sediments. Transfers of Hg emissions to largely unavailable reservoirs occur over the time scale of centuries, and are primarily mediated through atmospheric exchanges of wet/dry deposition and evasion from vegetation, soil organic matter and ocean surfaces. A key link between inorganic Hg inputs and exposure of humans and wildlife is the net production of methylmercury, which occurs mainly in reducing zones in freshwater, terrestrial, and coastal environments, and the subsurface ocean. Elevated human exposure to methylmercury primarily results from consumption of estuarine and marine fish. Developing fetuses are most at risk from this neurotoxin but health effects of highly exposed populations and wildlife are also a concern. Integration of Hg science with national and international policy efforts is needed to target efforts and evaluate efficacy. PMID:23590191
Polidori, G; Marreiro, A; Pron, H; Lestriez, P; Boyer, F C; Quinart, H; Tourbah, A; Taïar, R
2016-11-01
This article establishes the basics of a theoretical model for the constitutive law that describes the skin temperature and thermolysis heat losses undergone by a subject during a session of whole-body cryotherapy (WBC). This study focuses on the few minutes during which the human body is subjected to a thermal shock. The relationship between skin temperature and thermolysis heat losses during this period is still unknown and have not yet been studied in the context of the whole human body. The analytical approach here is based on the hypothesis that the skin thermal shock during a WBC session can be thermally modelled by the sum of both radiative and free convective heat transfer functions. The validation of this scientific approach and the derivation of temporal evolution thermal laws, both on skin temperature and dissipated thermal power during the thermal shock open many avenues of large scale studies with the aim of proposing individualized cryotherapy protocols as well as protocols intended for target populations. Furthermore, this study shows quantitatively the substantial imbalance between human metabolism and thermolysis during WBC, the explanation of which remains an open question. Copyright © 2016 Elsevier Ltd. All rights reserved.
Delatour, Vincent; Lalere, Beatrice; Saint-Albin, Karène; Peignaux, Maryline; Hattchouel, Jean-Marc; Dumont, Gilles; De Graeve, Jacques; Vaslin-Reimann, Sophie; Gillery, Philippe
2012-11-20
The reliability of biological tests is a major issue for patient care in terms of public health that involves high economic stakes. Reference methods, as well as regular external quality assessment schemes (EQAS), are needed to monitor the analytical performance of field methods. However, control material commutability is a major concern to assess method accuracy. To overcome material non-commutability, we investigated the possibility of using lyophilized serum samples together with a limited number of frozen serum samples to assign matrix-corrected target values, taking the example of glucose assays. Trueness of the current glucose assays was first measured against a primary reference method by using human frozen sera. Methods using hexokinase and glucose oxidase with spectroreflectometric detection proved very accurate, with bias ranging between -2.2% and +2.3%. Bias of methods using glucose oxidase with spectrophotometric detection was +4.5%. Matrix-related bias of the lyophilized materials was then determined and ranged from +2.5% to -14.4%. Matrix-corrected target values were assigned and used to assess trueness of 22 sub-peer groups. We demonstrated that matrix-corrected target values can be a valuable tool to assess field method accuracy in large scale surveys where commutable materials are not available in sufficient amount with acceptable costs. Copyright © 2012 Elsevier B.V. All rights reserved.
lncRNATargets: A platform for lncRNA target prediction based on nucleic acid thermodynamics.
Hu, Ruifeng; Sun, Xiaobo
2016-08-01
Many studies have supported that long noncoding RNAs (lncRNAs) perform various functions in various critical biological processes. Advanced experimental and computational technologies allow access to more information on lncRNAs. Determining the functions and action mechanisms of these RNAs on a large scale is urgently needed. We provided lncRNATargets, which is a web-based platform for lncRNA target prediction based on nucleic acid thermodynamics. The nearest-neighbor (NN) model was used to calculate binging-free energy. The main principle of NN model for nucleic acid assumes that identity and orientation of neighbor base pairs determine stability of a given base pair. lncRNATargets features the following options: setting of a specific temperature that allow use not only for human but also for other animals or plants; processing all lncRNAs in high throughput without RNA size limitation that is superior to any other existing tool; and web-based, user-friendly interface, and colored result displays that allow easy access for nonskilled computer operators and provide better understanding of results. This technique could provide accurate calculation on the binding-free energy of lncRNA-target dimers to predict if these structures are well targeted together. lncRNATargets provides high accuracy calculations, and this user-friendly program is available for free at http://www.herbbol.org:8001/lrt/ .
Sunaga, Noriaki; Shames, David S.; Girard, Luc; Peyton, Michael; Larsen, Jill E.; Imai, Hisao; Soh, Junichi; Sato, Mitsuo; Yanagitani, Noriko; Kaira, Kyoichi; Xie, Yang; Gazdar, Adi F.; Mori, Masatomo; Minna, John D.
2011-01-01
Oncogenic KRAS is found in >25% of lung adenocarcinomas, the major histologic subtype of non-small cell lung cancer (NSCLC), and is an important target for drug development. To this end, we generated four NSCLC lines with stable knockdown selective for oncogenic KRAS. As expected, stable knockdown of oncogenic KRAS led to inhibition of in vitro and in vivo tumor growth in the KRAS mutant NSCLC cells, but not in NSCLC cells that have wild-type KRAS (but mutant NRAS). Surprisingly, we did not see large-scale induction of cell death and the growth inhibitory effect was not complete. To further understand the ability of NSCLCs to grow despite selective removal of mutant KRAS expression, we performed microarray expression profiling of NSCLC cell lines with or without mutant KRAS knockdown and isogenic human bronchial epithelial cell lines (HBECs) with and without oncogenic KRAS. We found that while the MAPK pathway is significantly down-regulated after mutant KRAS knockdown, these NSCLCs showed increased levels of phospho-STAT3 and phospho-EGFR, and variable changes in phospho-Akt. In addition, mutant KRAS knockdown sensitized the NSCLCs to p38 and EGFR inhibitors. Our findings suggest that targeting oncogenic KRAS by itself will not be sufficient treatment but may offer possibilities of combining anti-KRAS strategies with other targeted drugs. PMID:21306997
Control factors and scale analysis of annual river water, sediments and carbon transport in China.
Song, Chunlin; Wang, Genxu; Sun, Xiangyang; Chang, Ruiying; Mao, Tianxu
2016-05-11
Under the context of dramatic human disturbances on river system, the processes that control the transport of water, sediment, and carbon from river basins to coastal seas are not completely understood. Here we performed a quantitative synthesis for 121 sites across China to find control factors of annual river exports (Rc: runoff coefficient; TSSC: total suspended sediment concentration; TSSL: total suspended sediment loads; TOCL: total organic carbon loads) at different spatial scales. The results indicated that human activities such as dam construction and vegetation restoration might have a greater influence than climate on the transport of river sediment and carbon, although climate was a major driver of Rc. Multiple spatial scale analyses indicated that Rc increased from the small to medium scale by 20% and then decreased at the sizable scale by 20%. TSSC decreased from the small to sizeable scale but increase from the sizeable to large scales; however, TSSL significantly decreased from small (768 g·m(-2)·a(-1)) to medium spatial scale basins (258 g·m(-2)·a(-1)), and TOCL decreased from the medium to large scale. Our results will improve the understanding of water, sediment and carbon transport processes and contribute better water and land resources management strategies from different spatial scales.
Revisiting the Basic Reproductive Number for Malaria and Its Implications for Malaria Control
Smith, David L; McKenzie, F. Ellis; Snow, Robert W; Hay, Simon I
2007-01-01
The prospects for the success of malaria control depend, in part, on the basic reproductive number for malaria, R 0. Here, we estimate R 0 in a novel way for 121 African populations, and thereby increase the number of R 0 estimates for malaria by an order of magnitude. The estimates range from around one to more than 3,000. We also consider malaria transmission and control in finite human populations, of size H. We show that classic formulas approximate the expected number of mosquitoes that could trace infection back to one mosquito after one parasite generation, Z 0(H), but they overestimate the expected number of infected humans per infected human, R 0(H). Heterogeneous biting increases R 0 and, as we show, Z 0(H), but we also show that it sometimes reduces R 0(H); those who are bitten most both infect many vectors and absorb infectious bites. The large range of R 0 estimates strongly supports the long-held notion that malaria control presents variable challenges across its transmission spectrum. In populations where R 0 is highest, malaria control will require multiple, integrated methods that target those who are bitten most. Therefore, strategic planning for malaria control should consider R 0, the spatial scale of transmission, human population density, and heterogeneous biting. PMID:17311470
An analytical approach to separate climate and human contributions to basin streamflow variability
NASA Astrophysics Data System (ADS)
Li, Changbin; Wang, Liuming; Wanrui, Wang; Qi, Jiaguo; Linshan, Yang; Zhang, Yuan; Lei, Wu; Cui, Xia; Wang, Peng
2018-04-01
Climate variability and anthropogenic regulations are two interwoven factors in the ecohydrologic system across large basins. Understanding the roles that these two factors play under various hydrologic conditions is of great significance for basin hydrology and sustainable water utilization. In this study, we present an analytical approach based on coupling water balance method and Budyko hypothesis to derive effectiveness coefficients (ECs) of climate change, as a way to disentangle contributions of it and human activities to the variability of river discharges under different hydro-transitional situations. The climate dominated streamflow change (ΔQc) by EC approach was compared with those deduced by the elasticity method and sensitivity index. The results suggest that the EC approach is valid and applicable for hydrologic study at large basin scale. Analyses of various scenarios revealed that contributions of climate change and human activities to river discharge variation differed among the regions of the study area. Over the past several decades, climate change dominated hydro-transitions from dry to wet, while human activities played key roles in the reduction of streamflow during wet to dry periods. Remarkable decline of discharge in upstream was mainly due to human interventions, although climate contributed more to runoff increasing during dry periods in the semi-arid downstream. Induced effectiveness on streamflow changes indicated a contribution ratio of 49% for climate and 51% for human activities at the basin scale from 1956 to 2015. The mathematic derivation based simple approach, together with the case example of temporal segmentation and spatial zoning, could help people understand variation of river discharge with more details at a large basin scale under the background of climate change and human regulations.
ACT-Vision: active collaborative tracking for multiple PTZ cameras
NASA Astrophysics Data System (ADS)
Broaddus, Christopher; Germano, Thomas; Vandervalk, Nicholas; Divakaran, Ajay; Wu, Shunguang; Sawhney, Harpreet
2009-04-01
We describe a novel scalable approach for the management of a large number of Pan-Tilt-Zoom (PTZ) cameras deployed outdoors for persistent tracking of humans and vehicles, without resorting to the large fields of view of associated static cameras. Our system, Active Collaborative Tracking - Vision (ACT-Vision), is essentially a real-time operating system that can control hundreds of PTZ cameras to ensure uninterrupted tracking of target objects while maintaining image quality and coverage of all targets using a minimal number of sensors. The system ensures the visibility of targets between PTZ cameras by using criteria such as distance from sensor and occlusion.
Kim, Hyerin; Kang, NaNa; An, KyuHyeon; Koo, JaeHyung; Kim, Min-Soo
2016-07-08
Design of high-quality primers for multiple target sequences is essential for qPCR experiments, but is challenging due to the need to consider both homology tests on off-target sequences and the same stringent filtering constraints on the primers. Existing web servers for primer design have major drawbacks, including requiring the use of BLAST-like tools for homology tests, lack of support for ranking of primers, TaqMan probes and simultaneous design of primers against multiple targets. Due to the large-scale computational overhead, the few web servers supporting homology tests use heuristic approaches or perform homology tests within a limited scope. Here, we describe the MRPrimerW, which performs complete homology testing, supports batch design of primers for multi-target qPCR experiments, supports design of TaqMan probes and ranks the resulting primers to return the top-1 best primers to the user. To ensure high accuracy, we adopted the core algorithm of a previously reported MapReduce-based method, MRPrimer, but completely redesigned it to allow users to receive query results quickly in a web interface, without requiring a MapReduce cluster or a long computation. MRPrimerW provides primer design services and a complete set of 341 963 135 in silico validated primers covering 99% of human and mouse genes. Free access: http://MRPrimerW.com. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
Kuppusamy, Kavitha T.; Jones, Daniel C.; Sperber, Henrik; Madan, Anup; Fischer, Karin A.; Rodriguez, Marita L.; Pabon, Lil; Zhu, Wei-Zhong; Tulloch, Nathaniel L.; Yang, Xiulan; Sniadecki, Nathan J.; Laflamme, Michael A.; Murry, Charles E.; Ruohola-Baker, Hannele
2015-01-01
In metazoans, transition from fetal to adult heart is accompanied by a switch in energy metabolism-glycolysis to fatty acid oxidation. The molecular factors regulating this metabolic switch remain largely unexplored. We first demonstrate that the molecular signatures in 1-year (y) matured human embryonic stem cell-derived cardiomyocytes (hESC-CMs) are similar to those seen in in vivo-derived mature cardiac tissues, thus making them an excellent model to study human cardiac maturation. We further show that let-7 is the most highly up-regulated microRNA (miRNA) family during in vitro human cardiac maturation. Gain- and loss-of-function analyses of let-7g in hESC-CMs demonstrate it is both required and sufficient for maturation, but not for early differentiation of CMs. Overexpression of let-7 family members in hESC-CMs enhances cell size, sarcomere length, force of contraction, and respiratory capacity. Interestingly, large-scale expression data, target analysis, and metabolic flux assays suggest this let-7–driven CM maturation could be a result of down-regulation of the phosphoinositide 3 kinase (PI3K)/AKT protein kinase/insulin pathway and an up-regulation of fatty acid metabolism. These results indicate let-7 is an important mediator in augmenting metabolic energetics in maturing CMs. Promoting maturation of hESC-CMs with let-7 overexpression will be highly significant for basic and applied research. PMID:25964336
NASA Astrophysics Data System (ADS)
Cozzoli, Francesco; Smolders, Sven; Eelkema, Menno; Ysebaert, Tom; Escaravage, Vincent; Temmerman, Stijn; Meire, Patrick; Herman, Peter M. J.; Bouma, Tjeerd J.
2017-01-01
The natural coastal hydrodynamics and morphology worldwide is altered by human interventions such as embankments, shipping and dredging, which may have consequences for ecosystem functionality. To ensure long-term ecological sustainability, requires capability to predict long-term large-scale ecological effects of altered hydromorphology. As empirical data sets at relevant scales are missing, there is need for integrating ecological modeling with physical modeling. This paper presents a case study showing the long-term, large-scale macrozoobenthic community response to two contrasting human alterations of the hydromorphological habitat: deepening of estuarine channels to enhance navigability (Westerschelde) vs. realization of a storm surge barrier to enhance coastal safety (Oosterschelde). A multidisciplinary integration of empirical data and modeling of estuarine morphology, hydrodynamics and benthic ecology was used to reconstruct the hydrological evolution and resulting long-term (50 years) large-scale ecological trends for both estuaries over the last. Our model indicated that hydrodynamic alterations following the deepening of the Westerschelde had negative implications for benthic life, while the realization of the Oosterschelde storm surge barriers had mixed and habitat-dependent responses, that also include unexpected improvement of environmental quality. Our analysis illustrates long-term trends in the natural community caused by opposing management strategies. The divergent human pressures on the Oosterschelde and Westerschelde are examples of what could happen in a near future for many global coastal ecosystems. The comparative analysis of the two basins is a valuable source of information to understand (and communicate) the future ecological consequences of human coastal development.
NASA Astrophysics Data System (ADS)
Brito, Joel; Freney, Evelyn; Colomb, Aurelie; Dupuy, Régis; Duplissy, Jonathan; Denjean, Cyrielle; Dominutti, Pamela; Batenburg, Anneke; Haslett, Sophie; Schulz, Christiane; Bourrianne, Thierry; Burnet, Frederic; Borbon, Agnès; Schneider, Johannes; Borrmann, Stephan; Coe, Hugh; Sellegri, Karine; Flamant, Cyrille; Knippertz, Peter; Schwarzenboeck, Alfons
2017-04-01
As part of the Dynamics-Aerosol-Chemistry-Cloud Interactions in West Africa (DACCIWA) project, airborne campaigns were designed to measure a large range of atmospheric constituents focusing on the improvement of our current understanding on the effect of anthropogenic emissions on regional climate. The targeted region, Southern West Africa, holds currently a population of over 340 million people, and is predicted by the United Nations to reach about 800 million by 2050. The climate in the region is characterized by a large-scale atmospheric circulation system which controls precipitation over a land area of about 6 million km2, directly impacting the water resources, agriculture and power generation of hundreds of millions of people. Besides its large natural variability, the West African monsoon system is also expected to be significantly affected by global and regional climate change, with large uncertainties on the role of local pollution. An important aspect assessing the impact of human activities on the local climate is thereby the understanding of aerosol sources and properties. The presented study details results of the DACCIWA measurement campaign using the French ATR42 research aircraft, which in combination with the German Falcon 20 and British Twin Otter aircraft, aimed to characterize physico-chemical properties of aerosols in the region using a suite of aerosol measurement techniques (e.g. C-TOF AMS, APITOF, SMPS, etc.) and supporting information from simultaneous trace gas measurements (e.g. PTRMS). This large dataset has been used to assess how anthropogenic emission (NOx, SO2, SO4) is impacting formation of biogenic secondary organic aerosol formation, in particular through the formation of isoprene epoxydiols (IEPOX). The recently collected data will certainly help understanding the coupling between human activities and regional climate in a sensitive, highly populated area.
Biological Concerns on the Selection of Animal Models for Teratogenic Testing.
Alves-Pimenta, Sofia; Colaço, Bruno; Oliveira, Paula A; Venâncio, Carlos
2018-01-01
During pregnancy fetus can be exposed to a variety of chemicals which may induce abortion and malformations. Due to the amounts of new substances coming into the market every year, a high demand for a rapid, reliable, and cost-effective method to detect potential toxicity is necessary. Different species have been used as animal models for teratogen screening, most of them sharing similar development processes with humans. However, the application of embryology knowledge to teratology is hampered by the complexity of the reproduction processes.The present chapter outlines the essential development periods in different models, and highlights the similarities and differences between species, advantages and disadvantages of each group, and specific sensitivities for teratogenic tests. These models can be organized into the following categories: (1) invertebrate species such Caenorhabditis elegans and Drosophila melanogaster, which have become ideal for screening simple mechanisms in the early periods of reproductive cycle, allowing for rapid results and minor ethical concerns; (2) vertebrate nonmammalian species such Xenopus laevis and Danio rerio, important models to assess teratogenic potential in later development with fewer ethical requirements; and (3) the mammalian species Mus musculus, Rattus norvegicus, and Oryctolagus cuniculus, phylogenetically more close to humans, essential to assess complex specialized processes, that occur later in development.Rules for development toxicology tests require the use of mammalian species. However, ethical concerns and costs limit their use in large-scale screening. By contrast, invertebrate and vertebrate nonmammalian species are increasing as alternative animal models, as these organisms combine less ethical requirements, low costs and culture conditions compatible with large-scale screening. In contrast to the in vitro techniques, their main advantage is to allow for high-throughput screening in a whole-animal context, not dependent on the prior identification of a target. In this chapter, the biological development of the animals most used in teratogenic tests is adressed with the aims of maximizing human translation, reducing the number of animals used, and the time to market for new drugs.
Decadal opportunities for space architects
NASA Astrophysics Data System (ADS)
Sherwood, Brent
2012-12-01
A significant challenge for the new field of space architecture is the dearth of project opportunities. Yet every year more young professionals express interest to enter the field. This paper derives projections that bound the number, type, and range of global development opportunities that may be reasonably expected over the next few decades for human space flight (HSF) systems so those interested in the field can benchmark their goals. Four categories of HSF activity are described: human Exploration of solar system bodies; human Servicing of space-based assets; large-scale development of space Resources; and Breakout of self-sustaining human societies into the solar system. A progressive sequence of capabilities for each category starts with its earliest feasible missions and leads toward its full expression. The four sequences are compared in scale, distance from Earth, and readiness. Scenarios hybridize the most synergistic features from the four sequences for comparison to status quo, government-funded HSF program plans. Finally qualitative, decadal, order-of-magnitude estimates are derived for system development needs, and hence opportunities for space architects. Government investment towards human planetary exploration is the weakest generator of space architecture work. Conversely, the strongest generator is a combination of three market drivers: (1) commercial passenger travel in low Earth orbit; (2) in parallel, government extension of HSF capability to GEO; both followed by (3) scale-up demonstration of end-to-end solar power satellites in GEO. The rich end of this scale affords space architecture opportunities which are more diverse, complex, large-scale, and sociologically challenging than traditional exploration vehicle cabins and habitats.
Egas-Bejar, Daniela; Anderson, Pete M; Agarwal, Rishi; Corrales-Medina, Fernando; Devarajan, Eswaran; Huh, Winston W; Brown, Robert E; Subbiah, Vivek
2014-03-12
The survival of patients with advanced osteosarcoma is poor with limited therapeutic options. There is an urgent need for new targeted therapies based on biomarkers. Recently, theranostic molecular profiling services for cancer patients by CLIA-certified commercial companies as well as in-house profiling in academic medical centers have expanded exponentially. We evaluated molecular profiles of patients with advanced osteosarcoma whose tumor tissue had been analyzed by one of the following methods: 1. 182-gene next-generation exome sequencing (Foundation Medicine, Boston, MA), 2. Immunohistochemistry (IHC)/PCR-based panel (CARIS Target Now, Irving, Tx), 3.Comparative genome hybridization (Oncopath, San Antonio, TX). 4. Single-gene PCR assays, PTEN IHC (MDACC CLIA), 5. UT Houston morphoproteomics (Houston, TX). The most common actionable aberrations occur in the PI3K/PTEN/mTOR pathway. No patterns in genomic alterations beyond the above are readily identifiable, and suggest both high molecular diversity in osteosarcoma and the need for more analyses to define distinct subgroups of osteosarcoma defined by genomic alterations. Based on our preliminary observations we hypothesize that the biology of aggressive and the metastatic phenotype osteosarcoma at the molecular level is similar to human fingerprints, in that no two tumors are identical. Further large scale analyses of osteosarcoma samples are warranted to test this hypothesis.
Expediting SRM assay development for large-scale targeted proteomics experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Chaochao; Shi, Tujin; Brown, Joseph N.
2014-08-22
Due to their high sensitivity and specificity, targeted proteomics measurements, e.g. selected reaction monitoring (SRM), are becoming increasingly popular for biological and translational applications. Selection of optimal transitions and optimization of collision energy (CE) are important assay development steps for achieving sensitive detection and accurate quantification; however, these steps can be labor-intensive, especially for large-scale applications. Herein, we explored several options for accelerating SRM assay development evaluated in the context of a relatively large set of 215 synthetic peptide targets. We first showed that HCD fragmentation is very similar to CID in triple quadrupole (QQQ) instrumentation, and by selection ofmore » top six y fragment ions from HCD spectra, >86% of top transitions optimized from direct infusion on QQQ instrument are covered. We also demonstrated that the CE calculated by existing prediction tools was less accurate for +3 precursors, and a significant increase in intensity for transitions could be obtained using a new CE prediction equation constructed from the present experimental data. Overall, our study illustrates the feasibility of expediting the development of larger numbers of high-sensitivity SRM assays through automation of transitions selection and accurate prediction of optimal CE to improve both SRM throughput and measurement quality.« less
TRIC: an automated alignment strategy for reproducible protein quantification in targeted proteomics
Röst, Hannes L.; Liu, Yansheng; D’Agostino, Giuseppe; Zanella, Matteo; Navarro, Pedro; Rosenberger, George; Collins, Ben C.; Gillet, Ludovic; Testa, Giuseppe; Malmström, Lars; Aebersold, Ruedi
2016-01-01
Large scale, quantitative proteomic studies have become essential for the analysis of clinical cohorts, large perturbation experiments and systems biology studies. While next-generation mass spectrometric techniques such as SWATH-MS have substantially increased throughput and reproducibility, ensuring consistent quantification of thousands of peptide analytes across multiple LC-MS/MS runs remains a challenging and laborious manual process. To produce highly consistent and quantitatively accurate proteomics data matrices in an automated fashion, we have developed the TRIC software which utilizes fragment ion data to perform cross-run alignment, consistent peak-picking and quantification for high throughput targeted proteomics. TRIC uses a graph-based alignment strategy based on non-linear retention time correction to integrate peak elution information from all LC-MS/MS runs acquired in a study. When compared to state-of-the-art SWATH-MS data analysis, the algorithm was able to reduce the identification error by more than 3-fold at constant recall, while correcting for highly non-linear chromatographic effects. On a pulsed-SILAC experiment performed on human induced pluripotent stem (iPS) cells, TRIC was able to automatically align and quantify thousands of light and heavy isotopic peak groups and substantially increased the quantitative completeness and biological information in the data, providing insights into protein dynamics of iPS cells. Overall, this study demonstrates the importance of consistent quantification in highly challenging experimental setups, and proposes an algorithm to automate this task, constituting the last missing piece in a pipeline for automated analysis of massively parallel targeted proteomics datasets. PMID:27479329
Scaling identity connects human mobility and social interactions.
Deville, Pierre; Song, Chaoming; Eagle, Nathan; Blondel, Vincent D; Barabási, Albert-László; Wang, Dashun
2016-06-28
Massive datasets that capture human movements and social interactions have catalyzed rapid advances in our quantitative understanding of human behavior during the past years. One important aspect affecting both areas is the critical role space plays. Indeed, growing evidence suggests both our movements and communication patterns are associated with spatial costs that follow reproducible scaling laws, each characterized by its specific critical exponents. Although human mobility and social networks develop concomitantly as two prolific yet largely separated fields, we lack any known relationships between the critical exponents explored by them, despite the fact that they often study the same datasets. Here, by exploiting three different mobile phone datasets that capture simultaneously these two aspects, we discovered a new scaling relationship, mediated by a universal flux distribution, which links the critical exponents characterizing the spatial dependencies in human mobility and social networks. Therefore, the widely studied scaling laws uncovered in these two areas are not independent but connected through a deeper underlying reality.
Scaling identity connects human mobility and social interactions
Deville, Pierre; Song, Chaoming; Eagle, Nathan; Blondel, Vincent D.; Barabási, Albert-László; Wang, Dashun
2016-01-01
Massive datasets that capture human movements and social interactions have catalyzed rapid advances in our quantitative understanding of human behavior during the past years. One important aspect affecting both areas is the critical role space plays. Indeed, growing evidence suggests both our movements and communication patterns are associated with spatial costs that follow reproducible scaling laws, each characterized by its specific critical exponents. Although human mobility and social networks develop concomitantly as two prolific yet largely separated fields, we lack any known relationships between the critical exponents explored by them, despite the fact that they often study the same datasets. Here, by exploiting three different mobile phone datasets that capture simultaneously these two aspects, we discovered a new scaling relationship, mediated by a universal flux distribution, which links the critical exponents characterizing the spatial dependencies in human mobility and social networks. Therefore, the widely studied scaling laws uncovered in these two areas are not independent but connected through a deeper underlying reality. PMID:27274050
International law poses problems for negative emissions research
NASA Astrophysics Data System (ADS)
Brent, Kerryn; McGee, Jeffrey; McDonald, Jan; Rohling, Eelco J.
2018-06-01
New international governance arrangements that manage environmental risk and potential conflicts of interests are needed to facilitate negative emissions research that is essential to achieving the large-scale CO2 removal implied by the Paris Agreement targets.
Mao, Mei; Zhou, Binbin; Tang, Xianghu; Chen, Cheng; Ge, Meihong; Li, Pan; Huang, Xingjiu; Yang, Liangbao; Liu, Jinhuai
2018-03-15
Liquid interfacial self-assembly of metal nanoparticles holds great promise for its various applications, such as in tunable optical devices, plasmonics, sensors, and catalysis. However, the construction of large-area, ordered, anisotropic, nanoparticle monolayers and the acquisition of self-assembled interface films are still significant challenges. Herein, a rapid, validated method to fabricate large-scale, close-packed nanomaterials at the cyclohexane/water interface, in which hydrophilic cetyltrimethylammonium bromide coated nanoparticles and gold nanorods (AuNRs) self-assemble into densely packed 2D arrays by regulating the surface ligand and suitable inducer, is reported. Decorating AuNRs with polyvinylpyrrolidone not only extensively decreases the charge of AuNRs, but also diminishes repulsive forces. More importantly, a general, facile, novel technique to transfer an interfacial monolayer through a designed in situ reaction cell linked to a microfluidic chip is revealed. The self-assembled nanofilm can then automatically settle on the substrate and be directly detected in the reaction cell in situ by means of a portable Raman spectrometer. Moreover, a close-packed monolayer of self-assembled AuNRs provides massive, efficient hotspots to create great surface-enhanced Raman scattering (SERS) enhancement, which provides high sensitivity and reproducibility as the SERS-active substrate. Furthermore, this strategy was exploited to detect drug molecules in human urine for cyclohexane-extracted targets acting as the oil phase to form an oil/water interface. A portable Raman spectrometer was employed to detect methamphetamine down to 100 ppb levels in human urine, exhibiting excellent practicability. As a universal platform, handy tool, and fast pretreatment method with a good capability for drug detection in biological systems, this technique shows great promise for rapid, credible, and on-spot drug detection. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
[Genome editing of industrial microorganism].
Zhu, Linjiang; Li, Qi
2015-03-01
Genome editing is defined as highly-effective and precise modification of cellular genome in a large scale. In recent years, such genome-editing methods have been rapidly developed in the field of industrial strain improvement. The quickly-updating methods thoroughly change the old mode of inefficient genetic modification, which is "one modification, one selection marker, and one target site". Highly-effective modification mode in genome editing have been developed including simultaneous modification of multiplex genes, highly-effective insertion, replacement, and deletion of target genes in the genome scale, cut-paste of a large DNA fragment. These new tools for microbial genome editing will certainly be applied widely, and increase the efficiency of industrial strain improvement, and promote the revolution of traditional fermentation industry and rapid development of novel industrial biotechnology like production of biofuel and biomaterial. The technological principle of these genome-editing methods and their applications were summarized in this review, which can benefit engineering and construction of industrial microorganism.
Inferring personal economic status from social network location
NASA Astrophysics Data System (ADS)
Luo, Shaojun; Morone, Flaviano; Sarraute, Carlos; Travizano, Matías; Makse, Hernán A.
2017-05-01
It is commonly believed that patterns of social ties affect individuals' economic status. Here we translate this concept into an operational definition at the network level, which allows us to infer the economic well-being of individuals through a measure of their location and influence in the social network. We analyse two large-scale sources: telecommunications and financial data of a whole country's population. Our results show that an individual's location, measured as the optimal collective influence to the structural integrity of the social network, is highly correlated with personal economic status. The observed social network patterns of influence mimic the patterns of economic inequality. For pragmatic use and validation, we carry out a marketing campaign that shows a threefold increase in response rate by targeting individuals identified by our social network metrics as compared to random targeting. Our strategy can also be useful in maximizing the effects of large-scale economic stimulus policies.
Inferring personal economic status from social network location.
Luo, Shaojun; Morone, Flaviano; Sarraute, Carlos; Travizano, Matías; Makse, Hernán A
2017-05-16
It is commonly believed that patterns of social ties affect individuals' economic status. Here we translate this concept into an operational definition at the network level, which allows us to infer the economic well-being of individuals through a measure of their location and influence in the social network. We analyse two large-scale sources: telecommunications and financial data of a whole country's population. Our results show that an individual's location, measured as the optimal collective influence to the structural integrity of the social network, is highly correlated with personal economic status. The observed social network patterns of influence mimic the patterns of economic inequality. For pragmatic use and validation, we carry out a marketing campaign that shows a threefold increase in response rate by targeting individuals identified by our social network metrics as compared to random targeting. Our strategy can also be useful in maximizing the effects of large-scale economic stimulus policies.
Teaching Real Science with a Microcomputer.
ERIC Educational Resources Information Center
Naiman, Adeline
1983-01-01
Discusses various ways science can be taught using microcomputers, including simulations/games which allow large-scale or historic experiments to be replicated on a manageable scale in a brief time. Examples of several computer programs are also presented, including "Experiments in Human Physiology,""Health Awareness…
Construction and Application of a Refined Hospital Management Chain.
Yi, Lihua; Hao, Aimin; Hu, Minmin; Huang, Pei; Yuan, Huikang; Xing, Ming
2015-05-01
Gaining large scale success was quite common in the later period of industrialization for hospitals in China. Today, Chinese hospital management face such problems as service inefficiency, high human resources cost, and low rate of capital use. This study analyzes the refined management chain of the Wuxi No. 2 People's Hospital. This consists of six gears namely "organizational structure, clinical practice, outpatient service, medical technology, and nursing care and logistics" used to achieve maximum scale and benefits. The gears are based on "flat management system targets, chief of medical staff, centralized outpatient service, intensified medical examinations, vertical nursing management and socialized logistics". The hospital took innovative measures. The "one doctor-one patient-one clinic" was well accepted; "one dispensary" shorten the waiting time by 20 min. The 168 rear service hot line "made patients' lives easier; and a red wrist ribbon" for seriously ill patient was implemented to prioritize medical treatment. The core concepts of refined hospital management are optimizing flow process, reducing waste, improving efficiency, saving costs, and taking good care of patients as most important.
Improving measurement technology for the design of sustainable cities
NASA Astrophysics Data System (ADS)
Pardyjak, Eric R.; Stoll, Rob
2017-09-01
This review identifies and discusses measurement technology gaps that are currently preventing major science leaps from being realized in the study of urban environmental transport processes. These scientific advances are necessary to better understand the links between atmospheric transport processes in the urban environment, human activities, and potential management strategies. We propose that with various improved and targeted measurements, it will be possible to provide technically sound guidance to policy and decision makers for the design of sustainable cities. This review focuses on full-scale in situ and remotely sensed measurements of atmospheric winds, temperature, and humidity in cities and links measurements to current modeling and simulation needs. A key conclusion of this review is that there is a need for urban-specific measurement techniques including measurements of highly-resolved three-dimensional fields at sampling frequencies high enough to capture small-scale turbulence processes yet also capable of covering spatial extents large enough to simultaneously capture key features of urban heterogeneity and boundary layer processes while also supporting the validation of current and emerging modeling capabilities.
Lifestyle precision medicine: the next generation in type 2 diabetes prevention?
Mutie, Pascal M; Giordano, Giuseppe N; Franks, Paul W
2017-09-22
The driving force behind the current global type 2 diabetes epidemic is insulin resistance in overweight and obese individuals. Dietary factors, physical inactivity, and sedentary behaviors are the major modifiable risk factors for obesity. Nevertheless, many overweight/obese people do not develop diabetes and lifestyle interventions focused on weight loss and diabetes prevention are often ineffective. Traditionally, chronically elevated blood glucose concentrations have been the hallmark of diabetes; however, many individuals will either remain 'prediabetic' or regress to normoglycemia. Thus, there is a growing need for innovative strategies to tackle diabetes at scale. The emergence of biomarker technologies has allowed more targeted therapeutic strategies for diabetes prevention (precision medicine), though largely confined to pharmacotherapy. Unlike most drugs, lifestyle interventions often have systemic health-enhancing effects. Thus, the pursuance of lifestyle precision medicine in diabetes seems rational. Herein, we review the literature on lifestyle interventions and diabetes prevention, describing the biological systems that can be characterized at scale in human populations, linking them to lifestyle in diabetes, and consider some of the challenges impeding the clinical translation of lifestyle precision medicine.
Library Resources for Bac End Sequencing. Final Technical Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pieter J. de Jong
2000-10-01
Studies directed towards the specific aims outlined for this research award are summarized. The RPCI II Human Bac Library has been expanded by the addition of 6.9-fold genomic coverage. This segment has been generated from a MBOI partial digest of the same anonymous donor DNA used for the rest of the library. A new cloning vector, pTARBAC1, has been constructed and used in the construction of RPCI-II segment 5. This new cloning vector provides a new strategy in identifying targeted genomic regions and will greatly facilitate a large-scale analysis for positional cloning. A new maleCS7BC/6J mouse BAC library has beenmore » constructed. RPCI-23 contain 576 plates (approx 210,000 clones) and represents approximately 11-fold coverage of the mouse genome.« less
Hacker, David L; Bertschinger, Martin; Baldi, Lucia; Wurm, Florian M
2004-10-27
Human embryonic kidney 293 (HEK293) cells, a widely used host for large-scale transient expression of recombinant proteins, are transformed with the adenovirus E1A and E1B genes. Because the E1A proteins function as transcriptional activators or repressors, they may have a positive or negative effect on transient transgene expression in this cell line. Suspension cultures of HEK293 EBNA (HEK293E) cells were co-transfected with a reporter plasmid expressing the GFP gene and a plasmid expressing a short hairpin RNA (shRNA) targeting the E1A mRNAs for degradation by RNA interference (RNAi). The presence of the shRNA in HEK293E cells reduced the steady state level of E1A mRNA up to 75% and increased transient GFP expression from either the elongation factor-1alpha (EF-1alpha) promoter or the human cytomegalovirus (HCMV) immediate early promoter up to twofold. E1A mRNA depletion also resulted in a twofold increase in transient expression of a recombinant IgG in both small- and large-scale suspension cultures when the IgG light and heavy chain genes were controlled by the EF-1alpha promoter. Finally, transient IgG expression was enhanced 2.5-fold when the anti-E1A shRNA was expressed from the same vector as the IgG light chain gene. These results demonstrated that E1A has a negative effect on transient gene expression in HEK293E cells, and they established that RNAi can be used to enhance recombinant protein expression in mammalian cells.
Serological approaches for the diagnosis of schistosomiasis - A review.
Hinz, Rebecca; Schwarz, Norbert G; Hahn, Andreas; Frickmann, Hagen
2017-02-01
Schistosomiasis is a common disease in endemic areas of Sub-Saharan Africa, South America and Asia. It is rare in Europe, mainly imported from endemic countries due to travelling or human migration. Available methods for the diagnosis of schistosomiasis comprise microscopic, molecular and serological approaches, with the latter detecting antigens or antibodies associated with Schistosoma spp. infection. The serological approach is a valuable screening tool in low-endemicity settings and for travel medicine, though the interpretation of any diagnostic results requires knowledge of test characteristics and a patient's history. Specific antibody detection by most currently used assays is only possible in a relatively late stage of infection and does not allow for the differentiation of acute from previous infections for therapeutic control or the discrimination between persisting infection and re-infection. Throughout the last decades, new target antigens have been identified, and assays with improved performance and suitability for use in the field have been developed. For numerous assays, large-scale studies are still required to reliably characterise assay characteristics alone and in association with other available methods for the diagnosis of schistosomiasis. Apart from S. mansoni, S. haematobium and S. japonicum, for which most available tests were developed, other species of Schistosoma that occur less frequently need to be taken into account. This narrative review describes and critically discusses the results of published studies on the evaluation of serological assays that detect antibodies against different Schistosoma species of humans. It provides insights into the diagnostic performance and an overview of available assays and their suitability for large-scale use or individual diagnosis, and thus sets the scene for serological diagnosis of schistosomiasis and the interpretation of results. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Patterson, Kelcey G.; Dixon Pittaro, Jennifer L.; Bastedo, Peter S.; Hess, David A.; Haeryfar, S. M. Mansour; McCormick, John K.
2014-01-01
Superantigens (SAgs) are microbial toxins that cross-link T cell receptors with major histocompatibility class II (MHC-II) molecules leading to the activation of large numbers of T cells. Herein, we describe the development and preclinical testing of a novel tumor-targeted SAg (TTS) therapeutic built using the streptococcal pyrogenic exotoxin C (SpeC) SAg and targeting cancer cells expressing the 5T4 tumor-associated antigen (TAA). To inhibit potentially harmful widespread immune cell activation, a SpeC mutation within the high-affinity MHC-II binding interface was generated (SpeCD203A) that demonstrated a pronounced reduction in mitogenic activity, yet this mutant could still induce immune cell-mediated cancer cell death in vitro. To target 5T4+ cancer cells, we engineered a humanized single chain variable fragment (scFv) antibody to recognize 5T4 (scFv5T4). Specific targeting of scFv5T4 was verified. SpeCD203A fused to scFv5T4 maintained the ability to activate and induce immune cell-mediated cytotoxicity of colorectal cancer cells. Using a xenograft model of established human colon cancer, we demonstrated that the SpeC-based TTS was able to control the growth and spread of large tumors in vivo. This required both TAA targeting by scFv5T4 and functional SAg activity. These studies lay the foundation for the development of streptococcal SAgs as ‘next-generation’ TTSs for cancer immunotherapy. PMID:24736661
War, space, and the evolution of Old World complex societies.
Turchin, Peter; Currie, Thomas E; Turner, Edward A L; Gavrilets, Sergey
2013-10-08
How did human societies evolve from small groups, integrated by face-to-face cooperation, to huge anonymous societies of today, typically organized as states? Why is there so much variation in the ability of different human populations to construct viable states? Existing theories are usually formulated as verbal models and, as a result, do not yield sharply defined, quantitative predictions that could be unambiguously tested with data. Here we develop a cultural evolutionary model that predicts where and when the largest-scale complex societies arose in human history. The central premise of the model, which we test, is that costly institutions that enabled large human groups to function without splitting up evolved as a result of intense competition between societies-primarily warfare. Warfare intensity, in turn, depended on the spread of historically attested military technologies (e.g., chariots and cavalry) and on geographic factors (e.g., rugged landscape). The model was simulated within a realistic landscape of the Afroeurasian landmass and its predictions were tested against a large dataset documenting the spatiotemporal distribution of historical large-scale societies in Afroeurasia between 1,500 BCE and 1,500 CE. The model-predicted pattern of spread of large-scale societies was very similar to the observed one. Overall, the model explained 65% of variance in the data. An alternative model, omitting the effect of diffusing military technologies, explained only 16% of variance. Our results support theories that emphasize the role of institutions in state-building and suggest a possible explanation why a long history of statehood is positively correlated with political stability, institutional quality, and income per capita.
War, space, and the evolution of Old World complex societies
Turchin, Peter; Currie, Thomas E.; Turner, Edward A. L.; Gavrilets, Sergey
2013-01-01
How did human societies evolve from small groups, integrated by face-to-face cooperation, to huge anonymous societies of today, typically organized as states? Why is there so much variation in the ability of different human populations to construct viable states? Existing theories are usually formulated as verbal models and, as a result, do not yield sharply defined, quantitative predictions that could be unambiguously tested with data. Here we develop a cultural evolutionary model that predicts where and when the largest-scale complex societies arose in human history. The central premise of the model, which we test, is that costly institutions that enabled large human groups to function without splitting up evolved as a result of intense competition between societies—primarily warfare. Warfare intensity, in turn, depended on the spread of historically attested military technologies (e.g., chariots and cavalry) and on geographic factors (e.g., rugged landscape). The model was simulated within a realistic landscape of the Afroeurasian landmass and its predictions were tested against a large dataset documenting the spatiotemporal distribution of historical large-scale societies in Afroeurasia between 1,500 BCE and 1,500 CE. The model-predicted pattern of spread of large-scale societies was very similar to the observed one. Overall, the model explained 65% of variance in the data. An alternative model, omitting the effect of diffusing military technologies, explained only 16% of variance. Our results support theories that emphasize the role of institutions in state-building and suggest a possible explanation why a long history of statehood is positively correlated with political stability, institutional quality, and income per capita. PMID:24062433
A new way to protect privacy in large-scale genome-wide association studies.
Kamm, Liina; Bogdanov, Dan; Laur, Sven; Vilo, Jaak
2013-04-01
Increased availability of various genotyping techniques has initiated a race for finding genetic markers that can be used in diagnostics and personalized medicine. Although many genetic risk factors are known, key causes of common diseases with complex heritage patterns are still unknown. Identification of such complex traits requires a targeted study over a large collection of data. Ideally, such studies bring together data from many biobanks. However, data aggregation on such a large scale raises many privacy issues. We show how to conduct such studies without violating privacy of individual donors and without leaking the data to third parties. The presented solution has provable security guarantees. Supplementary data are available at Bioinformatics online.
The loss-of-allele assay for ES cell screening and mouse genotyping.
Frendewey, David; Chernomorsky, Rostislav; Esau, Lakeisha; Om, Jinsop; Xue, Yingzi; Murphy, Andrew J; Yancopoulos, George D; Valenzuela, David M
2010-01-01
Targeting vectors used to create directed mutations in mouse embryonic stem (ES) cells consist, in their simplest form, of a gene for drug selection flanked by mouse genomic sequences, the so-called homology arms that promote site-directed homologous recombination between the vector and the target gene. The VelociGene method for the creation of targeted mutations in ES cells employs targeting vectors, called BACVecs, that are based on bacterial artificial chromosomes. Compared with conventional short targeting vectors, BacVecs provide two major advantages: (1) their much larger homology arms promote high targeting efficiencies without the need for isogenicity or negative selection strategies; and (2) they enable deletions and insertions of up to 100kb in a single targeting event, making possible gene-ablating definitive null alleles and other large-scale genomic modifications. Because of their large arm sizes, however, BACVecs do not permit screening by conventional assays, such as long-range PCR or Southern blotting, that link the inserted targeting vector to the targeted locus. To exploit the advantages of BACVecs for gene targeting, we inverted the conventional screening logic in developing the loss-of-allele (LOA) assay, which quantifies the number of copies of the native locus to which the mutation was directed. In a correctly targeted ES cell clone, the LOA assay detects one of the two native alleles (for genes not on the X or Y chromosome), the other allele being disrupted by the targeted modification. We apply the same principle in reverse as a gain-of-allele assay to quantify the copy number of the inserted targeting vector. The LOA assay reveals a correctly targeted clone as having lost one copy of the native target gene and gained one copy of the drug resistance gene or other inserted marker. The combination of these quantitative assays makes LOA genotyping unequivocal and amenable to automated scoring. We use the quantitative polymerase chain reaction (qPCR) as our method of allele quantification, but any method that can reliably distinguish the difference between one and two copies of the target gene can be used to develop an LOA assay. We have designed qPCR LOA assays for deletions, insertions, point mutations, domain swaps, conditional, and humanized alleles and have used the insert assays to quantify the copy number of random insertion BAC transgenics. Because of its quantitative precision, specificity, and compatibility with high throughput robotic operations, the LOA assay eliminates bottlenecks in ES cell screening and mouse genotyping and facilitates maximal speed and throughput for knockout mouse production. Copyright (c) 2010 Elsevier Inc. All rights reserved.
Family and Human Development across Cultures: A View from the Other Side.
ERIC Educational Resources Information Center
Kagitcibasi, Cigdem
Using a contextual-developmental-functional approach, this book seeks to discover the functional links between family dynamics and socialization within varying sociocultural contexts to human development, and to integrate theory and application in large-scale interventions promoting human well-being and societal development in the Majority World.…
Automated Essay Scoring versus Human Scoring: A Correlational Study
ERIC Educational Resources Information Center
Wang, Jinhao; Brown, Michelle Stallone
2008-01-01
The purpose of the current study was to analyze the relationship between automated essay scoring (AES) and human scoring in order to determine the validity and usefulness of AES for large-scale placement tests. Specifically, a correlational research design was used to examine the correlations between AES performance and human raters' performance.…
Murphy, Patricia; Kabir, Md Humayun; Srivastava, Tarini; Mason, Michele E.; Dewi, Chitra U.; Lim, Seakcheng; Yang, Andrian; Djordjevic, Djordje; Killingsworth, Murray C.; Ho, Joshua W. K.; Harman, David G.
2018-01-01
ABSTRACT Cataracts cause vision loss and blindness by impairing the ability of the ocular lens to focus light onto the retina. Various cataract risk factors have been identified, including drug treatments, age, smoking and diabetes. However, the molecular events responsible for these different forms of cataract are ill-defined, and the advent of modern cataract surgery in the 1960s virtually eliminated access to human lenses for research. Here, we demonstrate large-scale production of light-focusing human micro-lenses from spheroidal masses of human lens epithelial cells purified from differentiating pluripotent stem cells. The purified lens cells and micro-lenses display similar morphology, cellular arrangement, mRNA expression and protein expression to human lens cells and lenses. Exposing the micro-lenses to the emergent cystic fibrosis drug Vx-770 reduces micro-lens transparency and focusing ability. These human micro-lenses provide a powerful and large-scale platform for defining molecular disease mechanisms caused by cataract risk factors, for anti-cataract drug screening and for clinically relevant toxicity assays. PMID:29217756
Murphy, Patricia; Kabir, Md Humayun; Srivastava, Tarini; Mason, Michele E; Dewi, Chitra U; Lim, Seakcheng; Yang, Andrian; Djordjevic, Djordje; Killingsworth, Murray C; Ho, Joshua W K; Harman, David G; O'Connor, Michael D
2018-01-09
Cataracts cause vision loss and blindness by impairing the ability of the ocular lens to focus light onto the retina. Various cataract risk factors have been identified, including drug treatments, age, smoking and diabetes. However, the molecular events responsible for these different forms of cataract are ill-defined, and the advent of modern cataract surgery in the 1960s virtually eliminated access to human lenses for research. Here, we demonstrate large-scale production of light-focusing human micro-lenses from spheroidal masses of human lens epithelial cells purified from differentiating pluripotent stem cells. The purified lens cells and micro-lenses display similar morphology, cellular arrangement, mRNA expression and protein expression to human lens cells and lenses. Exposing the micro-lenses to the emergent cystic fibrosis drug Vx-770 reduces micro-lens transparency and focusing ability. These human micro-lenses provide a powerful and large-scale platform for defining molecular disease mechanisms caused by cataract risk factors, for anti-cataract drug screening and for clinically relevant toxicity assays. © 2018. Published by The Company of Biologists Ltd.
Letzel, Thomas; Bayer, Anne; Schulz, Wolfgang; Heermann, Alexandra; Lucke, Thomas; Greco, Giorgia; Grosse, Sylvia; Schüssler, Walter; Sengl, Manfred; Letzel, Marion
2015-10-01
A large number of anthropogenic trace contaminants such as pharmaceuticals, their human metabolites and further transformation products (TPs) enter wastewater treatment plants on a daily basis. A mixture of known, expected, and unknown molecules are discharged into the receiving aquatic environment because only partial elimination occurs for many of these chemicals during physical, biological and chemical treatment processes. In this study, an array of LC-MS methods from three collaborating laboratories was applied to detect and identify anthropogenic trace contaminants and their TPs in different waters. Starting with theoretical predictions of TPs, an efficient workflow using the combination of target, suspected-target and non-target strategies for the identification of these TPs in the environment was developed. These techniques and strategies were applied to study anti-hypertensive drugs from the sartan group (i.e., candesartan, eprosartan, irbesartan, olmesartan, and valsartan). Degradation experiments were performed in lab-scale wastewater treatment plants, and a screening workflow including an inter-laboratory approach was used for the identification of transformation products in the effluent samples. Subsequently, newly identified compounds were successfully analyzed in effluents of real wastewater treatment plants and river waters. Copyright © 2015 Elsevier Ltd. All rights reserved.
(New hosts and vectors for genome cloning)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The main goal of our project remains the development of new bacterial hosts and vectors for the stable propagation of human DNA clones in E. coli. During the past six months of our current budget period, we have (1) continued to develop new hosts that permit the stable maintenance of unstable features of human DNA, and (2) developed a series of vectors for (a) cloning large DNA inserts, (b) assessing the frequency of human sequences that are lethal to the growth of E. coli, and (c) assessing the stability of human sequences cloned in M13 for large-scale sequencing projects.
[New hosts and vectors for genome cloning]. Progress report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The main goal of our project remains the development of new bacterial hosts and vectors for the stable propagation of human DNA clones in E. coli. During the past six months of our current budget period, we have (1) continued to develop new hosts that permit the stable maintenance of unstable features of human DNA, and (2) developed a series of vectors for (a) cloning large DNA inserts, (b) assessing the frequency of human sequences that are lethal to the growth of E. coli, and (c) assessing the stability of human sequences cloned in M13 for large-scale sequencing projects.
Collins, Jeffrey M; Hunter, Mary; Gordon, Wanda; Kempker, Russell R; Blumberg, Henry M; Ray, Susan M
2018-06-01
Following large declines in tuberculosis transmission the United States, large-scale screening programs targeting low-risk healthcare workers are increasingly a source of false-positive results. We report a large cluster of presumed false-positive tuberculin skin test results in healthcare workers following a change to 50-dose vials of Tubersol tuberculin.Infect Control Hosp Epidemiol 2018;39:750-752.
Rodríguez-Gómez, Francisco; Romero-Gil, Verónica; Arroyo-López, Francisco N; Roldán-Reyes, Juan C; Torres-Gallardo, Rosa; Bautista-Gallego, Joaquín; García-García, Pedro; Garrido-Fernández, Antonio
2017-01-01
This work studies the inoculation conditions for allowing the survival/predominance of a potential probiotic strain ( Lactobacillus pentosus TOMC-LAB2) when used as a starter culture in large-scale fermentations of green Spanish-style olives. The study was performed in two successive seasons (2011/2012 and 2012/2013), using about 150 tons of olives. Inoculation immediately after brining (to prevent wild initial microbiota growth) followed by re-inoculation 24 h later (to improve competitiveness) was essential for inoculum predominance. Processing early in the season (September) showed a favorable effect on fermentation and strain predominance on olives (particularly when using acidified brines containing 25 L HCl/vessel) but caused the disappearance of the target strain from both brines and olives during the storage phase. On the contrary, processing in October slightly reduced the target strain predominance on olives (70-90%) but allowed longer survival. The type of inoculum used (laboratory vs. industry pre-adapted) never had significant effects. Thus, this investigation discloses key issues for the survival and predominance of starter cultures in large-scale industrial fermentations of green Spanish-style olives. Results can be of interest for producing probiotic table olives and open new research challenges on the causes of inoculum vanishing during the storage phase.
Frequent global transmission of H1N1pdm09 influenza viruses from humans to swine, 2009-2011
USDA-ARS?s Scientific Manuscript database
Using a large-scale phylogenetic approach we identify at least 52 human-to-swine transmission events of pandemic A/H1N1/09 influenza virus. These results highlight the global frequency of swine exposure to human influenza viruses and the permeability of the human-swine species barrier, even followin...
A critical assessment of boron target compounds for boron neutron capture therapy.
Hawthorne, M Frederick; Lee, Mark W
2003-01-01
Boron neutron capture therapy (BNCT) has undergone dramatic developments since its inception by Locher in 1936 and the development of nuclear energy during World War II. The ensuing Cold War spawned the entirely new field of polyhedral borane chemistry, rapid advances in nuclear reactor technology and a corresponding increase in the number to reactors potentially available for BNCT. This effort has been largely oriented toward the eradication of glioblastoma multiforme (GBM) and melanoma with reduced interest in other types of malignancies. The design and synthesis of boron-10 target compounds needed for BNCT was not channeled to those types of compounds specifically required for GBM or melanoma. Consequently, a number of potentially useful boron agents are known which have not been biologically evaluated beyond a cursory examination and only three boron-10 enriched target species are approved for human use following their Investigational New Drug classification by the US Food and Drug Administration; BSH, BPA and GB-10. All ongoing clinical trials with GBM and melanoma are necessarily conducted with one of these three species and most often with BPA. The further development of BNCT is presently stalled by the absence of strong support for advanced compound evaluation and compound discovery driven by recent advances in biology and chemistry. A rigorous demonstration of BNCT efficacy surpassing that of currently available protocols has yet to be achieved. This article discusses the past history of compound development, contemporary problems such as compound classification and those problems which impede future advances. The latter include means for biological evaluation of new (and existing) boron target candidates at all stages of their development and the large-scale synthesis of boron target species for clinical trials and beyond. The future of BNCT is bright if latitude is given to the choice of clinical disease to be treated and if a recognized study demonstrating improved efficacy is completed. Eventually, BNCT in some form will be commercialized.
Hontelez, Jan A. C.; Bakker, Roel; Blok, David J.; Cai, Rui; Houweling, Tanja A. J.; Kulik, Margarete C.; Lenk, Edeltraud J.; Luyendijk, Marianne; Matthijsse, Suzette M.; Redekop, William K.; Wagenaar, Inge; Jacobson, Julie; Nagelkerke, Nico J. D.; Richardus, Jan H.
2016-01-01
Background The London Declaration (2012) was formulated to support and focus the control and elimination of ten neglected tropical diseases (NTDs), with targets for 2020 as formulated by the WHO Roadmap. Five NTDs (lymphatic filariasis, onchocerciasis, schistosomiasis, soil-transmitted helminths and trachoma) are to be controlled by preventive chemotherapy (PCT), and four (Chagas’ disease, human African trypanosomiasis, leprosy and visceral leishmaniasis) by innovative and intensified disease management (IDM). Guinea worm, virtually eradicated, is not considered here. We aim to estimate the global health impact of meeting these targets in terms of averted morbidity, mortality, and disability adjusted life years (DALYs). Methods The Global Burden of Disease (GBD) 2010 study provides prevalence and burden estimates for all nine NTDs in 1990 and 2010, by country, age and sex, which were taken as the basis for our calculations. Estimates for other years were obtained by interpolating between 1990 (or the start-year of large-scale control efforts) and 2010, and further extrapolating until 2030, such that the 2020 targets were met. The NTD disease manifestations considered in the GBD study were analyzed as either reversible or irreversible. Health impacts were assessed by comparing the results of achieving the targets with the counterfactual, construed as the health burden had the 1990 (or 2010 if higher) situation continued unabated. Principle Findings/Conclusions Our calculations show that meeting the targets will lead to about 600 million averted DALYs in the period 2011–2030, nearly equally distributed between PCT and IDM-NTDs, with the health gain amongst PCT-NTDs mostly (96%) due to averted disability and amongst IDM-NTDs largely (95%) from averted mortality. These health gains include about 150 million averted irreversible disease manifestations (e.g. blindness) and 5 million averted deaths. Control of soil-transmitted helminths accounts for one third of all averted DALYs. We conclude that the projected health impact of the London Declaration justifies the required efforts. PMID:26890362
Norman, Paul J.; Norberg, Steven J.; Guethlein, Lisbeth A.; Nemat-Gorgani, Neda; Royce, Thomas; Wroblewski, Emily E.; Dunn, Tamsen; Mann, Tobias; Alicata, Claudia; Hollenbach, Jill A.; Chang, Weihua; Shults Won, Melissa; Gunderson, Kevin L.; Abi-Rached, Laurent; Ronaghi, Mostafa; Parham, Peter
2017-01-01
The most polymorphic part of the human genome, the MHC, encodes over 160 proteins of diverse function. Half of them, including the HLA class I and II genes, are directly involved in immune responses. Consequently, the MHC region strongly associates with numerous diseases and clinical therapies. Notoriously, the MHC region has been intractable to high-throughput analysis at complete sequence resolution, and current reference haplotypes are inadequate for large-scale studies. To address these challenges, we developed a method that specifically captures and sequences the 4.8-Mbp MHC region from genomic DNA. For 95 MHC homozygous cell lines we assembled, de novo, a set of high-fidelity contigs and a sequence scaffold, representing a mean 98% of the target region. Included are six alternative MHC reference sequences of the human genome that we completed and refined. Characterization of the sequence and structural diversity of the MHC region shows the approach accurately determines the sequences of the highly polymorphic HLA class I and HLA class II genes and the complex structural diversity of complement factor C4A/C4B. It has also uncovered extensive and unexpected diversity in other MHC genes; an example is MUC22, which encodes a lung mucin and exhibits more coding sequence alleles than any HLA class I or II gene studied here. More than 60% of the coding sequence alleles analyzed were previously uncharacterized. We have created a substantial database of robust reference MHC haplotype sequences that will enable future population scale studies of this complicated and clinically important region of the human genome. PMID:28360230
de Magalhães, João Pedro; Matsuda, Alex
2012-03-01
Modern humans originated in Africa before migrating across the world with founder effects and adaptations to new environments contributing to their present phenotypic diversity. Determining the genetic basis of differences between populations may provide clues about our evolutionary history and may have clinical implications. Herein, we develop a method to detect genes and biological processes in which populations most differ by calculating the genetic distance between modern populations and a hypothetical ancestral population. We apply our method to large-scale single nucleotide polymorphism (SNP) data from human populations of African, European and Asian origin. As expected, ancestral alleles were more conserved in the African populations and we found evidence of high divergence in genes previously suggested as targets of selection related to skin pigmentation, immune response, senses and dietary adaptations. Our genome-wide scan also reveals novel candidates for contributing to population-specific traits. These include genes related to neuronal development and behavior that may have been influenced by cultural processes. Moreover, in the African populations, we found a high divergence in genes related to UV protection and to the male reproductive system. Taken together, these results confirm and expand previous findings, providing new clues about the evolution and genetics of human phenotypic diversity. © 2011 The Authors Annals of Human Genetics © 2011 Blackwell Publishing Ltd/University College London.
Risk for large-scale fires in boreal forests of Finland under changing climate
NASA Astrophysics Data System (ADS)
Lehtonen, I.; Venäläinen, A.; Kämäräinen, M.; Peltola, H.; Gregow, H.
2015-08-01
The target of this work was to assess the impact of projected climate change on the number of large forest fires (over 10 ha fires) and burned area in Finland. For this purpose, we utilized a strong relationship between fire occurrence and the Canadian fire weather index (FWI) during 1996-2014. We used daily data from five global climate models under representative concentration pathway RCP4.5 and RCP8.5 scenarios. The model data were statistically downscaled onto a high-resolution grid using the quantile-mapping method before performing the analysis. Our results suggest that the number of large forest fires may double or even triple during the present century. This would increase the risk that some of the fires could develop into real conflagrations which have become almost extinct in Finland due to active and efficient fire suppression. Our results also reveal substantial inter-model variability in the rate of the projected increase in forest-fire danger. We moreover showed that the majority of large fires occur within a relatively short period in May and June due to human activities and that FWI correlates poorer with the fire activity during this time of year than later in summer when lightning is more important cause of fires.
NASA Astrophysics Data System (ADS)
Gerlitz, Lars; Gafurov, Abror; Apel, Heiko; Unger-Sayesteh, Katy; Vorogushyn, Sergiy; Merz, Bruno
2016-04-01
Statistical climate forecast applications typically utilize a small set of large scale SST or climate indices, such as ENSO, PDO or AMO as predictor variables. If the predictive skill of these large scale modes is insufficient, specific predictor variables such as customized SST patterns are frequently included. Hence statistically based climate forecast models are either based on a fixed number of climate indices (and thus might not consider important predictor variables) or are highly site specific and barely transferable to other regions. With the aim of developing an operational seasonal forecast model, which is easily transferable to any region in the world, we present a generic data mining approach which automatically selects potential predictors from gridded SST observations and reanalysis derived large scale atmospheric circulation patterns and generates robust statistical relationships with posterior precipitation anomalies for user selected target regions. Potential predictor variables are derived by means of a cellwise correlation analysis of precipitation anomalies with gridded global climate variables under consideration of varying lead times. Significantly correlated grid cells are subsequently aggregated to predictor regions by means of a variability based cluster analysis. Finally for every month and lead time, an individual random forest based forecast model is automatically calibrated and evaluated by means of the preliminary generated predictor variables. The model is exemplarily applied and evaluated for selected headwater catchments in Central and South Asia. Particularly the for winter and spring precipitation (which is associated with westerly disturbances in the entire target domain) the model shows solid results with correlation coefficients up to 0.7, although the variability of precipitation rates is highly underestimated. Likewise for the monsoonal precipitation amounts in the South Asian target areas a certain skill of the model could be detected. The skill of the model for the dry summer season in Central Asia and the transition seasons over South Asia is found to be low. A sensitivity analysis by means on well known climate indices reveals the major large scale controlling mechanisms for the seasonal precipitation climate of each target area. For the Central Asian target areas, both, the El Nino Southern Oscillation and the North Atlantic Oscillation are identified as important controlling factors for precipitation totals during moist spring season. Drought conditions are found to be triggered by a warm ENSO phase in combination with a positive phase of the NAO. For the monsoonal summer precipitation amounts over Southern Asia, the model suggests a distinct negative response to El Nino events.
The CAMELS data set: catchment attributes and meteorology for large-sample studies
NASA Astrophysics Data System (ADS)
Addor, Nans; Newman, Andrew J.; Mizukami, Naoki; Clark, Martyn P.
2017-10-01
We present a new data set of attributes for 671 catchments in the contiguous United States (CONUS) minimally impacted by human activities. This complements the daily time series of meteorological forcing and streamflow provided by Newman et al. (2015b). To produce this extension, we synthesized diverse and complementary data sets to describe six main classes of attributes at the catchment scale: topography, climate, streamflow, land cover, soil, and geology. The spatial variations among basins over the CONUS are discussed and compared using a series of maps. The large number of catchments, combined with the diversity of the attributes we extracted, makes this new data set well suited for large-sample studies and comparative hydrology. In comparison to the similar Model Parameter Estimation Experiment (MOPEX) data set, this data set relies on more recent data, it covers a wider range of attributes, and its catchments are more evenly distributed across the CONUS. This study also involves assessments of the limitations of the source data sets used to compute catchment attributes, as well as detailed descriptions of how the attributes were computed. The hydrometeorological time series provided by Newman et al. (2015b, https://doi.org/10.5065/D6MW2F4D) together with the catchment attributes introduced in this paper (https://doi.org/10.5065/D6G73C3Q) constitute the freely available CAMELS data set, which stands for Catchment Attributes and MEteorology for Large-sample Studies.
Universality of accelerating change
NASA Astrophysics Data System (ADS)
Eliazar, Iddo; Shlesinger, Michael F.
2018-03-01
On large time scales the progress of human technology follows an exponential growth trend that is termed accelerating change. The exponential growth trend is commonly considered to be the amalgamated effect of consecutive technology revolutions - where the progress carried in by each technology revolution follows an S-curve, and where the aging of each technology revolution drives humanity to push for the next technology revolution. Thus, as a collective, mankind is the 'intelligent designer' of accelerating change. In this paper we establish that the exponential growth trend - and only this trend - emerges universally, on large time scales, from systems that combine together two elements: randomness and amalgamation. Hence, the universal generation of accelerating change can be attained by systems with no 'intelligent designer'.
Scale dependent behavioral responses to human development by a large predator, the puma.
Wilmers, Christopher C; Wang, Yiwei; Nickel, Barry; Houghtaling, Paul; Shakeri, Yasaman; Allen, Maximilian L; Kermish-Wells, Joe; Yovovich, Veronica; Williams, Terrie
2013-01-01
The spatial scale at which organisms respond to human activity can affect both ecological function and conservation planning. Yet little is known regarding the spatial scale at which distinct behaviors related to reproduction and survival are impacted by human interference. Here we provide a novel approach to estimating the spatial scale at which a top predator, the puma (Puma concolor), responds to human development when it is moving, feeding, communicating, and denning. We find that reproductive behaviors (communication and denning) require at least a 4× larger buffer from human development than non-reproductive behaviors (movement and feeding). In addition, pumas give a wider berth to types of human development that provide a more consistent source of human interference (neighborhoods) than they do to those in which human presence is more intermittent (arterial roads with speeds >35 mph). Neighborhoods were a deterrent to pumas regardless of behavior, while arterial roads only deterred pumas when they were communicating and denning. Female pumas were less deterred by human development than males, but they showed larger variation in their responses overall. Our behaviorally explicit approach to modeling animal response to human activity can be used as a novel tool to assess habitat quality, identify wildlife corridors, and mitigate human-wildlife conflict.
A Parallel Finite Set Statistical Simulator for Multi-Target Detection and Tracking
NASA Astrophysics Data System (ADS)
Hussein, I.; MacMillan, R.
2014-09-01
Finite Set Statistics (FISST) is a powerful Bayesian inference tool for the joint detection, classification and tracking of multi-target environments. FISST is capable of handling phenomena such as clutter, misdetections, and target birth and decay. Implicit within the approach are solutions to the data association and target label-tracking problems. Finally, FISST provides generalized information measures that can be used for sensor allocation across different types of tasks such as: searching for new targets, and classification and tracking of known targets. These FISST capabilities have been demonstrated on several small-scale illustrative examples. However, for implementation in a large-scale system as in the Space Situational Awareness problem, these capabilities require a lot of computational power. In this paper, we implement FISST in a parallel environment for the joint detection and tracking of multi-target systems. In this implementation, false alarms and misdetections will be modeled. Target birth and decay will not be modeled in the present paper. We will demonstrate the success of the method for as many targets as we possibly can in a desktop parallel environment. Performance measures will include: number of targets in the simulation, certainty of detected target tracks, computational time as a function of clutter returns and number of targets, among other factors.
Vugmeyster, Yulia; Rohde, Cynthia; Perreault, Mylene; Gimeno, Ruth E; Singh, Pratap
2013-01-01
TAM-163, an agonist monoclonal antibody targeting tyrosine receptor kinase-B (TrkB), is currently being investigated as a potential body weight modulatory agent in humans. To support the selection of the dose range for the first-in-human (FIH) trial of TAM-163, we conducted a mechanistic analysis of the pharmacokinetic (PK) and pharmacodynamic (PD) data (e.g., body weight gain) obtained in lean cynomolgus and obese rhesus monkeys following single doses ranging from 0.3 to 60 mg/kg. A target-mediated drug disposition (TMDD) model was used to describe the observed nonlinear PK and Emax approach was used to describe the observed dose-dependent PD effect. The TMDD model development was supported by the experimental determination of the binding affinity constant (9.4 nM) and internalization rate of the drug-target complex (2.08 h(-1)). These mechanistic analyses enabled linking of exposure, target (TrkB) coverage, and pharmacological activity (e.g., PD) in monkeys, and indicated that ≥ 38% target coverage (time-average) was required to achieve significant body weight gain in monkeys. Based on the scaling of the TMDD model from monkeys to humans and assuming similar relationship between the target coverage and pharmacological activity between monkey and humans, subcutaneous (SC) doses of 1 and 15 mg/kg in humans were projected to be the minimally and the fully pharmacologically active doses, respectively. Based on the minimal anticipated biological effect level (MABEL) approach for starting dose selection, the dose of 0.05 mg/kg (3 mg for a 60 kg human) SC was recommended as the starting dose for FIH trials, because at this dose level<10% target coverage was projected at Cmax (and all other time points). This study illustrates a rational mechanistic approach for the selection of FIH dose range for a therapeutic protein with a complex model of action.
Understanding metropolitan patterns of daily encounters.
Sun, Lijun; Axhausen, Kay W; Lee, Der-Horng; Huang, Xianfeng
2013-08-20
Understanding of the mechanisms driving our daily face-to-face encounters is still limited; the field lacks large-scale datasets describing both individual behaviors and their collective interactions. However, here, with the help of travel smart card data, we uncover such encounter mechanisms and structures by constructing a time-resolved in-vehicle social encounter network on public buses in a city (about 5 million residents). Using a population scale dataset, we find physical encounters display reproducible temporal patterns, indicating that repeated encounters are regular and identical. On an individual scale, we find that collective regularities dominate distinct encounters' bounded nature. An individual's encounter capability is rooted in his/her daily behavioral regularity, explaining the emergence of "familiar strangers" in daily life. Strikingly, we find individuals with repeated encounters are not grouped into small communities, but become strongly connected over time, resulting in a large, but imperceptible, small-world contact network or "structure of co-presence" across the whole metropolitan area. Revealing the encounter pattern and identifying this large-scale contact network are crucial to understanding the dynamics in patterns of social acquaintances, collective human behaviors, and--particularly--disclosing the impact of human behavior on various diffusion/spreading processes.
Understanding metropolitan patterns of daily encounters
Sun, Lijun; Axhausen, Kay W.; Lee, Der-Horng; Huang, Xianfeng
2013-01-01
Understanding of the mechanisms driving our daily face-to-face encounters is still limited; the field lacks large-scale datasets describing both individual behaviors and their collective interactions. However, here, with the help of travel smart card data, we uncover such encounter mechanisms and structures by constructing a time-resolved in-vehicle social encounter network on public buses in a city (about 5 million residents). Using a population scale dataset, we find physical encounters display reproducible temporal patterns, indicating that repeated encounters are regular and identical. On an individual scale, we find that collective regularities dominate distinct encounters’ bounded nature. An individual’s encounter capability is rooted in his/her daily behavioral regularity, explaining the emergence of “familiar strangers” in daily life. Strikingly, we find individuals with repeated encounters are not grouped into small communities, but become strongly connected over time, resulting in a large, but imperceptible, small-world contact network or “structure of co-presence” across the whole metropolitan area. Revealing the encounter pattern and identifying this large-scale contact network are crucial to understanding the dynamics in patterns of social acquaintances, collective human behaviors, and—particularly—disclosing the impact of human behavior on various diffusion/spreading processes. PMID:23918373
Manufacturing Process Developments for Regeneratively-Cooled Channel Wall Rocket Nozzles
NASA Technical Reports Server (NTRS)
Gradl, Paul; Brandsmeier, Will
2016-01-01
Regeneratively cooled channel wall nozzles incorporate a series of integral coolant channels to contain the coolant to maintain adequate wall temperatures and expand hot gas providing engine thrust and specific impulse. NASA has been evaluating manufacturing techniques targeting large scale channel wall nozzles to support affordability of current and future liquid rocket engine nozzles and thrust chamber assemblies. The development of these large scale manufacturing techniques focus on the liner formation, channel slotting with advanced abrasive water-jet milling techniques and closeout of the coolant channels to replace or augment other cost reduction techniques being evaluated for nozzles. NASA is developing a series of channel closeout techniques including large scale additive manufacturing laser deposition and explosively bonded closeouts. A series of subscale nozzles were completed evaluating these processes. Fabrication of mechanical test and metallography samples, in addition to subscale hardware has focused on Inconel 625, 300 series stainless, aluminum alloys as well as other candidate materials. Evaluations of these techniques are demonstrating potential for significant cost reductions for large scale nozzles and chambers. Hot fire testing is planned using these techniques in the future.
K. Bruce Jones; Anne C. Neale; Timothy G. Wade; James D. Wickham; Chad L. Cross; Curtis M. Edmonds; Thomas R. Loveland; Maliha S. Nash; Kurt H. Riitters; Elizabeth R. Smith
2001-01-01
Spatially explicit identification of changes in ecological conditions over large areas is key to targeting and prioitizing areas for environmental protection and restoration by managers at watershed, basin, and regional scales. A critical limitation to this point has been the development of methods to conduct such broad-scale assessments. Field-based methods have...
NASA Astrophysics Data System (ADS)
Ostermayr, T. M.; Gebhard, J.; Haffa, D.; Kiefer, D.; Kreuzer, C.; Allinger, K.; Bömer, C.; Braenzel, J.; Schnürer, M.; Cermak, I.; Schreiber, J.; Hilz, P.
2018-01-01
We report on a Paul-trap system with large access angles that allows positioning of fully isolated micrometer-scale particles with micrometer precision as targets in high-intensity laser-plasma interactions. This paper summarizes theoretical and experimental concepts of the apparatus as well as supporting measurements that were performed for the trapping process of single particles.
Huang, Wei-Chiao; Burnouf, Pierre-Alain; Su, Yu-Cheng; Chen, Bing-Mae; Chuang, Kuo-Hsiang; Lee, Chia-Wei; Wei, Pei-Kuen; Cheng, Tian-Lu; Roffler, Steve R
2016-01-26
Attachment of ligands to the surface of nanoparticles (NPs) is an attractive approach to target specific cells and increase intracellular delivery of nanocargos. To expedite investigation of targeted NPs, we engineered human cancer cells to express chimeric receptors that bind polyethylene glycol (PEG) and internalize stealth NPs in a fashion similar to ligand-targeted liposomes against epidermal growth factor receptor 1 or 2 (HER1 or HER2), which are validated targets for cancer therapy. Measurement of the rate of endocytosis and lysosomal accumulation of small (80-94 nm) or large (180-220 nm) flexible liposomes or more rigid lipid-coated mesoporous silica particles in human HT29 colon cancer and SKBR3 breast cancer cells that express chimeric receptors revealed that larger and more rigid NPs were internalized more slowly than smaller and more flexible NPs. An exception is when both the small and large liposomes underwent endocytosis via HER2. HER1 mediated faster and greater uptake of NPs into cells but retained NPs less well as compared to HER2. Lysosomal accumulation of NPs internalized via HER1 was unaffected by NP rigidity but was inversely related to NP size, whereas large rigid NPs internalized by HER2 displayed increased lysosomal accumulation. Our results provide insight into the effects of NP properties on receptor-mediated endocytosis and suggest that anti-PEG chimeric receptors may help accelerate investigation of targeted stealth NPs.
Ovchinnikov, Victor; Karplus, Martin
2012-07-26
The popular targeted molecular dynamics (TMD) method for generating transition paths in complex biomolecular systems is revisited. In a typical TMD transition path, the large-scale changes occur early and the small-scale changes tend to occur later. As a result, the order of events in the computed paths depends on the direction in which the simulations are performed. To identify the origin of this bias, and to propose a method in which the bias is absent, variants of TMD in the restraint formulation are introduced and applied to the complex open ↔ closed transition in the protein calmodulin. Due to the global best-fit rotation that is typically part of the TMD method, the simulated system is guided implicitly along the lowest-frequency normal modes, until the large spatial scales associated with these modes are near the target conformation. The remaining portion of the transition is described progressively by higher-frequency modes, which correspond to smaller-scale rearrangements. A straightforward modification of TMD that avoids the global best-fit rotation is the locally restrained TMD (LRTMD) method, in which the biasing potential is constructed from a number of TMD potentials, each acting on a small connected portion of the protein sequence. With a uniform distribution of these elements, transition paths that lack the length-scale bias are obtained. Trajectories generated by steered MD in dihedral angle space (DSMD), a method that avoids best-fit rotations altogether, also lack the length-scale bias. To examine the importance of the paths generated by TMD, LRTMD, and DSMD in the actual transition, we use the finite-temperature string method to compute the free energy profile associated with a transition tube around a path generated by each algorithm. The free energy barriers associated with the paths are comparable, suggesting that transitions can occur along each route with similar probabilities. This result indicates that a broad ensemble of paths needs to be calculated to obtain a full description of conformational changes in biomolecules. The breadth of the contributing ensemble suggests that energetic barriers for conformational transitions in proteins are offset by entropic contributions that arise from a large number of possible paths.
Coexistence between wildlife and humans at fine spatial scales.
Carter, Neil H; Shrestha, Binoj K; Karki, Jhamak B; Pradhan, Narendra Man Babu; Liu, Jianguo
2012-09-18
Many wildlife species face imminent extinction because of human impacts, and therefore, a prevailing belief is that some wildlife species, particularly large carnivores and ungulates, cannot coexist with people at fine spatial scales (i.e., cannot regularly use the exact same point locations). This belief provides rationale for various conservation programs, such as resettling human communities outside protected areas. However, quantitative information on the capacity and mechanisms for wildlife to coexist with humans at fine spatial scales is scarce. Such information is vital, because the world is becoming increasingly crowded. Here, we provide empirical information about the capacity and mechanisms for tigers (a globally endangered species) to coexist with humans at fine spatial scales inside and outside Nepal's Chitwan National Park, a flagship protected area for imperiled wildlife. Information obtained from field cameras in 2010 and 2011 indicated that human presence (i.e., people on foot and vehicles) was ubiquitous and abundant throughout the study site; however, tiger density was also high. Surprisingly, even at a fine spatial scale (i.e., camera locations), tigers spatially overlapped with people on foot and vehicles in both years. However, in both years, tigers offset their temporal activity patterns to be much less active during the day when human activity peaked. In addition to temporal displacement, tiger-human coexistence was likely enhanced by abundant tiger prey and low levels of tiger poaching. Incorporating fine-scale spatial and temporal activity patterns into conservation plans can help address a major global challenge-meeting human needs while sustaining wildlife.
Analysis of calibration accuracy of cameras with different target sizes for large field of view
NASA Astrophysics Data System (ADS)
Zhang, Jin; Chai, Zhiwen; Long, Changyu; Deng, Huaxia; Ma, Mengchao; Zhong, Xiang; Yu, Huan
2018-03-01
Visual measurement plays an increasingly important role in the field o f aerospace, ship and machinery manufacturing. Camera calibration of large field-of-view is a critical part of visual measurement . For the issue a large scale target is difficult to be produced, and the precision can not to be guaranteed. While a small target has the advantage of produced of high precision, but only local optimal solutions can be obtained . Therefore, studying the most suitable ratio of the target size to the camera field of view to ensure the calibration precision requirement of the wide field-of-view is required. In this paper, the cameras are calibrated by a series of different dimensions of checkerboard calibration target s and round calibration targets, respectively. The ratios of the target size to the camera field-of-view are 9%, 18%, 27%, 36%, 45%, 54%, 63%, 72%, 81% and 90%. The target is placed in different positions in the camera field to obtain the camera parameters of different positions . Then, the distribution curves of the reprojection mean error of the feature points' restructure in different ratios are analyzed. The experimental data demonstrate that with the ratio of the target size to the camera field-of-view increas ing, the precision of calibration is accordingly improved, and the reprojection mean error changes slightly when the ratio is above 45%.
Sivalingam, Jaichandran; Lam, Alan Tin-Lun; Chen, Hong Yu; Yang, Bin Xia; Chen, Allen Kuan-Liang; Reuveny, Shaul; Loh, Yuin-Han; Oh, Steve Kah-Weng
2016-08-01
In vitro generation of red blood cells (RBCs) from human embryonic stem cells and human induced pluripotent stem cells appears to be a promising alternate approach to circumvent shortages in donor-derived blood supplies for clinical applications. Conventional methods for hematopoietic differentiation of human pluripotent stem cells (hPSC) rely on embryoid body (EB) formation and/or coculture with xenogeneic cell lines. However, most current methods for hPSC expansion and EB formation are not amenable for scale-up to levels required for large-scale RBC generation. Moreover, differentiation methods that rely on xenogenic cell lines would face obstacles for future clinical translation. In this study, we report the development of a serum-free and chemically defined microcarrier-based suspension culture platform for scalable hPSC expansion and EB formation. Improved survival and better quality EBs generated with the microcarrier-based method resulted in significantly improved mesoderm induction and, when combined with hematopoietic differentiation, resulted in at least a 6-fold improvement in hematopoietic precursor expansion, potentially culminating in a 80-fold improvement in the yield of RBC generation compared to a conventional EB-based differentiation method. In addition, we report efficient terminal maturation and generation of mature enucleated RBCs using a coculture system that comprised primary human mesenchymal stromal cells. The microcarrier-based platform could prove to be an appealing strategy for future scale-up of hPSC culture, EB generation, and large-scale generation of RBCs under defined and xeno-free conditions.
NASA Astrophysics Data System (ADS)
Austin, Kemen G.; González-Roglich, Mariano; Schaffer-Smith, Danica; Schwantes, Amanda M.; Swenson, Jennifer J.
2017-05-01
Deforestation continues across the tropics at alarming rates, with repercussions for ecosystem processes, carbon storage and long term sustainability. Taking advantage of recent fine-scale measurement of deforestation, this analysis aims to improve our understanding of the scale of deforestation drivers in the tropics. We examined trends in forest clearings of different sizes from 2000-2012 by country, region and development level. As tropical deforestation increased from approximately 6900 kha yr-1 in the first half of the study period, to >7900 kha yr-1 in the second half of the study period, >50% of this increase was attributable to the proliferation of medium and large clearings (>10 ha). This trend was most pronounced in Southeast Asia and in South America. Outside of Brazil >60% of the observed increase in deforestation in South America was due to an upsurge in medium- and large-scale clearings; Brazil had a divergent trend of decreasing deforestation, >90% of which was attributable to a reduction in medium and large clearings. The emerging prominence of large-scale drivers of forest loss in many regions and countries suggests the growing need for policy interventions which target industrial-scale agricultural commodity producers. The experience in Brazil suggests that there are promising policy solutions to mitigate large-scale deforestation, but that these policy initiatives do not adequately address small-scale drivers. By providing up-to-date and spatially explicit information on the scale of deforestation, and the trends in these patterns over time, this study contributes valuable information for monitoring, and designing effective interventions to address deforestation.
Nagaraj, Shivashankar H.; Gasser, Robin B.; Ranganathan, Shoba
2008-01-01
Background Parasitic nematodes of humans, other animals and plants continue to impose a significant public health and economic burden worldwide, due to the diseases they cause. Promising antiparasitic drug and vaccine candidates have been discovered from excreted or secreted (ES) proteins released from the parasite and exposed to the immune system of the host. Mining the entire expressed sequence tag (EST) data available from parasitic nematodes represents an approach to discover such ES targets. Methods and Findings In this study, we predicted, using EST2Secretome, a novel, high-throughput, computational workflow system, 4,710 ES proteins from 452,134 ESTs derived from 39 different species of nematodes, parasitic in animals (including humans) or plants. In total, 2,632, 786, and 1,292 ES proteins were predicted for animal-, human-, and plant-parasitic nematodes. Subsequently, we systematically analysed ES proteins using computational methods. Of these 4,710 proteins, 2,490 (52.8%) had orthologues in Caenorhabditis elegans, whereas 621 (13.8%) appeared to be novel, currently having no significant match to any molecule available in public databases. Of the C. elegans homologues, 267 had strong “loss-of-function” phenotypes by RNA interference (RNAi) in this nematode. We could functionally classify 1,948 (41.3%) sequences using the Gene Ontology (GO) terms, establish pathway associations for 573 (12.2%) sequences using Kyoto Encyclopaedia of Genes and Genomes (KEGG), and identify protein interaction partners for 1,774 (37.6%) molecules. We also mapped 758 (16.1%) proteins to protein domains including the nematode-specific protein family “transthyretin-like” and “chromadorea ALT,” considered as vaccine candidates against filariasis in humans. Conclusions We report the large-scale analysis of ES proteins inferred from EST data for a range of parasitic nematodes. This set of ES proteins provides an inventory of known and novel members of ES proteins as a foundation for studies focused on understanding the biology of parasitic nematodes and their interactions with their hosts, as well as for the development of novel drugs or vaccines for parasite intervention and control. PMID:18820748
Motor scaling by viewing distance of early visual motion signals during smooth pursuit
NASA Technical Reports Server (NTRS)
Zhou, Hui-Hui; Wei, Min; Angelaki, Dora E.
2002-01-01
The geometry of gaze stabilization during head translation requires eye movements to scale proportionally to the inverse of target distance. Such a scaling has indeed been demonstrated to exist for the translational vestibuloocular reflex (TVOR), as well as optic flow-selective translational visuomotor reflexes (e.g., ocular following, OFR). The similarities in this scaling by a neural estimate of target distance for both the TVOR and the OFR have been interpreted to suggest that the two reflexes share common premotor processing. Because the neural substrates of OFR are partly shared by those for the generation of pursuit eye movements, we wanted to know if the site of gain modulation for TVOR and OFR is also part of a major pathway for pursuit. Thus, in the present studies, we investigated in rhesus monkeys whether initial eye velocity and acceleration during the open-loop portion of step ramp pursuit scales with target distance. Specifically, with visual motion identical on the retina during tracking at different distances (12, 24, and 60 cm), we compared the first 80 ms of horizontal pursuit. We report that initial eye velocity and acceleration exhibits either no or a very small dependence on vergence angle that is at least an order of magnitude less than the corresponding dependence of the TVOR and OFR. The results suggest that the neural substrates for motor scaling by target distance remain largely distinct from the main pathway for pursuit.
De Steur, Hans; Mehta, Saurabh; Gellynck, Xavier; Finkelstein, Julia L
2017-04-01
Genetic engineering has been successfully applied to increase micronutrient content in staple crops. Nutrition evidence is key to ensure scale-up and successful implementation. Unlike conventional plant breeding efforts, research on the efficacy or effectiveness of GM biofortified crops on nutritional status in human populations is lacking. This review reports on the potential role of GM biofortified crops in closing the micronutrient gap - increasing the dietary intake of micronutrients in human populations. To date, one clinical trial in the United States reported a high bio-conversion rate of β-carotene in Golden Rice, and potential effects of GM biofortified crop consumption on dietary intake and nutritional outcomes are promising. However, further research needs to confirm the ex ante assessments in target regions. Copyright © 2017. Published by Elsevier Ltd.
Drug Target Mining and Analysis of the Chinese Tree Shrew for Pharmacological Testing
Liu, Jie; Lee, Wen-hui; Zhang, Yun
2014-01-01
The discovery of new drugs requires the development of improved animal models for drug testing. The Chinese tree shrew is considered to be a realistic candidate model. To assess the potential of the Chinese tree shrew for pharmacological testing, we performed drug target prediction and analysis on genomic and transcriptomic scales. Using our pipeline, 3,482 proteins were predicted to be drug targets. Of these predicted targets, 446 and 1,049 proteins with the highest rank and total scores, respectively, included homologs of targets for cancer chemotherapy, depression, age-related decline and cardiovascular disease. Based on comparative analyses, more than half of drug target proteins identified from the tree shrew genome were shown to be higher similarity to human targets than in the mouse. Target validation also demonstrated that the constitutive expression of the proteinase-activated receptors of tree shrew platelets is similar to that of human platelets but differs from that of mouse platelets. We developed an effective pipeline and search strategy for drug target prediction and the evaluation of model-based target identification for drug testing. This work provides useful information for future studies of the Chinese tree shrew as a source of novel targets for drug discovery research. PMID:25105297
Rapid inverse planning for pressure-driven drug infusions in the brain.
Rosenbluth, Kathryn H; Martin, Alastair J; Mittermeyer, Stephan; Eschermann, Jan; Dickinson, Peter J; Bankiewicz, Krystof S
2013-01-01
Infusing drugs directly into the brain is advantageous to oral or intravenous delivery for large molecules or drugs requiring high local concentrations with low off-target exposure. However, surgeons manually planning the cannula position for drug delivery in the brain face a challenging three-dimensional visualization task. This study presents an intuitive inverse-planning technique to identify the optimal placement that maximizes coverage of the target structure while minimizing the potential for leakage outside the target. The technique was retrospectively validated using intraoperative magnetic resonance imaging of infusions into the striatum of non-human primates and into a tumor in a canine model and applied prospectively to upcoming human clinical trials.
Validating Bayesian truth serum in large-scale online human experiments.
Frank, Morgan R; Cebrian, Manuel; Pickard, Galen; Rahwan, Iyad
2017-01-01
Bayesian truth serum (BTS) is an exciting new method for improving honesty and information quality in multiple-choice survey, but, despite the method's mathematical reliance on large sample sizes, existing literature about BTS only focuses on small experiments. Combined with the prevalence of online survey platforms, such as Amazon's Mechanical Turk, which facilitate surveys with hundreds or thousands of participants, BTS must be effective in large-scale experiments for BTS to become a readily accepted tool in real-world applications. We demonstrate that BTS quantifiably improves honesty in large-scale online surveys where the "honest" distribution of answers is known in expectation on aggregate. Furthermore, we explore a marketing application where "honest" answers cannot be known, but find that BTS treatment impacts the resulting distributions of answers.
Validating Bayesian truth serum in large-scale online human experiments
Frank, Morgan R.; Cebrian, Manuel; Pickard, Galen; Rahwan, Iyad
2017-01-01
Bayesian truth serum (BTS) is an exciting new method for improving honesty and information quality in multiple-choice survey, but, despite the method’s mathematical reliance on large sample sizes, existing literature about BTS only focuses on small experiments. Combined with the prevalence of online survey platforms, such as Amazon’s Mechanical Turk, which facilitate surveys with hundreds or thousands of participants, BTS must be effective in large-scale experiments for BTS to become a readily accepted tool in real-world applications. We demonstrate that BTS quantifiably improves honesty in large-scale online surveys where the “honest” distribution of answers is known in expectation on aggregate. Furthermore, we explore a marketing application where “honest” answers cannot be known, but find that BTS treatment impacts the resulting distributions of answers. PMID:28494000
2013-01-01
Background Bacteria and archaea develop immunity against invading genomes by incorporating pieces of the invaders' sequences, called spacers, into a clustered regularly interspaced short palindromic repeats (CRISPR) locus between repeats, forming arrays of repeat-spacer units. When spacers are expressed, they direct CRISPR-associated (Cas) proteins to silence complementary invading DNA. In order to characterize the invaders of human microbiomes, we use spacers from CRISPR arrays that we had previously assembled from shotgun metagenomic datasets, and identify contigs that contain these spacers' targets. Results We discover 95,000 contigs that are putative invasive mobile genetic elements, some targeted by hundreds of CRISPR spacers. We find that oral sites in healthy human populations have a much greater variety of mobile genetic elements than stool samples. Mobile genetic elements carry genes encoding diverse functions: only 7% of the mobile genetic elements are similar to known phages or plasmids, although a much greater proportion contain phage- or plasmid-related genes. A small number of contigs share similarity with known integrative and conjugative elements, providing the first examples of CRISPR defenses against this class of element. We provide detailed analyses of a few large mobile genetic elements of various types, and a relative abundance analysis of mobile genetic elements and putative hosts, exploring the dynamic activities of mobile genetic elements in human microbiomes. A joint analysis of mobile genetic elements and CRISPRs shows that protospacer-adjacent motifs drive their interaction network; however, some CRISPR-Cas systems target mobile genetic elements lacking motifs. Conclusions We identify a large collection of invasive mobile genetic elements in human microbiomes, an important resource for further study of the interaction between the CRISPR-Cas immune system and invaders. PMID:23628424
NASA Technical Reports Server (NTRS)
Tri, Terry O.
1999-01-01
As a key component in its ground test bed capability, NASA's Advanced Life Support Program has been developing a large-scale advanced life support test facility capable of supporting long-duration evaluations of integrated bioregenerative life support systems with human test crews. This facility-targeted for evaluation of hypogravity compatible life support systems to be developed for use on planetary surfaces such as Mars or the Moon-is called the Bioregenerative Planetary Life Support Systems Test Complex (BIO-Plex) and is currently under development at the Johnson Space Center. This test bed is comprised of a set of interconnected chambers with a sealed internal environment which are outfitted with systems capable of supporting test crews of four individuals for periods exceeding one year. The advanced technology systems to be tested will consist of both biological and physicochemical components and will perform all required crew life support functions. This presentation provides a description of the proposed test "missions" to be supported by the BIO-Plex and the planned development strategy for the facility.
Lam, Max; Trampush, Joey W; Yu, Jin; Knowles, Emma; Davies, Gail; Liewald, David C; Starr, John M; Djurovic, Srdjan; Melle, Ingrid; Sundet, Kjetil; Christoforou, Andrea; Reinvang, Ivar; DeRosse, Pamela; Lundervold, Astri J; Steen, Vidar M; Espeseth, Thomas; Räikkönen, Katri; Widen, Elisabeth; Palotie, Aarno; Eriksson, Johan G; Giegling, Ina; Konte, Bettina; Roussos, Panos; Giakoumaki, Stella; Burdick, Katherine E; Payton, Antony; Ollier, William; Chiba-Falek, Ornit; Attix, Deborah K; Need, Anna C; Cirulli, Elizabeth T; Voineskos, Aristotle N; Stefanis, Nikos C; Avramopoulos, Dimitrios; Hatzimanolis, Alex; Arking, Dan E; Smyrnis, Nikolaos; Bilder, Robert M; Freimer, Nelson A; Cannon, Tyrone D; London, Edythe; Poldrack, Russell A; Sabb, Fred W; Congdon, Eliza; Conley, Emily Drabant; Scult, Matthew A; Dickinson, Dwight; Straub, Richard E; Donohoe, Gary; Morris, Derek; Corvin, Aiden; Gill, Michael; Hariri, Ahmad R; Weinberger, Daniel R; Pendleton, Neil; Bitsios, Panos; Rujescu, Dan; Lahti, Jari; Le Hellard, Stephanie; Keller, Matthew C; Andreassen, Ole A; Deary, Ian J; Glahn, David C; Malhotra, Anil K; Lencz, Todd
2017-11-28
Here, we present a large (n = 107,207) genome-wide association study (GWAS) of general cognitive ability ("g"), further enhanced by combining results with a large-scale GWAS of educational attainment. We identified 70 independent genomic loci associated with general cognitive ability. Results showed significant enrichment for genes causing Mendelian disorders with an intellectual disability phenotype. Competitive pathway analysis implicated the biological processes of neurogenesis and synaptic regulation, as well as the gene targets of two pharmacologic agents: cinnarizine, a T-type calcium channel blocker, and LY97241, a potassium channel inhibitor. Transcriptome-wide and epigenome-wide analysis revealed that the implicated loci were enriched for genes expressed across all brain regions (most strongly in the cerebellum). Enrichment was exclusive to genes expressed in neurons but not oligodendrocytes or astrocytes. Finally, we report genetic correlations between cognitive ability and disparate phenotypes including psychiatric disorders, several autoimmune disorders, longevity, and maternal age at first birth. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.
Stable isotope probing to study functional components of complex microbial ecosystems.
Mazard, Sophie; Schäfer, Hendrik
2014-01-01
This protocol presents a method of dissecting the DNA or RNA of key organisms involved in a specific biochemical process within a complex ecosystem. Stable isotope probing (SIP) allows the labelling and separation of nucleic acids from community members that are involved in important biochemical transformations, yet are often not the most numerically abundant members of a community. This pure culture-independent technique circumvents limitations of traditional microbial isolation techniques or data mining from large-scale whole-community metagenomic studies to tease out the identities and genomic repertoires of microorganisms participating in biological nutrient cycles. SIP experiments can be applied to virtually any ecosystem and biochemical pathway under investigation provided a suitable stable isotope substrate is available. This versatile methodology allows a wide range of analyses to be performed, from fatty-acid analyses, community structure and ecology studies, and targeted metagenomics involving nucleic acid sequencing. SIP experiments provide an effective alternative to large-scale whole-community metagenomic studies by specifically targeting the organisms or biochemical transformations of interest, thereby reducing the sequencing effort and time-consuming bioinformatics analyses of large datasets.
Quasi-Experimental Evaluation of the Effectiveness of a Large-Scale Readmission Reduction Program.
Jenq, Grace Y; Doyle, Margaret M; Belton, Beverly M; Herrin, Jeph; Horwitz, Leora I
2016-05-01
Feasibility, effectiveness, and sustainability of large-scale readmission reduction efforts are uncertain. The Greater New Haven Coalition for Safe Transitions and Readmission Reductions was funded by the Center for Medicare & Medicaid Services (CMS) to reduce readmissions among all discharged Medicare fee-for-service (FFS) patients. To evaluate whether overall Medicare FFS readmissions were reduced through an intervention applied to high-risk discharge patients. This quasi-experimental evaluation took place at an urban academic medical center. Target discharge patients were older than 64 years with Medicare FFS insurance, residing in nearby zip codes, and discharged alive to home or facility and not against medical advice or to hospice; control discharge patients were older than 54 years with the same zip codes and discharge disposition but without Medicare FFS insurance if older than 64 years. High-risk target discharge patients were selectively enrolled in the program. Personalized transitional care, including education, medication reconciliation, follow-up telephone calls, and linkage to community resources. We measured the 30-day unplanned same-hospital readmission rates in the baseline period (May 1, 2011, through April 30, 2012) and intervention period (October 1, 2012, through May 31, 2014). We enrolled 10 621 (58.3%) of 18 223 target discharge patients (73.9% of discharge patients screened as high risk) and included all target discharge patients in the analysis. The mean (SD) age of the target discharge patients was 79.7 (8.8) years. The adjusted readmission rate decreased from 21.5% to 19.5% in the target population and from 21.1% to 21.0% in the control population, a relative reduction of 9.3%. The number needed to treat to avoid 1 readmission was 50. In a difference-in-differences analysis using a logistic regression model, the odds of readmission in the target population decreased significantly more than that of the control population in the intervention period (odds ratio, 0.90; 95% CI, 0.83-0.99; P = .03). In a comparative interrupted time series analysis of the difference in monthly adjusted admission rates, the target population decreased an absolute -3.09 (95% CI, -6.47 to 0.29; P = .07) relative to the control population, a similar but nonsignificant effect. This large-scale readmission reduction program reduced readmissions by 9.3% among the full population targeted by the CMS despite being delivered only to high-risk patients. However, it did not achieve the goal reduction set by the CMS.
Ianuzzi, Allyson; Pickar, Joel G; Khalsa, Partap S
2009-01-01
Quadruped animal models have been validated and used as biomechanical models for the lumbar spine. The biomechanics of the cat lumbar spine has not been well characterized, even though it is a common model used in neuromechanical studies. Compare the physiological ranges of motion and determine torque-limits for cat and human lumbar spine specimens during physiological motions. Biomechanics study. Cat and human lumbar spine specimens. Intervertebral angle (IVA), joint moment, yield point, torque-limit, and correlation coefficients. Cat (L2-sacrum) and human (T12-sacrum) lumbar spine specimens were mechanically tested to failure during displacement-controlled extension (E), lateral bending (LB), and axial rotation (AR). Single trials consisted of 10 cycles (10mm/s or 5 degrees /s) to a target displacement where the magnitude of the target displacement was increased for subsequent trials until failure occurred. Whole-lumbar stiffness, torque at yield point, and joint stiffness were determined. Scaling relationships were established using equations analogous to those that describe the load response of elliptically shaped beams. IVA magnitudes for cat and human lumbar spines were similar during physiological motions. Human whole-lumbar and joint stiffness magnitudes were significantly greater than those for cat spine specimens (p<.05). Torque-limits were also greater for humans compared with cats. Scaling relationships with high correlation (R(2) greater than 0.77) were established during later LB and AR. The current study defined "physiological ranges of movement" for human and cat lumbar spine specimens during displacement-controlled testing, and should be observed in future biomechanical studies conducted under displacement control.
Large-scale model quality assessment for improving protein tertiary structure prediction.
Cao, Renzhi; Bhattacharya, Debswapna; Adhikari, Badri; Li, Jilong; Cheng, Jianlin
2015-06-15
Sampling structural models and ranking them are the two major challenges of protein structure prediction. Traditional protein structure prediction methods generally use one or a few quality assessment (QA) methods to select the best-predicted models, which cannot consistently select relatively better models and rank a large number of models well. Here, we develop a novel large-scale model QA method in conjunction with model clustering to rank and select protein structural models. It unprecedentedly applied 14 model QA methods to generate consensus model rankings, followed by model refinement based on model combination (i.e. averaging). Our experiment demonstrates that the large-scale model QA approach is more consistent and robust in selecting models of better quality than any individual QA method. Our method was blindly tested during the 11th Critical Assessment of Techniques for Protein Structure Prediction (CASP11) as MULTICOM group. It was officially ranked third out of all 143 human and server predictors according to the total scores of the first models predicted for 78 CASP11 protein domains and second according to the total scores of the best of the five models predicted for these domains. MULTICOM's outstanding performance in the extremely competitive 2014 CASP11 experiment proves that our large-scale QA approach together with model clustering is a promising solution to one of the two major problems in protein structure modeling. The web server is available at: http://sysbio.rnet.missouri.edu/multicom_cluster/human/. © The Author 2015. Published by Oxford University Press.
NASA Technical Reports Server (NTRS)
Fijany, Amir; Collier, James B.; Citak, Ari
1997-01-01
A team of US Army Corps of Engineers, Omaha District and Engineering and Support Center, Huntsville, let Propulsion Laboratory (JPL), Stanford Research Institute (SRI), and Montgomery Watson is currently in the process of planning and conducting the largest ever survey at the Former Buckley Field (60,000 acres), in Colorado, by using SRI airborne, ground penetrating, Synthetic Aperture Radar (SAR). The purpose of this survey is the detection of surface and subsurface Unexploded Ordnance (UXO) and in a broader sense the site characterization for identification of contaminated as well as clear areas. In preparation for such a large-scale survey, JPL has been developing advanced algorithms and a high-performance restbed for processing of massive amount of expected SAR data from this site. Two key requirements of this project are the accuracy (in terms of UXO detection) and speed of SAR data processing. The first key feature of this testbed is a large degree of automation and a minimum degree of the need for human perception in the processing to achieve an acceptable processing rate of several hundred acres per day. For accurate UXO detection, novel algorithms have been developed and implemented. These algorithms analyze dual polarized (HH and VV) SAR data. They are based on the correlation of HH and VV SAR data and involve a rather large set of parameters for accurate detection of UXO. For each specific site, this set of parameters can be optimized by using ground truth data (i.e., known surface and subsurface UXOs). In this paper, we discuss these algorithms and their successful application for detection of surface and subsurface anti-tank mines by using a data set from Yuma proving Ground, A7, acquired by SRI SAR.
Policy approaches to renewable energy investment in the Mediterranean region
NASA Astrophysics Data System (ADS)
Patt, A.; Komendantova, N.; Battaglini, A.; Lilliestam, J.; Williges, K.
2009-04-01
Europe's climate policy objective of 20% renewable energy by 2020, and the call by the IPCC to reduce greenhouse gas emissions by 80% by 2050, pose major challenges for the European Union. Several policy options are available to move towards these objectives. In this paper, we will address the most critical policy and governance issues associated with one particular approach to scaling up renewable energy resources: reliance on large-scale energy generation facilities outside the European continent, such as onshore and offshore wind farms and concentrating solar power (CSP) facilities in the Mediterranean region. Several feasibility studies completed over the past three years (German Aerospace Center 2006; German Aerospace Center 2005; Czisch, Elektrotechnik 2005, p. 488; Lorenz, Pinner, Seitz, McKinsey Quarterly 2008, p.10; German Aerospace Center 2005; Knies 2008, The Club of Rome; Khosla, Breaking the Climate Deadlock Briefing Papers, 2008, p.19) have convincingly demonstrated that large-scale wind and CSP projects ought to be very attractive for a number of reasons, including cost, reliability of power supply, and technological maturity. According to these studies it would be technically possible for Europe to rely on large-scale wind and CSP for the majority of its power needs by 2050—indeed enough to completely replace its reliance on fossil fuels for power generation—at competitive cost over its current, carbon intensive system. While it has been shown to be technically feasible to develop renewable resources in North Africa to account for a large share of Europe's energy needs, doing so would require sustained double digit rates of growth in generating and long-distance transmission capacity, and would potentially require a very different high voltage grid architecture within Europe. Doing so at a large scale could require enormous up-front investments in technical capacity, financial instruments and human resources. What are the policy instruments best suited to achieving such growth quickly and smoothly? What bottlenecks—in terms of supply chains, human capital, finance, and transmission capacity—need to be anticipated and addressed if the rate of capacity growth is to be sustained over several decades? What model of governance would create a safe investment climate in consistence with new EU legislation (i.e. EU Renewable Energy Directive) as well as expected post-Kyoto targets and mechanisms? The material that we present here is based on a series of workshops held between November 2008 and January 2009, in which a wide range of stakeholders expressed their views about the fundamental needs for policy intervention. Supplementing the results from these workshops have been additional expert interviews, and basic financial modeling. One of the interesting results from this research is the need for a multi-pronged approach. First, there is a need for a support scheme, potentially compatible with in all cases supplementing the EU REN Directive, that would create a stable market for North African electricity in Europe. Second, there is a need for policies that facilitate the formation of public private partnerships in North Africa, as the specific investment vehicle, as a way to manage some of the uncertainties associated with large-scale investments in the region. Third, attention has to be paid to the development of supply chains within the Mediterranean region, as a way of ensuring the compatibility of such investments with sustainable development.
Target Discovery for Precision Medicine Using High-Throughput Genome Engineering.
Guo, Xinyi; Chitale, Poonam; Sanjana, Neville E
2017-01-01
Over the past few years, programmable RNA-guided nucleases such as the CRISPR/Cas9 system have ushered in a new era of precision genome editing in diverse model systems and in human cells. Functional screens using large libraries of RNA guides can interrogate a large hypothesis space to pinpoint particular genes and genetic elements involved in fundamental biological processes and disease-relevant phenotypes. Here, we review recent high-throughput CRISPR screens (e.g. loss-of-function, gain-of-function, and targeting noncoding elements) and highlight their potential for uncovering novel therapeutic targets, such as those involved in cancer resistance to small molecular drugs and immunotherapies, tumor evolution, infectious disease, inborn genetic disorders, and other therapeutic challenges.
LARGE SCALE DISASTER ANALYSIS AND MANAGEMENT: SYSTEM LEVEL STUDY ON AN INTEGRATED MODEL
The increasing intensity and scale of human activity across the globe leading to severe depletion and deterioration of the Earth's natural resources has meant that sustainability has emerged as a new paradigm of analysis and management. Sustainability, conceptually defined by the...