Sample records for biologists computer scientists

  1. System biology of gene regulation.

    PubMed

    Baitaluk, Michael

    2009-01-01

    A famous joke story that exhibits the traditionally awkward alliance between theory and experiment and showing the differences between experimental biologists and theoretical modelers is when a University sends a biologist, a mathematician, a physicist, and a computer scientist to a walking trip in an attempt to stimulate interdisciplinary research. During a break, they watch a cow in a field nearby and the leader of the group asks, "I wonder how one could decide on the size of a cow?" Since a cow is a biological object, the biologist responded first: "I have seen many cows in this area and know it is a big cow." The mathematician argued, "The true volume is determined by integrating the mathematical function that describes the outer surface of the cow's body." The physicist suggested: "Let's assume the cow is a sphere...." Finally the computer scientist became nervous and said that he didn't bring his computer because there is no Internet connection up there on the hill. In this humorous but explanatory story suggestions proposed by theorists can be taken to reflect the view of many experimental biologists that computer scientists and theorists are too far removed from biological reality and therefore their theories and approaches are not of much immediate usefulness. Conversely, the statement of the biologist mirrors the view of many traditional theoretical and computational scientists that biological experiments are for the most part simply descriptive, lack rigor, and that much of the resulting biological data are of questionable functional relevance. One of the goals of current biology as a multidisciplinary science is to bring people from different scientific areas together on the same "hill" and teach them to speak the same "language." In fact, of course, when presenting their data, most experimentalist biologists do provide an interpretation and explanation for the results, and many theorists/computer scientists aim to answer (or at least to fully describe) questions of biological relevance. Thus systems biology could be treated as such a socioscientific phenomenon and a new approach to both experiments and theory that is defined by the strategy of pursuing integration of complex data about the interactions in biological systems from diverse experimental sources using interdisciplinary tools and personnel.

  2. A computational image analysis glossary for biologists.

    PubMed

    Roeder, Adrienne H K; Cunha, Alexandre; Burl, Michael C; Meyerowitz, Elliot M

    2012-09-01

    Recent advances in biological imaging have resulted in an explosion in the quality and quantity of images obtained in a digital format. Developmental biologists are increasingly acquiring beautiful and complex images, thus creating vast image datasets. In the past, patterns in image data have been detected by the human eye. Larger datasets, however, necessitate high-throughput objective analysis tools to computationally extract quantitative information from the images. These tools have been developed in collaborations between biologists, computer scientists, mathematicians and physicists. In this Primer we present a glossary of image analysis terms to aid biologists and briefly discuss the importance of robust image analysis in developmental studies.

  3. Decision tree and ensemble learning algorithms with their applications in bioinformatics.

    PubMed

    Che, Dongsheng; Liu, Qi; Rasheed, Khaled; Tao, Xiuping

    2011-01-01

    Machine learning approaches have wide applications in bioinformatics, and decision tree is one of the successful approaches applied in this field. In this chapter, we briefly review decision tree and related ensemble algorithms and show the successful applications of such approaches on solving biological problems. We hope that by learning the algorithms of decision trees and ensemble classifiers, biologists can get the basic ideas of how machine learning algorithms work. On the other hand, by being exposed to the applications of decision trees and ensemble algorithms in bioinformatics, computer scientists can get better ideas of which bioinformatics topics they may work on in their future research directions. We aim to provide a platform to bridge the gap between biologists and computer scientists.

  4. ModeLang: a new approach for experts-friendly viral infections modeling.

    PubMed

    Wasik, Szymon; Prejzendanc, Tomasz; Blazewicz, Jacek

    2013-01-01

    Computational modeling is an important element of systems biology. One of its important applications is modeling complex, dynamical, and biological systems, including viral infections. This type of modeling usually requires close cooperation between biologists and mathematicians. However, such cooperation often faces communication problems because biologists do not have sufficient knowledge to understand mathematical description of the models, and mathematicians do not have sufficient knowledge to define and verify these models. In many areas of systems biology, this problem has already been solved; however, in some of these areas there are still certain problematic aspects. The goal of the presented research was to facilitate this cooperation by designing seminatural formal language for describing viral infection models that will be easy to understand for biologists and easy to use by mathematicians and computer scientists. The ModeLang language was designed in cooperation with biologists and its computer implementation was prepared. Tests proved that it can be successfully used to describe commonly used viral infection models and then to simulate and verify them. As a result, it can make cooperation between biologists and mathematicians modeling viral infections much easier, speeding up computational verification of formulated hypotheses.

  5. ModeLang: A New Approach for Experts-Friendly Viral Infections Modeling

    PubMed Central

    Blazewicz, Jacek

    2013-01-01

    Computational modeling is an important element of systems biology. One of its important applications is modeling complex, dynamical, and biological systems, including viral infections. This type of modeling usually requires close cooperation between biologists and mathematicians. However, such cooperation often faces communication problems because biologists do not have sufficient knowledge to understand mathematical description of the models, and mathematicians do not have sufficient knowledge to define and verify these models. In many areas of systems biology, this problem has already been solved; however, in some of these areas there are still certain problematic aspects. The goal of the presented research was to facilitate this cooperation by designing seminatural formal language for describing viral infection models that will be easy to understand for biologists and easy to use by mathematicians and computer scientists. The ModeLang language was designed in cooperation with biologists and its computer implementation was prepared. Tests proved that it can be successfully used to describe commonly used viral infection models and then to simulate and verify them. As a result, it can make cooperation between biologists and mathematicians modeling viral infections much easier, speeding up computational verification of formulated hypotheses. PMID:24454531

  6. Towards a cyberinfrastructure for the biological sciences: progress, visions and challenges.

    PubMed

    Stein, Lincoln D

    2008-09-01

    Biology is an information-driven science. Large-scale data sets from genomics, physiology, population genetics and imaging are driving research at a dizzying rate. Simultaneously, interdisciplinary collaborations among experimental biologists, theorists, statisticians and computer scientists have become the key to making effective use of these data sets. However, too many biologists have trouble accessing and using these electronic data sets and tools effectively. A 'cyberinfrastructure' is a combination of databases, network protocols and computational services that brings people, information and computational tools together to perform science in this information-driven world. This article reviews the components of a biological cyberinfrastructure, discusses current and pending implementations, and notes the many challenges that lie ahead.

  7. Using Physical and Computer Simulations of Collective Behaviour as an Introduction to Modelling Concepts for Applied Biologists

    ERIC Educational Resources Information Center

    Rands, Sean A.

    2012-01-01

    Models are an important tool in science: not only do they act as a convenient device for describing a system or problem, but they also act as a conceptual tool for framing and exploring hypotheses. Models, and in particular computer simulations, are also an important education tool for training scientists, but it is difficult to teach students the…

  8. Perspectives on an education in computational biology and medicine.

    PubMed

    Rubinstein, Jill C

    2012-09-01

    The mainstream application of massively parallel, high-throughput assays in biomedical research has created a demand for scientists educated in Computational Biology and Bioinformatics (CBB). In response, formalized graduate programs have rapidly evolved over the past decade. Concurrently, there is increasing need for clinicians trained to oversee the responsible translation of CBB research into clinical tools. Physician-scientists with dedicated CBB training can facilitate such translation, positioning themselves at the intersection between computational biomedical research and medicine. This perspective explores key elements of the educational path to such a position, specifically addressing: 1) evolving perceptions of the role of the computational biologist and the impact on training and career opportunities; 2) challenges in and strategies for obtaining the core skill set required of a biomedical researcher in a computational world; and 3) how the combination of CBB with medical training provides a logical foundation for a career in academic medicine and/or biomedical research.

  9. Launching "the evolution of cooperation".

    PubMed

    Axelrod, Robert

    2012-04-21

    This article describes three aspects of the author's early work on the evolution of the cooperation. First, it explains how the idea for a computer tournament for the iterated Prisoner's Dilemma was inspired by the artificial intelligence research on computer checkers and computer chess. Second, it shows how the vulnerability of simple reciprocity of misunderstanding or misimplementation can be eliminated with the addition of some degree of generosity or contrition. Third, it recounts the unusual collaboration between the author, a political scientist, and William D. Hamilton, an evolutionary biologist. Copyright © 2011 Elsevier Ltd. All rights reserved.

  10. Cognitive styles of Forest Service scientists and managers in the Pacific Northwest.

    Treesearch

    Andrew B. Carey

    1997-01-01

    Preferences of executives, foresters, and biologists of the Pacific Northwest Research Station and executives, District Rangers, foresters, engineers, and biologists of the Pacific Northwest Region, National Forest System (USDA Forest Service), were compared for various thinking styles. Herrmann brain dominance profiles from 230 scientists and managers were drawn from...

  11. Social insects inspire human design

    PubMed Central

    Holbrook, C. Tate; Clark, Rebecca M.; Moore, Dani; Overson, Rick P.; Penick, Clint A.; Smith, Adrian A.

    2010-01-01

    The international conference ‘Social Biomimicry: Insect Societies and Human Design’, hosted by Arizona State University, USA, 18–20 February 2010, explored how the collective behaviour and nest architecture of social insects can inspire innovative and effective solutions to human design challenges. It brought together biologists, designers, engineers, computer scientists, architects and businesspeople, with the dual aims of enriching biology and advancing biomimetic design. PMID:20392721

  12. Computational evolution: taking liberties.

    PubMed

    Correia, Luís

    2010-09-01

    Evolution has, for a long time, inspired computer scientists to produce computer models mimicking its behavior. Evolutionary algorithm (EA) is one of the areas where this approach has flourished. EAs have been used to model and study evolution, but they have been especially developed for their aptitude as optimization tools for engineering. Developed models are quite simple in comparison with their natural sources of inspiration. However, since EAs run on computers, we have the freedom, especially in optimization models, to test approaches both realistic and outright speculative, from the biological point of view. In this article, we discuss different common evolutionary algorithm models, and then present some alternatives of interest. These include biologically inspired models, such as co-evolution and, in particular, symbiogenetics and outright artificial operators and representations. In each case, the advantages of the modifications to the standard model are identified. The other area of computational evolution, which has allowed us to study basic principles of evolution and ecology dynamics, is the development of artificial life platforms for open-ended evolution of artificial organisms. With these platforms, biologists can test theories by directly manipulating individuals and operators, observing the resulting effects in a realistic way. An overview of the most prominent of such environments is also presented. If instead of artificial platforms we use the real world for evolving artificial life, then we are dealing with evolutionary robotics (ERs). A brief description of this area is presented, analyzing its relations to biology. Finally, we present the conclusions and identify future research avenues in the frontier of computation and biology. Hopefully, this will help to draw the attention of more biologists and computer scientists to the benefits of such interdisciplinary research.

  13. Selfish altruism, fierce cooperation and the predator.

    PubMed

    Askitas, Nikolaos

    2018-12-01

    This paper suggests a new way to think about a famous question: what explains cooperation in nature and in particular in humans? I argue that, for an evolutionary biologist as well as a quantitative social scientist, the triangle of two 'teammates' in the presence of a predator (passing and shooting in two-on-one situations) is one of the fundamental conceptual building-blocks for understanding these phenomena because in such a situation the fact that life is packaged in many distinct enclosures (and not in one big monolithic blob) can unfold its comparative advantage. I show how, in the presence of a predator, cooperative equilibria emerge among entirely selfish teammates if we infinitesimally bias the lead player in the selfish direction or assign a computational burden on the predator due to the presence of a teammate. I argue that 'predators' are common in the biological jungle but also in everyday human settings. Intuitively, this paper builds on the simple idea - a familiar one to a biologist observing the natural world but perhaps less so to social scientists - that everybody has enemies.

  14. Watering Down Barriers to Using Hydropower through Fisheries Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ham, Ken

    Much of our work on clean energy is targeted at improving performance of hydropower, the largest source of renewable energy in the Pacific Northwest and the nation. PNNL experts in hydropower—from computer scientists to biologists and engineers—are helping to optimize the efficiency and environmental performance of hydroelectric plants. The Columbia River is the nation’s most important hydropower resource, producing 40 percent of the nation’s hydroelectric generation and up to 70 percent of the region’s power. At PNNL, Fisheries Biologist Ken Ham and others are working with stakeholders in the Pacific Northwest, the Army Corps of Engineers and DOE to ensuremore » that this resource continues to provide its many benefits while setting a new standard for environmental sustainability. As aging turbines are replaced in existing hydropower dams, computational modeling and state-of-the-art fisheries research combine to aid the design of a next-generation hydro turbine that meets or exceeds current biological performance standards and produces more power.« less

  15. Information technology challenges of biodiversity and ecosystems informatics

    USGS Publications Warehouse

    Schnase, J.L.; Cushing, J.; Frame, M.; Frondorf, A.; Landis, E.; Maier, D.; Silberschatz, A.

    2003-01-01

    Computer scientists, biologists, and natural resource managers recently met to examine the prospects for advancing computer science and information technology research by focusing on the complex and often-unique challenges found in the biodiversity and ecosystem domain. The workshop and its final report reveal that the biodiversity and ecosystem sciences are fundamentally information sciences and often address problems having distinctive attributes of scale and socio-technical complexity. The paper provides an overview of the emerging field of biodiversity and ecosystem informatics and demonstrates how the demands of biodiversity and ecosystem research can advance our understanding and use of information technologies.

  16. Naturally selecting solutions: the use of genetic algorithms in bioinformatics.

    PubMed

    Manning, Timmy; Sleator, Roy D; Walsh, Paul

    2013-01-01

    For decades, computer scientists have looked to nature for biologically inspired solutions to computational problems; ranging from robotic control to scheduling optimization. Paradoxically, as we move deeper into the post-genomics era, the reverse is occurring, as biologists and bioinformaticians look to computational techniques, to solve a variety of biological problems. One of the most common biologically inspired techniques are genetic algorithms (GAs), which take the Darwinian concept of natural selection as the driving force behind systems for solving real world problems, including those in the bioinformatics domain. Herein, we provide an overview of genetic algorithms and survey some of the most recent applications of this approach to bioinformatics based problems.

  17. 2016 ISCB Overton Prize awarded to Debora Marks

    PubMed Central

    Fogg, Christiana N.; Kovats, Diane E.

    2016-01-01

    The International Society for Computational Biology (ISCB) recognizes the achievements of an early- to mid-career scientist with the Overton Prize each year. The Overton Prize was established to honor the untimely loss of Dr. G. Christian Overton, a respected computational biologist and founding ISCB Board member. Winners of the Overton Prize are independent investigators in the early to middle phases of their careers who are selected because of their significant contributions to computational biology through research, teaching, and service. 2016 will mark the fifteenth bestowment of the ISCB Overton Prize.  ISCB is pleased to confer this award the to Debora Marks, Assistant Professor of Systems Biology and director of the Raymond and Beverly Sackler Laboratory for Computational Biology at Harvard Medical School. PMID:27429747

  18. 2016 ISCB Overton Prize awarded to Debora Marks.

    PubMed

    Fogg, Christiana N; Kovats, Diane E

    2016-01-01

    The International Society for Computational Biology (ISCB) recognizes the achievements of an early- to mid-career scientist with the Overton Prize each year. The Overton Prize was established to honor the untimely loss of Dr. G. Christian Overton, a respected computational biologist and founding ISCB Board member. Winners of the Overton Prize are independent investigators in the early to middle phases of their careers who are selected because of their significant contributions to computational biology through research, teaching, and service. 2016 will mark the fifteenth bestowment of the ISCB Overton Prize.  ISCB is pleased to confer this award the to Debora Marks, Assistant Professor of Systems Biology and director of the Raymond and Beverly Sackler Laboratory for Computational Biology at Harvard Medical School.

  19. From Students to Scientists

    ERIC Educational Resources Information Center

    Ho-Shing, Olivia

    2017-01-01

    In his book "Letters to a Young Scientist," renowned biologist Edward O. Wilson recounted his own coming-of-age story as a scientist, and distilled the motivating qualities of science down to curiosity and creativity. Individuals become scientists when they are curious about a phenomenon in the world around them and ask about the real…

  20. Biology, politics, and the emerging science of human nature.

    PubMed

    Fowler, James H; Schreiber, Darren

    2008-11-07

    In the past 50 years, biologists have learned a tremendous amount about human brain function and its genetic basis. At the same time, political scientists have been intensively studying the effect of the social and institutional environment on mass political attitudes and behaviors. However, these separate fields of inquiry are subject to inherent limitations that may only be resolved through collaboration across disciplines. We describe recent advances and argue that biologists and political scientists must work together to advance a new science of human nature.

  1. Automating CapCom: Pragmatic Operations and Technology Research for Human Exploration of Mars

    NASA Technical Reports Server (NTRS)

    Clancey, William J.

    2003-01-01

    During the Apollo program, NASA and the scientific community used terrestrial analog sites for understanding planetary features and for training astronauts to be scientists. More recently, computer scientists and human factors specialists have followed geologists and biologists into the field, learning how science is actually done on expeditions in extreme environments. Research stations have been constructed by the Mars Society in the Arctic and American southwest, providing facilities for hundreds of researchers to investigate how small crews might live and work on Mars. Combining these interests-science, operations, and technology-in Mars analog field expeditions provides tremendous synergy and authenticity to speculations about Mars missions. By relating historical analyses of Apollo and field science, engineers are creating experimental prototypes that provide significant new capabilities, such as a computer system that automates some of the functions of Apollo s CapCom. Thus, analog studies have created a community of practice-a new collaboration between scientists and engineers-so that technology begins with real human needs and works incrementally towards the challenges of the human exploration of Mars.

  2. CompNanoTox2015: novel perspectives from a European conference on computational nanotoxicology on predictive nanotoxicology.

    PubMed

    Bañares, Miguel A; Haase, Andrea; Tran, Lang; Lobaskin, Vladimir; Oberdörster, Günter; Rallo, Robert; Leszczynski, Jerzy; Hoet, Peter; Korenstein, Rafi; Hardy, Barry; Puzyn, Tomasz

    2017-09-01

    A first European Conference on Computational Nanotoxicology, CompNanoTox, was held in November 2015 in Benahavís, Spain with the objectives to disseminate and integrate results from the European modeling and database projects (NanoPUZZLES, ModENPTox, PreNanoTox, MembraneNanoPart, MODERN, eNanoMapper and EU COST TD1204 MODENA) as well as to create synergies within the European NanoSafety Cluster. This conference was supported by the COST Action TD1204 MODENA on developing computational methods for toxicological risk assessment of engineered nanoparticles and provided a unique opportunity for cross fertilization among complementary disciplines. The efforts to develop and validate computational models crucially depend on high quality experimental data and relevant assays which will be the basis to identify relevant descriptors. The ambitious overarching goal of this conference was to promote predictive nanotoxicology, which can only be achieved by a close collaboration between the computational scientists (e.g. database experts, modeling experts for structure, (eco) toxicological effects, performance and interaction of nanomaterials) and experimentalists from different areas (in particular toxicologists, biologists, chemists and material scientists, among others). The main outcome and new perspectives of this conference are summarized here.

  3. CompNanoTox2015: novel perspectives from a European conference on computational nanotoxicology on predictive nanotoxicology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bañares, Miguel A.; Haase, Andrea; Tran, Lang

    A first European Conference on Computational Nanotoxicology, CompNanoTox, was held in November 2015 in Benahavís, Spain with the objectives to disseminate and integrate results from the European modeling and database projects (NanoPUZZLES, ModENPTox, PreNanoTox, MembraneNanoPart, MODERN, eNanoMapper and EU COST TD1204 MODENA) as well as to create synergies within the European NanoSafety Cluster. This conference was supported by the COST Action TD1204 MODENA on developing computational methods for toxicological risk assessment of engineered nanoparticles and provided a unique opportunity for crossfertilization among complementary disciplines. The efforts to develop and validate computational models crucially depend on high quality experimental data andmore » relevant assays which will be the basis to identify relevant descriptors. The ambitious overarching goal of this conference was to promote predictive nanotoxicology, which can only be achieved by a close collaboration between the computational scientists (e.g. database experts, modeling experts for structure, (eco) toxicological effects, performance and interaction of nanomaterials) and experimentalists from different areas (in particular toxicologists, biologists, chemists and material scientists, among others). The main outcome and new perspectives of this conference are summarized here.« less

  4. Data Science Priorities for a University Hospital-Based Institute of Infectious Diseases: A Viewpoint.

    PubMed

    Valleron, Alain-Jacques

    2017-08-15

    Automation of laboratory tests, bioinformatic analysis of biological sequences, and professional data management are used routinely in a modern university hospital-based infectious diseases institute. This dates back to at least the 1980s. However, the scientific methods of this 21st century are changing with the increased power and speed of computers, with the "big data" revolution having already happened in genomics and environment, and eventually arriving in medical informatics. The research will be increasingly "data driven," and the powerful machine learning methods whose efficiency is demonstrated in daily life will also revolutionize medical research. A university-based institute of infectious diseases must therefore not only gather excellent computer scientists and statisticians (as in the past, and as in any medical discipline), but also fully integrate the biologists and clinicians with these computer scientists, statisticians, and mathematical modelers having a broad culture in machine learning, knowledge representation, and knowledge discovery. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail: journals.permissions@oup.com.

  5. Opportunities in plant synthetic biology.

    PubMed

    Cook, Charis; Martin, Lisa; Bastow, Ruth

    2014-05-01

    Synthetic biology is an emerging field uniting scientists from all disciplines with the aim of designing or re-designing biological processes. Initially, synthetic biology breakthroughs came from microbiology, chemistry, physics, computer science, materials science, mathematics, and engineering disciplines. A transition to multicellular systems is the next logical step for synthetic biologists and plants will provide an ideal platform for this new phase of research. This meeting report highlights some of the exciting plant synthetic biology projects, and tools and resources, presented and discussed at the 2013 GARNet workshop on plant synthetic biology.

  6. Adapting bioinformatics curricula for big data.

    PubMed

    Greene, Anna C; Giffin, Kristine A; Greene, Casey S; Moore, Jason H

    2016-01-01

    Modern technologies are capable of generating enormous amounts of data that measure complex biological systems. Computational biologists and bioinformatics scientists are increasingly being asked to use these data to reveal key systems-level properties. We review the extent to which curricula are changing in the era of big data. We identify key competencies that scientists dealing with big data are expected to possess across fields, and we use this information to propose courses to meet these growing needs. While bioinformatics programs have traditionally trained students in data-intensive science, we identify areas of particular biological, computational and statistical emphasis important for this era that can be incorporated into existing curricula. For each area, we propose a course structured around these topics, which can be adapted in whole or in parts into existing curricula. In summary, specific challenges associated with big data provide an important opportunity to update existing curricula, but we do not foresee a wholesale redesign of bioinformatics training programs. © The Author 2015. Published by Oxford University Press.

  7. Adapting bioinformatics curricula for big data

    PubMed Central

    Greene, Anna C.; Giffin, Kristine A.; Greene, Casey S.

    2016-01-01

    Modern technologies are capable of generating enormous amounts of data that measure complex biological systems. Computational biologists and bioinformatics scientists are increasingly being asked to use these data to reveal key systems-level properties. We review the extent to which curricula are changing in the era of big data. We identify key competencies that scientists dealing with big data are expected to possess across fields, and we use this information to propose courses to meet these growing needs. While bioinformatics programs have traditionally trained students in data-intensive science, we identify areas of particular biological, computational and statistical emphasis important for this era that can be incorporated into existing curricula. For each area, we propose a course structured around these topics, which can be adapted in whole or in parts into existing curricula. In summary, specific challenges associated with big data provide an important opportunity to update existing curricula, but we do not foresee a wholesale redesign of bioinformatics training programs. PMID:25829469

  8. Removing the center from computing: biology's new mode of digital knowledge production.

    PubMed

    November, Joseph

    2011-06-01

    This article shows how the USA's National Institutes of Health (NIH) helped to bring about a major shift in the way computers are used to produce knowledge and in the design of computers themselves as a consequence of its early 1960s efforts to introduce information technology to biologists. Starting in 1960 the NIH sought to reform the life sciences by encouraging researchers to make use of digital electronic computers, but despite generous federal support biologists generally did not embrace the new technology. Initially the blame fell on biologists' lack of appropriate (i.e. digital) data for computers to process. However, when the NIH consulted MIT computer architect Wesley Clark about this problem, he argued that the computer's quality as a device that was centralized posed an even greater challenge to potential biologist users than did the computer's need for digital data. Clark convinced the NIH that if the agency hoped to effectively computerize biology, it would need to satisfy biologists' experimental and institutional needs by providing them the means to use a computer without going to a computing center. With NIH support, Clark developed the 1963 Laboratory Instrument Computer (LINC), a small, real-time interactive computer intended to be used inside the laboratory and controlled entirely by its biologist users. Once built, the LINC provided a viable alternative to the 1960s norm of large computers housed in computing centers. As such, the LINC not only became popular among biologists, but also served in later decades as an important precursor of today's computing norm in the sciences and far beyond, the personal computer.

  9. Large-scale gene function analysis with the PANTHER classification system.

    PubMed

    Mi, Huaiyu; Muruganujan, Anushya; Casagrande, John T; Thomas, Paul D

    2013-08-01

    The PANTHER (protein annotation through evolutionary relationship) classification system (http://www.pantherdb.org/) is a comprehensive system that combines gene function, ontology, pathways and statistical analysis tools that enable biologists to analyze large-scale, genome-wide data from sequencing, proteomics or gene expression experiments. The system is built with 82 complete genomes organized into gene families and subfamilies, and their evolutionary relationships are captured in phylogenetic trees, multiple sequence alignments and statistical models (hidden Markov models or HMMs). Genes are classified according to their function in several different ways: families and subfamilies are annotated with ontology terms (Gene Ontology (GO) and PANTHER protein class), and sequences are assigned to PANTHER pathways. The PANTHER website includes a suite of tools that enable users to browse and query gene functions, and to analyze large-scale experimental data with a number of statistical tests. It is widely used by bench scientists, bioinformaticians, computer scientists and systems biologists. In the 2013 release of PANTHER (v.8.0), in addition to an update of the data content, we redesigned the website interface to improve both user experience and the system's analytical capability. This protocol provides a detailed description of how to analyze genome-wide experimental data with the PANTHER classification system.

  10. Next generation of network medicine: interdisciplinary signaling approaches.

    PubMed

    Korcsmaros, Tamas; Schneider, Maria Victoria; Superti-Furga, Giulio

    2017-02-20

    In the last decade, network approaches have transformed our understanding of biological systems. Network analyses and visualizations have allowed us to identify essential molecules and modules in biological systems, and improved our understanding of how changes in cellular processes can lead to complex diseases, such as cancer, infectious and neurodegenerative diseases. "Network medicine" involves unbiased large-scale network-based analyses of diverse data describing interactions between genes, diseases, phenotypes, drug targets, drug transport, drug side-effects, disease trajectories and more. In terms of drug discovery, network medicine exploits our understanding of the network connectivity and signaling system dynamics to help identify optimal, often novel, drug targets. Contrary to initial expectations, however, network approaches have not yet delivered a revolution in molecular medicine. In this review, we propose that a key reason for the limited impact, so far, of network medicine is a lack of quantitative multi-disciplinary studies involving scientists from different backgrounds. To support this argument, we present existing approaches from structural biology, 'omics' technologies (e.g., genomics, proteomics, lipidomics) and computational modeling that point towards how multi-disciplinary efforts allow for important new insights. We also highlight some breakthrough studies as examples of the potential of these approaches, and suggest ways to make greater use of the power of interdisciplinarity. This review reflects discussions held at an interdisciplinary signaling workshop which facilitated knowledge exchange from experts from several different fields, including in silico modelers, computational biologists, biochemists, geneticists, molecular and cell biologists as well as cancer biologists and pharmacologists.

  11. How Academic Biologists and Physicists View Science Outreach

    PubMed Central

    Ecklund, Elaine Howard; James, Sarah A.; Lincoln, Anne E.

    2012-01-01

    Scholars and pundits alike argue that U.S. scientists could do more to reach out to the general public. Yet, to date, there have been few systematic studies that examine how scientists understand the barriers that impede such outreach. Through analysis of 97 semi-structured interviews with academic biologists and physicists at top research universities in the United States, we classify the type and target audiences of scientists’ outreach activities. Finally, we explore the narratives academic scientists have about outreach and its reception in the academy, in particular what they perceive as impediments to these activities. We find that scientists’ outreach activities are stratified by gender and that university and disciplinary rewards as well as scientists’ perceptions of their own skills have an impact on science outreach. Research contributions and recommendations for university policy follow. PMID:22590526

  12. Radar aeroecology: exploring the movements of aerial fauna through radio-wave remote sensing

    PubMed Central

    Chilson, Phillip B.; Bridge, Eli; Frick, Winifred F.; Chapman, Jason W.; Kelly, Jeffrey F.

    2012-01-01

    An international and interdisciplinary Radar Aeroecology Workshop was held at the National Weather Center on 5–6 March 2012 on the University of Oklahoma campus in Norman, OK, USA. The workshop brought together biologists, meteorologists, radar engineers and computer scientists from 22 institutions and four countries. A central motivation behind the Radar Aeroecology Workshop was to foster better communication and cross-disciplinary collaboration among a diverse spectrum of researchers, and promote a better understanding of the ecology of animals that move within and use the Earth's lower atmosphere (aerosphere). PMID:22628093

  13. Computational Analysis of Behavior.

    PubMed

    Egnor, S E Roian; Branson, Kristin

    2016-07-08

    In this review, we discuss the emerging field of computational behavioral analysis-the use of modern methods from computer science and engineering to quantitatively measure animal behavior. We discuss aspects of experiment design important to both obtaining biologically relevant behavioral data and enabling the use of machine vision and learning techniques for automation. These two goals are often in conflict. Restraining or restricting the environment of the animal can simplify automatic behavior quantification, but it can also degrade the quality or alter important aspects of behavior. To enable biologists to design experiments to obtain better behavioral measurements, and computer scientists to pinpoint fruitful directions for algorithm improvement, we review known effects of artificial manipulation of the animal on behavior. We also review machine vision and learning techniques for tracking, feature extraction, automated behavior classification, and automated behavior discovery, the assumptions they make, and the types of data they work best with.

  14. 2017 ISCB Overton Prize: Christoph Bock

    PubMed Central

    Fogg, Christiana N.; Kovats, Diane E.; Berger, Bonnie

    2017-01-01

    The International Society for Computational Biology (ISCB) each year recognizes the achievements of an early to mid-career scientist with the Overton Prize. This prize honors the untimely death of Dr. G. Christian Overton, an admired computational biologist and founding ISCB Board member. Winners of the Overton Prize are independent investigators who are in the early to middle phases of their careers and are selected because of their significant contributions to computational biology through research, teaching, and service. ISCB is pleased to recognize Dr. Christoph Bock, Principal Investigator at the CeMM Research Center for Molecular Medicine of the Austrian Academy of Sciences in Vienna, Austria, as the 2017 winner of the Overton Prize. Bock will be presenting a keynote presentation at the 2017 International Conference on Intelligent Systems for Molecular Biology/European Conference on Computational Biology (ISMB/ECCB) in Prague, Czech Republic being held during July 21-25, 2017. PMID:28713546

  15. 2017 ISCB Overton Prize: Christoph Bock.

    PubMed

    Fogg, Christiana N; Kovats, Diane E; Berger, Bonnie

    2017-01-01

    The International Society for Computational Biology (ISCB) each year recognizes the achievements of an early to mid-career scientist with the Overton Prize. This prize honors the untimely death of Dr. G. Christian Overton, an admired computational biologist and founding ISCB Board member. Winners of the Overton Prize are independent investigators who are in the early to middle phases of their careers and are selected because of their significant contributions to computational biology through research, teaching, and service. ISCB is pleased to recognize Dr. Christoph Bock, Principal Investigator at the CeMM Research Center for Molecular Medicine of the Austrian Academy of Sciences in Vienna, Austria, as the 2017 winner of the Overton Prize. Bock will be presenting a keynote presentation at the 2017 International Conference on Intelligent Systems for Molecular Biology/European Conference on Computational Biology (ISMB/ECCB) in Prague, Czech Republic being held during July 21-25, 2017.

  16. Computing through Scientific Abstractions in SysBioPS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chin, George; Stephan, Eric G.; Gracio, Deborah K.

    2004-10-13

    Today, biologists and bioinformaticists have a tremendous amount of computational power at their disposal. With the availability of supercomputers, burgeoning scientific databases and digital libraries such as GenBank and PubMed, and pervasive computational environments such as the Grid, biologists have access to a wealth of computational capabilities and scientific data at hand. Yet, the rapid development of computational technologies has far exceeded the typical biologist’s ability to effectively apply the technology in their research. Computational sciences research and development efforts such as the Biology Workbench, BioSPICE (Biological Simulation Program for Intra-Cellular Evaluation), and BioCoRE (Biological Collaborative Research Environment) are importantmore » in connecting biologists and their scientific problems to computational infrastructures. On the Computational Cell Environment and Heuristic Entity-Relationship Building Environment projects at the Pacific Northwest National Laboratory, we are jointly developing a new breed of scientific problem solving environment called SysBioPSE that will allow biologists to access and apply computational resources in the scientific research context. In contrast to other computational science environments, SysBioPSE operates as an abstraction layer above a computational infrastructure. The goal of SysBioPSE is to allow biologists to apply computational resources in the context of the scientific problems they are addressing and the scientific perspectives from which they conduct their research. More specifically, SysBioPSE allows biologists to capture and represent scientific concepts and theories and experimental processes, and to link these views to scientific applications, data repositories, and computer systems.« less

  17. Corals as bioindicators of climate change

    USGS Publications Warehouse

    Shinn, Eugene A.

    2008-01-01

    Potential effects of climate change and ocean acidification have energized much discussion among coral scientists, especially biologists. Will corals go extinct, lose their skeletons, or migrate pole-ward to cooler waters? No one knows, but some simple experiments, recent observations, and recent studies may shed some light on these questions. Above all they show the need for collaboration among biologists and geologists.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    John Wooley; Herbert S. Lin

    This study is the first comprehensive NRC study that suggests a high-level intellectual structure for Federal agencies for supporting work at the biology/computing interface. The report seeks to establish the intellectual legitimacy of a fundamentally cross-disciplinary collaboration between biologists and computer scientists. That is, while some universities are increasingly favorable to research at the intersection, life science researchers at other universities are strongly impeded in their efforts to collaborate. This report addresses these impediments and describes proven strategies for overcoming them. An important feature of the report is the use of well-documented examples that describe clearly to individuals not trainedmore » in computer science the value and usage of computing across the biological sciences, from genes and proteins to networks and pathways, from organelles to cells, and from individual organisms to populations and ecosystems. It is hoped that these examples will be useful to students in the life sciences to motivate (continued) study in computer science that will enable them to be more facile users of computing in their future biological studies.« less

  19. Prominent Women Biologists

    ERIC Educational Resources Information Center

    Hoh, Yin Kiong; Boo, Hong Kwen

    2003-01-01

    The perception that scientists are intelligent white men who are socially inept, absent-minded nerds seems to prevail among students at all levels, from elementary school to college. While media may, by chance or choice, promote this image, it is unfortunately a realistic one. This stereotypical image of scientists as white men has, in part,…

  20. Why Machine-Information Metaphors are Bad for Science and Science Education

    NASA Astrophysics Data System (ADS)

    Pigliucci, Massimo; Boudry, Maarten

    2011-05-01

    Genes are often described by biologists using metaphors derived from computational science: they are thought of as carriers of information, as being the equivalent of "blueprints" for the construction of organisms. Likewise, cells are often characterized as "factories" and organisms themselves become analogous to machines. Accordingly, when the human genome project was initially announced, the promise was that we would soon know how a human being is made, just as we know how to make airplanes and buildings. Importantly, modern proponents of Intelligent Design, the latest version of creationism, have exploited biologists' use of the language of information and blueprints to make their spurious case, based on pseudoscientific concepts such as "irreducible complexity" and on flawed analogies between living cells and mechanical factories. However, the living organism = machine analogy was criticized already by David Hume in his Dialogues Concerning Natural Religion. In line with Hume's criticism, over the past several years a more nuanced and accurate understanding of what genes are and how they operate has emerged, ironically in part from the work of computational scientists who take biology, and in particular developmental biology, more seriously than some biologists seem to do. In this article we connect Hume's original criticism of the living organism = machine analogy with the modern ID movement, and illustrate how the use of misleading and outdated metaphors in science can play into the hands of pseudoscientists. Thus, we argue that dropping the blueprint and similar metaphors will improve both the science of biology and its understanding by the general public.

  1. The Mary Ingraham Bunting Institute of Radcliffe College.

    DTIC Science & Technology

    1992-08-31

    scientists -- biologists Elena Budrene and Doris Stern, and geologist Constance Soja -- had also been fellows during the 1990-91 fellowship year. Beginning...Balanced Cross Sections" Constance X. Soja , Geology, Harvard University "Tectonic Controls on Reef Development During the Silurian" Doris Naimark Stern...other better-situated scholars like geologists Constance Soja and Barbara Sheffels, and molecular biologists Elena Budrene and Orna Resnekov, the

  2. Verification of otolith identity used by fisheries scientists for aging channel catfish

    USGS Publications Warehouse

    Long, James M.; Stewart, David R.

    2010-01-01

    Previously published studies of the age estimation of channel catfish Ictalurus punctatus based on otoliths have reported using the sagittae, whereas it is likely they were actually using the lapilli. This confusion may have resulted because in catfishes (ostariophyseans) the lapilli are the largest of the three otoliths, whereas in nonostariophysean fish the sagittae are the largest. Based on (1) scanning electron microscope microphotographs of channel catfish otoliths, (2) X-ray computed tomography scans of a channel catfish head, (3) descriptions of techniques used to removed otoliths from channel catfish reported in the literature, and (4) a sample of channel catfish otoliths received from fisheries biologists from around the country, it is clear that lapilli are most often used for channel catfish aging studies, not sagittae, as has been previously reported. Fisheries scientists who obtain otoliths from channel catfish can use the information in this paper to correctly identify otolith age.

  3. Bio-ontologies: current trends and future directions

    PubMed Central

    Bodenreider, Olivier; Stevens, Robert

    2006-01-01

    In recent years, as a knowledge-based discipline, bioinformatics has been made more computationally amenable. After its beginnings as a technology advocated by computer scientists to overcome problems of heterogeneity, ontology has been taken up by biologists themselves as a means to consistently annotate features from genotype to phenotype. In medical informatics, artifacts called ontologies have been used for a longer period of time to produce controlled lexicons for coding schemes. In this article, we review the current position in ontologies and how they have become institutionalized within biomedicine. As the field has matured, the much older philosophical aspects of ontology have come into play. With this and the institutionalization of ontology has come greater formality. We review this trend and what benefits it might bring to ontologies and their use within biomedicine. PMID:16899495

  4. Electron Microscope Center Opens at Berkeley.

    ERIC Educational Resources Information Center

    Robinson, Arthur L.

    1981-01-01

    A 1.5-MeV High Voltage Electron Microscope has been installed at the Lawrence Berkeley Laboratory which will help materials scientists and biologists study samples in more true-to-life situations. A 1-MeV Atomic Resolution Microscope will be installed at the same location in two years which will allow scientists to distinguish atoms. (DS)

  5. The Impact of a Citizen Science Program on Student Achievement and Motivation: A Social Cognitive Career Perspective

    ERIC Educational Resources Information Center

    Hiller, Suzanne E.

    2012-01-01

    Citizen science programs are joint efforts between hobbyists and professional scientists designed to collect data to support scientific research. Through these programs, biologists study species population trends while citizen scientists improve their content knowledge and science skills. The purpose of the present mixed method quasi-experimental…

  6. E-Book Usage among Chemists, Biochemists and Biologists: Findings of a Survey and Interviews

    ERIC Educational Resources Information Center

    Zhang, Yuening; Beckman, Roger

    2011-01-01

    An e-book usage survey was sent through departmental mailing lists to the graduate students, scientists and faculty members of the Chemistry Department and Biology Department of Indiana University, Bloomington (IUB). Several faculty members, scientists and graduates students from the Chemistry Department and Biology Department were also contacted…

  7. The Scientist in the Casa: The Child as Scientist in the Making

    ERIC Educational Resources Information Center

    Sackett, Ginni

    2016-01-01

    If a parent were to ask what science and technology are offered in a Montessori preschool, Ginni Sackett provides a comprehensive reply. By precisely defining the words science and technology with an expansion of those definitions from renowned biologist E. O. Wilson, alongside the "experiences we offer every day to the children in our…

  8. Big cat phylogenies, consensus trees, and computational thinking.

    PubMed

    Sul, Seung-Jin; Williams, Tiffani L

    2011-07-01

    Phylogenetics seeks to deduce the pattern of relatedness between organisms by using a phylogeny or evolutionary tree. For a given set of organisms or taxa, there may be many evolutionary trees depicting how these organisms evolved from a common ancestor. As a result, consensus trees are a popular approach for summarizing the shared evolutionary relationships in a group of trees. We examine these consensus techniques by studying how the pantherine lineage of cats (clouded leopard, jaguar, leopard, lion, snow leopard, and tiger) evolved, which is hotly debated. While there are many phylogenetic resources that describe consensus trees, there is very little information, written for biologists, regarding the underlying computational techniques for building them. The pantherine cats provide us with a small, relevant example to explore the computational techniques (such as sorting numbers, hashing functions, and traversing trees) for constructing consensus trees. Our hope is that life scientists enjoy peeking under the computational hood of consensus tree construction and share their positive experiences with others in their community.

  9. VirtualPlant: A Software Platform to Support Systems Biology Research1[W][OA

    PubMed Central

    Katari, Manpreet S.; Nowicki, Steve D.; Aceituno, Felipe F.; Nero, Damion; Kelfer, Jonathan; Thompson, Lee Parnell; Cabello, Juan M.; Davidson, Rebecca S.; Goldberg, Arthur P.; Shasha, Dennis E.; Coruzzi, Gloria M.; Gutiérrez, Rodrigo A.

    2010-01-01

    Data generation is no longer the limiting factor in advancing biological research. In addition, data integration, analysis, and interpretation have become key bottlenecks and challenges that biologists conducting genomic research face daily. To enable biologists to derive testable hypotheses from the increasing amount of genomic data, we have developed the VirtualPlant software platform. VirtualPlant enables scientists to visualize, integrate, and analyze genomic data from a systems biology perspective. VirtualPlant integrates genome-wide data concerning the known and predicted relationships among genes, proteins, and molecules, as well as genome-scale experimental measurements. VirtualPlant also provides visualization techniques that render multivariate information in visual formats that facilitate the extraction of biological concepts. Importantly, VirtualPlant helps biologists who are not trained in computer science to mine lists of genes, microarray experiments, and gene networks to address questions in plant biology, such as: What are the molecular mechanisms by which internal or external perturbations affect processes controlling growth and development? We illustrate the use of VirtualPlant with three case studies, ranging from querying a gene of interest to the identification of gene networks and regulatory hubs that control seed development. Whereas the VirtualPlant software was developed to mine Arabidopsis (Arabidopsis thaliana) genomic data, its data structures, algorithms, and visualization tools are designed in a species-independent way. VirtualPlant is freely available at www.virtualplant.org. PMID:20007449

  10. Unipro UGENE NGS pipelines and components for variant calling, RNA-seq and ChIP-seq data analyses.

    PubMed

    Golosova, Olga; Henderson, Ross; Vaskin, Yuriy; Gabrielian, Andrei; Grekhov, German; Nagarajan, Vijayaraj; Oler, Andrew J; Quiñones, Mariam; Hurt, Darrell; Fursov, Mikhail; Huyen, Yentram

    2014-01-01

    The advent of Next Generation Sequencing (NGS) technologies has opened new possibilities for researchers. However, the more biology becomes a data-intensive field, the more biologists have to learn how to process and analyze NGS data with complex computational tools. Even with the availability of common pipeline specifications, it is often a time-consuming and cumbersome task for a bench scientist to install and configure the pipeline tools. We believe that a unified, desktop and biologist-friendly front end to NGS data analysis tools will substantially improve productivity in this field. Here we present NGS pipelines "Variant Calling with SAMtools", "Tuxedo Pipeline for RNA-seq Data Analysis" and "Cistrome Pipeline for ChIP-seq Data Analysis" integrated into the Unipro UGENE desktop toolkit. We describe the available UGENE infrastructure that helps researchers run these pipelines on different datasets, store and investigate the results and re-run the pipelines with the same parameters. These pipeline tools are included in the UGENE NGS package. Individual blocks of these pipelines are also available for expert users to create their own advanced workflows.

  11. Changing the face of science: Lessons from the 2017 Science-A-Thon

    NASA Astrophysics Data System (ADS)

    Barnes, R. T.; Licker, R.; Burt, M. A.; Holloway, T.

    2017-12-01

    Studies have shown that over two-thirds of Americans cannot name a living scientist. This disconnect is a concern for science and scientists, considering the large role of public funding for science, and the importance of science in many policy issues. As a large-scale public outreach initiative and fundraiser, the Earth Science Women's Network (ESWN) launched "Science-A-Thon" on July 13, 2017. This "day of science" invited participants to share 12 photos over 12 hours of a day, including both personal routines and professional endeavors. Over 200 scientists participated, with the #DayofScience hashtag trending on Twitter for the day. Earth scientists represented the largest portion of participants, but the event engaged cancer biologists, computer scientists, and more, including scientists from more than 10 countries. Science-A-Thon builds on the success and visibility of other social media campaigns, such as #actuallivingscientist and #DresslikeaWoman. Importantly these efforts share a common goal, by providing diverse images of scientists we can shift the public perception of who a scientist is and what science looks like in the real world. This type of public engagement offers a wide range of potential role models for students, and individual stories to increase public engagement with science. Social media campaigns such as this shift the public perception of who scientists are, why they do what they do, and what they do each day. The actions and conversations emerging from Science-A-Thon included scientists talking about (1) their science and motivation, (2) the purpose and need for ESWN, and (3) why they chose to participate in this event increased the reach of a social media campaign and fundraiser.

  12. Formalizing an integrative, multidisciplinary cancer therapy discovery workflow

    PubMed Central

    McGuire, Mary F.; Enderling, Heiko; Wallace, Dorothy I.; Batra, Jaspreet; Jordan, Marie; Kumar, Sushil; Panetta, John C.; Pasquier, Eddy

    2014-01-01

    Although many clinicians and researchers work to understand cancer, there has been limited success to effectively combine forces and collaborate over time, distance, data and budget constraints. Here we present a workflow template for multidisciplinary cancer therapy that was developed during the 2nd Annual Workshop on Cancer Systems Biology sponsored by Tufts University, Boston, MA in July 2012. The template was applied to the development of a metronomic therapy backbone for neuroblastoma. Three primary groups were identified: clinicians, biologists, and scientists (mathematicians, computer scientists, physicists and engineers). The workflow described their integrative interactions; parallel or sequential processes; data sources and computational tools at different stages as well as the iterative nature of therapeutic development from clinical observations to in vitro, in vivo, and clinical trials. We found that theoreticians in dialog with experimentalists could develop calibrated and parameterized predictive models that inform and formalize sets of testable hypotheses, thus speeding up discovery and validation while reducing laboratory resources and costs. The developed template outlines an interdisciplinary collaboration workflow designed to systematically investigate the mechanistic underpinnings of a new therapy and validate that therapy to advance development and clinical acceptance. PMID:23955390

  13. From stem cells to human development: a distinctly human perspective on early embryology, cellular differentiation and translational research.

    PubMed

    Craft, April M; Johnson, Matthew

    2017-01-01

    Over 100 scientists with common interests in human development, disease and regeneration gathered in late September 2016 for The Company of Biologists' second 'From Stem Cells to Human Development' meeting held in historic Southbridge. In this Meeting Review, we highlight some of the exciting new findings that were presented, and discuss emerging themes and convergences in human development and disease that arose during these discussions. © 2017. Published by The Company of Biologists Ltd.

  14. An electronic laboratory notebook based on HTML forms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marstaller, J.E.; Zorn, M.D.

    The electronic notebook records information that has traditionally been kept in handwritten laboratory notebooks. It keeps detailed information about the progress of the research , such as the optimization of primers, the screening of the primers and, finally, the mapping of the probes. The notebook provides two areas of services: Data entry, and reviewing of data in all stages. The World wide Web browsers, with HTML based forms provide a fast and easy mechanism to create forms-based user interfaces. The computer scientist can sit down with the biologist and rapidly make changes in response to the user`s comments. Furthermore themore » HTML forms work equally well on a number of different hardware platforms; thus the biologists may continue using their Macintosh computers and find a familiar interface if they have to work on a Unix workstation. The web browser can be run from any machine connected to the Internet: thus the users are free to enter or view information even away from their labs at home or while on travel. Access can be restricted by password and other means to secure the confidentiality of the data. A bonus that is hard to implement otherwise is the facile connection to outside resources. Linking local information to data in public databases is only a hypertext link away with little or no additional programming efforts.« less

  15. Primer and interviews: Molecular mechanisms of morphological evolution

    PubMed Central

    Kiefer, Julie C

    2010-01-01

    The beauty of the developing embryo, and the awe that it inspires, lure many scientists into the field of developmental biology. What compels cells to divide, migrate, and morph into a being with a complex body plan? Evolutionary developmental biologists hold similar fascinations, with dynamics that take place on a grander timescale. How do phenotypic traits diverge over evolutionary time? This primer illustrates how a deep understanding of the basic principles that underlie developmental biology have changed how scientists think about the evolution of body form. The primer culminates in a conversation with David Stern, PhD, and Michael Shapiro, PhD, who discuss current topics in morphological evolution, why the field should be of interest to classic developmental biologists, and what lies ahead. Developmental Dynamics 239:3497–3505, 2010. © 2010 Wiley-Liss, Inc. PMID:21069831

  16. Misconceptions of Synthetic Biology: Lessons from an Interdisciplinary Summer School

    NASA Technical Reports Server (NTRS)

    Verseux, Cyprien; Acevedo-Rocha, Carlos G.; Chizzolini, Fabio; Rothschild, Lynn J.

    2016-01-01

    In 2014, an international group of scholars from various fields analysed the "societal dimensions" of synthetic biology in an interdisciplinary summer school. Here, we report and discuss the biologists' observations on the general perception of synthetic biology by non-biologists who took part in this event. Most attendees mainly associated synthetic biology with contributions from the best-known public figures of the field, rarely mentioning other scientists. Media extrapolations of those contributions appeared to have created unrealistic expectations and irrelevant fears that were widely disconnected from the current research in synthetic biology. Another observation was that when debating developments in synthetic biology, semantics strongly mattered: depending on the terms used to present an application of synthetic biology, attendees reacted in radically different ways. For example, using the term "GMOs" (genetically modified organisms) rather than the term "genetic engineering" led to very different reactions. Stimulating debates also happened with participants having unanticipated points of view, for instance biocentrist ethicists who argued that engineered microbes should not be used for human purposes. Another communication challenge emerged from the connotations and inaccuracies surrounding the word "life", which impaired constructive debates, thus leading to misconceptions about the abilities of scientists to engineer or even create living organisms. Finally, it appeared that synthetic biologists tend to overestimate the knowledge of non-biologists, further affecting communication. The motivation and ability of synthetic biologists to communicate their work outside their research field needs to be fostered, notably towards policymakers who need a more accurate and technical understanding of the field to make informed decisions. Interdisciplinary events gathering scholars working in and around synthetic biology are an effective tool in addressing those issues.

  17. Genomic impact of eukaryotic transposable elements

    PubMed Central

    2012-01-01

    The third international conference on the genomic impact of eukaryotic transposable elements (TEs) was held 24 to 28 February 2012 at the Asilomar Conference Center, Pacific Grove, CA, USA. Sponsored in part by the National Institutes of Health grant 5 P41 LM006252, the goal of the conference was to bring together researchers from around the world who study the impact and mechanisms of TEs using multiple computational and experimental approaches. The meeting drew close to 170 attendees and included invited floor presentations on the biology of TEs and their genomic impact, as well as numerous talks contributed by young scientists. The workshop talks were devoted to computational analysis of TEs with additional time for discussion of unresolved issues. Also, there was ample opportunity for poster presentations and informal evening discussions. The success of the meeting reflects the important role of Repbase in comparative genomic studies, and emphasizes the need for close interactions between experimental and computational biologists in the years to come. PMID:23171443

  18. Genomic impact of eukaryotic transposable elements.

    PubMed

    Arkhipova, Irina R; Batzer, Mark A; Brosius, Juergen; Feschotte, Cédric; Moran, John V; Schmitz, Jürgen; Jurka, Jerzy

    2012-11-21

    The third international conference on the genomic impact of eukaryotic transposable elements (TEs) was held 24 to 28 February 2012 at the Asilomar Conference Center, Pacific Grove, CA, USA. Sponsored in part by the National Institutes of Health grant 5 P41 LM006252, the goal of the conference was to bring together researchers from around the world who study the impact and mechanisms of TEs using multiple computational and experimental approaches. The meeting drew close to 170 attendees and included invited floor presentations on the biology of TEs and their genomic impact, as well as numerous talks contributed by young scientists. The workshop talks were devoted to computational analysis of TEs with additional time for discussion of unresolved issues. Also, there was ample opportunity for poster presentations and informal evening discussions. The success of the meeting reflects the important role of Repbase in comparative genomic studies, and emphasizes the need for close interactions between experimental and computational biologists in the years to come.

  19. Data integration in biological research: an overview.

    PubMed

    Lapatas, Vasileios; Stefanidakis, Michalis; Jimenez, Rafael C; Via, Allegra; Schneider, Maria Victoria

    2015-12-01

    Data sharing, integration and annotation are essential to ensure the reproducibility of the analysis and interpretation of the experimental findings. Often these activities are perceived as a role that bioinformaticians and computer scientists have to take with no or little input from the experimental biologist. On the contrary, biological researchers, being the producers and often the end users of such data, have a big role in enabling biological data integration. The quality and usefulness of data integration depend on the existence and adoption of standards, shared formats, and mechanisms that are suitable for biological researchers to submit and annotate the data, so it can be easily searchable, conveniently linked and consequently used for further biological analysis and discovery. Here, we provide background on what is data integration from a computational science point of view, how it has been applied to biological research, which key aspects contributed to its success and future directions.

  20. Computing Platforms for Big Biological Data Analytics: Perspectives and Challenges.

    PubMed

    Yin, Zekun; Lan, Haidong; Tan, Guangming; Lu, Mian; Vasilakos, Athanasios V; Liu, Weiguo

    2017-01-01

    The last decade has witnessed an explosion in the amount of available biological sequence data, due to the rapid progress of high-throughput sequencing projects. However, the biological data amount is becoming so great that traditional data analysis platforms and methods can no longer meet the need to rapidly perform data analysis tasks in life sciences. As a result, both biologists and computer scientists are facing the challenge of gaining a profound insight into the deepest biological functions from big biological data. This in turn requires massive computational resources. Therefore, high performance computing (HPC) platforms are highly needed as well as efficient and scalable algorithms that can take advantage of these platforms. In this paper, we survey the state-of-the-art HPC platforms for big biological data analytics. We first list the characteristics of big biological data and popular computing platforms. Then we provide a taxonomy of different biological data analysis applications and a survey of the way they have been mapped onto various computing platforms. After that, we present a case study to compare the efficiency of different computing platforms for handling the classical biological sequence alignment problem. At last we discuss the open issues in big biological data analytics.

  1. Crops in silico: A community wide multi-scale computational modeling framework of plant canopies

    NASA Astrophysics Data System (ADS)

    Srinivasan, V.; Christensen, A.; Borkiewic, K.; Yiwen, X.; Ellis, A.; Panneerselvam, B.; Kannan, K.; Shrivastava, S.; Cox, D.; Hart, J.; Marshall-Colon, A.; Long, S.

    2016-12-01

    Current crop models predict a looming gap between supply and demand for primary foodstuffs over the next 100 years. While significant yield increases were achieved in major food crops during the early years of the green revolution, the current rates of yield increases are insufficient to meet future projected food demand. Furthermore, with projected reduction in arable land, decrease in water availability, and increasing impacts of climate change on future food production, innovative technologies are required to sustainably improve crop yield. To meet these challenges, we are developing Crops in silico (Cis), a biologically informed, multi-scale, computational modeling framework that can facilitate whole plant simulations of crop systems. The Cis framework is capable of linking models of gene networks, protein synthesis, metabolic pathways, physiology, growth, and development in order to investigate crop response to different climate scenarios and resource constraints. This modeling framework will provide the mechanistic details to generate testable hypotheses toward accelerating directed breeding and engineering efforts to increase future food security. A primary objective for building such a framework is to create synergy among an inter-connected community of biologists and modelers to create a realistic virtual plant. This framework advantageously casts the detailed mechanistic understanding of individual plant processes across various scales in a common scalable framework that makes use of current advances in high performance and parallel computing. We are currently designing a user friendly interface that will make this tool equally accessible to biologists and computer scientists. Critically, this framework will provide the community with much needed tools for guiding future crop breeding and engineering, understanding the emergent implications of discoveries at the molecular level for whole plant behavior, and improved prediction of plant and ecosystem responses to the environment.

  2. Learning to speak for nature: A case study of the development of scientists and their representational practices

    NASA Astrophysics Data System (ADS)

    Torralba, Jose Antonio (Tony)

    This is an ethnographic study of how learning and development take place among a group of professional biologists as a result of their engagement with the material, representational, and interactional resources for constructing scientific claims. The study examines these processes through the everyday activities of a research project conducted by this group of biologists to understand different aspects of subterranean termites. As part of its ethnographic approach, this study offers extended descriptions of how biologists designed and organized resources across distinct settings (e.g., field sites, laboratories, and meeting rooms) as they attempt to produce scientific claims. As part of its developmental approach, the study examines how individuals and their representational practices changed as a result of engagement with the claim-making process. By examining the everyday practices of biologists inside of a laboratory, this study attempts to highlight elements of disciplinary context and practice that play an important role in how individuals learn and develop disciplinary competence. The study offers a developmental model of how individuals and their representational practices change in virtue of each other as these individuals engage in the claim-making process. The study attends to the various ways scientists actually know, learn, and become competent in a discipline like entomology (the study of insects) with the intent of finding out what should be considered in designing learning environments within the science for those beginning to engage with the subject matter.

  3. ISMB/ECCB 2009 Stockholm

    PubMed Central

    Sagot, Marie-France; McKay, B.J. Morrison; Myers, Gene

    2009-01-01

    The International Society for Computational Biology (ISCB; http://www.iscb.org) presents the Seventeenth Annual International Conference on Intelligent Systems for Molecular Biology (ISMB), organized jointly with the Eighth Annual European Conference on Computational Biology (ECCB; http://bioinf.mpi-inf.mpg.de/conferences/eccb/eccb.htm), in Stockholm, Sweden, 27 June to 2 July 2009. The organizers are putting the finishing touches on the year's premier computational biology conference, with an expected attendance of 1400 computer scientists, mathematicians, statisticians, biologists and scientists from other disciplines related to and reliant on this multi-disciplinary science. ISMB/ECCB 2009 (http://www.iscb.org/ismbeccb2009/) follows the framework introduced at the ISMB/ECCB 2007 (http://www.iscb.org/ismbeccb2007/) in Vienna, and further refined at the ISMB 2008 (http://www.iscb.org/ismb2008/) in Toronto; a framework developed to specifically encourage increased participation from often under-represented disciplines at conferences on computational biology. During the main ISMB conference dates of 29 June to 2 July, keynote talks from highly regarded scientists, including ISCB Award winners, are the featured presentations that bring all attendees together twice a day. The remainder of each day offers a carefully balanced selection of parallel sessions to choose from: proceedings papers, special sessions on emerging topics, highlights of the past year's published research, special interest group meetings, technology demonstrations, workshops and several unique sessions of value to the broad audience of students, faculty and industry researchers. Several hundred posters displayed for the duration of the conference has become a standard of the ISMB and ECCB conference series, and an extensive commercial exhibition showcases the latest bioinformatics publications, software, hardware and services available on the market today. The main conference is preceded by 2 days of Special Interest Group (SIG) and Satellite meetings running in parallel to the fifth Student Council Symposium on 27 June, and in parallel to Tutorials on 28 June. All scientific sessions take place at the Stockholmsmässan/Stockholm International Fairs conference and exposition facility. Contact: bj@iscb.org PMID:19447790

  4. Finding your inner modeler: An NSF-sponsored workshop to introduce cell biologists to modeling/computational approaches.

    PubMed

    Stone, David E; Haswell, Elizabeth S; Sztul, Elizabeth

    2017-01-01

    In classical Cell Biology, fundamental cellular processes are revealed empirically, one experiment at a time. While this approach has been enormously fruitful, our understanding of cells is far from complete. In fact, the more we know, the more keenly we perceive our ignorance of the profoundly complex and dynamic molecular systems that underlie cell structure and function. Thus, it has become apparent to many cell biologists that experimentation alone is unlikely to yield major new paradigms, and that empiricism must be combined with theory and computational approaches to yield major new discoveries. To facilitate those discoveries, three workshops will convene annually for one day in three successive summers (2017-2019) to promote the use of computational modeling by cell biologists currently unconvinced of its utility or unsure how to apply it. The first of these workshops was held at the University of Illinois, Chicago in July 2017. Organized to facilitate interactions between traditional cell biologists and computational modelers, it provided a unique educational opportunity: a primer on how cell biologists with little or no relevant experience can incorporate computational modeling into their research. Here, we report on the workshop and describe how it addressed key issues that cell biologists face when considering modeling including: (1) Is my project appropriate for modeling? (2) What kind of data do I need to model my process? (3) How do I find a modeler to help me in integrating modeling approaches into my work? And, perhaps most importantly, (4) why should I bother?

  5. Computational biology and bioinformatics in Nigeria.

    PubMed

    Fatumo, Segun A; Adoga, Moses P; Ojo, Opeolu O; Oluwagbemi, Olugbenga; Adeoye, Tolulope; Ewejobi, Itunuoluwa; Adebiyi, Marion; Adebiyi, Ezekiel; Bewaji, Clement; Nashiru, Oyekanmi

    2014-04-01

    Over the past few decades, major advances in the field of molecular biology, coupled with advances in genomic technologies, have led to an explosive growth in the biological data generated by the scientific community. The critical need to process and analyze such a deluge of data and turn it into useful knowledge has caused bioinformatics to gain prominence and importance. Bioinformatics is an interdisciplinary research area that applies techniques, methodologies, and tools in computer and information science to solve biological problems. In Nigeria, bioinformatics has recently played a vital role in the advancement of biological sciences. As a developing country, the importance of bioinformatics is rapidly gaining acceptance, and bioinformatics groups comprised of biologists, computer scientists, and computer engineers are being constituted at Nigerian universities and research institutes. In this article, we present an overview of bioinformatics education and research in Nigeria. We also discuss professional societies and academic and research institutions that play central roles in advancing the discipline in Nigeria. Finally, we propose strategies that can bolster bioinformatics education and support from policy makers in Nigeria, with potential positive implications for other developing countries.

  6. Computational Biology and Bioinformatics in Nigeria

    PubMed Central

    Fatumo, Segun A.; Adoga, Moses P.; Ojo, Opeolu O.; Oluwagbemi, Olugbenga; Adeoye, Tolulope; Ewejobi, Itunuoluwa; Adebiyi, Marion; Adebiyi, Ezekiel; Bewaji, Clement; Nashiru, Oyekanmi

    2014-01-01

    Over the past few decades, major advances in the field of molecular biology, coupled with advances in genomic technologies, have led to an explosive growth in the biological data generated by the scientific community. The critical need to process and analyze such a deluge of data and turn it into useful knowledge has caused bioinformatics to gain prominence and importance. Bioinformatics is an interdisciplinary research area that applies techniques, methodologies, and tools in computer and information science to solve biological problems. In Nigeria, bioinformatics has recently played a vital role in the advancement of biological sciences. As a developing country, the importance of bioinformatics is rapidly gaining acceptance, and bioinformatics groups comprised of biologists, computer scientists, and computer engineers are being constituted at Nigerian universities and research institutes. In this article, we present an overview of bioinformatics education and research in Nigeria. We also discuss professional societies and academic and research institutions that play central roles in advancing the discipline in Nigeria. Finally, we propose strategies that can bolster bioinformatics education and support from policy makers in Nigeria, with potential positive implications for other developing countries. PMID:24763310

  7. Statements by Scientists in the California Textbook Dispute

    ERIC Educational Resources Information Center

    American Biology Teacher, 1972

    1972-01-01

    Contains statements by five biologists and science educators to the California State Department of Education committee considering the adoption standards for science textbooks with regard to the clause requiring inclusion of creationist viewpoints of species origins. (AL)

  8. Meaningful crosstalk between biologists and physical scientists is essential for modern biology to progress.

    PubMed

    Dev, Sukhendu B

    2009-01-01

    The advances in biological sciences have been phenomenal since the structure of DNA was decoded, especially if one considers the input from physical sciences, not only in terms of analytical tools, but also understanding and solving some of the key problems in biology. In this article, I trace briefly the history of this transition, from physical sciences to biology, and argue that progress in modern biology can be accelerated if there is far more meaningful crosstalk between the biologists and the physical scientists, simply because biology has become far more complex and interdisciplinary, and the need for such crosstalk cannot be overemphasized. Without a concerted effort in this area progress will be hindered, and the two camps will continue to work on their own, using their own specialized language, thus making communication highly ineffective. I support my argument giving a vast array of examples and also quoting leading authorities.

  9. Results of heavy ion radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Castro, J.R.

    1994-04-01

    The potential of heavy ion therapy for clinical use in cancer therapy stems from the biological parameters of heavy charged particles, and their precise dose localization. Biologically, carbon, neon and other heavy ion beams (up to about silicon) are clinically useful in overcoming the radioresistance of hypoxic tumors, thus increasing biological effectiveness relative to low-LET x-ray or electron beams. Cells irradiated by heavy ions show less variation in cell-cycle related radiosensitivity and decreased repair of radiation injury. The physical parameters of these heavy charged particles allow precise delivery of high radiation doses to tumors while minimizing irradiation of normal tissues.more » Clinical use requires close interaction between radiation oncologists, medical physicists, accelerator physicists, engineers, computer scientists and radiation biologists.« less

  10. Mammalian Toxicology Testing: Problem Definition Study, Personnel Plan.

    DTIC Science & Technology

    1981-03-01

    Technician X X Biochemist X Biologist !Bookkeeper Cage Washer X Clinical Chemist Compound Preparation Technician X Computer Cooer X Computer ...Biologist 62 Bookkeeper 60 Cage rasher 33 Clinical Chemist 26 Comp. Prep. Technician 20 Computer Coder 31 Computer Programer 31 Electron Microscope Op...29,200 * Computer Programmer BS S SFByAe 900-2.0 18,400 - $24500 e Lab Tec-inician (Chemistry) BS 5 SF Say Area 16,600- 24.000 - 14.200- ’,0 * Animal

  11. Medical biochemistry in Macedonia: a profession for physicians and natural scientists.

    PubMed

    Traikovska, S; Dzhekova-Stojkova, S

    2001-06-01

    Medical biochemistry or clinical chemistry in its roots is an interdisciplinary science between natural sciences and medicine. The largest part of medical biochemistry is natural science (chemistry, biochemistry, biology, physics, mathematics), which is very well integrated in deduction of medical problems. Medical biochemistry throughout the world, including Macedonia, should be a professional field open to both physicians and natural scientists, according to its historical development, theoretical characteristics and applied practice. Physicians and natural scientists follow the same route in clinical chemistry during the postgraduate training of specialization in medical biochemistry/clinical chemistry. However, in Macedonia the specialization in medical biochemistry/clinical chemistry is today regulated by law only for physicians and pharmacists. The study of clinical chemistry in Europe has shown its interdisciplinary character. In most European countries different professions, such as physicians, chemists/biochemists, pharmacists, biologists and others could specialize in clinical chemistry. The question for the next generation of specialists in Macedonia is whether to accept the present conditions or to attempt to change the law to include chemists/biochemists and biologists as well. The latter used to be a practice in Macedonia 20 years ago, and still is in many European countries. Such change in law would also result in changes in the postgraduate educational program in medical biochemistry in Macedonia. The new postgraduate program has to follow the European Syllabus, recommended by EC4. To obtain sufficient knowledge in clinical chemistry, the duration of vocational training (undergraduate and postgraduate) for all trainees (physicians, pharmaceutics, chemists/biochemists and biologists) should be 8 years.

  12. A novel paradigm for cell and molecule interaction ontology: from the CMM model to IMGT-ONTOLOGY

    PubMed Central

    2010-01-01

    Background Biology is moving fast toward the virtuous circle of other disciplines: from data to quantitative modeling and back to data. Models are usually developed by mathematicians, physicists, and computer scientists to translate qualitative or semi-quantitative biological knowledge into a quantitative approach. To eliminate semantic confusion between biology and other disciplines, it is necessary to have a list of the most important and frequently used concepts coherently defined. Results We propose a novel paradigm for generating new concepts for an ontology, starting from model rather than developing a database. We apply that approach to generate concepts for cell and molecule interaction starting from an agent based model. This effort provides a solid infrastructure that is useful to overcome the semantic ambiguities that arise between biologists and mathematicians, physicists, and computer scientists, when they interact in a multidisciplinary field. Conclusions This effort represents the first attempt at linking molecule ontology with cell ontology, in IMGT-ONTOLOGY, the well established ontology in immunogenetics and immunoinformatics, and a paradigm for life science biology. With the increasing use of models in biology and medicine, the need to link different levels, from molecules to cells to tissues and organs, is increasingly important. PMID:20167082

  13. The Hematopoietic Expression Viewer: expanding mobile apps as a scientific tool.

    PubMed

    James, Regis A; Rao, Mitchell M; Chen, Edward S; Goodell, Margaret A; Shaw, Chad A

    2012-07-15

    Many important data in current biological science comprise hundreds, thousands or more individual results. These massive data require computational tools to navigate results and effectively interact with the content. Mobile device apps are an increasingly important tool in the everyday lives of scientists and non-scientists alike. These software present individuals with compact and efficient tools to interact with complex data at meetings or other locations remote from their main computing environment. We believe that apps will be important tools for biologists, geneticists and physicians to review content while participating in biomedical research or practicing medicine. We have developed a prototype app for displaying gene expression data using the iOS platform. To present the software engineering requirements, we review the model-view-controller schema for Apple's iOS. We apply this schema to a simple app for querying locally developed microarray gene expression data. The challenge of this application is to balance between storing content locally within the app versus obtaining it dynamically via a network connection. The Hematopoietic Expression Viewer is available at http://www.shawlab.org/he_viewer. The source code for this project and any future information on how to obtain the app can be accessed at http://www.shawlab.org/he_viewer.

  14. A Two-Ocean Bouillabaisse: Science, Politics, and the Central American Sea-Level Canal Controversy.

    PubMed

    Keiner, Christine

    2017-11-01

    As the Panama Canal approached its fiftieth anniversary in the mid-1960s, U.S. officials concerned about the costs of modernization welcomed the technology of peaceful nuclear excavation to create a new waterway at sea level. Biologists seeking a share of the funds slated for radiological-safety studies called attention to another potential effect which they deemed of far greater ecological and evolutionary magnitude - marine species exchange, an obscure environmental issue that required the expertise of underresourced life scientists. An enterprising endeavor to support Smithsonian naturalists, especially marine biologists at the Smithsonian Tropical Research Institute in Panama, wound up sparking heated debates - between biologists and engineers about the oceans' biological integrity and among scientists about whether the megaproject represented a research opportunity or environmental threat. A National Academy of Sciences panel chaired by Ernst Mayr failed to attract congressional funding for its 10-year baseline research program, but did create a stir in the scientific and mainstream press about the ecological threats that the sea-level canal might unleash upon the Atlantic and Pacific. This paper examines how the proposed megaproject sparked a scientific and political conversation about the risks of mixing the oceans at a time when many members of the scientific and engineering communities still viewed the seas as impervious to human-facilitated change.

  15. The Washington Biologists' Field Club : Its members and its history (1900-2006)

    USGS Publications Warehouse

    Perry, M.C.

    2007-01-01

    This book is based on the interesting one-hundred-plus-year history of the Club and its members. Plummers Island and the historic cabin on the Island have served as a common meeting area where the Club members have conducted research and held many social activities for over a century. The history has been written and revised over the years by members, and the biographical sketches also have been collected and written by the members. The Club was formed in 1900 and incorporated as a society in 1901 for scientists in the Washington, D.C., area. In recent years the Club has sponsored research by many non-member local scientists with grants totaling over $305,000. The cumulative total of 267 members represents all branches of natural science, with a strong emphasis on biology as the Club name indicates. In addition to the biologists there have been famous naturalists (e.g., John Burroughs), high-level administrators (e.g., Ira Gabrielson), and well-known artists (e.g., Roger Tory Peterson). Most members have been biological scientists, working for agencies in the Washington, D.C., area, who have published many articles and books dealing with biology and related subjects. The book is publIshed mainly for the benefit of the living Club members and for relatives of the deceased members. The members hope that the book will find its way into libraries across the country and that in the future, persons interested in some of the pioneer scientists, in the various professional areas of science, can obtain biographical information from a well-documented source. Most of the 542 illustrations of the members, cabin, and the Island have not been published previously. It is hopeful that the biographical sketches, pictures, and other information presented in this book can generate new information for future publications and for the website of the Washington Biologists' Field Club, which is updated frequently.

  16. Meet EPA Scientist Maureen R. Gwinn, M.S. Ph.D. DABT

    EPA Pesticide Factsheets

    EPA biologist Dr. Maureen Gwinn works on human health hazard assessments for the Agency's IRIS program. Dr. Gwinn currently serves as the Associate Program Director for Community Health in EPA's Sustainable and Healthy Communities national research program

  17. International Conference on Intelligent Systems for Molecular Biology (ISMB)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldberg, Debra; Hibbs, Matthew; Kall, Lukas

    The Intelligent Systems for Molecular Biology (ISMB) conference has provided a general forum for disseminating the latest developments in bioinformatics on an annual basis for the past 13 years. ISMB is a multidisciplinary conference that brings together scientists from computer science, molecular biology, mathematics and statistics. The goal of the ISMB meeting is to bring together biologists and computational scientists in a focus on actual biological problems, i.e., not simply theoretical calculations. The combined focus on "intelligent systems" and actual biological data makes ISMB a unique and highly important meeting, and 13 years of experience in holding the conference hasmore » resulted in a consistently well organized, well attended, and highly respected annual conference. The ISMB 2005 meeting was held June 25-29, 2005 at the Renaissance Center in Detroit, Michigan. The meeting attracted over 1,730 attendees. The science presented was exceptional, and in the course of the five-day meeting, 56 scientific papers, 710 posters, 47 Oral Abstracts, 76 Software demonstrations, and 14 tutorials were presented. The attendees represented a broad spectrum of backgrounds with 7% from commercial companies, over 28% qualifying for student registration, and 41 countries were represented at the conference, emphasizing its important international aspect. The ISMB conference is especially important because the cultures of computer science and biology are so disparate. ISMB, as a full-scale technical conference with refereed proceedings that have been indexed by both MEDLINE and Current Contents since 1996, bridges this cultural gap.« less

  18. Hidden in the Middle: Culture, Value and Reward in Bioinformatics.

    PubMed

    Lewis, Jamie; Bartlett, Andrew; Atkinson, Paul

    2016-01-01

    Bioinformatics - the so-called shotgun marriage between biology and computer science - is an interdiscipline. Despite interdisciplinarity being seen as a virtue, for having the capacity to solve complex problems and foster innovation, it has the potential to place projects and people in anomalous categories. For example, valorised 'outputs' in academia are often defined and rewarded by discipline. Bioinformatics, as an interdisciplinary bricolage, incorporates experts from various disciplinary cultures with their own distinct ways of working. Perceived problems of interdisciplinarity include difficulties of making explicit knowledge that is practical, theoretical, or cognitive. But successful interdisciplinary research also depends on an understanding of disciplinary cultures and value systems, often only tacitly understood by members of the communities in question. In bioinformatics, the 'parent' disciplines have different value systems; for example, what is considered worthwhile research by computer scientists can be thought of as trivial by biologists, and vice versa . This paper concentrates on the problems of reward and recognition described by scientists working in academic bioinformatics in the United Kingdom. We highlight problems that are a consequence of its cross-cultural make-up, recognising that the mismatches in knowledge in this borderland take place not just at the level of the practical, theoretical, or epistemological, but also at the cultural level too. The trend in big, interdisciplinary science is towards multiple authors on a single paper; in bioinformatics this has created hybrid or fractional scientists who find they are being positioned not just in-between established disciplines but also in-between as middle authors or, worse still, left off papers altogether.

  19. POLICY ADVOCACY IN SCIENCE: PREVALENCE, PERSPECTIVES, AND IMPLICATIONS FOR CONSERVATION BIOLOGISTS

    EPA Science Inventory

    Much debate and discussion has focused on the relationship between science and advocacy, and the role of scientists in influencing public policy. Some argue that advocacy is widespread within scientific literature, however, data to evaluate that contention are lacking. We examine...

  20. Meet EPA Biologist Thomas Knudsen, Ph.D.

    EPA Pesticide Factsheets

    Dr. Tom Knudsen is a developmental systems biologist at EPA's Center for Computational Toxicology. His research focuses on the potential for chemicals to disrupt prenatal development—one of the most important life stages.

  1. The Croonian lectures of 1917: a McGill pathologist confronts the biologists of England.

    PubMed

    Buttolph, Mike

    2010-11-01

    John George Adami (1862-1926) qualified in medicine at Manchester and in 1892 was appointed professor of pathology at McGill University. At the invitation of the Royal College of Physicians (in London) he delivered the Croonian Lectures in 1917. He chose the title 'Adaptation and disease; the contribution of medical research to the study of evolution'. Adami believed that medical work had brought to light important facts about heredity that had not been communicated adequately to biological scientists. He used the lectures to describe this work, placing particular emphasis on his contention that acquired characters are inherited. At this time the medical audience at Adami's lectures would have been generally sympathetic to the idea that acquired characters can be inherited, though many leading British biologists were not sympathetic. Adami hoped that a concise review of the medical findings would persuade the biologists to his point of view or at least would be the starting point for a serious discussion of his evidence. However, the biologists were not persuaded and, although there were acrimonious personal exchanges, there was no scientific debate.

  2. Turning information into knowledge for rangeland management

    USDA-ARS?s Scientific Manuscript database

    The kind of knowledge system that will be capable of meeting the needs of rangeland managers will evolve as scientists, technology specialists, managers, and biologists find ways to integrate the ever expanding array of information systems and tools to meet their needs. The tools and techniques high...

  3. Staff Scientist | Center for Cancer Research

    Cancer.gov

    The scientist will be tasked with independent research projects that support and/or further the scope of our laboratory goals as determined by the Principal Investigator. The scientist will be responsible for overseeing daily operations and coordination of projects in close conjunction with all laboratory personnel. The scientist will participate in teaching laboratory methods to first-time post-docs, research fellows, and students. The scientist will work closely with a full-time research biologist, both in collaboration of research projects and in the lab-critical administrative tasks of IRB-approval, animal protocols, budget, etc. Our laboratory has two post-doctoral researchers at any given time. This is a great opportunity for candidates who are interested in cancer biology and want to grow their research career by working in our program with outstanding support of other established laboratories and core facilities in the National Cancer Institute.

  4. Reflections on Doing Geography: Learning Observations from the Fourth Grade

    ERIC Educational Resources Information Center

    Delahunty, Tina

    2010-01-01

    The Nature Conservancy's (TNC) Orchard Bog site in Shady Valley, Tennessee, is a unique Appalachian mountain bog that provides many opportunities for student exploration. A biogeographer, a field technician, two biologists, and a historian combined their expertise to teach 100 fourth graders how historians and scientists learn about past…

  5. Energy and Sociology.

    ERIC Educational Resources Information Center

    Cottrell, Fred

    The realization that all scientific phenomena are manifestations of energy, rather than separate subjects of inquiry for chemists, physicists, or biologists, has encouraged scientists to explore gaps between the traditional fields of scientific inquiry. In light of this fact, it would seem that the flow of energy should be a major area of concern…

  6. Summer Diabetes Programs a Healthy Hit.

    ERIC Educational Resources Information Center

    Tribal College Journal, 2003

    2003-01-01

    Discusses whether it is possible for the AIHEC and tribal colleges to reverse poor eating habits and decrease diabetes rates for children through education. Students are gathered in a camp setting and they learn from scientists, nurses, nutritionists, biologists, and cultural teachers about how they can develop healthy habits. (MZ)

  7. Open-Door Policy

    ERIC Educational Resources Information Center

    Forde, Dana

    2010-01-01

    According to a 2006 National Science Foundation study, African-Americans, Hispanics and American Indians make up only 2.65 percent, 3.53 percent, and 0.59 percent, respectively, of life sciences academics at four-year institutions. The lack of biologists and other scientists from these ethnic groups is a threat to America's public health and…

  8. Beyond Borders: Zoo as Training Location for Wildlife Biologists

    ERIC Educational Resources Information Center

    Melber, Leah M.; Bergren, Rachel; Santymire, Rachel

    2011-01-01

    The role of institutions such as zoos in global conservation efforts is critical. In addition to serving as informal learning centers for the general public, these institutions are well-positioned to provide training and professional development for the next generation of conservation scientists. And while many organizations traditionally have…

  9. Automated Segmentation and Classification of Coral using Fluid Lensing from Unmanned Airborne Platforms

    NASA Technical Reports Server (NTRS)

    Instrella, Ron; Chirayath, Ved

    2016-01-01

    In recent years, there has been a growing interest among biologists in monitoring the short and long term health of the world's coral reefs. The environmental impact of climate change poses a growing threat to these biologically diverse and fragile ecosystems, prompting scientists to use remote sensing platforms and computer vision algorithms to analyze shallow marine systems. In this study, we present a novel method for performing coral segmentation and classification from aerial data collected from small unmanned aerial vehicles (sUAV). Our method uses Fluid Lensing algorithms to remove and exploit strong optical distortions created along the air-fluid boundary to produce cm-scale resolution imagery of the ocean floor at depths up to 5 meters. A 3D model of the reef is reconstructed using structure from motion (SFM) algorithms, and the associated depth information is combined with multidimensional maximum a posteriori (MAP) estimation to separate organic from inorganic material and classify coral morphologies in the Fluid-Lensed transects. In this study, MAP estimation is performed using a set of manually classified 100 x 100 pixel training images to determine the most probable coral classification within an interrogated region of interest. Aerial footage of a coral reef was captured off the coast of American Samoa and used to test our proposed method. 90 x 20 meter transects of the Samoan coastline undergo automated classification and are manually segmented by a marine biologist for comparison, leading to success rates as high as 85%. This method has broad applications for coastal remote sensing, and will provide marine biologists access to large swaths of high resolution, segmented coral imagery.

  10. Automated Segmentation and Classification of Coral using Fluid Lensing from Unmanned Airborne Platforms

    NASA Astrophysics Data System (ADS)

    Instrella, R.; Chirayath, V.

    2015-12-01

    In recent years, there has been a growing interest among biologists in monitoring the short and long term health of the world's coral reefs. The environmental impact of climate change poses a growing threat to these biologically diverse and fragile ecosystems, prompting scientists to use remote sensing platforms and computer vision algorithms to analyze shallow marine systems. In this study, we present a novel method for performing coral segmentation and classification from aerial data collected from small unmanned aerial vehicles (sUAV). Our method uses Fluid Lensing algorithms to remove and exploit strong optical distortions created along the air-fluid boundary to produce cm-scale resolution imagery of the ocean floor at depths up to 5 meters. A 3D model of the reef is reconstructed using structure from motion (SFM) algorithms, and the associated depth information is combined with multidimensional maximum a posteriori (MAP) estimation to separate organic from inorganic material and classify coral morphologies in the Fluid-Lensed transects. In this study, MAP estimation is performed using a set of manually classified 100 x 100 pixel training images to determine the most probable coral classification within an interrogated region of interest. Aerial footage of a coral reef was captured off the coast of American Samoa and used to test our proposed method. 90 x 20 meter transects of the Samoan coastline undergo automated classification and are manually segmented by a marine biologist for comparison, leading to success rates as high as 85%. This method has broad applications for coastal remote sensing, and will provide marine biologists access to large swaths of high resolution, segmented coral imagery.

  11. The Reference Genome Sequence of Saccharomyces cerevisiae: Then and Now

    PubMed Central

    Engel, Stacia R.; Dietrich, Fred S.; Fisk, Dianna G.; Binkley, Gail; Balakrishnan, Rama; Costanzo, Maria C.; Dwight, Selina S.; Hitz, Benjamin C.; Karra, Kalpana; Nash, Robert S.; Weng, Shuai; Wong, Edith D.; Lloyd, Paul; Skrzypek, Marek S.; Miyasato, Stuart R.; Simison, Matt; Cherry, J. Michael

    2014-01-01

    The genome of the budding yeast Saccharomyces cerevisiae was the first completely sequenced from a eukaryote. It was released in 1996 as the work of a worldwide effort of hundreds of researchers. In the time since, the yeast genome has been intensively studied by geneticists, molecular biologists, and computational scientists all over the world. Maintenance and annotation of the genome sequence have long been provided by the Saccharomyces Genome Database, one of the original model organism databases. To deepen our understanding of the eukaryotic genome, the S. cerevisiae strain S288C reference genome sequence was updated recently in its first major update since 1996. The new version, called “S288C 2010,” was determined from a single yeast colony using modern sequencing technologies and serves as the anchor for further innovations in yeast genomic science. PMID:24374639

  12. Genetic Constructor: An Online DNA Design Platform.

    PubMed

    Bates, Maxwell; Lachoff, Joe; Meech, Duncan; Zulkower, Valentin; Moisy, Anaïs; Luo, Yisha; Tekotte, Hille; Franziska Scheitz, Cornelia Johanna; Khilari, Rupal; Mazzoldi, Florencio; Chandran, Deepak; Groban, Eli

    2017-12-15

    Genetic Constructor is a cloud Computer Aided Design (CAD) application developed to support synthetic biologists from design intent through DNA fabrication and experiment iteration. The platform allows users to design, manage, and navigate complex DNA constructs and libraries, using a new visual language that focuses on functional parts abstracted from sequence. Features like combinatorial libraries and automated primer design allow the user to separate design from construction by focusing on functional intent, and design constraints aid iterative refinement of designs. A plugin architecture enables contributions from scientists and coders to leverage existing powerful software and connect to DNA foundries. The software is easily accessible and platform agnostic, free for academics, and available in an open-source community edition. Genetic Constructor seeks to democratize DNA design, manufacture, and access to tools and services from the synthetic biology community.

  13. ISMB Conference Funding to Support Attendance of Early Researchers and Students

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gaasterland, Terry

    ISMB Conference Funding for Students and Young Scientists Historical Description The Intelligent Systems for Molecular Biology (ISMB) conference has provided a general forum for disseminating the latest developments in bioinformatics on an annual basis for the past 22 years. ISMB is a multidisciplinary conference that brings together scientists from computer science, molecular biology, mathematics and statistics. The goal of the ISMB meeting is to bring together biologists and computational scientists in a focus on actual biological problems, i.e., not simply theoretical calculations. The combined focus on “intelligent systems” and actual biological data makes ISMB a unique and highly important meeting.more » 21 years of experience in holding the conference has resulted in a consistently well-organized, well attended, and highly respected annual conference. "Intelligent systems" include any software which goes beyond straightforward, closed-form algorithms or standard database technologies, and encompasses those that view data in a symbolic fashion, learn from examples, consolidate multiple levels of abstraction, or synthesize results to be cognitively tractable to a human, including the development and application of advanced computational methods for biological problems. Relevant computational techniques include, but are not limited to: machine learning, pattern recognition, knowledge representation, databases, combinatorics, stochastic modeling, string and graph algorithms, linguistic methods, robotics, constraint satisfaction, and parallel computation. Biological areas of interest include molecular structure, genomics, molecular sequence analysis, evolution and phylogenetics, molecular interactions, metabolic pathways, regulatory networks, developmental control, and molecular biology generally. Emphasis is placed on the validation of methods using real data sets, on practical applications in the biological sciences, and on development of novel computational techniques. The ISMB conferences are distinguished from many other conferences in computational biology or artificial intelligence by an insistence that the researchers work with real molecular biology data, not theoretical or toy examples; and from many other biological conferences by providing a forum for technical advances as they occur, which otherwise may be shunned until a firm experimental result is published. The resulting intellectual richness and cross-disciplinary diversity provides an important opportunity for both students and senior researchers. ISMB has become the premier conference series in this field with refereed, published proceedings, establishing an infrastructure to promote the growing body of research.« less

  14. Mind Your Manners

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wiley, H. S.

    2010-01-01

    There has been a lot of talk in the media about the loss of courtesy in modern society. By many criteria, it seems that people in general have lost a degree of politeness. Reading some of the online comments after several recent articles in The Scientist would seem to indicate that biologists have also lost their manners. This lack of basic manners alarms me not only because of the obvious danger to our sense of community, but also because this type of behavior could damage society’s positive perception of scientists. Every time scientists (or anonymous bloggers posing as scientists) rantmore » about the idiocy of someone who disagrees with them, our collective credibility erodes. Yes, there are a number of issues that we as scientists should be passionate about, but our objectives are best served by retaining an objective demeanor and respecting those to whom we happen to disagree.« less

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simon, Horst D.; Zorn, Manfred D.; Spengler, Sylvia J.

    The pace of extraordinary advances in molecular biology has accelerated in the past decade due in large part to discoveries coming from genome projects on human and model organisms. The advances in the genome project so far, happening well ahead of schedule and under budget, have exceeded any dreams by its protagonists, let alone formal expectations. Biologists expect the next phase of the genome project to be even more startling in terms of dramatic breakthroughs in our understanding of human biology, the biology of health and of disease. Only today can biologists begin to envision the necessary experimental, computational andmore » theoretical steps necessary to exploit genome sequence information for its medical impact, its contribution to biotechnology and economic competitiveness, and its ultimate contribution to environmental quality. High performance computing has become one of the critical enabling technologies, which will help to translate this vision of future advances in biology into reality. Biologists are increasingly becoming aware of the potential of high performance computing. The goal of this tutorial is to introduce the exciting new developments in computational biology and genomics to the high performance computing community.« less

  16. Emerging Scholars: The Class of 2008

    ERIC Educational Resources Information Center

    Forde, Dana; Lum, Lydia; Nealy, Michelle J.; Pluviose, David; Roach, Ronald; Rogers, Ibram; Rolo, Mark Anthony; Seymour, Add, Jr., Valdata, Patricia; Watson, Jamal

    2008-01-01

    This year's crop of "Emerging Scholars"--The Class of 2008--includes a math biologist who was only the second woman to receive the Alfred P. Sloan Fellowship in math; a geneticist who recently became one of 20 winners of the National Science Foundation's Presidential Early Career Awards for Scientists and Engineers; and an extensively published…

  17. Modeling Geyser Eruptions in the Classroom

    ERIC Educational Resources Information Center

    Mattox, Stephen; Webster, Christine

    2005-01-01

    Watching Old Faithful transform from a smoldering mound to an explosive 50-meter high geyser is enough to generate awe in any observer. Behind this stunning, visual geologic display is a triad of heat, water, and plumbing that rarely unify on our planet. But geologists are not the only scientists drawn to geysers. Biologists have recently…

  18. Making Sense of Scientific Biographies: Scientific Achievement, Nature of Science, and Storylines in College Students' Essays

    ERIC Educational Resources Information Center

    Hwang, Seyoung

    2015-01-01

    In this article, the educative value of scientific biographies will be explored, especially for non-science major college students. During the "Scientist's life and thought" course, 66 college students read nine scientific biographies including five biologists, covering the canonical scientific achievements in Western scientific history.…

  19. 77 FR 48167 - Agency Information Collection Activities: Proposed Information Collection; Evaluating the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-13

    ... message delivery media to backcountry visitors. USGS social scientists and a NPS bear management biologist will use their combined expertise to conduct a social survey of backcountry visitors to YNP to help... 1995 (PRA), and as a part of our continuing efforts to reduce paperwork and respondent burden, we...

  20. Teaching the Stories of Scientists and Their Discoveries

    ERIC Educational Resources Information Center

    McKinney, Donald; Michalovic, Mark

    2004-01-01

    For many science students and teachers, the history of science brings to mind musty portraits of long-ago chemists, physicists, and biologists, birth and death dates, and some brief mention of specific contributions. Frequently lost amid teaching pressures are the lessons that may be found in the history of science. These stories not only teach…

  1. Emerging strengths in Asia Pacific bioinformatics.

    PubMed

    Ranganathan, Shoba; Hsu, Wen-Lian; Yang, Ueng-Cheng; Tan, Tin Wee

    2008-12-12

    The 2008 annual conference of the Asia Pacific Bioinformatics Network (APBioNet), Asia's oldest bioinformatics organisation set up in 1998, was organized as the 7th International Conference on Bioinformatics (InCoB), jointly with the Bioinformatics and Systems Biology in Taiwan (BIT 2008) Conference, Oct. 20-23, 2008 at Taipei, Taiwan. Besides bringing together scientists from the field of bioinformatics in this region, InCoB is actively involving researchers from the area of systems biology, to facilitate greater synergy between these two groups. Marking the 10th Anniversary of APBioNet, this InCoB 2008 meeting followed on from a series of successful annual events in Bangkok (Thailand), Penang (Malaysia), Auckland (New Zealand), Busan (South Korea), New Delhi (India) and Hong Kong. Additionally, tutorials and the Workshop on Education in Bioinformatics and Computational Biology (WEBCB) immediately prior to the 20th Federation of Asian and Oceanian Biochemists and Molecular Biologists (FAOBMB) Taipei Conference provided ample opportunity for inducting mainstream biochemists and molecular biologists from the region into a greater level of awareness of the importance of bioinformatics in their craft. In this editorial, we provide a brief overview of the peer-reviewed manuscripts accepted for publication herein, grouped into thematic areas. As the regional research expertise in bioinformatics matures, the papers fall into thematic areas, illustrating the specific contributions made by APBioNet to global bioinformatics efforts.

  2. Emerging strengths in Asia Pacific bioinformatics

    PubMed Central

    Ranganathan, Shoba; Hsu, Wen-Lian; Yang, Ueng-Cheng; Tan, Tin Wee

    2008-01-01

    The 2008 annual conference of the Asia Pacific Bioinformatics Network (APBioNet), Asia's oldest bioinformatics organisation set up in 1998, was organized as the 7th International Conference on Bioinformatics (InCoB), jointly with the Bioinformatics and Systems Biology in Taiwan (BIT 2008) Conference, Oct. 20–23, 2008 at Taipei, Taiwan. Besides bringing together scientists from the field of bioinformatics in this region, InCoB is actively involving researchers from the area of systems biology, to facilitate greater synergy between these two groups. Marking the 10th Anniversary of APBioNet, this InCoB 2008 meeting followed on from a series of successful annual events in Bangkok (Thailand), Penang (Malaysia), Auckland (New Zealand), Busan (South Korea), New Delhi (India) and Hong Kong. Additionally, tutorials and the Workshop on Education in Bioinformatics and Computational Biology (WEBCB) immediately prior to the 20th Federation of Asian and Oceanian Biochemists and Molecular Biologists (FAOBMB) Taipei Conference provided ample opportunity for inducting mainstream biochemists and molecular biologists from the region into a greater level of awareness of the importance of bioinformatics in their craft. In this editorial, we provide a brief overview of the peer-reviewed manuscripts accepted for publication herein, grouped into thematic areas. As the regional research expertise in bioinformatics matures, the papers fall into thematic areas, illustrating the specific contributions made by APBioNet to global bioinformatics efforts. PMID:19091008

  3. Student Researcher Experiences (SRE): From the Field to Life as a Steward

    NASA Astrophysics Data System (ADS)

    Brown, J.; Jenkins, T.; Chase, Z.

    2017-12-01

    Florida is a beautiful place to live; water, woods and wildlife are abundant. Many people want to live or visit our area to enjoy our natural resources. However, more people and technology lead to more changes in our resources, and conservation of our natural resources becomes even more important. Youth with conservation interests can benefit greatly from partnerships with scientists and organizations involved in conservation. Partnering with the Florida Fish and Wildlife Conservation Commission biologists helps youth learn how to construct scientific research projects that are current and meaningful, and will supply data to secure the health of our natural resources. Partnerships with scientists gives youth opportunities to become critical thinkers, citizen scientists, stewards, and a voice for nature in their community.

  4. Vander Lugt correlation of DNA sequence data

    NASA Astrophysics Data System (ADS)

    Christens-Barry, William A.; Hawk, James F.; Martin, James C.

    1990-12-01

    DNA, the molecule containing the genetic code of an organism, is a linear chain of subunits. It is the sequence of subunits, of which there are four kinds, that constitutes the unique blueprint of an individual. This sequence is the focus of a large number of analyses performed by an army of geneticists, biologists, and computer scientists. Most of these analyses entail searches for specific subsequences within the larger set of sequence data. Thus, most analyses are essentially pattern recognition or correlation tasks. Yet, there are special features to such analysis that influence the strategy and methods of an optical pattern recognition approach. While the serial processing employed in digital electronic computers remains the main engine of sequence analyses, there is no fundamental reason that more efficient parallel methods cannot be used. We describe an approach using optical pattern recognition (OPR) techniques based on matched spatial filtering. This allows parallel comparison of large blocks of sequence data. In this study we have simulated a Vander Lugt1 architecture implementing our approach. Searches for specific target sequence strings within a block of DNA sequence from the Co/El plasmid2 are performed.

  5. Taming the BEAST-A Community Teaching Material Resource for BEAST 2.

    PubMed

    Barido-Sottani, Joëlle; Bošková, Veronika; Plessis, Louis Du; Kühnert, Denise; Magnus, Carsten; Mitov, Venelin; Müller, Nicola F; PecErska, Julija; Rasmussen, David A; Zhang, Chi; Drummond, Alexei J; Heath, Tracy A; Pybus, Oliver G; Vaughan, Timothy G; Stadler, Tanja

    2018-01-01

    Phylogenetics and phylodynamics are central topics in modern evolutionary biology. Phylogenetic methods reconstruct the evolutionary relationships among organisms, whereas phylodynamic approaches reveal the underlying diversification processes that lead to the observed relationships. These two fields have many practical applications in disciplines as diverse as epidemiology, developmental biology, palaeontology, ecology, and linguistics. The combination of increasingly large genetic data sets and increases in computing power is facilitating the development of more sophisticated phylogenetic and phylodynamic methods. Big data sets allow us to answer complex questions. However, since the required analyses are highly specific to the particular data set and question, a black-box method is not sufficient anymore. Instead, biologists are required to be actively involved with modeling decisions during data analysis. The modular design of the Bayesian phylogenetic software package BEAST 2 enables, and in fact enforces, this involvement. At the same time, the modular design enables computational biology groups to develop new methods at a rapid rate. A thorough understanding of the models and algorithms used by inference software is a critical prerequisite for successful hypothesis formulation and assessment. In particular, there is a need for more readily available resources aimed at helping interested scientists equip themselves with the skills to confidently use cutting-edge phylogenetic analysis software. These resources will also benefit researchers who do not have access to similar courses or training at their home institutions. Here, we introduce the "Taming the Beast" (https://taming-the-beast.github.io/) resource, which was developed as part of a workshop series bearing the same name, to facilitate the usage of the Bayesian phylogenetic software package BEAST 2. © The Author(s) 2017. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  6. ''After the Genome 5 Conference'' to be held October 6-10, 1999 in Jackson Hole, Wyoming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roger Brent

    OAK B139 The postgenomic era is arriving faster than anyone had imagined--sometime during 2000 we'll have a large fraction of the human genome sequence. Heretofore, our understanding of function has come from non-industrial experiments whose conclusions were largely framed in human language. The advent of large amounts of sequence data, and of ''functional genomic'' data types such as mRNA expression data, have changed this picture. These data share the feature that individual observations and measurements are typically relatively low value adding. Such data is now being generated so rapidly that the amount of information contained in it will surpass themore » amount of biological information collected by traditional means. It is tantalizing to envision using genomic information to create a quantitative biology with a very strong data component. Unfortunately, we are very early in our understanding of how to ''compute on'' genomic information so as to extract biological knowledge from i t. In fact, some current efforts to come to grips with genomic information often resemble a computer savvy library science, where the most important issues concern categories, classification schemes, and information retrieval. When exploring new libraries, a measure of cataloging and inventory is surely inevitable. However, at some point we will need to move from library science to scholarship.We would like to achieve a quantitative and predictive understanding of biological function. We realize that making the bridge from knowledge of systems to the sets of abstractions that constitute computable entities is not easy. The After the Genome meetings were started in 1995 to help the biological community think about and prepare for the changes in biological research in the face of the oncoming flow of genomic information. The term ''After the Genome'' refers to a future in which complete inventories of the gene products of entire organisms become available.Since then, many more biologists have become cognizant of the issues raised by this future, and, in response, the organizers intend to distinguish this meeting from other ''postgenomic'' meetings by bringing together intellectuals from subject fields far outside of conventional biology with the expectation that this will help focus thinking beyond the immediate future. To this end, After the Genome 5 will bring together industrial and university researchers, including: (1) Physicists, chemists, and engineers who are devising and using new data gathering techniques, such as microarrays, protein mass spectrometry, and single molecule measurements (2) Computer scientists from fields as diverse as geology and wargames, who have experience moving from broad knowledge of systems to analysis that results in models and simulations (3) Neurobiologists and computer scientists who combine physiological experimentation and computer modeling to understand single cells and small networks of cells (4) Biologists who are trying to model genetic networks (5) All-around visionary thinkers (6) policy makers, to suggest how to convey any good ideas to organizations that can commit resources to them.« less

  7. "After the Genome 5, Conference to be held October 6-10, 1999, Jackson Hole, Wyoming"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brent, Roger

    The postgenomic era is arriving faster than anyone had imagined-- sometime during 2000 we'll have a large fraction of the human genome sequence. Heretofore, our understanding of function has come from non-industrial experiments whose conclusions were largely framed in human language. The advent of large amounts of sequence data, and of "functional genomic" data types such as mRNA expression data, have changed this picture. These data share the feature that individual observations and measurements are typically relatively low value adding. Such data is now being generated so rapidly that the amount of information contained in it will surpass the amountmore » of biological information collected by traditional means. It is tantalizing to envision using genomic information to create a quantitative biology with a very strong data component. Unfortunately, we are very early in our understanding of how to "compute on" genomic information so as to extract biological knowledge from it. In fact, some current efforts to come to grips with genomic information often resemble a computer savvy library science, where the most important issues concern categories, classification schemes, and information retrieval. When exploring new libraries, a measure of cataloging and inventory is surely inevitable. However, at some point we will need to move from library science to scholarship. We would like to achieve a quantitative and predictive understanding of biological function. We realize that making the bridge from knowledge of systems to the sets of abstractions that constitute computable entities is not easy. The After the Genome meetings were started in 1995 to help the biological community think about and prepare for the changes in biological research in the face of the oncoming flow of genomic information. The term "After the Genome" refers to a future in which complete inventories of the gene products of entire organisms become available. Since then, many more biologists have become cognizant of the issues raised by this future, and, in response, the organizers intend to distinguish this meeting from other "postgenomic" meetings by bringing together intellectuals from subject fields far outside of conventional biology with the expectation that this will help focus thinking beyond the immediate future. To this end, After the Genome 5 will bring together industrial and university researchers, including: 1) Physicists, chemists, and engineers who are devising and using new data gathering techniques, such as microarrays, protein mass spectrometry, and single molecule measurements 2) Computer scientists from fields as diverse as geology and wargames, who have experience moving from broad knowledge of systems to analysis that results in models and simulations 3) Neurobiologists and computer scientists who combine physiological experimentation and computer modeling to understand single cells and small networks of cells 4) Biologists who are trying to model genetic networks 5) All- around visionary thinkers 7) policy makers, to suggest how to convey any good ideas to organizations that can commit resources to them.« less

  8. Database systems for knowledge-based discovery.

    PubMed

    Jagarlapudi, Sarma A R P; Kishan, K V Radha

    2009-01-01

    Several database systems have been developed to provide valuable information from the bench chemist to biologist, medical practitioner to pharmaceutical scientist in a structured format. The advent of information technology and computational power enhanced the ability to access large volumes of data in the form of a database where one could do compilation, searching, archiving, analysis, and finally knowledge derivation. Although, data are of variable types the tools used for database creation, searching and retrieval are similar. GVK BIO has been developing databases from publicly available scientific literature in specific areas like medicinal chemistry, clinical research, and mechanism-based toxicity so that the structured databases containing vast data could be used in several areas of research. These databases were classified as reference centric or compound centric depending on the way the database systems were designed. Integration of these databases with knowledge derivation tools would enhance the value of these systems toward better drug design and discovery.

  9. An introduction to scripting in Ruby for biologists.

    PubMed

    Aerts, Jan; Law, Andy

    2009-07-16

    The Ruby programming language has a lot to offer to any scientist with electronic data to process. Not only is the initial learning curve very shallow, but its reflection and meta-programming capabilities allow for the rapid creation of relatively complex applications while still keeping the code short and readable. This paper provides a gentle introduction to this scripting language for researchers without formal informatics training such as many wet-lab scientists. We hope this will provide such researchers an idea of how powerful a tool Ruby can be for their data management tasks and encourage them to learn more about it.

  10. The threat of nuclear war: Some responses

    PubMed Central

    Marcattilio, A. J. M.; Nevin, John A.

    1986-01-01

    The possibility of nuclear holocaust threatens the very existence of the world community. Biologists, earth scientists, educators, lawyers, philosophers, physicists, physicians, and social scientists have addressed the problem from their special perspectives, and have had substantial impact on the public. Behavior analysts, however, have not as a whole contributed a great deal to the goal of preventing nuclear catastrophe. We argue that the threat of nuclear war is primarily a behavioral problem, and present an analysis of that problem. In addition, we address the difficulty of implementing behavioral interventions that would contribute to the survival of the World. PMID:22478648

  11. The Science of Human Nature and the Human Nature of Science

    ERIC Educational Resources Information Center

    Menhand, Lois

    2005-01-01

    In 1889, German biologist August Weissmann showed that mice whose tails are cut off do not produce short-tailed offspring. It was a step forward for science, but a step backward for civilization. Weissmann's discovery was good for science because, contrary to what many scientists had believed, acquired characteristics are not, of course,…

  12. Systematic Biology Training and Personnel. Higher Education Surveys Report, Survey Number 10.

    ERIC Educational Resources Information Center

    Celebuski, Carin A.; Farris, Elizabeth

    The Task Force on Global Biodiversity of the National Science Board is charged with developing a course of action for the National Science Foundation to follow to promote responsible management of global biological diversity. Effective management of the problem is hampered by a shortage of systematic biologists--scientists who identify, document,…

  13. A Lab with a View: American Postdocs Abroad

    ERIC Educational Resources Information Center

    Gladfelter, Amy

    2002-01-01

    As recently as the early 1970s, a postdoctoral research experience overseas was a valued part of training for a U.S. biologist aspiring to an academic position. Not only did the U.S. scientists benefit educationally from participating in different laboratory and cultural systems, but labs outside the United States were enriched by the ideas,…

  14. The influence of climate variability and change on the science and practice of restoration ecology

    Treesearch

    Donald A. Falk; Connie Millar

    2016-01-01

    Variation in Earth’s climate system has always been a primary driver of ecosystem processes and biological evolution. In recent decades, however, the prospect of anthropogenically driven change to the climate system has become an increasingly dominant concern for scientists and conservation biologists. Understanding how ecosystems may...

  15. 76 FR 28793 - Office of Biotechnology Activities, Office of Science Policy, Office of the Director; Notice of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-18

    ...: ``Strategies to Educate Non-Traditional Audiences about Dual Use Research in the Life Sciences: Amateur Biologists and Scientists in Non-Life Science Disciplines;'' (3) update on activities of NSABB Working Groups... Activities, Office of Science Policy, Office of the Director; Notice of Meeting Pursuant to section 10(a) of...

  16. Too Few Choices

    ERIC Educational Resources Information Center

    Murray, Meg

    2007-01-01

    In this article, the author, who is a scientist, a wife and a mother of two preschool children talks about how these two roles exerted a disproportionate impact on her career choices. She is also an X-Gal, one of a group of nine female biologists who have banded together to offer one another advice and support as they seek careers in academic…

  17. In Galápagos … and Uncomfortable with Evolution

    ERIC Educational Resources Information Center

    Cotner, Sehoya; Graczyk, Hannah; Rodríguez Garcia, José Luis; Moore, Randy

    2016-01-01

    In June 2013, the third World Evolution Summit convened on San Cristóbal, hosting scientists from around the world (Paz-y-Miño-C and Espinosa 2013)--neither the first nor likely the last gathering of biologists on these remote islands. Clearly, both locals and an international audience perceive Galápagos as figuring prominently in discourse about…

  18. Historical and cultural fires, tribal management and research issue in Northern California: Trails, fires and tribulations

    Treesearch

    Frank K. Lake

    2013-01-01

    Indigenous people’s detailed traditional knowledge about fire, although superficially referenced in various writings, has not for the most part been analyzed in detail or simulated by resource managers, wildlife biologists, and ecologists. . . . Instead, scientists have developed the principles and theories of fire ecology, fire behavior and effects models, and...

  19. NS and NNS Scientists' Amendments of Dutch Scientific English and Their Impact on Hedging

    ERIC Educational Resources Information Center

    Burrough-Boenisch, Joy

    2005-01-01

    When 45 biologists from eight countries were asked to critically read and amend the English in Discussion sections of three Dutch-authored draft research papers, many of their alterations impacted on the hedging. This article discusses these alterations. In particular, it focuses on the hotspots in the texts, i.e., the points on which several…

  20. Bridging gaps in discovery and development: chemical and biological sciences for affordable health, wellness and sustainability.

    PubMed

    Chauhan, Prem Man Singh

    2011-05-01

    To commemorate 2011 as the International Year of Chemistry, the Indian Society of Chemists and Biologists organized its 15th International Conference on 'Bridging Gaps in Discovery and Development: Chemical and Biological Sciences for Affordable Health, Wellness and Sustainability' at Hotel Grand Bhagwati, in association with Saurashtra University, Rajkot, India. Anamik Shah, President of the Indian Society of Chemists and Biologists, was organizing secretary of the conference. Nicole Moreau, President of the International Union of Pure and Applied Chemistry and Secretary General of the Comité National de la Chimie, National Centre for Scientific Research France, was chief guest of the function. The four-day scientific program included 52 plenary lectures, 24 invited lectures by eminent scientists in the field and 12 oral presentations. A total of 317 posters were presented by young scientists and PhD students in three different poster sessions. Approximately 750 delegates from India, the USA, UK, France, Switzerland, Germany, Austria, Belgium, Sweden, Japan and other countries attended the conference. The majority of the speakers gave presentations related to their current projects and areas of interest and many of the talks covered synthesis, structure-activity relationships, current trends in medicinal chemistry and drug research.

  1. A life scientist, an engineer and a social scientist walk into a lab: challenges of dual-use engagement and education in synthetic biology.

    PubMed

    Edwards, Brett; Kelle, Alexander

    2012-01-01

    The discussion of dual-use education is often predicated on a discrete population of practicing life scientists exhibiting certain deficiencies in awareness or expertise. This has lead to the claim that there is a greater requirement for awareness raising and education amongst this population. However, there is yet to be an inquiry into the impact of the 'convergent' nature of emerging techno-sciences upon the prospects of dual-use education. The field of synthetic biology, although often portrayed as homogeneous, is in fact composed of various sub-fields and communities. Its practitioners have diverse academic backgrounds. The research institutions that have fostered its development in the UK often have their own sets of norms and practices in engagement with ethical, legal and social issues associated with scientific knowledge and technologies. The area is also complicated by the emergence of synthetic biologists outside traditional research environments, the so called 'do-it-yourself' or 'garage biologists'. This paper untangles some of the complexities in the current state of synthetic biology and addresses the prospects for dual-use education for practitioners. It provides a short overview of the field and discusses identified dual-use issues. There follows a discussion of UK networks in synthetic biology, including their engagement with ethical, legal, social and dual-use issues and limited educational efforts in relation to these. It concludes by outlining options for developing a more systematic dual-use education strategy for synthetic biology.

  2. The National Cancer Institute's Physical Sciences - Oncology Network

    NASA Astrophysics Data System (ADS)

    Espey, Michael Graham

    In 2009, the NCI launched the Physical Sciences - Oncology Centers (PS-OC) initiative with 12 Centers (U54) funded through 2014. The current phase of the Program includes U54 funded Centers with the added feature of soliciting new Physical Science - Oncology Projects (PS-OP) U01 grant applications through 2017; see NCI PAR-15-021. The PS-OPs, individually and along with other PS-OPs and the Physical Sciences-Oncology Centers (PS-OCs), comprise the Physical Sciences-Oncology Network (PS-ON). The foundation of the Physical Sciences-Oncology initiative is a high-risk, high-reward program that promotes a `physical sciences perspective' of cancer and fosters the convergence of physical science and cancer research by forming transdisciplinary teams of physical scientists (e.g., physicists, mathematicians, chemists, engineers, computer scientists) and cancer researchers (e.g., cancer biologists, oncologists, pathologists) who work closely together to advance our understanding of cancer. The collaborative PS-ON structure catalyzes transformative science through increased exchange of people, ideas, and approaches. PS-ON resources are leveraged to fund Trans-Network pilot projects to enable synergy and cross-testing of experimental and/or theoretical concepts. This session will include a brief PS-ON overview followed by a strategic discussion with the APS community to exchange perspectives on the progression of trans-disciplinary physical sciences in cancer research.

  3. Systems biology driven software design for the research enterprise.

    PubMed

    Boyle, John; Cavnor, Christopher; Killcoyne, Sarah; Shmulevich, Ilya

    2008-06-25

    In systems biology, and many other areas of research, there is a need for the interoperability of tools and data sources that were not originally designed to be integrated. Due to the interdisciplinary nature of systems biology, and its association with high throughput experimental platforms, there is an additional need to continually integrate new technologies. As scientists work in isolated groups, integration with other groups is rarely a consideration when building the required software tools. We illustrate an approach, through the discussion of a purpose built software architecture, which allows disparate groups to reuse tools and access data sources in a common manner. The architecture allows for: the rapid development of distributed applications; interoperability, so it can be used by a wide variety of developers and computational biologists; development using standard tools, so that it is easy to maintain and does not require a large development effort; extensibility, so that new technologies and data types can be incorporated; and non intrusive development, insofar as researchers need not to adhere to a pre-existing object model. By using a relatively simple integration strategy, based upon a common identity system and dynamically discovered interoperable services, a light-weight software architecture can become the focal point through which scientists can both get access to and analyse the plethora of experimentally derived data.

  4. PATRIC: the Comprehensive Bacterial Bioinformatics Resource with a Focus on Human Pathogenic Species ▿ ‡ #

    PubMed Central

    Gillespie, Joseph J.; Wattam, Alice R.; Cammer, Stephen A.; Gabbard, Joseph L.; Shukla, Maulik P.; Dalay, Oral; Driscoll, Timothy; Hix, Deborah; Mane, Shrinivasrao P.; Mao, Chunhong; Nordberg, Eric K.; Scott, Mark; Schulman, Julie R.; Snyder, Eric E.; Sullivan, Daniel E.; Wang, Chunxia; Warren, Andrew; Williams, Kelly P.; Xue, Tian; Seung Yoo, Hyun; Zhang, Chengdong; Zhang, Yan; Will, Rebecca; Kenyon, Ronald W.; Sobral, Bruno W.

    2011-01-01

    Funded by the National Institute of Allergy and Infectious Diseases, the Pathosystems Resource Integration Center (PATRIC) is a genomics-centric relational database and bioinformatics resource designed to assist scientists in infectious-disease research. Specifically, PATRIC provides scientists with (i) a comprehensive bacterial genomics database, (ii) a plethora of associated data relevant to genomic analysis, and (iii) an extensive suite of computational tools and platforms for bioinformatics analysis. While the primary aim of PATRIC is to advance the knowledge underlying the biology of human pathogens, all publicly available genome-scale data for bacteria are compiled and continually updated, thereby enabling comparative analyses to reveal the basis for differences between infectious free-living and commensal species. Herein we summarize the major features available at PATRIC, dividing the resources into two major categories: (i) organisms, genomes, and comparative genomics and (ii) recurrent integration of community-derived associated data. Additionally, we present two experimental designs typical of bacterial genomics research and report on the execution of both projects using only PATRIC data and tools. These applications encompass a broad range of the data and analysis tools available, illustrating practical uses of PATRIC for the biologist. Finally, a summary of PATRIC's outreach activities, collaborative endeavors, and future research directions is provided. PMID:21896772

  5. An introduction to scripting in Ruby for biologists

    PubMed Central

    Aerts, Jan; Law, Andy

    2009-01-01

    The Ruby programming language has a lot to offer to any scientist with electronic data to process. Not only is the initial learning curve very shallow, but its reflection and meta-programming capabilities allow for the rapid creation of relatively complex applications while still keeping the code short and readable. This paper provides a gentle introduction to this scripting language for researchers without formal informatics training such as many wet-lab scientists. We hope this will provide such researchers an idea of how powerful a tool Ruby can be for their data management tasks and encourage them to learn more about it. PMID:19607723

  6. A guide to the visual analysis and communication of biomolecular structural data.

    PubMed

    Johnson, Graham T; Hertig, Samuel

    2014-10-01

    Biologists regularly face an increasingly difficult task - to effectively communicate bigger and more complex structural data using an ever-expanding suite of visualization tools. Whether presenting results to peers or educating an outreach audience, a scientist can achieve maximal impact with minimal production time by systematically identifying an audience's needs, planning solutions from a variety of visual communication techniques and then applying the most appropriate software tools. A guide to available resources that range from software tools to professional illustrators can help researchers to generate better figures and presentations tailored to any audience's needs, and enable artistically inclined scientists to create captivating outreach imagery.

  7. Research on climate impacts to forests began early at Fort Valley Experimental Forest

    Treesearch

    Brian W. Geils; Susan D. Olberding

    2012-01-01

    Even before Arizona was a state, government scientists walked and rode across its broad, open landscapes from nearly sea level to over 12,000 feet of elevation, observing its diverse vegetation and climate. In 1889, biologist C. Hart Merriam traversed northern Arizona and found six of the seven world life zones he would later describe by latitude and elevation. The...

  8. Structural biologists capture detailed image of gene regulator’s fleeting form | Center for Cancer Research

    Cancer.gov

    Using an ultrafast, high-intensity radiation source called an X-ray free-electron laser (XFEL), scientists have captured an atomic-level picture of an RNA structure called a riboswitch as it reorganizes itself to regulate protein production. The structure they visualized has never before been seen, and likely exists for only milliseconds after the riboswitch first encounters

  9. HpBase: A genome database of a sea urchin, Hemicentrotus pulcherrimus.

    PubMed

    Kinjo, Sonoko; Kiyomoto, Masato; Yamamoto, Takashi; Ikeo, Kazuho; Yaguchi, Shunsuke

    2018-04-01

    To understand the mystery of life, it is important to accumulate genomic information for various organisms because the whole genome encodes the commands for all the genes. Since the genome of Strongylocentrotus purpratus was sequenced in 2006 as the first sequenced genome in echinoderms, the genomic resources of other North American sea urchins have gradually been accumulated, but no sea urchin genomes are available in other areas, where many scientists have used the local species and reported important results. In this manuscript, we report a draft genome of the sea urchin Hemincentrotus pulcherrimus because this species has a long history as the target of developmental and cell biology in East Asia. The genome of H. pulcherrimus was assembled into 16,251 scaffold sequences with an N50 length of 143 kbp, and approximately 25,000 genes were identified in the genome. The size of the genome and the sequencing coverage were estimated to be approximately 800 Mbp and 100×, respectively. To provide these data and information of annotation, we constructed a database, HpBase (http://cell-innovation.nig.ac.jp/Hpul/). In HpBase, gene searches, genome browsing, and blast searches are available. In addition, HpBase includes the "recipes" for experiments from each lab using H. pulcherrimus. These recipes will continue to be updated according to the circumstances of individual scientists and can be powerful tools for experimental biologists and for the community. HpBase is a suitable dataset for evolutionary, developmental, and cell biologists to compare H. pulcherrimus genomic information with that of other species and to isolate gene information. © 2018 Japanese Society of Developmental Biologists.

  10. Integrating Omics Technologies to Study Pulmonary Physiology and Pathology at the Systems Level

    PubMed Central

    Pathak, Ravi Ramesh; Davé, Vrushank

    2014-01-01

    Assimilation and integration of “omics” technologies, including genomics, epigenomics, proteomics, and metabolomics has readily altered the landscape of medical research in the last decade. The vast and complex nature of omics data can only be interpreted by linking molecular information at the organismic level, forming the foundation of systems biology. Research in pulmonary biology/medicine has necessitated integration of omics, network, systems and computational biology data to differentially diagnose, interpret, and prognosticate pulmonary diseases, facilitating improvement in therapy and treatment modalities. This review describes how to leverage this emerging technology in understanding pulmonary diseases at the systems level –called a “systomic” approach. Considering the operational wholeness of cellular and organ systems, diseased genome, proteome, and the metabolome needs to be conceptualized at the systems level to understand disease pathogenesis and progression. Currently available omics technology and resources require a certain degree of training and proficiency in addition to dedicated hardware and applications, making them relatively less user friendly for the pulmonary biologist and clinicians. Herein, we discuss the various strategies, computational tools and approaches required to study pulmonary diseases at the systems level for biomedical scientists and clinical researchers. PMID:24802001

  11. Research, conservation, and collaboration: The role of visiting scientists in developing countries

    USGS Publications Warehouse

    Foster, Mercedes S.

    1993-01-01

    As awareness of environmental problems and the need to protect our natural resources or use them wisely has grown, scientists have become increasingly interested in conservation. Some individuals are involved in conservation-related activities through research or teaching, but most of us participate only as citizens concerned about the world in which we live. Often, we decline to take an active role in conservation issues because we think that "it will take too much time away from our science," or that it is "too much trouble." Both perspectives, I think, are inaccurate. Sometimes investigators fail to participate because they are ignorant of the ways in which scientists (or scientific organizations) interface with conservation - in other words, of how one goes about getting personally involved. Whatever the reason, this lack of involvement is unfortunate, because scientists, and especially "whole organism" biologists (including ornithologists), can make unique contributions to conservation programs, as scientists, without a significant increase in effort or any change in the quality of their work. At the same time, they reap both professional and personal rewards.

  12. Science through the Internet: Researching, Evaluating and Citing Websites.

    PubMed

    Lee, L EJ; Misser, E

    1999-01-08

    This article attempts to convey the joys and frustrations of skimming the Internet trying to find relevant information concerning an academic's work as a scientist, a student or an instructor. A brief overview of the Internet and the "do's and don'ts" for the neophyte as well for the more seasoned "navigator" are given. Some guidelines of "what works and what does not" and "what is out there" are provided for the scientist with specific emphasis for biologists, as well as for all others having an interest in science but with little interest in spending countless hours "surfing the net". An extensive but not exhaustive list of related websites is provided.

  13. Science through the Internet: Researching, Evaluating and Citing Websites

    PubMed Central

    Misser, E

    1998-01-01

    This article attempts to convey the joys and frustrations of skimming the Internet trying to find relevant information concerning an academic's work as a scientist, a student or an instructor. A brief overview of the Internet and the "do's and don'ts" for the neophyte as well for the more seasoned "navigator" are given. Some guidelines of "what works and what does not" and "what is out there" are provided for the scientist with specific emphasis for biologists, as well as for all others having an interest in science but with little interest in spending countless hours "surfing the net". An extensive but not exhaustive list of related websites is provided. PMID:12734595

  14. Promoting the confluence of tropical cyclone research.

    PubMed

    Marler, Thomas E

    2015-01-01

    Contributions of biologists to tropical cyclone research may improve by integrating concepts from other disciplines. Employing accumulated cyclone energy into protocols may foster greater integration of ecology and meteorology research. Considering experienced ecosystems as antifragile instead of just resilient may improve cross-referencing among ecological and social scientists. Quantifying ecosystem capital as distinct from ecosystem services may improve integration of tropical cyclone ecology research into the expansive global climate change research community.

  15. Structural biologists capture detailed image of gene regulator’s fleeting form | Center for Cancer Research

    Cancer.gov

    Using an ultrafast, high-intensity radiation source called an X-ray free-electron laser (XFEL), scientists have captured an atomic-level picture of an RNA structure called a riboswitch as it reorganizes itself to regulate protein production. The structure they visualized has never before been seen, and likely exists for only milliseconds after the riboswitch first encounters its activating molecule.  Read more...  

  16. Intelligent Interfaces for Mining Large-Scale RNAi-HCS Image Databases

    PubMed Central

    Lin, Chen; Mak, Wayne; Hong, Pengyu; Sepp, Katharine; Perrimon, Norbert

    2010-01-01

    Recently, High-content screening (HCS) has been combined with RNA interference (RNAi) to become an essential image-based high-throughput method for studying genes and biological networks through RNAi-induced cellular phenotype analyses. However, a genome-wide RNAi-HCS screen typically generates tens of thousands of images, most of which remain uncategorized due to the inadequacies of existing HCS image analysis tools. Until now, it still requires highly trained scientists to browse a prohibitively large RNAi-HCS image database and produce only a handful of qualitative results regarding cellular morphological phenotypes. For this reason we have developed intelligent interfaces to facilitate the application of the HCS technology in biomedical research. Our new interfaces empower biologists with computational power not only to effectively and efficiently explore large-scale RNAi-HCS image databases, but also to apply their knowledge and experience to interactive mining of cellular phenotypes using Content-Based Image Retrieval (CBIR) with Relevance Feedback (RF) techniques. PMID:21278820

  17. The Biological Connection Markup Language: a SBGN-compliant format for visualization, filtering and analysis of biological pathways.

    PubMed

    Beltrame, Luca; Calura, Enrica; Popovici, Razvan R; Rizzetto, Lisa; Guedez, Damariz Rivero; Donato, Michele; Romualdi, Chiara; Draghici, Sorin; Cavalieri, Duccio

    2011-08-01

    Many models and analysis of signaling pathways have been proposed. However, neither of them takes into account that a biological pathway is not a fixed system, but instead it depends on the organism, tissue and cell type as well as on physiological, pathological and experimental conditions. The Biological Connection Markup Language (BCML) is a format to describe, annotate and visualize pathways. BCML is able to store multiple information, permitting a selective view of the pathway as it exists and/or behave in specific organisms, tissues and cells. Furthermore, BCML can be automatically converted into data formats suitable for analysis and into a fully SBGN-compliant graphical representation, making it an important tool that can be used by both computational biologists and 'wet lab' scientists. The XML schema and the BCML software suite are freely available under the LGPL for download at http://bcml.dc-atlas.net. They are implemented in Java and supported on MS Windows, Linux and OS X.

  18. Cloud Technology May Widen Genomic Bottleneck - TCGA

    Cancer.gov

    Computational biologist Dr. Ilya Shmulevich suggests that renting cloud computing power might widen the bottleneck for analyzing genomic data. Learn more about his experience with the Cloud in this TCGA in Action Case Study.

  19. The Handicap Principle for Trust in Computer Security, the Semantic Web and Social Networking

    NASA Astrophysics Data System (ADS)

    Ma, Zhanshan (Sam); Krings, Axel W.; Hung, Chih-Cheng

    Communication is a fundamental function of life, and it exists in almost all living things: from single-cell bacteria to human beings. Communication, together with competition and cooperation,arethree fundamental processes in nature. Computer scientists are familiar with the study of competition or 'struggle for life' through Darwin's evolutionary theory, or even evolutionary computing. They may be equally familiar with the study of cooperation or altruism through the Prisoner's Dilemma (PD) game. However, they are likely to be less familiar with the theory of animal communication. The objective of this article is three-fold: (i) To suggest that the study of animal communication, especially the honesty (reliability) of animal communication, in which some significant advances in behavioral biology have been achieved in the last three decades, should be on the verge to spawn important cross-disciplinary research similar to that generated by the study of cooperation with the PD game. One of the far-reaching advances in the field is marked by the publication of "The Handicap Principle: a Missing Piece of Darwin's Puzzle" by Zahavi (1997). The 'Handicap' principle [34][35], which states that communication signals must be costly in some proper way to be reliable (honest), is best elucidated with evolutionary games, e.g., Sir Philip Sidney (SPS) game [23]. Accordingly, we suggest that the Handicap principle may serve as a fundamental paradigm for trust research in computer science. (ii) To suggest to computer scientists that their expertise in modeling computer networks may help behavioral biologists in their study of the reliability of animal communication networks. This is largely due to the historical reason that, until the last decade, animal communication was studied with the dyadic paradigm (sender-receiver) rather than with the network paradigm. (iii) To pose several open questions, the answers to which may bear some refreshing insights to trust research in computer science, especially secure and resilient computing, the semantic web, and social networking. One important thread unifying the three aspects is the evolutionary game theory modeling or its extensions with survival analysis and agreement algorithms [19][20], which offer powerful game models for describing time-, space-, and covariate-dependent frailty (uncertainty and vulnerability) and deception (honesty).

  20. Zuivere en toegepaste wetenschap in de tropen : biologisch onderzoek aan particuliere proefstations in Nederlands-Indië 1870-1940

    NASA Astrophysics Data System (ADS)

    van der Schoor, W. J.

    2012-04-01

    Most experiment stations originated from the cooperation between entrepreneurs and the government. From the 1890s onwards, the government, together with the well organised colonial entrepreneurs, established research departments for several plantation crops at the Botanical Gardens at Buitenzorg (now Bogor), that eventually became independent experiment stations in the first decades of the twentieth century. By the 1920s, the ‘proefstationswezen’ (experiment station system) numbered some fifteen private experiment stations or sub-stations. After the war, the private experiment stations together with the government experiment stations at Buitenzorg were to provide the backbone of Indonesian agricultural science. Dutch biologists in particular, made a striking plea for pursuing the natural sciences in the tropical colonies. First, they pointed out the scientific importance of the tropics. Secondly, they stressed the role of the natural sciences, in particular biology, as a natural ally of colonial agriculture. Pure science was seen as a leading force for technical and social progress. The third motive was the cultural value of science for the Netherlands and its colonies. The cultivation of science in the colonies gave international prestige and strengthened self-confidence in the imperial struggle around 1900. Science had a civilising effect; scientific research, however, was to remain in the hands of western, colonial scientists. From the 1880s and 1890s onward, the experiment stations in the Indies were characterised by their strategic aims and scientific orientation. Up to 1910, the ‘academic’ views of biologists like Treub and Went concerning science and practice were predominant, and research was considered to be the central aim. From 1910 onwards, advice became more central and special extension services were established at the experiment stations. Due to diverging views of science, tasks and aims became a battlefield for discussions in the next decades. In the background of these debates were the rise of Wageningen Agricultural College, the rise and institutionalisation of applied agricultural sciences and the increasing competition between Wageningen and university trained scientist. Genetics and breeding in particular were at the core of the research programmes. The practical aim of the breeding work, however, did not leave too much opportunity for more fundamental investigations. The impetus for pure research came from individual researchers. In tobacco and sugar cane breeding, new scientific theories provided inspiration, but to a large extent the practical breeding work built on nineteenth-century breeding techniques. In many respects, plant breeding and university genetics became separate disciplines. Circa one in six Dutch biologist worked for at least some time in the colony. The colonial experiment stations instilled a practical and pragmatic attitude to Dutch science into quite a number of biologists. Besides, the experiment stations system provided Dutch biologists with an extensive network and international contacts with fellow scientists, entrepreneurs and captains of industry. The scientific nationalism of Treub and Went, the bloom of the experiment stations and the ambitions of the Indies colonial elite did not result in the establishment of an independent, ‘Indische’ scientific community. Essentially, the Dutch East Indies were an exploitation province of Dutch science

  1. Calculating life? Duelling discourses in interdisciplinary systems biology.

    PubMed

    Calvert, Jane; Fujimura, Joan H

    2011-06-01

    A high profile context in which physics and biology meet today is in the new field of systems biology. Systems biology is a fascinating subject for sociological investigation because the demands of interdisciplinary collaboration have brought epistemological issues and debates front and centre in discussions amongst systems biologists in conference settings, in publications, and in laboratory coffee rooms. One could argue that systems biologists are conducting their own philosophy of science. This paper explores the epistemic aspirations of the field by drawing on interviews with scientists working in systems biology, attendance at systems biology conferences and workshops, and visits to systems biology laboratories. It examines the discourses of systems biologists, looking at how they position their work in relation to previous types of biological inquiry, particularly molecular biology. For example, they raise the issue of reductionism to distinguish systems biology from molecular biology. This comparison with molecular biology leads to discussions about the goals and aspirations of systems biology, including epistemic commitments to quantification, rigor and predictability. Some systems biologists aspire to make biology more similar to physics and engineering by making living systems calculable, modelable and ultimately predictable-a research programme that is perhaps taken to its most extreme form in systems biology's sister discipline: synthetic biology. Other systems biologists, however, do not think that the standards of the physical sciences are the standards by which we should measure the achievements of systems biology, and doubt whether such standards will ever be applicable to 'dirty, unruly living systems'. This paper explores these epistemic tensions and reflects on their sociological dimensions and their consequences for future work in the life sciences. Copyright © 2010 Elsevier Ltd. All rights reserved.

  2. Science in Places of Grandeur: Communication and Engagement in National Parks.

    PubMed

    Watkins, Tim; Miller-Rushing, Abraham J; Nelson, Sarah J

    2018-05-14

    The United States has set aside over 400 national parks and other protected areas to be managed by the National Park Service (NPS). Collectively, these sites attract over 300 million visits per year which makes the NPS one of the largest informal education institutions in the country. Because the NPS supports and facilitates scientific studies in parks, the national park system provides abundant opportunity for biologists and other scientists to engage global audiences in learning, exploring, and even conducting science. Those opportunities are best pursued through collaborations among scientists and the professional communication staff (interpreters, educators, media specialists, etc.) of parks and their partner organizations. This article describes unique opportunities and rationale for such collaborations, presents several examples that highlight the range of activities and lessons drawn from them, and invites scientists to conduct studies in parks and bring their science into the public eye.

  3. Speeding Up Ecological and Evolutionary Computations in R; Essentials of High Performance Computing for Biologists

    PubMed Central

    Visser, Marco D.; McMahon, Sean M.; Merow, Cory; Dixon, Philip M.; Record, Sydne; Jongejans, Eelke

    2015-01-01

    Computation has become a critical component of research in biology. A risk has emerged that computational and programming challenges may limit research scope, depth, and quality. We review various solutions to common computational efficiency problems in ecological and evolutionary research. Our review pulls together material that is currently scattered across many sources and emphasizes those techniques that are especially effective for typical ecological and environmental problems. We demonstrate how straightforward it can be to write efficient code and implement techniques such as profiling or parallel computing. We supply a newly developed R package (aprof) that helps to identify computational bottlenecks in R code and determine whether optimization can be effective. Our review is complemented by a practical set of examples and detailed Supporting Information material (S1–S3 Texts) that demonstrate large improvements in computational speed (ranging from 10.5 times to 14,000 times faster). By improving computational efficiency, biologists can feasibly solve more complex tasks, ask more ambitious questions, and include more sophisticated analyses in their research. PMID:25811842

  4. Demystifying computer science for molecular ecologists.

    PubMed

    Belcaid, Mahdi; Toonen, Robert J

    2015-06-01

    In this age of data-driven science and high-throughput biology, computational thinking is becoming an increasingly important skill for tackling both new and long-standing biological questions. However, despite its obvious importance and conspicuous integration into many areas of biology, computer science is still viewed as an obscure field that has, thus far, permeated into only a few of the biology curricula across the nation. A national survey has shown that lack of computational literacy in environmental sciences is the norm rather than the exception [Valle & Berdanier (2012) Bulletin of the Ecological Society of America, 93, 373-389]. In this article, we seek to introduce a few important concepts in computer science with the aim of providing a context-specific introduction aimed at research biologists. Our goal was to help biologists understand some of the most important mainstream computational concepts to better appreciate bioinformatics methods and trade-offs that are not obvious to the uninitiated. © 2015 John Wiley & Sons Ltd.

  5. Systems biology driven software design for the research enterprise

    PubMed Central

    Boyle, John; Cavnor, Christopher; Killcoyne, Sarah; Shmulevich, Ilya

    2008-01-01

    Background In systems biology, and many other areas of research, there is a need for the interoperability of tools and data sources that were not originally designed to be integrated. Due to the interdisciplinary nature of systems biology, and its association with high throughput experimental platforms, there is an additional need to continually integrate new technologies. As scientists work in isolated groups, integration with other groups is rarely a consideration when building the required software tools. Results We illustrate an approach, through the discussion of a purpose built software architecture, which allows disparate groups to reuse tools and access data sources in a common manner. The architecture allows for: the rapid development of distributed applications; interoperability, so it can be used by a wide variety of developers and computational biologists; development using standard tools, so that it is easy to maintain and does not require a large development effort; extensibility, so that new technologies and data types can be incorporated; and non intrusive development, insofar as researchers need not to adhere to a pre-existing object model. Conclusion By using a relatively simple integration strategy, based upon a common identity system and dynamically discovered interoperable services, a light-weight software architecture can become the focal point through which scientists can both get access to and analyse the plethora of experimentally derived data. PMID:18578887

  6. NDEx - the Network Data Exchange, A Network Commons for Biologists | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    Network models of biology, whether curated or derived from large-scale data analysis, are critical tools in the understanding of cancer mechanisms and in the design and personalization of therapies. The NDEx Project (Network Data Exchange) will create, deploy, and maintain an open-source, web-based software platform and public website to enable scientists, organizations, and software applications to share, store, manipulate, and publish biological networks.

  7. Promoting the confluence of tropical cyclone research

    PubMed Central

    Marler, Thomas E

    2015-01-01

    Contributions of biologists to tropical cyclone research may improve by integrating concepts from other disciplines. Employing accumulated cyclone energy into protocols may foster greater integration of ecology and meteorology research. Considering experienced ecosystems as antifragile instead of just resilient may improve cross-referencing among ecological and social scientists. Quantifying ecosystem capital as distinct from ecosystem services may improve integration of tropical cyclone ecology research into the expansive global climate change research community. PMID:26480001

  8. Classification Techniques for Multivariate Data Analysis.

    DTIC Science & Technology

    1980-03-28

    analysis among biologists, botanists, and ecologists, while some social scientists may refer "typology". Other frequently encountered terms are pattern...the determinantal equation: lB -XW 0 (42) 49 The solutions X. are the eigenvalues of the matrix W-1 B 1 as in discriminant analysis. There are t non...Statistical Package for Social Sciences (SPSS) (14) subprogram FACTOR was used for the principal components analysis. It is designed both for the factor

  9. From bench to bar: careers in patent law for molecular biologists

    PubMed Central

    Machin, Nathan A.

    2013-01-01

    Leaving science to pursue a career in patent law requires a considerable investment of time and energy, and possibly money, with no guarantee of finding a job or of returning to science should the decision prove infelicitous. Yet the large number of former scientists now practicing patent law shows that it can be done. I provide suggestions for investigating the potential opportunities, costs, risks, and rewards of this career path. PMID:23813843

  10. The mentoring of male and female scientists during their doctoral studies

    NASA Astrophysics Data System (ADS)

    Filippelli, Laura Ann

    The mentoring relationships of male and female scientists during their doctoral studies were examined. Male and female biologists, chemists, engineers and physicists were compared regarding the importance of doctoral students receiving career enhancing and psychosocial mentoring from their doctoral chairperson and student colleagues. Scientists' satisfaction with their chairperson and colleagues as providers of these mentoring functions was also investigated. In addition, scientists identified individuals other than their chairperson and colleagues who were positive influencers on their professional development as scientists and those who hindered their development. A reliable instrument, "The Survey of Accomplished Scientists' Doctoral Experiences," was developed to assess career enhancing and psychosocial mentoring of doctoral chairpersons and student colleagues based on the review of literature, interviews with scientists and two pilot studies. Surveys were mailed to a total of 400 men and women scientists with earned doctorates, of which 209 were completed and returned. The findings reveal that female scientists considered the doctoral chairperson furnishing career enhancing mentoring more important than did the men, while both were in accordance with the importance of them providing psychosocial mentoring. In addition, female scientists were not as satisfied as men with their chairperson providing most of the career enhancing and psychosocial mentoring functions. For doctoral student colleagues, female scientists, when compared to men, indicated that they considered student colleagues more important in providing career enhancing and psychosocial mentoring. However, male and female scientists were equally satisfied with their colleagues as providers of these mentoring functions. Lastly, the majority of male scientists indicated that professors served as a positive influencer, while women revealed that spouses and friends positively influenced their professional development as scientists. Several recommended changes in science departments are provided.

  11. Facing the Challenges of Accessing, Managing, and Integrating Large Observational Datasets in Ecology: Enabling and Enriching the Use of NEON's Observational Data

    NASA Astrophysics Data System (ADS)

    Thibault, K. M.

    2013-12-01

    As the construction of NEON and its transition to operations progresses, more and more data will become available to the scientific community, both from NEON directly and from the concomitant growth of existing data repositories. Many of these datasets include ecological observations of a diversity of taxa in both aquatic and terrestrial environments. Although observational data have been collected and used throughout the history of organismal biology, the field has not yet fully developed a culture of data management, documentation, standardization, sharing and discoverability to facilitate the integration and synthesis of datasets. Moreover, the tools required to accomplish these goals, namely database design, implementation, and management, and automation and parallelization of analytical tasks through computational techniques, have not historically been included in biology curricula, at either the undergraduate or graduate levels. To ensure the success of data-generating projects like NEON in advancing organismal ecology and to increase transparency and reproducibility of scientific analyses, an acceleration of the cultural shift to open science practices, the development and adoption of data standards, such as the DarwinCore standard for taxonomic data, and increased training in computational approaches for biologists need to be realized. Here I highlight several initiatives that are intended to increase access to and discoverability of publicly available datasets and equip biologists and other scientists with the skills that are need to manage, integrate, and analyze data from multiple large-scale projects. The EcoData Retriever (ecodataretriever.org) is a tool that downloads publicly available datasets, re-formats the data into an efficient relational database structure, and then automatically imports the data tables onto a user's local drive into the database tool of the user's choice. The automation of these tasks results in nearly instantaneous execution of tasks that previously required hours to days of each data user's time, with decreased error rates and increased useability of the data. The Ecological Data wiki (ecologicaldata.org) provides a forum for users of ecological datasets to share relevant metadata and tips and tricks for using the data, in order to flatten learning curves, as well as minimize redundancy of efforts among users of the same datasets. Finally, Software Carpentry (software-carpentry.org) has developed curricula for scientific computing and provides both online training and low cost, short courses that can be tailored to the specific needs of the students. Demand for these courses has been increasing exponentially in recent years, and represent a significant educational resource for biologists. I will conclude by linking these initiatives to the challenges facing ecologists related to the effective and efficient exploitation of NEON's diverse data streams.

  12. Implicit Theories of Creativity in Computer Science in the United States and China

    ERIC Educational Resources Information Center

    Tang, Chaoying; Baer, John; Kaufman, James C.

    2015-01-01

    To study implicit concepts of creativity in computer science in the United States and mainland China, we first asked 308 Chinese computer scientists for adjectives that would describe a creative computer scientist. Computer scientists and non-computer scientists from China (N = 1069) and the United States (N = 971) then rated how well those…

  13. Wet Lab Accelerator: A Web-Based Application Democratizing Laboratory Automation for Synthetic Biology.

    PubMed

    Bates, Maxwell; Berliner, Aaron J; Lachoff, Joe; Jaschke, Paul R; Groban, Eli S

    2017-01-20

    Wet Lab Accelerator (WLA) is a cloud-based tool that allows a scientist to conduct biology via robotic control without the need for any programming knowledge. A drag and drop interface provides a convenient and user-friendly method of generating biological protocols. Graphically developed protocols are turned into programmatic instruction lists required to conduct experiments at the cloud laboratory Transcriptic. Prior to the development of WLA, biologists were required to write in a programming language called "Autoprotocol" in order to work with Transcriptic. WLA relies on a new abstraction layer we call "Omniprotocol" to convert the graphical experimental description into lower level Autoprotocol language, which then directs robots at Transcriptic. While WLA has only been tested at Transcriptic, the conversion of graphically laid out experimental steps into Autoprotocol is generic, allowing extension of WLA into other cloud laboratories in the future. WLA hopes to democratize biology by bringing automation to general biologists.

  14. SCHIP: Statistics for Chromosome Interphase Positioning Based on Interchange Data

    NASA Technical Reports Server (NTRS)

    Vives, Sergi; Loucas, Bradford; Vazquez, Mariel; Brenner, David J.; Sachs, Rainer K.; Hlatky, Lynn; Cornforth, Michael; Arsuaga, Javier

    2005-01-01

    he position of chromosomes in the interphase nucleus is believed to be associated with a number of biological processes. Here, we present a web-based application that helps analyze the relative position of chromosomes during interphase in human cells, based on observed radiogenic chromosome aberrations. The inputs of the program are a table of yields of pairwise chromosome interchanges and a proposed chromosome geometric cluster. Each can either be uploaded or selected from provided datasets. The main outputs are P-values for the proposed chromosome clusters. SCHIP is designed to be used by a number of scientific communities interested in nuclear architecture, including cancer and cell biologists, radiation biologists and mathematical/computational biologists.

  15. Developing the next generation of diverse computer scientists: the need for enhanced, intersectional computing identity theory

    NASA Astrophysics Data System (ADS)

    Rodriguez, Sarah L.; Lehman, Kathleen

    2017-10-01

    This theoretical paper explores the need for enhanced, intersectional computing identity theory for the purpose of developing a diverse group of computer scientists for the future. Greater theoretical understanding of the identity formation process specifically for computing is needed in order to understand how students come to understand themselves as computer scientists. To ensure that the next generation of computer scientists is diverse, this paper presents a case for examining identity development intersectionally, understanding the ways in which women and underrepresented students may have difficulty identifying as computer scientists and be systematically oppressed in their pursuit of computer science careers. Through a review of the available scholarship, this paper suggests that creating greater theoretical understanding of the computing identity development process will inform the way in which educational stakeholders consider computer science practices and policies.

  16. Improving interactions between animal rights groups and conservation biologists.

    PubMed

    Perry, Dan; Perry, Gad

    2008-02-01

    Invasive species are often considered to be a major threat to biodiversity, leading conservation biologists to often recommend their complete eradication. Animal rights groups typically categorically oppose killing animals, and their opposition has brought eradication attempts of gray squirrels in northern Italy (Europe) and mute swans in Vermont to a halt. As a result native red squirrels may disappear from Europe and ecosystem-wide impacts are expected to be caused by the swan. In contrast, cooperation between managers and animal rights groups has resulted in a successful control program for feral pigs in Fort Worth, Texas (U.S.A.). The philosophical differences between animal rights and conservation biologists' views make cooperation seem unlikely, yet documented cases of cooperation have been beneficial for both groups. We recommend that managers dealing with invasive species should consult with social scientists and ethicists to gain a better understanding of the implications of some of their policy decisions. In addition, we recommend that animal rights groups do more to support alternatives to lethal control, which are often excluded by economic limitations. Prevention of arrival of invasive species via application of the precautionary principle may be an especially productive avenue for such collaboration because it fits the goals and values of both groups.

  17. Intelligently deciphering unintelligible designs: algorithmic algebraic model checking in systems biology

    PubMed Central

    Mishra, Bud

    2009-01-01

    Systems biology, as a subject, has captured the imagination of both biologists and systems scientists alike. But what is it? This review provides one researcher's somewhat idiosyncratic view of the subject, but also aims to persuade young scientists to examine the possible evolution of this subject in a rich historical context. In particular, one may wish to read this review to envision a subject built out of a consilience of many interesting concepts from systems sciences, logic and model theory, and algebra, culminating in novel tools, techniques and theories that can reveal deep principles in biology—seen beyond mere observations. A particular focus in this review is on approaches embedded in an embryonic program, dubbed ‘algorithmic algebraic model checking’, and its powers and limitations. PMID:19364723

  18. Trends in tissue repair and regeneration.

    PubMed

    Galliot, Brigitte; Crescenzi, Marco; Jacinto, Antonio; Tajbakhsh, Shahragim

    2017-02-01

    The 6th EMBO conference on the Molecular and Cellular Basis of Regeneration and Tissue Repair took place in Paestum (Italy) on the 17th-21st September, 2016. The 160 scientists who attended discussed the importance of cellular and tissue plasticity, biophysical aspects of regeneration, the diverse roles of injury-induced immune responses, strategies to reactivate regeneration in mammals, links between regeneration and ageing, and the impact of non-mammalian models on regenerative medicine. © 2017. Published by The Company of Biologists Ltd.

  19. Space Radiation and Cataracts (LBNL Summer Lecture Series)

    ScienceCinema

    Blakely, Eleanor [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Life Sciences Division

    2018-01-23

    Summer Lecture Series 2009: Eleanor Blakely, radiation biologist of the Life Sciences Division at Lawrence Berkeley National Laboratory, has been a scientist at Berkeley Lab since 1975. She is studying the effect of radiation on cataracts which concerns not only cancer patients, but also astronauts. As astronauts spend increasingly longer time in space, the effects of cosmic radiation exposure will become an increasingly important health issue- yet there is little human data on these effects. Blakely reviews this emerging field and the contributions made at Berkeley Lab

  20. Will Russian Scientists Go Rogue? A Survey on the Threat and the Impact of Western Assistance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ball, D Y; Gerber, T P

    2004-12-27

    The collapse of the Soviet Union sparked fears throughout the world that rogue nations and terrorist organizations would gain access to weapons of mass destruction (WMD). One specific concern has been 'WMD brain drain.' Russians with knowledge about nuclear, chemical, and biological weapons could now depart to any country of their choice, including rogue nations seeking to produce WMD. Meanwhile, Russian science fell into a protracted crisis, with plummeting salaries, little funding for research, and few new recruits to science. These developments increased both the incentives and the opportunities for scientists to sell their knowledge to governments and terrorist organizationsmore » with hostile intentions toward the United States. Recognizing the threat of WMD brain drain from Russia, the United States, and other governments implemented a host of programs designed to reduce the risk. Despite, or perhaps partly because of, massive assistance from the West to prevent scientists with WMD knowledge from emigrating, the threat of Russian WMD brain drain has recently faded from view. Yet we have seen no evidence that these programs are effective and little systematic assessment of the current threat of WMD migration. Our data from an unprecedented survey of 602 Russian physicists, biologists, and chemists suggest that the threat of WMD brain drain from Russia should still be at the forefront of our attention. Roughly 20 percent of Russian physicists, biologists, and chemists say they would consider working in rogue nations such as North Korea, Iran, Syria, or Iraq (still considered a rogue state at the time of the survey). At the same time, the data reveal that U.S. and Western nonproliferation assistance programs work. They significantly reduce the likelihood that Russian scientists would consider working in these countries. Moreover, Russian grants do not reduce scientists' propensity to 'go rogue'. These survey findings have clear policy implications: the U.S. and its allies must continue to adequately fund nonproliferation assistance programs rather than hastily declare victory. The U.S. should remain engaged with former Soviet WMD scientists until they are willing and able to find support for their research from competitive, civilian-oriented, privately funded projects. Otherwise, we run a great risk that WMD expertise will migrate from the former Soviet Union to countries or organizations that harbor hostile intentions toward the U.S. Assistance programs work to reduce the threat of WMD brain drain, but their task is not complete. Now is not the time to pull back.« less

  1. Biocellion: accelerating computer simulation of multicellular biological system models

    PubMed Central

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-01-01

    Motivation: Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. Results: We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Availability and implementation: Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. Contact: seunghwa.kang@pnnl.gov PMID:25064572

  2. Conscience dilemma: to become a bioengineer or to survive as a biologist.

    PubMed

    Selimoglu, Sureyya Mert

    2014-01-01

    Bioengineering is the consideration of biological problems from modern engineering, therefore money-oriented, perspective. Today, grant-giving bodies always favor bioengineering projects rather than pure biology projects (like those in ecology, entomology, etc.). Therefore, today's biologist is forced to be on the horns of a dilemma. They have to either submit a very powerful and valid reason for the proposal of their project, or change the project to one having a potential of money-based outcome. On the other hand, because of dealing with the living components of nature, conducting a research in pure biology is like a kind of worship. For this reason, from a believer scientist's view, a deviation (in terms of research) from biology to bioengineering can be considered like committing a sin. Unfortunately, today's wild capitalism has been bringing new sinners day by day, and this system will continue for the foreseeable future unless grant-giving bodies comprehend the real importance of pure biology.

  3. Conceptual biology, hypothesis discovery, and text mining: Swanson's legacy.

    PubMed

    Bekhuis, Tanja

    2006-04-03

    Innovative biomedical librarians and information specialists who want to expand their roles as expert searchers need to know about profound changes in biology and parallel trends in text mining. In recent years, conceptual biology has emerged as a complement to empirical biology. This is partly in response to the availability of massive digital resources such as the network of databases for molecular biologists at the National Center for Biotechnology Information. Developments in text mining and hypothesis discovery systems based on the early work of Swanson, a mathematician and information scientist, are coincident with the emergence of conceptual biology. Very little has been written to introduce biomedical digital librarians to these new trends. In this paper, background for data and text mining, as well as for knowledge discovery in databases (KDD) and in text (KDT) is presented, then a brief review of Swanson's ideas, followed by a discussion of recent approaches to hypothesis discovery and testing. 'Testing' in the context of text mining involves partially automated methods for finding evidence in the literature to support hypothetical relationships. Concluding remarks follow regarding (a) the limits of current strategies for evaluation of hypothesis discovery systems and (b) the role of literature-based discovery in concert with empirical research. Report of an informatics-driven literature review for biomarkers of systemic lupus erythematosus is mentioned. Swanson's vision of the hidden value in the literature of science and, by extension, in biomedical digital databases, is still remarkably generative for information scientists, biologists, and physicians.

  4. Graduate Training at the Interface of Computational and Experimental Biology: An Outcome Report from a Partnership of Volunteers between a University and a National Laboratory

    PubMed Central

    von Arnim, Albrecht G.; Missra, Anamika

    2017-01-01

    Leading voices in the biological sciences have called for a transformation in graduate education leading to the PhD degree. One area commonly singled out for growth and innovation is cross-training in computational science. In 1998, the University of Tennessee (UT) founded an intercollegiate graduate program called the UT-ORNL Graduate School of Genome Science and Technology in partnership with the nearby Oak Ridge National Laboratory. Here, we report outcome data that attest to the program’s effectiveness in graduating computationally enabled biologists for diverse careers. Among 77 PhD graduates since 2003, the majority came with traditional degrees in the biological sciences, yet two-thirds moved into computational or hybrid (computational–experimental) positions. We describe the curriculum of the program and how it has changed. We also summarize how the program seeks to establish cohesion between computational and experimental biologists. This type of program can respond flexibly and dynamically to unmet training needs. In conclusion, this study from a flagship, state-supported university may serve as a reference point for creating a stable, degree-granting, interdepartmental graduate program in computational biology and allied areas. PMID:29167223

  5. Recruitment of Foreigners in the Market for Computer Scientists in the United States

    PubMed Central

    Bound, John; Braga, Breno; Golden, Joseph M.

    2016-01-01

    We present and calibrate a dynamic model that characterizes the labor market for computer scientists. In our model, firms can recruit computer scientists from recently graduated college students, from STEM workers working in other occupations or from a pool of foreign talent. Counterfactual simulations suggest that wages for computer scientists would have been 2.8–3.8% higher, and the number of Americans employed as computers scientists would have been 7.0–13.6% higher in 2004 if firms could not hire more foreigners than they could in 1994. In contrast, total CS employment would have been 3.8–9.0% lower, and consequently output smaller. PMID:27170827

  6. The conception of life in synthetic biology.

    PubMed

    Deplazes-Zemp, Anna

    2012-12-01

    The phrase 'synthetic biology' is used to describe a set of different scientific and technological disciplines, which share the objective to design and produce new life forms. This essay addresses the following questions: What conception of life stands behind this ambitious objective? In what relation does this conception of life stand to that of traditional biology and biotechnology? And, could such a conception of life raise ethical concerns? Three different observations that provide useful indications for the conception of life in synthetic biology will be discussed in detail: 1. Synthetic biologists focus on different features of living organisms in order to design new life forms, 2. Synthetic biologists want to contribute to the understanding of life, and 3. Synthetic biologists want to modify life through a rational design, which implies the notions of utilising, minimising/optimising, varying and overcoming life. These observations indicate a tight connection between science and technology, a focus on selected aspects of life, a production-oriented approach to life, and a design-oriented understanding of life. It will be argued that through this conception of life synthetic biologists present life in a different light. This conception of life will be illustrated by the metaphor of a toolbox. According to the notion of life as a toolbox, the different features of living organisms are perceived as various rationally designed instruments that can be used for the production of the living organism itself or secondary products made by the organism. According to certain ethical positions this conception of life might raise ethical concerns related to the status of the organism, the motives of the scientists and the role of technology in our society.

  7. Spatially continuous interpolation of water stage and water depths using the Everglades depth estimation network (EDEN)

    USGS Publications Warehouse

    Pearlstine, Leonard; Higer, Aaron; Palaseanu, Monica; Fujisaki, Ikuko; Mazzotti, Frank

    2007-01-01

    The Everglades Depth Estimation Network (EDEN) is an integrated network of real-time water-level monitoring, ground-elevation modeling, and water-surface modeling that provides scientists and managers with current (2000-present), online water-stage and water-depth information for the entire freshwater portion of the Greater Everglades. Continuous daily spatial interpolations of the EDEN network stage data are presented on a 400-square-meter grid spacing. EDEN offers a consistent and documented dataset that can be used by scientists and managers to (1) guide large-scale field operations, (2) integrate hydrologic and ecological responses, and (3) support biological and ecological assessments that measure ecosystem responses to the implementation of the Comprehensive Everglades Restoration Plan (CERP) The target users are biologists and ecologists examining trophic level responses to hydrodynamic changes in the Everglades.

  8. A woman like you: Women scientists and engineers at Brookhaven National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benkovitz, Carmen; Bernholc, Nicole; Cohen, Anita

    1991-01-01

    This publication by the women in Science and Engineering introduces career possibilities in science and engineering. It introduces what work and home life are like for women who have already entered these fields. Women at Brookhaven National Laboratory work in a variety of challenging research roles -- from biologist and environmental scientist to safety engineer, from patent lawyer to technician. Brookhaven National Laboratory is a multi-program laboratory which carries out basic and applied research in the physical, biomedical and environmental sciences and in selected energy technologies. The Laboratory is managed by Associated University, Inc., under contract with the US Departmentmore » of Energy. Brookhaven and the other national laboratories, because of their enormous research resources, can play a critical role in a education and training of the workforce.« less

  9. A woman like you: Women scientists and engineers at Brookhaven National Laboratory. Careers in action

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-12-31

    This publication by the women in Science and Engineering introduces career possibilities in science and engineering. It introduces what work and home life are like for women who have already entered these fields. Women at Brookhaven National Laboratory work in a variety of challenging research roles -- from biologist and environmental scientist to safety engineer, from patent lawyer to technician. Brookhaven National Laboratory is a multi-program laboratory which carries out basic and applied research in the physical, biomedical and environmental sciences and in selected energy technologies. The Laboratory is managed by Associated University, Inc., under contract with the US Departmentmore » of Energy. Brookhaven and the other national laboratories, because of their enormous research resources, can play a critical role in a education and training of the workforce.« less

  10. Proceedings of the 2011 Elwha River Science Symposium

    USGS Publications Warehouse

    Barbero, Kiley; Morrow, Tara; Shaffer, Anne; Duda, Jeffrey J.; Jenkins, Kurt J.; Blackie, Barbara; Lear, Cathy

    2011-01-01

    Many of the scientists working on the Elwha project have regularly met, since around 2004, for annual meetings. Loosely organized under the auspices of the Elwha Research and Elwha Nearshore consortia, the annual meetings have been informative for many reasons, including the sharing of study plans, field schedules, and preliminary results. It has been a great way for groups of physical scientists and groups of biologists to learn about the questions of interest to each group and to explore areas of overlap. In some cases, these meetings have spawned new collaborations, synergies, and research directions. In planning for the 2011 Elwha River Science Symposium, we sought to retain this espirit de corps, but realized that the start of dam removal heralded an important new phase of the project and called for an event that celebrated this special occasion. 

  11. BioSPICE: access to the most current computational tools for biologists.

    PubMed

    Garvey, Thomas D; Lincoln, Patrick; Pedersen, Charles John; Martin, David; Johnson, Mark

    2003-01-01

    The goal of the BioSPICE program is to create a framework that provides biologists access to the most current computational tools. At the program midpoint, the BioSPICE member community has produced a software system that comprises contributions from approximately 20 participating laboratories integrated under the BioSPICE Dashboard and a methodology for continued software integration. These contributed software modules are the BioSPICE Dashboard, a graphical environment that combines Open Agent Architecture and NetBeans software technologies in a coherent, biologist-friendly user interface. The current Dashboard permits data sources, models, simulation engines, and output displays provided by different investigators and running on different machines to work together across a distributed, heterogeneous network. Among several other features, the Dashboard enables users to create graphical workflows by configuring and connecting available BioSPICE components. Anticipated future enhancements to BioSPICE include a notebook capability that will permit researchers to browse and compile data to support model building, a biological model repository, and tools to support the development, control, and data reduction of wet-lab experiments. In addition to the BioSPICE software products, a project website supports information exchange and community building.

  12. Preface.

    PubMed

    Ditlevsen, Susanne; Lansky, Petr

    2016-06-01

    This Special Issue of Mathematical Biosciences and Engineering contains 11 selected papers presented at the Neural Coding 2014 workshop. The workshop was held in the royal city of Versailles in France, October 6-10, 2014. This was the 11th of a series of international workshops on this subject, the first held in Prague (1995), then Versailles (1997), Osaka (1999), Plymouth (2001), Aulla (2003), Marburg (2005), Montevideo (2007), Tainan (2009), Limassol (2010), and again in Prague (2012). Also selected papers from Prague were published as a special issue of Mathematical Biosciences and Engineering and in this way a tradition was started. Similarly to the previous workshops, this was a single track multidisciplinary event bringing together experimental and computational neuroscientists. The Neural Coding Workshops are traditionally biennial symposia. They are relatively small in size, interdisciplinary with major emphasis on the search for common principles in neural coding. The workshop was conceived to bring together scientists from different disciplines for an in-depth discussion of mathematical model-building and computational strategies. Further information on the meeting can be found at the NC2014 website at https://colloque6.inra.fr/neural_coding_2014. The meeting was supported by French National Institute for Agricultural Research, the world's leading institution in this field. This Special Issue of Mathematical Biosciences and Engineering contains 11 selected papers presented at the Neural Coding 2014 workshop. The workshop was held in the royal city of Versailles in France, October 6-10, 2014. This was the 11th of a series of international workshops on this subject, the first held in Prague (1995), then Versailles (1997), Osaka (1999), Plymouth (2001), Aulla (2003), Marburg (2005), Montevideo (2007), Tainan (2009), Limassol (2010), and again in Prague (2012). Also selected papers from Prague were published as a special issue of Mathematical Biosciences and Engineering and in this way a tradition was started. Similarly to the previous workshops, this was a single track multidisciplinary event bringing together experimental and computational neuroscientists. The Neural Coding Workshops are traditionally biennial symposia. They are relatively small in size, interdisciplinary with major emphasis on the search for common principles in neural coding. The workshop was conceived to bring together scientists from different disciplines for an in-depth discussion of mathematical model-building and computational strategies. Further information on the meeting can be found at the NC2014 website at https://colloque6.inra.fr/neural_coding_2014. The meeting was supported by French National Institute for Agricultural Research, the world's leading institution in this field. Understanding how the brain processes information is one of the most challenging subjects in neuroscience. The papers presented in this special issue show a small corner of the huge diversity of this field, and illustrate how scientists with different backgrounds approach this vast subject. The diversity of disciplines engaged in these investigations is remarkable: biologists, mathematicians, physicists, psychologists, computer scientists, and statisticians, all have original tools and ideas by which to try to elucidate the underlying mechanisms. In this issue, emphasis is put on mathematical modeling of single neurons. A variety of problems in computational neuroscience accompanied with a rich diversity of mathematical tools and approaches are presented. We hope it will inspire and challenge the readers in their own research. We would like to thank the authors for their valuable contributions and the referees for their priceless effort of reviewing the manuscripts. Finally, we would like to thank Yang Kuang for supporting us and making this publication possible.

  13. BioImageXD: an open, general-purpose and high-throughput image-processing platform.

    PubMed

    Kankaanpää, Pasi; Paavolainen, Lassi; Tiitta, Silja; Karjalainen, Mikko; Päivärinne, Joacim; Nieminen, Jonna; Marjomäki, Varpu; Heino, Jyrki; White, Daniel J

    2012-06-28

    BioImageXD puts open-source computer science tools for three-dimensional visualization and analysis into the hands of all researchers, through a user-friendly graphical interface tuned to the needs of biologists. BioImageXD has no restrictive licenses or undisclosed algorithms and enables publication of precise, reproducible and modifiable workflows. It allows simple construction of processing pipelines and should enable biologists to perform challenging analyses of complex processes. We demonstrate its performance in a study of integrin clustering in response to selected inhibitors.

  14. Transparent mediation-based access to multiple yeast data sources using an ontology driven interface.

    PubMed

    Briache, Abdelaali; Marrakchi, Kamar; Kerzazi, Amine; Navas-Delgado, Ismael; Rossi Hassani, Badr D; Lairini, Khalid; Aldana-Montes, José F

    2012-01-25

    Saccharomyces cerevisiae is recognized as a model system representing a simple eukaryote whose genome can be easily manipulated. Information solicited by scientists on its biological entities (Proteins, Genes, RNAs...) is scattered within several data sources like SGD, Yeastract, CYGD-MIPS, BioGrid, PhosphoGrid, etc. Because of the heterogeneity of these sources, querying them separately and then manually combining the returned results is a complex and time-consuming task for biologists most of whom are not bioinformatics expert. It also reduces and limits the use that can be made on the available data. To provide transparent and simultaneous access to yeast sources, we have developed YeastMed: an XML and mediator-based system. In this paper, we present our approach in developing this system which takes advantage of SB-KOM to perform the query transformation needed and a set of Data Services to reach the integrated data sources. The system is composed of a set of modules that depend heavily on XML and Semantic Web technologies. User queries are expressed in terms of a domain ontology through a simple form-based web interface. YeastMed is the first mediation-based system specific for integrating yeast data sources. It was conceived mainly to help biologists to find simultaneously relevant data from multiple data sources. It has a biologist-friendly interface easy to use. The system is available at http://www.khaos.uma.es/yeastmed/.

  15. Partly cloudy with a chance of migration: Weather, radars, and aeroecology

    USGS Publications Warehouse

    Chilson, Phillip B.; Frick, Winifred F.; Kelly, Jeffrey F.; Howard, Kenneth W.; Larkin, Ronald P.; Diehl, Robert H.; Westbrook, John K.; Kelly, T. Adam; Kunz, Thomas H.

    2012-01-01

    Aeroecology is an emerging scientific discipline that integrates atmospheric science, Earth science, geography, ecology, computer science, computational biology, and engineering to further the understanding of biological patterns and processes. The unifying concept underlying this new transdisciplinary field of study is a focus on the planetary boundary layer and lower free atmosphere (i.e., the aerosphere), and the diversity of airborne organisms that inhabit and depend on the aerosphere for their existence. Here, we focus on the role of radars and radar networks in aeroecological studies. Radar systems scanning the atmosphere are primarily used to monitor weather conditions and track the location and movements of aircraft. However, radar echoes regularly contain signals from other sources, such as airborne birds, bats, and arthropods. We briefly discuss how radar observations can be and have been used to study a variety of airborne organisms and examine some of the many potential benefits likely to arise from radar aeroecology for meteorological and biological research over a wide range of spatial and temporal scales. Radar systems are becoming increasingly sophisticated with the advent of innovative signal processing and dual-polarimetric capabilities. These capabilities should be better harnessed to promote both meteorological and aeroecological research and to explore the interface between these two broad disciplines. We strongly encourage close collaboration among meteorologists, radar scientists, biologists, and others toward developing radar products that will contribute to a better understanding of airborne fauna.

  16. [Which are the most influential journals, books and scientists in Latin American biology?].

    PubMed

    Monge-Nájera, Julián; Benavides-Varela, Catalina; Morera, Bernal

    2004-03-01

    A survey was distributed by e-mail to 553 biologists who study the Neotropics, in order to identify the journals, books and researchers with the greatest influence over Latin American biology. The biologists' database of the Revista de Biología Tropical was used to obtain their addresses. One third of them answered. The Revista de Biología Tropical is considered the most influential journal in the region. The majority of other influential journals are published in developed countries. The thematic distribution of answers, as well as independent assessments found in the literature, indicate that these and other survey results are not biased by the use of the journal's database. By subject, marine and ecological journals are the most influential. In contrast with American science, there are no researchers or books that clearly dominate the field. These results hint to the subjectivity of many awards and qualifications and possibly reflect a lack of tradition regarding appearance of local scientists in the mass media, the small capacity of world wide diffusion for local research and the low priority of science in the Iberoamerican culture. Latin American journals should improve, specially through efficient communication with authors, stringent rejection of inferior manuscripts and through widespread and timely distribution. The marked dominance by male researchers may reflect the lower number of women in the field, and social inequality. Despite the absence of "superstars", there was a correlation: most scientists in the "list of outstanding researchers" were from large countries. The publication of the most influential journal in one of the smallest countries of the region might reflect the relatively long period of existence of the Revista (half a century), the lack of other alternatives in the region and the journal's inclusion in international indices. Recommendations for Latin American science include a selection of the best journals to receive financial support and the establishment, with help from the mass media, of a group of selected researchers as role models for the new generations.

  17. VisRseq: R-based visual framework for analysis of sequencing data

    PubMed Central

    2015-01-01

    Background Several tools have been developed to enable biologists to perform initial browsing and exploration of sequencing data. However the computational tool set for further analyses often requires significant computational expertise to use and many of the biologists with the knowledge needed to interpret these data must rely on programming experts. Results We present VisRseq, a framework for analysis of sequencing datasets that provides a computationally rich and accessible framework for integrative and interactive analyses without requiring programming expertise. We achieve this aim by providing R apps, which offer a semi-auto generated and unified graphical user interface for computational packages in R and repositories such as Bioconductor. To address the interactivity limitation inherent in R libraries, our framework includes several native apps that provide exploration and brushing operations as well as an integrated genome browser. The apps can be chained together to create more powerful analysis workflows. Conclusions To validate the usability of VisRseq for analysis of sequencing data, we present two case studies performed by our collaborators and report their workflow and insights. PMID:26328469

  18. VisRseq: R-based visual framework for analysis of sequencing data.

    PubMed

    Younesy, Hamid; Möller, Torsten; Lorincz, Matthew C; Karimi, Mohammad M; Jones, Steven J M

    2015-01-01

    Several tools have been developed to enable biologists to perform initial browsing and exploration of sequencing data. However the computational tool set for further analyses often requires significant computational expertise to use and many of the biologists with the knowledge needed to interpret these data must rely on programming experts. We present VisRseq, a framework for analysis of sequencing datasets that provides a computationally rich and accessible framework for integrative and interactive analyses without requiring programming expertise. We achieve this aim by providing R apps, which offer a semi-auto generated and unified graphical user interface for computational packages in R and repositories such as Bioconductor. To address the interactivity limitation inherent in R libraries, our framework includes several native apps that provide exploration and brushing operations as well as an integrated genome browser. The apps can be chained together to create more powerful analysis workflows. To validate the usability of VisRseq for analysis of sequencing data, we present two case studies performed by our collaborators and report their workflow and insights.

  19. [Around biological evolution. Reflections of a physicist].

    PubMed

    Sanchez-Palencia, Evariste

    2016-01-01

    This text is the written version of a talk at the Société de Biologie on February 17, 2016. It contains reflections of a non-biologist scientist on general problems of biological evolution, including the kind of causality involved, the ideas emerging from it, in particular the constructive and structuring character of phenomena such as predation, the role of stability and attractors. This leads to a larger reflection on dialectics, the general framework of evolving processes, which overpasses formal logic and instantaneousness. © Société de Biologie, 2016.

  20. Regeneration, morphogenesis and self-organization.

    PubMed

    Goldman, Daniel

    2014-07-01

    The RIKEN Center for Developmental Biology in Kobe, Japan, hosted a meeting entitled 'Regeneration of Organs: Programming and Self-Organization' in March, 2014. Scientists from across the globe met to discuss current research on regeneration, organ morphogenesis and self-organization - and the links between these fields. A diverse range of experimental models and organ systems was presented, and the speakers aptly illustrated the unique power of each. This Meeting Review describes the major advances reported and themes emerging from this exciting meeting. © 2014. Published by The Company of Biologists Ltd.

  1. Integrated interdisciplinary training in the radiological sciences.

    PubMed

    Brenner, D J; Vazquez, M; Buonanno, M; Amundson, S A; Bigelow, A W; Garty, G; Harken, A D; Hei, T K; Marino, S A; Ponnaiya, B; Randers-Pehrson, G; Xu, Y

    2014-02-01

    The radiation sciences are increasingly interdisciplinary, both from the research and the clinical perspectives. Beyond clinical and research issues, there are very real issues of communication between scientists from different disciplines. It follows that there is an increasing need for interdisciplinary training courses in the radiological sciences. Training courses are common in biomedical academic and clinical environments, but are typically targeted to scientists in specific technical fields. In the era of multidisciplinary biomedical science, there is a need for highly integrated multidisciplinary training courses that are designed for, and are useful to, scientists who are from a mix of very different academic fields and backgrounds. We briefly describe our experiences running such an integrated training course for researchers in the field of biomedical radiation microbeams, and draw some conclusions about how such interdisciplinary training courses can best function. These conclusions should be applicable to many other areas of the radiological sciences. In summary, we found that it is highly beneficial to keep the scientists from the different disciplines together. In practice, this means not segregating the training course into sections specifically for biologists and sections specifically for physicists and engineers, but rather keeping the students together to attend the same lectures and hands-on studies throughout the course. This structure added value to the learning experience not only in terms of the cross fertilization of information and ideas between scientists from the different disciplines, but also in terms of reinforcing some basic concepts for scientists in their own discipline.

  2. T198. A SCHIZOPHRENIA-LIKE BIRTH SEASONALITY AMONG MATHEMATICIANS AND AN OPPOSITE SEASONALITY AMONG BIOLOGISTS: MORE EVIDENCE IMPLICATING BIMODAL RHYTHMS OF GENERAL BIRTHS

    PubMed Central

    Marzullo, Giovanni

    2018-01-01

    Abstract Background Based on early-20th century births, a pre-electric illumination time of comparatively normal human exposure to sunlight, studies of schizophrenia (SCZ) found a birth seasonality with two opposite effects: a SCZ-liability peak among subjects born around late-February and an equally significant SCZ-resistance peak among those born six months later, around late-August. We previously investigated this rhythm in connection with a sunlight-dependent bimodal rhythm of general births that, prior to the full advent of electric lighting (but not later), occurred ubiquitously in non-equatorial parts of the world. We found that the SCZ-liability peak coincided with a first, Feb-Mar peak of general-population births (the GP1) while the SCZ-resistance peak coincided with a second, Aug-Sep peak of those births (the GP2). Moreover, in a study of hand and visual-field preferences among professional baseball players, we found the SCZ-liability, GP1-coincident seasonality among players with preferences denoting cerebral asymmetry “deficits” (CADs) and the SCZ-resistance, GP2-coincident seasonality among those with preferences denoting cerebral asymmetry “excesses.” Also, in a study suggested by associations of CADs with artistic abilities, we found the SCZ-liability, GP1-coincident seasonality among groups representing visual, performing and literary art “creators” (VPL-Artists) and the SCZ-resistance, GP2-coincident seasonality among groups representing art critics, historians, curators and other art “observers” (Para-Artists). Together, these findings suggested, as one possibility (but see later), that the SCZ-liability, CAD effects and artistic abilities could all three represent traits genetically or otherwise selected into the GP1 excess population of newborns and out of the GP2 population. The present study of “scientists” was initially aimed at the purported arts/science antithesis. Methods Birth seasonalities were examined among early-20th century born American scientists and among yet earlier European biologists and mathematicians. Results A group representing 1,925 American scientists showed the SCZ-resistance, GP2-coincident seasonality. However, this effect proved to be mostly due to biologists because biochemists, chemists, and physicists showed gradually less seasonality while mathematicians suggested an altogether artist-like, GP1-coincident seasonality. This intimation of a biologist-mathematician antithesis was pursued with an investigation of most major figures in the history of the two sciences from the 15th to the early-20th century. The two groups, numbering 576 mathematicians and 787 biologists, shared the same mean decade of birth, the 1780s, and essentially the same geographic origin in Western Europe. The mathematicians showed a very significant SCZ liability-like, GP1-coincident seasonality while the biologists showed an even more significant SCZ resistance-like, GP2-coincident seasonality. The latter effect was particularly strong among naturalists, anatomists and other groups representing biological “observationalism” as opposed to “experimentalism.” Discussion The findings are discussed in light of a) new evidence that the annual photoperiod is indeed alone responsible for both peaks of general births, with the GP1 and the GP2 being caused by maternal periconceptional exposure to, respectively, the summer-solstice sunlight maximum and the winter-solstice minimum, and b) an approach/withdrawal theory of lateralization of basic emotions where the left cerebral cortex would handle external stimuli eliciting complacent emotions towards external realities while the right cortex would handle internal stimuli eliciting disdain for those realities.

  3. Biocellion: accelerating computer simulation of multicellular biological system models.

    PubMed

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-11-01

    Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  4. Challenges and responsibilities for public sector scientists.

    PubMed

    Van Montagu, Marc

    2010-11-30

    Current agriculture faces the challenge of doubling food production to meet the food needs of a population expected to reach 9 billion by mid-century whilst maintaining soil and water quality and conserving biodiversity. These challenges are more overwhelming for the rural poor, who are the custodians of environmental resources and at the same time particularly vulnerable to environmental degradation. Solutions have to come from concerted actions by different segments of society in which public sector science plays a fundamental role. Public sector scientists are at the root of all the present generation of GM crop traits under cultivation and more will come with the new knowledge that is being generated by systems biology. To speed up innovation, molecular biologists must interact with scientists from the different fields as well as with stakeholders outside the academic world in order to create an environment capable of capturing value from public sector knowledge. I highlight here the measures that have to be taken urgently to guarantee that science and technology can tackle the problems of subsistence farmers. Copyright © 2010 Elsevier B.V. All rights reserved.

  5. An Ibm PC/AT-Based Image Acquisition And Processing System For Quantitative Image Analysis

    NASA Astrophysics Data System (ADS)

    Kim, Yongmin; Alexander, Thomas

    1986-06-01

    In recent years, a large number of applications have been developed for image processing systems in the area of biological imaging. We have already finished the development of a dedicated microcomputer-based image processing and analysis system for quantitative microscopy. The system's primary function has been to facilitate and ultimately automate quantitative image analysis tasks such as the measurement of cellular DNA contents. We have recognized from this development experience, and interaction with system users, biologists and technicians, that the increasingly widespread use of image processing systems, and the development and application of new techniques for utilizing the capabilities of such systems, would generate a need for some kind of inexpensive general purpose image acquisition and processing system specially tailored for the needs of the medical community. We are currently engaged in the development and testing of hardware and software for a fairly high-performance image processing computer system based on a popular personal computer. In this paper, we describe the design and development of this system. Biological image processing computer systems have now reached a level of hardware and software refinement where they could become convenient image analysis tools for biologists. The development of a general purpose image processing system for quantitative image analysis that is inexpensive, flexible, and easy-to-use represents a significant step towards making the microscopic digital image processing techniques more widely applicable not only in a research environment as a biologist's workstation, but also in clinical environments as a diagnostic tool.

  6. The ImageJ ecosystem: an open platform for biomedical image analysis

    PubMed Central

    Schindelin, Johannes; Rueden, Curtis T.; Hiner, Mark C.; Eliceiri, Kevin W.

    2015-01-01

    Technology in microscopy advances rapidly, enabling increasingly affordable, faster, and more precise quantitative biomedical imaging, which necessitates correspondingly more-advanced image processing and analysis techniques. A wide range of software is available – from commercial to academic, special-purpose to Swiss army knife, small to large–but a key characteristic of software that is suitable for scientific inquiry is its accessibility. Open-source software is ideal for scientific endeavors because it can be freely inspected, modified, and redistributed; in particular, the open-software platform ImageJ has had a huge impact on life sciences, and continues to do so. From its inception, ImageJ has grown significantly due largely to being freely available and its vibrant and helpful user community. Scientists as diverse as interested hobbyists, technical assistants, students, scientific staff, and advanced biology researchers use ImageJ on a daily basis, and exchange knowledge via its dedicated mailing list. Uses of ImageJ range from data visualization and teaching to advanced image processing and statistical analysis. The software's extensibility continues to attract biologists at all career stages as well as computer scientists who wish to effectively implement specific image-processing algorithms. In this review, we use the ImageJ project as a case study of how open-source software fosters its suites of software tools, making multitudes of image-analysis technology easily accessible to the scientific community. We specifically explore what makes ImageJ so popular, how it impacts life science, how it inspires other projects, and how it is self-influenced by coevolving projects within the ImageJ ecosystem. PMID:26153368

  7. The ImageJ ecosystem: An open platform for biomedical image analysis.

    PubMed

    Schindelin, Johannes; Rueden, Curtis T; Hiner, Mark C; Eliceiri, Kevin W

    2015-01-01

    Technology in microscopy advances rapidly, enabling increasingly affordable, faster, and more precise quantitative biomedical imaging, which necessitates correspondingly more-advanced image processing and analysis techniques. A wide range of software is available-from commercial to academic, special-purpose to Swiss army knife, small to large-but a key characteristic of software that is suitable for scientific inquiry is its accessibility. Open-source software is ideal for scientific endeavors because it can be freely inspected, modified, and redistributed; in particular, the open-software platform ImageJ has had a huge impact on the life sciences, and continues to do so. From its inception, ImageJ has grown significantly due largely to being freely available and its vibrant and helpful user community. Scientists as diverse as interested hobbyists, technical assistants, students, scientific staff, and advanced biology researchers use ImageJ on a daily basis, and exchange knowledge via its dedicated mailing list. Uses of ImageJ range from data visualization and teaching to advanced image processing and statistical analysis. The software's extensibility continues to attract biologists at all career stages as well as computer scientists who wish to effectively implement specific image-processing algorithms. In this review, we use the ImageJ project as a case study of how open-source software fosters its suites of software tools, making multitudes of image-analysis technology easily accessible to the scientific community. We specifically explore what makes ImageJ so popular, how it impacts the life sciences, how it inspires other projects, and how it is self-influenced by coevolving projects within the ImageJ ecosystem. © 2015 Wiley Periodicals, Inc.

  8. Considering aspects of the 3Rs principles within experimental animal biology.

    PubMed

    Sneddon, Lynne U; Halsey, Lewis G; Bury, Nic R

    2017-09-01

    The 3Rs - Replacement, Reduction and Refinement - are embedded into the legislation and guidelines governing the ethics of animal use in experiments. Here, we consider the advantages of adopting key aspects of the 3Rs into experimental biology, represented mainly by the fields of animal behaviour, neurobiology, physiology, toxicology and biomechanics. Replacing protected animals with less sentient forms or species, cells, tissues or computer modelling approaches has been broadly successful. However, many studies investigate specific models that exhibit a particular adaptation, or a species that is a target for conservation, such that their replacement is inappropriate. Regardless of the species used, refining procedures to ensure the health and well-being of animals prior to and during experiments is crucial for the integrity of the results and legitimacy of the science. Although the concepts of health and welfare are developed for model organisms, relatively little is known regarding non-traditional species that may be more ecologically relevant. Studies should reduce the number of experimental animals by employing the minimum suitable sample size. This is often calculated using power analyses, which is associated with making statistical inferences based on the P -value, yet P -values often leave scientists on shaky ground. We endorse focusing on effect sizes accompanied by confidence intervals as a more appropriate means of interpreting data; in turn, sample size could be calculated based on effect size precision. Ultimately, the appropriate employment of the 3Rs principles in experimental biology empowers scientists in justifying their research, and results in higher-quality science. © 2017. Published by The Company of Biologists Ltd.

  9. 2017 ISCB Accomplishment by a Senior Scientist Award: Pavel Pevzner

    PubMed Central

    Fogg, Christiana N.; Kovats, Diane E.; Berger, Bonnie

    2017-01-01

    The International Society for Computational Biology ( ISCB) recognizes an established scientist each year with the Accomplishment by a Senior Scientist Award for significant contributions he or she has made to the field. This award honors scientists who have contributed to the advancement of computational biology and bioinformatics through their research, service, and education work. Pavel Pevzner, PhD, Ronald R. Taylor Professor of Computer Science and Director of the NIH Center for Computational Mass Spectrometry at University of California, San Diego, has been selected as the winner of the 2017 Accomplishment by a Senior Scientist Award. The ISCB awards committee, chaired by Dr. Bonnie Berger of the Massachusetts Institute of Technology, selected Pevzner as the 2017 winner. Pevzner will receive his award and deliver a keynote address at the 2017 Intelligent Systems for Molecular Biology-European Conference on Computational Biology joint meeting ( ISMB/ECCB 2017) held in Prague, Czech Republic from July 21-July 25, 2017. ISMB/ECCB is a biennial joint meeting that brings together leading scientists in computational biology and bioinformatics from around the globe. PMID:28713548

  10. Biotargeted nanomedicines for cancer: six tenets before you begin

    PubMed Central

    Goldberg, Michael S.; Hook, Sara S.; Wang, Andrew Z.; Bulte, Jeff WM.; Patri, Anil K.; Uckun, Fatih M.; Cryns, Vincent L.; Hanes, Justin; Akin, Demir; Hall, Jennifer B.; Gharkholo, Nastaran; Mumper, Russell J.

    2013-01-01

    Biotargeted nanomedicines have captured the attention of academic and industrial scientists who have been motivated by the theoretical possibilities of the ‘magic bullet’ that was first conceptualized by Paul Ehrlich at the beginning of the 20th century. The Biotargeting Working Group, consisting of more than 50 pharmaceutical scientists, engineers, biologists and clinicians, has been formed as part of the National Cancer Institute’s Alliance for Nanotechnology in Cancer to harness collective wisdom in order to tackle conceptual and practical challenges in developing biotargeted nanomedicines for cancer. In modern science and medicine, it is impossible for any individual to be an expert in every aspect of biology, chemistry, materials science, pharmaceutics, toxicology, chemical engineering, imaging, physiology, oncology and regulatory affairs. Drawing on the expertise of leaders from each of these disciplines, this commentary highlights six tenets of biotargeted cancer nanomedicines in order to enable the translation of basic science into clinical practice. PMID:23394158

  11. Rice-Map: a new-generation rice genome browser.

    PubMed

    Wang, Jun; Kong, Lei; Zhao, Shuqi; Zhang, He; Tang, Liang; Li, Zhe; Gu, Xiaocheng; Luo, Jingchu; Gao, Ge

    2011-03-30

    The concurrent release of rice genome sequences for two subspecies (Oryza sativa L. ssp. japonica and Oryza sativa L. ssp. indica) facilitates rice studies at the whole genome level. Since the advent of high-throughput analysis, huge amounts of functional genomics data have been delivered rapidly, making an integrated online genome browser indispensable for scientists to visualize and analyze these data. Based on next-generation web technologies and high-throughput experimental data, we have developed Rice-Map, a novel genome browser for researchers to navigate, analyze and annotate rice genome interactively. More than one hundred annotation tracks (81 for japonica and 82 for indica) have been compiled and loaded into Rice-Map. These pre-computed annotations cover gene models, transcript evidences, expression profiling, epigenetic modifications, inter-species and intra-species homologies, genetic markers and other genomic features. In addition to these pre-computed tracks, registered users can interactively add comments and research notes to Rice-Map as User-Defined Annotation entries. By smoothly scrolling, dragging and zooming, users can browse various genomic features simultaneously at multiple scales. On-the-fly analysis for selected entries could be performed through dedicated bioinformatic analysis platforms such as WebLab and Galaxy. Furthermore, a BioMart-powered data warehouse "Rice Mart" is offered for advanced users to fetch bulk datasets based on complex criteria. Rice-Map delivers abundant up-to-date japonica and indica annotations, providing a valuable resource for both computational and bench biologists. Rice-Map is publicly accessible at http://www.ricemap.org/, with all data available for free downloading.

  12. Familiarity Vs Trust: A Comparative Study of Domain Scientists' Trust in Visual Analytics and Conventional Analysis Methods.

    PubMed

    Dasgupta, Aritra; Lee, Joon-Yong; Wilson, Ryan; Lafrance, Robert A; Cramer, Nick; Cook, Kristin; Payne, Samuel

    2017-01-01

    Combining interactive visualization with automated analytical methods like statistics and data mining facilitates data-driven discovery. These visual analytic methods are beginning to be instantiated within mixed-initiative systems, where humans and machines collaboratively influence evidence-gathering and decision-making. But an open research question is that, when domain experts analyze their data, can they completely trust the outputs and operations on the machine-side? Visualization potentially leads to a transparent analysis process, but do domain experts always trust what they see? To address these questions, we present results from the design and evaluation of a mixed-initiative, visual analytics system for biologists, focusing on analyzing the relationships between familiarity of an analysis medium and domain experts' trust. We propose a trust-augmented design of the visual analytics system, that explicitly takes into account domain-specific tasks, conventions, and preferences. For evaluating the system, we present the results of a controlled user study with 34 biologists where we compare the variation of the level of trust across conventional and visual analytic mediums and explore the influence of familiarity and task complexity on trust. We find that despite being unfamiliar with a visual analytic medium, scientists seem to have an average level of trust that is comparable with the same in conventional analysis medium. In fact, for complex sense-making tasks, we find that the visual analytic system is able to inspire greater trust than other mediums. We summarize the implications of our findings with directions for future research on trustworthiness of visual analytic systems.

  13. PhyloDet: a scalable visualization tool for mapping multiple traits to large evolutionary trees

    PubMed Central

    Lee, Bongshin; Nachmanson, Lev; Robertson, George; Carlson, Jonathan M.; Heckerman, David

    2009-01-01

    Summary: Evolutionary biologists are often interested in finding correlations among biological traits across a number of species, as such correlations may lead to testable hypotheses about the underlying function. Because some species are more closely related than others, computing and visualizing these correlations must be done in the context of the evolutionary tree that relates species. In this note, we introduce PhyloDet (short for PhyloDetective), an evolutionary tree visualization tool that enables biologists to visualize multiple traits mapped to the tree. Availability: http://research.microsoft.com/cue/phylodet/ Contact: bongshin@microsoft.com. PMID:19633096

  14. The Biological Connection Markup Language: a SBGN-compliant format for visualization, filtering and analysis of biological pathways

    PubMed Central

    Rizzetto, Lisa; Guedez, Damariz Rivero; Donato, Michele; Romualdi, Chiara; Draghici, Sorin; Cavalieri, Duccio

    2011-01-01

    Motivation: Many models and analysis of signaling pathways have been proposed. However, neither of them takes into account that a biological pathway is not a fixed system, but instead it depends on the organism, tissue and cell type as well as on physiological, pathological and experimental conditions. Results: The Biological Connection Markup Language (BCML) is a format to describe, annotate and visualize pathways. BCML is able to store multiple information, permitting a selective view of the pathway as it exists and/or behave in specific organisms, tissues and cells. Furthermore, BCML can be automatically converted into data formats suitable for analysis and into a fully SBGN-compliant graphical representation, making it an important tool that can be used by both computational biologists and ‘wet lab’ scientists. Availability and implementation: The XML schema and the BCML software suite are freely available under the LGPL for download at http://bcml.dc-atlas.net. They are implemented in Java and supported on MS Windows, Linux and OS X. Contact: duccio.cavalieri@unifi.it; sorin@wayne.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:21653523

  15. The Human Genome Project: big science transforms biology and medicine.

    PubMed

    Hood, Leroy; Rowen, Lee

    2013-01-01

    The Human Genome Project has transformed biology through its integrated big science approach to deciphering a reference human genome sequence along with the complete sequences of key model organisms. The project exemplifies the power, necessity and success of large, integrated, cross-disciplinary efforts - so-called 'big science' - directed towards complex major objectives. In this article, we discuss the ways in which this ambitious endeavor led to the development of novel technologies and analytical tools, and how it brought the expertise of engineers, computer scientists and mathematicians together with biologists. It established an open approach to data sharing and open-source software, thereby making the data resulting from the project accessible to all. The genome sequences of microbes, plants and animals have revolutionized many fields of science, including microbiology, virology, infectious disease and plant biology. Moreover, deeper knowledge of human sequence variation has begun to alter the practice of medicine. The Human Genome Project has inspired subsequent large-scale data acquisition initiatives such as the International HapMap Project, 1000 Genomes, and The Cancer Genome Atlas, as well as the recently announced Human Brain Project and the emerging Human Proteome Project.

  16. Defrosting the digital library: bibliographic tools for the next generation web.

    PubMed

    Hull, Duncan; Pettifer, Steve R; Kell, Douglas B

    2008-10-01

    Many scientists now manage the bulk of their bibliographic information electronically, thereby organizing their publications and citation material from digital libraries. However, a library has been described as "thought in cold storage," and unfortunately many digital libraries can be cold, impersonal, isolated, and inaccessible places. In this Review, we discuss the current chilly state of digital libraries for the computational biologist, including PubMed, IEEE Xplore, the ACM digital library, ISI Web of Knowledge, Scopus, Citeseer, arXiv, DBLP, and Google Scholar. We illustrate the current process of using these libraries with a typical workflow, and highlight problems with managing data and metadata using URIs. We then examine a range of new applications such as Zotero, Mendeley, Mekentosj Papers, MyNCBI, CiteULike, Connotea, and HubMed that exploit the Web to make these digital libraries more personal, sociable, integrated, and accessible places. We conclude with how these applications may begin to help achieve a digital defrost, and discuss some of the issues that will help or hinder this in terms of making libraries on the Web warmer places in the future, becoming resources that are considerably more useful to both humans and machines.

  17. Defrosting the Digital Library: Bibliographic Tools for the Next Generation Web

    PubMed Central

    Hull, Duncan; Pettifer, Steve R.; Kell, Douglas B.

    2008-01-01

    Many scientists now manage the bulk of their bibliographic information electronically, thereby organizing their publications and citation material from digital libraries. However, a library has been described as “thought in cold storage,” and unfortunately many digital libraries can be cold, impersonal, isolated, and inaccessible places. In this Review, we discuss the current chilly state of digital libraries for the computational biologist, including PubMed, IEEE Xplore, the ACM digital library, ISI Web of Knowledge, Scopus, Citeseer, arXiv, DBLP, and Google Scholar. We illustrate the current process of using these libraries with a typical workflow, and highlight problems with managing data and metadata using URIs. We then examine a range of new applications such as Zotero, Mendeley, Mekentosj Papers, MyNCBI, CiteULike, Connotea, and HubMed that exploit the Web to make these digital libraries more personal, sociable, integrated, and accessible places. We conclude with how these applications may begin to help achieve a digital defrost, and discuss some of the issues that will help or hinder this in terms of making libraries on the Web warmer places in the future, becoming resources that are considerably more useful to both humans and machines. PMID:18974831

  18. The Human Genome Project: big science transforms biology and medicine

    PubMed Central

    2013-01-01

    The Human Genome Project has transformed biology through its integrated big science approach to deciphering a reference human genome sequence along with the complete sequences of key model organisms. The project exemplifies the power, necessity and success of large, integrated, cross-disciplinary efforts - so-called ‘big science’ - directed towards complex major objectives. In this article, we discuss the ways in which this ambitious endeavor led to the development of novel technologies and analytical tools, and how it brought the expertise of engineers, computer scientists and mathematicians together with biologists. It established an open approach to data sharing and open-source software, thereby making the data resulting from the project accessible to all. The genome sequences of microbes, plants and animals have revolutionized many fields of science, including microbiology, virology, infectious disease and plant biology. Moreover, deeper knowledge of human sequence variation has begun to alter the practice of medicine. The Human Genome Project has inspired subsequent large-scale data acquisition initiatives such as the International HapMap Project, 1000 Genomes, and The Cancer Genome Atlas, as well as the recently announced Human Brain Project and the emerging Human Proteome Project. PMID:24040834

  19. Mathematics and evolutionary biology make bioinformatics education comprehensible.

    PubMed

    Jungck, John R; Weisstein, Anton E

    2013-09-01

    The patterns of variation within a molecular sequence data set result from the interplay between population genetic, molecular evolutionary and macroevolutionary processes-the standard purview of evolutionary biologists. Elucidating these patterns, particularly for large data sets, requires an understanding of the structure, assumptions and limitations of the algorithms used by bioinformatics software-the domain of mathematicians and computer scientists. As a result, bioinformatics often suffers a 'two-culture' problem because of the lack of broad overlapping expertise between these two groups. Collaboration among specialists in different fields has greatly mitigated this problem among active bioinformaticians. However, science education researchers report that much of bioinformatics education does little to bridge the cultural divide, the curriculum too focused on solving narrow problems (e.g. interpreting pre-built phylogenetic trees) rather than on exploring broader ones (e.g. exploring alternative phylogenetic strategies for different kinds of data sets). Herein, we present an introduction to the mathematics of tree enumeration, tree construction, split decomposition and sequence alignment. We also introduce off-line downloadable software tools developed by the BioQUEST Curriculum Consortium to help students learn how to interpret and critically evaluate the results of standard bioinformatics analyses.

  20. Mathematics and evolutionary biology make bioinformatics education comprehensible

    PubMed Central

    Weisstein, Anton E.

    2013-01-01

    The patterns of variation within a molecular sequence data set result from the interplay between population genetic, molecular evolutionary and macroevolutionary processes—the standard purview of evolutionary biologists. Elucidating these patterns, particularly for large data sets, requires an understanding of the structure, assumptions and limitations of the algorithms used by bioinformatics software—the domain of mathematicians and computer scientists. As a result, bioinformatics often suffers a ‘two-culture’ problem because of the lack of broad overlapping expertise between these two groups. Collaboration among specialists in different fields has greatly mitigated this problem among active bioinformaticians. However, science education researchers report that much of bioinformatics education does little to bridge the cultural divide, the curriculum too focused on solving narrow problems (e.g. interpreting pre-built phylogenetic trees) rather than on exploring broader ones (e.g. exploring alternative phylogenetic strategies for different kinds of data sets). Herein, we present an introduction to the mathematics of tree enumeration, tree construction, split decomposition and sequence alignment. We also introduce off-line downloadable software tools developed by the BioQUEST Curriculum Consortium to help students learn how to interpret and critically evaluate the results of standard bioinformatics analyses. PMID:23821621

  1. Computer vision in cell biology.

    PubMed

    Danuser, Gaudenz

    2011-11-23

    Computer vision refers to the theory and implementation of artificial systems that extract information from images to understand their content. Although computers are widely used by cell biologists for visualization and measurement, interpretation of image content, i.e., the selection of events worth observing and the definition of what they mean in terms of cellular mechanisms, is mostly left to human intuition. This Essay attempts to outline roles computer vision may play and should play in image-based studies of cellular life. Copyright © 2011 Elsevier Inc. All rights reserved.

  2. The essential roles of chemistry in high-throughput screening triage

    PubMed Central

    Dahlin, Jayme L; Walters, Michael A

    2015-01-01

    It is increasingly clear that academic high-throughput screening (HTS) and virtual HTS triage suffers from a lack of scientists trained in the art and science of early drug discovery chemistry. Many recent publications report the discovery of compounds by screening that are most likely artifacts or promiscuous bioactive compounds, and these results are not placed into the context of previous studies. For HTS to be most successful, it is our contention that there must exist an early partnership between biologists and medicinal chemists. Their combined skill sets are necessary to design robust assays and efficient workflows that will weed out assay artifacts, false positives, promiscuous bioactive compounds and intractable screening hits, efforts that ultimately give projects a better chance at identifying truly useful chemical matter. Expertise in medicinal chemistry, cheminformatics and purification sciences (analytical chemistry) can enhance the post-HTS triage process by quickly removing these problematic chemotypes from consideration, while simultaneously prioritizing the more promising chemical matter for follow-up testing. It is only when biologists and chemists collaborate effectively that HTS can manifest its full promise. PMID:25163000

  3. Measuring, interpreting, and responding to changes in coral reefs: A challenge for biologists, geologist, and managers

    USGS Publications Warehouse

    Rogers, Caroline S.; Miller, Jeff; Hubbard, Dennis K.; Rogers, Caroline S.; Lipps, Jere H.; Stanley, George D.

    2016-01-01

    What, exactly, is a coral reef? And how have the world’s reefs changed in the last several decades? What are the stressors undermining reef structure and function? Given the predicted effects of climate change, do reefs have a future? Is it possible to “manage” coral reefs for resilience? What can coral reef scientists contribute to improve protection and management of coral reefs? What insights can biologists and geologists provide regarding the persistence of coral reefs on a human timescale? What is reef change to a biologist… to a geologist?Clearly, there are many challenging questions. In this chapter, we present some of our thoughts on monitoring and management of coral reefs in US national parks in the Caribbean and western Atlantic based on our experience as members of monitoring teams. We reflect on the need to characterize and evaluate reefs, on how to conduct high-quality monitoring programs, and on what we can learn from biological and geological experiments and investigations. We explore the possibility that specific steps can be taken to “manage” coral reefs for greater resilience.

  4. Worthy heir or treacherous patricide? Konrad Lorenz and Jakob v. Uexküll.

    PubMed

    Mildenberger, Florian

    2005-01-01

    The biologist Jakob v. Uexküll is often seen as the preceptor of modern behavioral theory, who lastingly influenced Konrad Lorenz in particular. Nevertheless, Uexküll has been highly inadequately received by the school Lorenz founded. This neglect of Uexküll's works resulted because Lorenz and Uexküll came into contact at a time when the biological sciences were sundered by a deep ideological division. On the one side stood the Darwin-rejecting Neo-Vitalists (for example Uexküll), on the other side were the Neo-Darwinists (for example Lorenz). After Vitalism was overcome as a consequence of the Evolutionary Synthesis, Darwinists who had taken an intermittent interest in Vitalists and their theories could now only distance themselves completely from earlier ideas. This went not only for biologists and behavioral researchers, but also for medical scientists. The emancipation from the starting points of their own science was so complete that, even decades later, when the earlier debates about Mechanism and Vitalism were long since historically outdated, behavioral research never investigated its own history.

  5. Telling science’s stories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wiley, H. S.

    Every biologist has been frustrated by an inability to find a specific piece of information in the literature. You are planning an experiment and you want to know whether factor X modifies the cellular response to factor Y. How do you find this information? Reference books and review articles are little help because most are supremely superficial, and any specific information they might contain is hopelessly out of date (not to mention the problem with constantly changing biological nomenclature). Online searching is only useful if the data you are looking for happens to be in the title or abstract. Unlessmore » what you’re looking for is the main subject of the paper, perusing the literature is almost hopeless. So what’s the best way to find biological information? The universal struggle that biologists undergo to find information in published papers indicates that the literature is not the actual repository of most biological knowledge. Most useful information, it seems, is not actually written down, but is passed orally between investigators. In other words, the best way to find biological information is to talk to other scientists.« less

  6. Biomimetics of photonic nanostructures.

    PubMed

    Parker, Andrew R; Townley, Helen E

    2007-06-01

    Biomimetics is the extraction of good design from nature. One approach to optical biomimetics focuses on the use of conventional engineering methods to make direct analogues of the reflectors and anti-reflectors found in nature. However, recent collaborations between biologists, physicists, engineers, chemists and materials scientists have ventured beyond experiments that merely mimic what happens in nature, leading to a thriving new area of research involving biomimetics through cell culture. In this new approach, the nanoengineering efficiency of living cells is harnessed and natural organisms such as diatoms and viruses are used to make nanostructures that could have commercial applications.

  7. Parts plus pipes: synthetic biology approaches to metabolic engineering

    PubMed Central

    Boyle, Patrick M.; Silver, Pamela A.

    2011-01-01

    Synthetic biologists combine modular biological “parts” to create higher-order devices. Metabolic engineers construct biological “pipes” by optimizing the microbial conversion of basic substrates to desired compounds. Many scientists work at the intersection of these two philosophies, employing synthetic devices to enhance metabolic engineering efforts. These integrated approaches promise to do more than simply improve product yields; they can expand the array of products that are tractable to produce biologically. In this review, we explore the application of synthetic biology techniques to next-generation metabolic engineering challenges, as well as the emerging engineering principles for biological design. PMID:22037345

  8. Getting under the skin of epidermal morphogenesis.

    PubMed

    Fuchs, Elaine; Raghavan, Srikala

    2002-03-01

    At the surface of the skin, the epidermis serves as the armour for the body. Scientists are now closer than ever to understanding how the epidermis accomplishes this extraordinary feat, and is able to survive and replenish itself under the harshest conditions that face any tissue. By combining genetic engineering with cell-biological studies and with human genome data analyses, skin biologists are discovering the mechanisms that underlie the development and differentiation of the epidermis and hair follicles of the skin. This explosion of knowledge paves the way for new discoveries into the genetic bases of human skin disorders and for developing new therapeutics.

  9. NASA Space Biology Research Associate Program for the 21st Century

    NASA Technical Reports Server (NTRS)

    Sonnenfeld, Gerald

    2000-01-01

    The Space Biology Research Associate Program for the 21st Century provided a unique opportunity to train individuals to conduct biological research in hypo- and hyper-gravity, and to conduct ground-based research. This grant was developed to maximize the potential for Space Biology as an emerging discipline and to train a cadre of space biologists. The field of gravitational and space biology is rapidly growing at the future of the field is reflected in the quality and education of its personnel. Our chief objective was to train and develop these scientists rapidly and in a cost effective model.

  10. Transient state kinetics tutorial using the kinetics simulation program, KINSIM.

    PubMed Central

    Wachsstock, D H; Pollard, T D

    1994-01-01

    This article provides an introduction to a computer tutorial on transient state kinetics. The tutorial uses our Macintosh version of the computer program, KINSIM, that calculates the time course of reactions. KINSIM is also available for other popular computers. This program allows even those investigators not mathematically inclined to evaluate the rate constants for the transitions between the intermediates in any reaction mechanism. These rate constants are one of the insights that are essential for understanding how biochemical processes work at the molecular level. The approach is applicable not only to enzyme reactions but also to any other type of process of interest to biophysicists, cell biologists, and molecular biologists in which concentrations change with time. In principle, the same methods could be used to characterize time-dependent, large-scale processes in ecology and evolution. Completion of the tutorial takes students 6-10 h. This investment is rewarded by a deep understanding of the principles of chemical kinetics and familiarity with the tools of kinetics simulation as an approach to solve everyday problems in the laboratory. PMID:7811941

  11. Translating New Science Into the Drug Review Process

    PubMed Central

    Rouse, Rodney; Kruhlak, Naomi; Weaver, James; Burkhart, Keith; Patel, Vikram; Strauss, David G.

    2017-01-01

    In 2011, the US Food and drug Administration (FDA) developed a strategic plan for regulatory science that focuses on developing new tools, standards, and approaches to assess the safety, efficacy, quality, and performance of FDA-regulated products. In line with this, the Division of Applied Regulatory Science was created to move new science into the Center for Drug Evaluation and Research (CDER) review process and close the gap between scientific innovation and drug review. The Division, located in the Office of Clinical Pharmacology, is unique in that it performs mission-critical applied research and review across the translational research spectrum including in vitro and in vivo laboratory research, in silico computational modeling and informatics, and integrated clinical research covering clinical pharmacology, experimental medicine, and postmarket analyses. The Division collaborates with Offices throughout CDER, across the FDA, other government agencies, academia, and industry. The Division is able to rapidly form interdisciplinary teams of pharmacologists, biologists, chemists, computational scientists, and clinicians to respond to challenging regulatory questions for specific review issues and for longer-range projects requiring the development of predictive models, tools, and biomarkers to speed the development and regulatory evaluation of safe and effective drugs. This article reviews the Division’s recent work and future directions, highlighting development and validation of biomarkers; novel humanized animal models; translational predictive safety combining in vitro, in silico, and in vivo clinical biomarkers; chemical and biomedical informatics tools for safety predictions; novel approaches to speed the development of complex generic drugs, biosimilars, and antibiotics; and precision medicine. PMID:29568713

  12. Which Melodic Universals Emerge from Repeated Signaling Games? A Note on Lumaca and Baggio (2017) ‡.

    PubMed

    Ravignani, Andrea; Verhoef, Tessa

    2018-01-01

    Music is a peculiar human behavior, yet we still know little as to why and how music emerged. For centuries, the study of music has been the sole prerogative of the humanities. Lately, however, music is being increasingly investigated by psychologists, neuroscientists, biologists, and computer scientists. One approach to studying the origins of music is to empirically test hypotheses about the mechanisms behind this structured behavior. Recent lab experiments show how musical rhythm and melody can emerge via the process of cultural transmission. In particular, Lumaca and Baggio (2017) tested the emergence of a sound system at the boundary between music and language. In this study, participants were given random pairs of signal-meanings; when participants negotiated their meaning and played a "game of telephone" with them, these pairs became more structured and systematic. Over time, the small biases introduced in each artificial transmission step accumulated, displaying quantitative trends, including the emergence, over the course of artificial human generations, of features resembling properties of language and music. In this Note, we highlight the importance of Lumaca and Baggio's experiment, place it in the broader literature on the evolution of language and music, and suggest refinements for future experiments. We conclude that, while psychological evidence for the emergence of proto-musical features is accumulating, complementary work is needed: Mathematical modeling and computer simulations should be used to test the internal consistency of experimentally generated hypotheses and to make new predictions.

  13. Fighting for life: Religion and science in the work of fish and wildlife biologists

    NASA Astrophysics Data System (ADS)

    Geffen, Joel Phillip

    Philosophers, historians, and sociologists of science have argued that it is impossible to separate fact from value. Even so, Americans generally demand that scientists be "objective." No bias is permitted in their work. Religious motivations in particular are widely considered anathema within the halls of science. My dissertation addresses both theoretical and practical aspects concerning objectivity in science through an examination of fish and wildlife biologists. I hypothesized that they use the language of objective science as a tool to convince others to protect habitats and species. Further, I claimed that this "rhetoric of science" is employed either consciously or unconsciously on behalf of personal values, and that religious and/or spiritual values figure significantly among these. Regarding the issue's practical applications, I argued in support of Susan Longino's assertion that while subjective influences exist in science, they do not necessarily indicate that objectivity has been sacrificed. My primary methodology is ethnographic. Thirty-five biologists working in the Pacific Northwest were interviewed during the course of summer 2001. Participant ages ranged from 23 to 78. Both genders were represented, as were various ethnic and cultural backgrounds, including Native American. I used a questionnaire to guide respondents through a consistent set of open-ended queries. I organized their answers under four categories: the true, the good, the beautiful, and the holy. The first three were borrowed from the theoretical writings of philosopher Immanuel Kant. The last came from Rudolf Otto's theological work. These categories provided an excellent analytical framework. I found that the great majority of fish and wildlife biologists strive for objectivity. However, they are also informed by powerful contextual values. These are derived from environmental ethics, aesthetic preferences pertaining to ecosystem appearance and function, and visceral experiences of connection with nature. These were blended into their practice of science to varying degrees. My hypothesis was affirmed. Science is not value-free, and nor can it be. Yet, contextual values do not necessarily undermine scientific objectivity.

  14. At Home in the Cell.

    ERIC Educational Resources Information Center

    Flannery, Maura C.

    1999-01-01

    Argues that biologists' understanding of the cell has become richer over the past 30 years. Describes how genetic engineering and sophisticated computer technology have provided an increased knowledge of genes, gene products, components of cells, and the structure and function of proteins. (CCM)

  15. Reproducible Bioinformatics Research for Biologists

    USDA-ARS?s Scientific Manuscript database

    This book chapter describes the current Big Data problem in Bioinformatics and the resulting issues with performing reproducible computational research. The core of the chapter provides guidelines and summaries of current tools/techniques that a noncomputational researcher would need to learn to pe...

  16. Phenex: ontological annotation of phenotypic diversity.

    PubMed

    Balhoff, James P; Dahdul, Wasila M; Kothari, Cartik R; Lapp, Hilmar; Lundberg, John G; Mabee, Paula; Midford, Peter E; Westerfield, Monte; Vision, Todd J

    2010-05-05

    Phenotypic differences among species have long been systematically itemized and described by biologists in the process of investigating phylogenetic relationships and trait evolution. Traditionally, these descriptions have been expressed in natural language within the context of individual journal publications or monographs. As such, this rich store of phenotype data has been largely unavailable for statistical and computational comparisons across studies or integration with other biological knowledge. Here we describe Phenex, a platform-independent desktop application designed to facilitate efficient and consistent annotation of phenotypic similarities and differences using Entity-Quality syntax, drawing on terms from community ontologies for anatomical entities, phenotypic qualities, and taxonomic names. Phenex can be configured to load only those ontologies pertinent to a taxonomic group of interest. The graphical user interface was optimized for evolutionary biologists accustomed to working with lists of taxa, characters, character states, and character-by-taxon matrices. Annotation of phenotypic data using ontologies and globally unique taxonomic identifiers will allow biologists to integrate phenotypic data from different organisms and studies, leveraging decades of work in systematics and comparative morphology.

  17. Interactive visualization of Earth and Space Science computations

    NASA Technical Reports Server (NTRS)

    Hibbard, William L.; Paul, Brian E.; Santek, David A.; Dyer, Charles R.; Battaiola, Andre L.; Voidrot-Martinez, Marie-Francoise

    1994-01-01

    Computers have become essential tools for scientists simulating and observing nature. Simulations are formulated as mathematical models but are implemented as computer algorithms to simulate complex events. Observations are also analyzed and understood in terms of mathematical models, but the number of these observations usually dictates that we automate analyses with computer algorithms. In spite of their essential role, computers are also barriers to scientific understanding. Unlike hand calculations, automated computations are invisible and, because of the enormous numbers of individual operations in automated computations, the relation between an algorithm's input and output is often not intuitive. This problem is illustrated by the behavior of meteorologists responsible for forecasting weather. Even in this age of computers, many meteorologists manually plot weather observations on maps, then draw isolines of temperature, pressure, and other fields by hand (special pads of maps are printed for just this purpose). Similarly, radiologists use computers to collect medical data but are notoriously reluctant to apply image-processing algorithms to that data. To these scientists with life-and-death responsibilities, computer algorithms are black boxes that increase rather than reduce risk. The barrier between scientists and their computations can be bridged by techniques that make the internal workings of algorithms visible and that allow scientists to experiment with their computations. Here we describe two interactive systems developed at the University of Wisconsin-Madison Space Science and Engineering Center (SSEC) that provide these capabilities to Earth and space scientists.

  18. Leveraging CyVerse Resources for De Novo Comparative Transcriptomics of Underserved (Non-model) Organisms

    PubMed Central

    Joyce, Blake L.; Haug-Baltzell, Asher K.; Hulvey, Jonathan P.; McCarthy, Fiona; Devisetty, Upendra Kumar; Lyons, Eric

    2017-01-01

    This workflow allows novice researchers to leverage advanced computational resources such as cloud computing to carry out pairwise comparative transcriptomics. It also serves as a primer for biologists to develop data scientist computational skills, e.g. executing bash commands, visualization and management of large data sets. All command line code and further explanations of each command or step can be found on the wiki (https://wiki.cyverse.org/wiki/x/dgGtAQ). The Discovery Environment and Atmosphere platforms are connected together through the CyVerse Data Store. As such, once the initial raw sequencing data has been uploaded there is no more need to transfer large data files over an Internet connection, minimizing the amount of time needed to conduct analyses. This protocol is designed to analyze only two experimental treatments or conditions. Differential gene expression analysis is conducted through pairwise comparisons, and will not be suitable to test multiple factors. This workflow is also designed to be manual rather than automated. Each step must be executed and investigated by the user, yielding a better understanding of data and analytical outputs, and therefore better results for the user. Once complete, this protocol will yield de novo assembled transcriptome(s) for underserved (non-model) organisms without the need to map to previously assembled reference genomes (which are usually not available in underserved organism). These de novo transcriptomes are further used in pairwise differential gene expression analysis to investigate genes differing between two experimental conditions. Differentially expressed genes are then functionally annotated to understand the genetic response organisms have to experimental conditions. In total, the data derived from this protocol is used to test hypotheses about biological responses of underserved organisms. PMID:28518075

  19. Programmable full-adder computations in communicating three-dimensional cell cultures.

    PubMed

    Ausländer, David; Ausländer, Simon; Pierrat, Xavier; Hellmann, Leon; Rachid, Leila; Fussenegger, Martin

    2018-01-01

    Synthetic biologists have advanced the design of trigger-inducible gene switches and their assembly into input-programmable circuits that enable engineered human cells to perform arithmetic calculations reminiscent of electronic circuits. By designing a versatile plug-and-play molecular-computation platform, we have engineered nine different cell populations with genetic programs, each of which encodes a defined computational instruction. When assembled into 3D cultures, these engineered cell consortia execute programmable multicellular full-adder logics in response to three trigger compounds.

  20. Climate@Home: Crowdsourcing Climate Change Research

    NASA Astrophysics Data System (ADS)

    Xu, C.; Yang, C.; Li, J.; Sun, M.; Bambacus, M.

    2011-12-01

    Climate change deeply impacts human wellbeing. Significant amounts of resources have been invested in building super-computers that are capable of running advanced climate models, which help scientists understand climate change mechanisms, and predict its trend. Although climate change influences all human beings, the general public is largely excluded from the research. On the other hand, scientists are eagerly seeking communication mediums for effectively enlightening the public on climate change and its consequences. The Climate@Home project is devoted to connect the two ends with an innovative solution: crowdsourcing climate computing to the general public by harvesting volunteered computing resources from the participants. A distributed web-based computing platform will be built to support climate computing, and the general public can 'plug-in' their personal computers to participate in the research. People contribute the spare computing power of their computers to run a computer model, which is used by scientists to predict climate change. Traditionally, only super-computers could handle such a large computing processing load. By orchestrating massive amounts of personal computers to perform atomized data processing tasks, investments on new super-computers, energy consumed by super-computers, and carbon release from super-computers are reduced. Meanwhile, the platform forms a social network of climate researchers and the general public, which may be leveraged to raise climate awareness among the participants. A portal is to be built as the gateway to the climate@home project. Three types of roles and the corresponding functionalities are designed and supported. The end users include the citizen participants, climate scientists, and project managers. Citizen participants connect their computing resources to the platform by downloading and installing a computing engine on their personal computers. Computer climate models are defined at the server side. Climate scientists configure computer model parameters through the portal user interface. After model configuration, scientists then launch the computing task. Next, data is atomized and distributed to computing engines that are running on citizen participants' computers. Scientists will receive notifications on the completion of computing tasks, and examine modeling results via visualization modules of the portal. Computing tasks, computing resources, and participants are managed by project managers via portal tools. A portal prototype has been built for proof of concept. Three forums have been setup for different groups of users to share information on science aspect, technology aspect, and educational outreach aspect. A facebook account has been setup to distribute messages via the most popular social networking platform. New treads are synchronized from the forums to facebook. A mapping tool displays geographic locations of the participants and the status of tasks on each client node. A group of users have been invited to test functions such as forums, blogs, and computing resource monitoring.

  1. Basic instincts

    NASA Astrophysics Data System (ADS)

    Hutson, Matthew

    2018-05-01

    In their adaptability, young children demonstrate common sense, a kind of intelligence that, so far, computer scientists have struggled to reproduce. Gary Marcus, a developmental cognitive scientist at New York University in New York City, believes the field of artificial intelligence (AI) would do well to learn lessons from young thinkers. Researchers in machine learning argue that computers trained on mountains of data can learn just about anything—including common sense—with few, if any, programmed rules. But Marcus says computer scientists are ignoring decades of work in the cognitive sciences and developmental psychology showing that humans have innate abilities—programmed instincts that appear at birth or in early childhood—that help us think abstractly and flexibly. He believes AI researchers ought to include such instincts in their programs. Yet many computer scientists, riding high on the successes of machine learning, are eagerly exploring the limits of what a naïve AI can do. Computer scientists appreciate simplicity and have an aversion to debugging complex code. Furthermore, big companies such as Facebook and Google are pushing AI in this direction. These companies are most interested in narrowly defined, near-term problems, such as web search and facial recognition, in which blank-slate AI systems can be trained on vast data sets and work remarkably well. But in the longer term, computer scientists expect AIs to take on much tougher tasks that require flexibility and common sense. They want to create chatbots that explain the news, autonomous taxis that can handle chaotic city traffic, and robots that nurse the elderly. Some computer scientists are already trying. Such efforts, researchers hope, will result in AIs that sit somewhere between pure machine learning and pure instinct. They will boot up following some embedded rules, but will also learn as they go.

  2. A computational framework to detect normal and tuberculosis infected lung from H and E-stained whole slide images

    NASA Astrophysics Data System (ADS)

    Niazi, M. Khalid Khan; Beamer, Gillian; Gurcan, Metin N.

    2017-03-01

    Accurate detection and quantification of normal lung tissue in the context of Mycobacterium tuberculosis infection is of interest from a biological perspective. The automatic detection and quantification of normal lung will allow the biologists to focus more intensely on regions of interest within normal and infected tissues. We present a computational framework to extract individual tissue sections from whole slide images having multiple tissue sections. It automatically detects the background, red blood cells and handwritten digits to bring efficiency as well as accuracy in quantification of tissue sections. For efficiency, we model our framework with logical and morphological operations as they can be performed in linear time. We further divide these individual tissue sections into normal and infected areas using deep neural network. The computational framework was trained on 60 whole slide images. The proposed computational framework resulted in an overall accuracy of 99.2% when extracting individual tissue sections from 120 whole slide images in the test dataset. The framework resulted in a relatively higher accuracy (99.7%) while classifying individual lung sections into normal and infected areas. Our preliminary findings suggest that the proposed framework has good agreement with biologists on how define normal and infected lung areas.

  3. Using creation science to demonstrate evolution: application of a creationist method for visualizing gaps in the fossil record to a phylogenetic study of coelurosaurian dinosaurs.

    PubMed

    Senter, P

    2010-08-01

    It is important to demonstrate evolutionary principles in such a way that they cannot be countered by creation science. One such way is to use creation science itself to demonstrate evolutionary principles. Some creation scientists use classic multidimensional scaling (CMDS) to quantify and visualize morphological gaps or continuity between taxa, accepting gaps as evidence of independent creation and accepting continuity as evidence of genetic relatedness. Here, I apply CMDS to a phylogenetic analysis of coelurosaurian dinosaurs and show that it reveals morphological continuity between Archaeopteryx, other early birds, and a wide range of nonavian coelurosaurs. Creation scientists who use CMDS must therefore accept that these animals are genetically related. Other uses of CMDS for evolutionary biologists include the identification of taxa with much missing evolutionary history and the tracing of the progressive filling of morphological gaps in the fossil record through successive years of discovery.

  4. Sociophysiology 25 years ago: early perspectives of an emerging discipline now part of social neuroscience.

    PubMed

    Barchas, Patricia R; Barchas, Jack D

    2011-08-01

    Sociophysiology was a term used early in the history of sociology and then again 25 years ago to describe interactions between the "social" and the "biological" worlds. Social scientists had largely viewed biology and the brain as a "black box" that was not an active aspect of their work or theories. A landmark, unpublished conference in 1986 brought together social scientists and biologists dedicated to the idea that bringing sociological conceptualizations and approaches together with those of physiology might create new ways to understand human behavior. The umbrella question for sociophysiology was dual: how do social processes impact the physiology of the organism, and how does that altered physiology affect future social behavior? This paper summarizes that conference with the goal of providing a glimpse into the early history of social neuroscience and to demonstrate the variety of individuals and interests that were present at the emergence of this new field. The late Patricia R. Barchas organized and chaired the conference. © 2011 New York Academy of Sciences.

  5. The Genome 10K Project: a way forward.

    PubMed

    Koepfli, Klaus-Peter; Paten, Benedict; O'Brien, Stephen J

    2015-01-01

    The Genome 10K Project was established in 2009 by a consortium of biologists and genome scientists determined to facilitate the sequencing and analysis of the complete genomes of 10,000 vertebrate species. Since then the number of selected and initiated species has risen from ∼26 to 277 sequenced or ongoing with funding, an approximately tenfold increase in five years. Here we summarize the advances and commitments that have occurred by mid-2014 and outline the achievements and present challenges of reaching the 10,000-species goal. We summarize the status of known vertebrate genome projects, recommend standards for pronouncing a genome as sequenced or completed, and provide our present and future vision of the landscape of Genome 10K. The endeavor is ambitious, bold, expensive, and uncertain, but together the Genome 10K Consortium of Scientists and the worldwide genomics community are moving toward their goal of delivering to the coming generation the gift of genome empowerment for many vertebrate species.

  6. Biological and Organic Chemical Decomposition of Silicates. Chapter 7.2

    NASA Technical Reports Server (NTRS)

    Silverman, M. P.

    1979-01-01

    The weathering of silicate rocks and minerals, an important concern of geologists and geochemists for many years, traditionally has been approached from strictly physical and chemical points of view. Biological effects were either unrecognized, ignored, or were mentioned in passing to account for such phenomena as the accumulation of organic matter in sediments or the generation of reducing environments. A major exception occurred in soil science where agricultural scientists, studying the factors important in the development of soils and their ability to nourish and sustain various crops, laid the foundation for much of what is known of the biological breakdown of silicate rocks and minerals. The advent of the space age accelerated the realization that many environmental problems and geo- chemical processes on Earth can only be understood in terms of ecosystems. This in turn, spurred renewed interest and activity among modem biologists, geologists and soil scientists attempting to unravel the intimate relations between biology and the weathering of silicate rocks and minerals of the earth surface.

  7. Live from Antarctica, Volume 4

    NASA Technical Reports Server (NTRS)

    1994-01-01

    In this fourth video of a four part 'Passport to Knowledge Special', hosted by Camille Moody Jennings from Maryland Public Television, children from Maryland and Alaska public schools had the opportunity to directly interact with and ask questions of scientists and researchers from the Antarctic, and learn about the different geological and meteorological research going on in the Antarctic and McMurdo Base at McMurdo Sound. The scientists questioned included: Donal Manahan (biologist from Un. of So. California), who described some of the geological features from Hut Point, the historic hut built by Capt. Scott in 1902; Sridar Anandakrishnan (Penn State Un.) whose research includes ice plate movement of the central ice sheet and earthquakes and how they affect the sheet; and Lt. j.g. Kate McNitt, who spends her winters investigating the trace gases, aerosols, CFC's and ozone levels over the Antarctic area that are affecting the seasonal ozone hole that appears in that region. Historical film footage of Capt. Scott's exploration of the Antarctic is included.

  8. Live from Antarctica, volume 4

    NASA Astrophysics Data System (ADS)

    In this fourth video of a four part 'Passport to Knowledge Special', hosted by Camille Moody Jennings from Maryland Public Television, children from Maryland and Alaska public schools had the opportunity to directly interact with and ask questions of scientists and researchers from the Antarctic, and learn about the different geological and meteorological research going on in the Antarctic and McMurdo Base at McMurdo Sound. The scientists questioned included: Donal Manahan (biologist from Un. of So. California), who described some of the geological features from Hut Point, the historic hut built by Capt. Scott in 1902; Sridar Anandakrishnan (Penn State Un.) whose research includes ice plate movement of the central ice sheet and earthquakes and how they affect the sheet; and Lt. j.g. Kate McNitt, who spends her winters investigating the trace gases, aerosols, CFC's and ozone levels over the Antarctic area that are affecting the seasonal ozone hole that appears in that region. Historical film footage of Capt. Scott's exploration of the Antarctic is included.

  9. A man of his time: thorstein veblen and the university of chicago darwinists.

    PubMed

    Raymer, Emilie J

    2013-01-01

    The Darwinian economic theory that Thorstein Veblen proposed and refined while he served as a professor of Political Economy at the University of Chicago from 1891 to 1906 should be assessed in the context of the community of Darwinian scientists and social scientists with whom Veblen worked and lived at Chicago. It is important to identify Veblen as a member of this broad community of Darwinian-inclined philosophers, physiologists, geologists, astronomers, and biologists at Chicago because Veblen's involvement with this circle suggests that the possible sources of his engagement with Darwinism extend beyond the pragmatists and Continental socialists to whom scholars have typically ascribed Veblen's Darwinian roots. Additionally, that an extensive community continued to use Darwinian evolutionary theory to construct new models of scientific and social scientific analysis at the turn of the twentieth century, a period during which Darwinism was purportedly in decline, suggests that the "eclipse of Darwinism" narrative has been overstated in literature about Darwinism's intellectual arc.

  10. Biological and Organic Chemical Decomposition of Silicates. Chapter 7.2

    NASA Technical Reports Server (NTRS)

    Sliverman, M. P.

    1979-01-01

    The weathering of silicate rocks and minerals, an important concern of geologists and geochemists for many years, traditionally has been approached from strictly physical and chemical points of view. Biological effects were either unrecognized, ignored, or were mentioned in passing to account for such phenomena as the accumulation of organic matter in sediments or the generation of reducing environments. A major exception occurred in soil science where agricultural scientists, studying the factors important in the development of soils and their ability to nourish and sustain various crops, laid the foundation for much of what is known of the biological breakdown of silicate rocks and minerals. The advent of the space age accelerated the realization that many environmental problems and geochemical processes on Earth can only be understood in terms of ecosystems. This in turn, spurred renewed interest and activity among modem biologists, geologists and soil scientists attempting to unravel the intimate relations between biology and the weathering of silicate rocks and minerals of the earth's surface.

  11. The Genome 10K Project: A Way Forward

    PubMed Central

    Koepfli, Klaus-Peter; Paten, Benedict; O’Brien, Stephen J.

    2017-01-01

    The Genome 10K Project was established in 2009 by a consortium of biologists and genome scientists determined to facilitate the sequencing and analysis of the complete genomes of 10,000 vertebrate species. Since then the number of selected and initiated species has risen from ~26 to 277 sequenced or ongoing with funding, an approximately tenfold increase in five years. Here we summarize the advances and commitments that have occurred by mid-2014 and outline the achievements and present challenges of reaching the 10,000-species goal. We summarize the status of known vertebrate genome projects, recommend standards for pronouncing a genome as sequenced or completed, and provide our present and future vision of the landscape of Genome 10K. The endeavor is ambitious, bold, expensive, and uncertain, but together the Genome 10K Consortium of Scientists and the worldwide genomics community are moving toward their goal of delivering to the coming generation the gift of genome empowerment for many vertebrate species. PMID:25689317

  12. Scientist-teacher collaboration: Integration of real data from a coastal wetland into a high school life science ecology-based research project

    NASA Astrophysics Data System (ADS)

    Hagan, Wendy L.

    Project G.R.O.W. is an ecology-based research project developed for high school biology students. The curriculum was designed based on how students learn and awareness of the nature of science and scientific practices so that students would design and carry out scientific investigations using real data from a local coastal wetland. This was a scientist-teacher collaboration between a CSULB biologist and high school biology teacher. Prior to implementing the three-week research project, students had multiple opportunities to practice building requisite skills via 55 lessons focusing on the nature of science, scientific practices, technology, Common Core State Standards of reading, writing, listening and speaking, and Next Generation Science Standards. Project G.R.O.W. culminated with student generated research papers and oral presentations. Outcomes reveal students struggle with constructing explanations and the use of Excel to create meaningful graphs. They showed gains in data organization, analysis, teamwork and aspects of the nature of science.

  13. Sonoran Desert: Fragile Land of Extremes

    USGS Publications Warehouse

    Produced and Directed by Wessells, Stephen

    2003-01-01

    'Sonoran Desert: Fragile Land of Extremes' shows how biologists with the U.S. Geological Survey work with other scientists in an effort to better understand native plants and animals such as desert tortoises, saguaro cacti, and Gila monsters. Much of the program was shot in and around Saguaro National Park near Tucson, Arizona. Genetic detective work, using DNA, focuses on understanding the lives of tortoises. Studies of saguaros over many decades clarify how these amazing plants reproduce and thrive in the desert. Threats from fire, diseases in tortoises, and a growing human population motivate the scientists. Their work to identify how these organisms live and survive is a crucial step for the sound management of biological resources on public lands. This 28-minute program, USGS Open-File Report 03-305, was shot entirely in high definition video and produced by the USGS Western Ecological Research Center and Southwest Biological Science Center; produced and directed by Stephen Wessells, Western Region Office of Communications.

  14. Self-regulation of recombinant DNA technology in Japan in the 1970s.

    PubMed

    Nagai, Hiroyuki; Nukaga, Yoshio; Saeki, Koji; Akabayashi, Akira

    2009-07-01

    Recombinant DNA technology was developed in the United States in the early 1970s. Leading scientists held an international Asilomar Conference in 1975 to examine the self regulation of recombinant DNA technology, followed by the U.S. National Institutes of Health drafting the Recombinant DNA Research Guidelines in 1976. The result of this conference significantly affected many nations, including Japan. However, there have been few historical studies on the self-regulation of recombinant technologies conducted by scientists and government officials in Japan. The purpose of this paper is to analyze how the Science Council of Japan, the Ministry of Education, Science adn Culture, and the Science and Technology Agency developed self-regulation policies for recombinant DNA technology in Japan in the 1970s. Groups of molecular biologist and geneticists played a key role in establishing guidelines in cooperation with government officials. Our findings suggest that self-regulation policies on recombinant DNA technology have influenced safety management for the life sciences and establishment of institutions for review in Japan.

  15. Research to knowledge: promoting the training of physician-scientists in the biology of pregnancy.

    PubMed

    Sadovsky, Yoel; Caughey, Aaron B; DiVito, Michelle; D'Alton, Mary E; Murtha, Amy P

    2018-01-01

    Common disorders of pregnancy, such as preeclampsia, preterm birth, and fetal growth abnormalities, continue to challenge perinatal biologists seeking insights into disease pathogenesis that will result in better diagnosis, therapy, and disease prevention. These challenges have recently been intensified with discoveries that associate gestational diseases with long-term maternal and neonatal outcomes. Whereas modern high-throughput investigative tools enable scientists and clinicians to noninvasively probe the maternal-fetal genome, epigenome, and other analytes, their implications for clinical medicine remain uncertain. Bridging these knowledge gaps depends on strengthening the existing pool of scientists with expertise in basic, translational, and clinical tools to address pertinent questions in the biology of pregnancy. Although PhD researchers are critical in this quest, physician-scientists would facilitate the inquiry by bringing together clinical challenges and investigative tools, promoting a culture of intellectual curiosity among clinical providers, and helping transform discoveries into relevant knowledge and clinical solutions. Uncertainties related to future administration of health care, federal support for research, attrition of physician-scientists, and an inadequate supply of new scholars may jeopardize our ability to address these challenges. New initiatives are necessary to attract current scholars and future generations of researchers seeking expertise in the scientific method and to support them, through mentorship and guidance, in pursuing a career that combines scientific investigation with clinical medicine. These efforts will promote breadth and depth of inquiry into the biology of pregnancy and enhance the pace of translation of scientific discoveries into better medicine and disease prevention. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Scanning Electron Microscopy and X-Ray Microanalysis

    NASA Astrophysics Data System (ADS)

    Albee, Arden L.

    This outstanding volume has managed the nearly impossible task of combining the expertise of all six authors in a lucid and homogeneous style of writing. Subtitled ‘A Text for Biologists, Material Scientists and Geologists,’ the book has evolved from a short course taught each summer at Lehigh University.The book provides a basic knowledge of (1) the electron optics for these instruments a nd their controls, (2) the characteristics of the electron beam-sample interactions, (3) image formation and interpretation, (4) X ray spectrometry and quantitative X ray microanalysis with separate detailed sections on wavelength dispersive and energy dispersive techniques, and (5) specimen preparation, especially for biological materials.

  17. Molecular clocks.

    PubMed

    Lee, Michael S Y; Ho, Simon Y W

    2016-05-23

    In the 1960s, several groups of scientists, including Emile Zuckerkandl and Linus Pauling, had noted that proteins experience amino acid replacements at a surprisingly consistent rate across very different species. This presumed single, uniform rate of genetic evolution was subsequently described using the term 'molecular clock'. Biologists quickly realised that such a universal pacemaker could be used as a yardstick for measuring the timescale of evolutionary divergences: estimating the rate of amino acid exchanges per unit of time and applying it to protein differences across a range of organisms would allow deduction of the divergence times of their respective lineages (Figure 1). Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Literary study and evolutionary theory : A review essay.

    PubMed

    Carroll, J

    1998-09-01

    Several recent books have claimed to integrate literary study with evolutionary biology. All of the books here considered, except Robert Storey's, adopt conceptions of evolutionary theory that are in some way marginal to the Darwinian adaptationist program. All the works attempt to connect evolutionary study with various other disciplines or methodologies: for example, with cultural anthropology, cognitive psychology, the psychology of emotion, neurobiology, chaos theory, or structuralist linguistics. No empirical paradigm has yet been established for this field, but important steps have been taken, especially by Storey, in formulating basic principles, identifying appropriate disciplinary connections, and marking out lines of inquiry. Reciprocal efforts are needed from biologists and social scientists.

  19. First person - Chih-Wen Chu.

    PubMed

    2018-05-16

    First Person is a series of interviews with the first authors of a selection of papers published in Journal of Cell Science, helping early-career researchers promote themselves alongside their papers. Chih-Wen Chu is the first author on 'The Ajuba family protein Wtip regulates actomyosin contractility during vertebrate neural tube closure', published in Journal of Cell Science. Chih-Wen is an associate scientist in the lab of Sergei Sokol at Icahn School of Medicine at Mount Sinai, New York, USA, investigating apical constriction and planar cell polarity, with a focus on protein dynamics at the cell junctions. © 2018. Published by The Company of Biologists Ltd.

  20. Fibre optic microarrays.

    PubMed

    Walt, David R

    2010-01-01

    This tutorial review describes how fibre optic microarrays can be used to create a variety of sensing and measurement systems. This review covers the basics of optical fibres and arrays, the different microarray architectures, and describes a multitude of applications. Such arrays enable multiplexed sensing for a variety of analytes including nucleic acids, vapours, and biomolecules. Polymer-coated fibre arrays can be used for measuring microscopic chemical phenomena, such as corrosion and localized release of biochemicals from cells. In addition, these microarrays can serve as a substrate for fundamental studies of single molecules and single cells. The review covers topics of interest to chemists, biologists, materials scientists, and engineers.

  1. Rebuilding a broken heart: lessons from developmental and regenerative biology.

    PubMed

    Kuyumcu-Martinez, Muge N; Bressan, Michael C

    2016-11-01

    In May 2016, the annual Weinstein Cardiovascular Development and Regeneration Conference was held in Durham, North Carolina, USA. The meeting assembled leading investigators, junior scientists and trainees from around the world to discuss developmental and regenerative biological approaches to understanding the etiology of congenital heart defects and the repair of diseased cardiac tissue. In this Meeting Review, we present several of the major themes that were discussed throughout the meeting and highlight the depth and range of research currently being performed to uncover the causes of human cardiac diseases and develop potential therapies. © 2016. Published by The Company of Biologists Ltd.

  2. Facilities | Computational Science | NREL

    Science.gov Websites

    technology innovation by providing scientists and engineers the ability to tackle energy challenges that scientists and engineers to take full advantage of advanced computing hardware and software resources

  3. Approximations, idealizations and 'experiments' at the physics-biology interface.

    PubMed

    Rowbottom, Darrell P

    2011-06-01

    This paper, which is based on recent empirical research at the University of Leeds, the University of Edinburgh, and the University of Bristol, presents two difficulties which arise when condensed matter physicists interact with molecular biologists: (1) the former use models which appear to be too coarse-grained, approximate and/or idealized to serve a useful scientific purpose to the latter; and (2) the latter have a rather narrower view of what counts as an experiment, particularly when it comes to computer simulations, than the former. It argues that these findings are related; that computer simulations are considered to be undeserving of experimental status, by molecular biologists, precisely because of the idealizations and approximations that they involve. The complexity of biological systems is a key factor. The paper concludes by critically examining whether the new research programme of 'systems biology' offers a genuine alternative to the modelling strategies used by physicists. It argues that it does not. Copyright © 2010 Elsevier Ltd. All rights reserved.

  4. How can we improve problem solving in undergraduate biology? Applying lessons from 30 years of physics education research.

    PubMed

    Hoskinson, A-M; Caballero, M D; Knight, J K

    2013-06-01

    If students are to successfully grapple with authentic, complex biological problems as scientists and citizens, they need practice solving such problems during their undergraduate years. Physics education researchers have investigated student problem solving for the past three decades. Although physics and biology problems differ in structure and content, the instructional purposes align closely: explaining patterns and processes in the natural world and making predictions about physical and biological systems. In this paper, we discuss how research-supported approaches developed by physics education researchers can be adopted by biologists to enhance student problem-solving skills. First, we compare the problems that biology students are typically asked to solve with authentic, complex problems. We then describe the development of research-validated physics curricula emphasizing process skills in problem solving. We show that solving authentic, complex biology problems requires many of the same skills that practicing physicists and biologists use in representing problems, seeking relationships, making predictions, and verifying or checking solutions. We assert that acquiring these skills can help biology students become competent problem solvers. Finally, we propose how biology scholars can apply lessons from physics education in their classrooms and inspire new studies in biology education research.

  5. Meeting report: a hard look at the state of enamel research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klein, Ophir D.; Duverger, Olivier; Shaw, Wendy

    Enamel is a principal component of the dentition, and defects in this hard tissue are associated with a wide variety of diseases. To assess the state of the field of enamel research, the National Institute of Dental and Craniofacial Research (NIDCR) convened the “Encouraging Novel Amelogenesis Models and Ex vivo cell Lines (ENAMEL) Development” workshop at its Bethesda headquarters on 23 June 2017. Enamel formation involves complex developmental stages and cellular differentiation mechanisms that are summarized in Figure 1. The meeting, which was organized by Jason Wan from NIDCR, had three sessions: model organisms, stem cells/cell lines, and tissues/ 3Dmore » cell culture/organoids. In attendance were investigators interested in enamel from a broad range of disciplines as well as NIDCR leadership and staff. The meeting brought together developmental biologists, cell biologists, human geneticists, materials scientists, and clinical researchers from across the United States to discuss recent progress and future challenges in our understanding of the formation and function of enamel. Lively discussions took place throughout the day, and this meeting report highlights some of the major findings and ideas that emerged during the workshop.« less

  6. Developing a Computational Environment for Coupling MOR Data, Maps, and Models: The Virtual Research Vessel (VRV) Prototype

    NASA Astrophysics Data System (ADS)

    Wright, D. J.; O'Dea, E.; Cushing, J. B.; Cuny, J. E.; Toomey, D. R.; Hackett, K.; Tikekar, R.

    2001-12-01

    The East Pacific Rise (EPR) from 9-10deg. N is currently our best-studied section of fast-spreading mid-ocean ridge. During several decades of investigation it has been explored by the full spectrum of ridge investigators, including chemists, biologists, geologists and geophysicists. These studies, and those that are ongoing, provide a wealth of observational data, results and data-driven theoretical (often numerical) studies that have not yet been fully utilized either by research scientists or by professional educators. While the situation is improving, a large amount of data, results, and related theoretical models still exist either in an inert, non-interactive form (e.g., journal publications) or as unlinked and currently incompatible computer data or algorithms. Infrastructure is needed not just for ready access to data, but linkage of disparate data sets (data to data) as well as data to models in order quantitatively evaluate hypotheses, refine numerical simulations, and explore new relations between observables. The prototype of a computational environment and toolset, called the Virtual Research Vessel (VRV), is being developed to provide scientists and educators with ready access to data, results and numerical models. While this effort is focused on the EPR 9N region, the resulting software tools and infrastructure should be helpful in establishing similar systems for other sections of the global mid-ocean ridge. Work in progress includes efforts to develop: (1) virtual database to incorporate diverse data types with domain-specific metadata into a global schema that allows web-query across different marine geology data sets, and an analogous declarative (database available) description of tools and models; (2) the ability to move data between GIS and the above DBMS, and tools to encourage data submission to archivesl (3) tools for finding and viewing archives, and translating between formats; (4) support for "computational steering" (tool composition) and model coupling (e.g., ability to run tool composition locally but access input data from the web, APIs to support coupling such as invoking programs that are running remotely, and help in writing data wrappers to publish programs); (5) support of migration paths for prototyped model coupling; and (6) export of marine geological data and data analysis to the undergraduate classroom (VRV-ET, "Educational Tool"). See the main VRV web site at http://oregonstate.edu/dept/vrv and the VRV-ET web site at: http://www.cs.uoregon.edu/research/vrv-et.

  7. An Analysis of Computer-Mediated Communication between Middle School Students and Scientist Role Models: A Pilot Study.

    ERIC Educational Resources Information Center

    Murfin, Brian

    1994-01-01

    Reports on a study of the effectiveness of computer-mediated communication (CMC) in providing African American and female middle school students with scientist role models. Quantitative and qualitative data gathered by analyzing messages students and scientists posted on a shared electronic bulletin board showed that CMC could be an effective…

  8. Scout: high-performance heterogeneous computing made simple

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jablin, James; Mc Cormick, Patrick; Herlihy, Maurice

    2011-01-26

    Researchers must often write their own simulation and analysis software. During this process they simultaneously confront both computational and scientific problems. Current strategies for aiding the generation of performance-oriented programs do not abstract the software development from the science. Furthermore, the problem is becoming increasingly complex and pressing with the continued development of many-core and heterogeneous (CPU-GPU) architectures. To acbieve high performance, scientists must expertly navigate both software and hardware. Co-design between computer scientists and research scientists can alleviate but not solve this problem. The science community requires better tools for developing, optimizing, and future-proofing codes, allowing scientists to focusmore » on their research while still achieving high computational performance. Scout is a parallel programming language and extensible compiler framework targeting heterogeneous architectures. It provides the abstraction required to buffer scientists from the constantly-shifting details of hardware while still realizing higb-performance by encapsulating software and hardware optimization within a compiler framework.« less

  9. S-MART, a software toolbox to aid RNA-Seq data analysis.

    PubMed

    Zytnicki, Matthias; Quesneville, Hadi

    2011-01-01

    High-throughput sequencing is now routinely performed in many experiments. But the analysis of the millions of sequences generated, is often beyond the expertise of the wet labs who have no personnel specializing in bioinformatics. Whereas several tools are now available to map high-throughput sequencing data on a genome, few of these can extract biological knowledge from the mapped reads. We have developed a toolbox called S-MART, which handles mapped RNA-Seq data. S-MART is an intuitive and lightweight tool which performs many of the tasks usually required for the analysis of mapped RNA-Seq reads. S-MART does not require any computer science background and thus can be used by all of the biologist community through a graphical interface. S-MART can run on any personal computer, yielding results within an hour even for Gb of data for most queries. S-MART may perform the entire analysis of the mapped reads, without any need for other ad hoc scripts. With this tool, biologists can easily perform most of the analyses on their computer for their RNA-Seq data, from the mapped data to the discovery of important loci.

  10. S-MART, A Software Toolbox to Aid RNA-seq Data Analysis

    PubMed Central

    Zytnicki, Matthias; Quesneville, Hadi

    2011-01-01

    High-throughput sequencing is now routinely performed in many experiments. But the analysis of the millions of sequences generated, is often beyond the expertise of the wet labs who have no personnel specializing in bioinformatics. Whereas several tools are now available to map high-throughput sequencing data on a genome, few of these can extract biological knowledge from the mapped reads. We have developed a toolbox called S-MART, which handles mapped RNA-Seq data. S-MART is an intuitive and lightweight tool which performs many of the tasks usually required for the analysis of mapped RNA-Seq reads. S-MART does not require any computer science background and thus can be used by all of the biologist community through a graphical interface. S-MART can run on any personal computer, yielding results within an hour even for Gb of data for most queries. S-MART may perform the entire analysis of the mapped reads, without any need for other ad hoc scripts. With this tool, biologists can easily perform most of the analyses on their computer for their RNA-Seq data, from the mapped data to the discovery of important loci. PMID:21998740

  11. An accessible method for implementing hierarchical models with spatio-temporal abundance data

    USGS Publications Warehouse

    Ross, Beth E.; Hooten, Melvin B.; Koons, David N.

    2012-01-01

    A common goal in ecology and wildlife management is to determine the causes of variation in population dynamics over long periods of time and across large spatial scales. Many assumptions must nevertheless be overcome to make appropriate inference about spatio-temporal variation in population dynamics, such as autocorrelation among data points, excess zeros, and observation error in count data. To address these issues, many scientists and statisticians have recommended the use of Bayesian hierarchical models. Unfortunately, hierarchical statistical models remain somewhat difficult to use because of the necessary quantitative background needed to implement them, or because of the computational demands of using Markov Chain Monte Carlo algorithms to estimate parameters. Fortunately, new tools have recently been developed that make it more feasible for wildlife biologists to fit sophisticated hierarchical Bayesian models (i.e., Integrated Nested Laplace Approximation, ‘INLA’). We present a case study using two important game species in North America, the lesser and greater scaup, to demonstrate how INLA can be used to estimate the parameters in a hierarchical model that decouples observation error from process variation, and accounts for unknown sources of excess zeros as well as spatial and temporal dependence in the data. Ultimately, our goal was to make unbiased inference about spatial variation in population trends over time.

  12. Role of passive deformation on propulsion through a lumped torsional flexibility model

    NASA Astrophysics Data System (ADS)

    Arora, Nipun; Gupta, Amit

    2016-11-01

    Scientists and biologists have been affianced in a deeper examination of insect flight to develop an improved understanding of the role of flexibility on aerodynamic performance. Here, we mimic a flapping wing through a fluid-structure interaction framework based upon a lumped torsional flexibility model. The developed fluid and structural solvers together determine the aerodynamic forces and wing deformation, respectively. An analytical solution to the simplified single-spring structural dynamics equation is established to substantiate simulations. It is revealed that the dynamics of structural deformation is governed by the balance between inertia, stiffness and aerodynamics, where the former two oscillate at the plunging frequency and the latter oscillates at twice the plunging frequency. We demonstrate that an induced phase difference between plunging and passive pitching is responsible for a higher thrust coefficient. This phase difference is also shown to be dependent on aerodynamics to inertia and natural to plunging frequency ratios. For inertia dominated flows, pitching and plunging always remain in phase. As the aerodynamics dominates, a large phase difference is induced which is accountable for a large passive deformation and higher thrust. Authors acknowledge the financial support received from the Aeronautics Research and Development Board (ARDB) under SIGMA Project No. 1705 and thank the IIT Delhi HPC facility for computational resources.

  13. An improved Pearson's correlation proximity-based hierarchical clustering for mining biological association between genes.

    PubMed

    Booma, P M; Prabhakaran, S; Dhanalakshmi, R

    2014-01-01

    Microarray gene expression datasets has concerned great awareness among molecular biologist, statisticians, and computer scientists. Data mining that extracts the hidden and usual information from datasets fails to identify the most significant biological associations between genes. A search made with heuristic for standard biological process measures only the gene expression level, threshold, and response time. Heuristic search identifies and mines the best biological solution, but the association process was not efficiently addressed. To monitor higher rate of expression levels between genes, a hierarchical clustering model was proposed, where the biological association between genes is measured simultaneously using proximity measure of improved Pearson's correlation (PCPHC). Additionally, the Seed Augment algorithm adopts average linkage methods on rows and columns in order to expand a seed PCPHC model into a maximal global PCPHC (GL-PCPHC) model and to identify association between the clusters. Moreover, a GL-PCPHC applies pattern growing method to mine the PCPHC patterns. Compared to existing gene expression analysis, the PCPHC model achieves better performance. Experimental evaluations are conducted for GL-PCPHC model with standard benchmark gene expression datasets extracted from UCI repository and GenBank database in terms of execution time, size of pattern, significance level, biological association efficiency, and pattern quality.

  14. An Improved Pearson's Correlation Proximity-Based Hierarchical Clustering for Mining Biological Association between Genes

    PubMed Central

    Booma, P. M.; Prabhakaran, S.; Dhanalakshmi, R.

    2014-01-01

    Microarray gene expression datasets has concerned great awareness among molecular biologist, statisticians, and computer scientists. Data mining that extracts the hidden and usual information from datasets fails to identify the most significant biological associations between genes. A search made with heuristic for standard biological process measures only the gene expression level, threshold, and response time. Heuristic search identifies and mines the best biological solution, but the association process was not efficiently addressed. To monitor higher rate of expression levels between genes, a hierarchical clustering model was proposed, where the biological association between genes is measured simultaneously using proximity measure of improved Pearson's correlation (PCPHC). Additionally, the Seed Augment algorithm adopts average linkage methods on rows and columns in order to expand a seed PCPHC model into a maximal global PCPHC (GL-PCPHC) model and to identify association between the clusters. Moreover, a GL-PCPHC applies pattern growing method to mine the PCPHC patterns. Compared to existing gene expression analysis, the PCPHC model achieves better performance. Experimental evaluations are conducted for GL-PCPHC model with standard benchmark gene expression datasets extracted from UCI repository and GenBank database in terms of execution time, size of pattern, significance level, biological association efficiency, and pattern quality. PMID:25136661

  15. The Bio-Logic and machinery of plant morphogenesis.

    PubMed

    Niklas, Karl J

    2003-04-01

    Morphogenesis (the development of organic form) requires signal-trafficking and cross-talking across all levels of organization to coordinate the operation of metabolic and genomic networked systems. Many biologists are currently converging on the pictorial conventions of computer scientists to render biological signaling as logic circuits supervising the operation of one or more signal-activated metabolic or gene networks. This approach can redact and simplify complex morphogenetic phenomena and allows for their aggregation into diagrams of larger, more "global" networked systems. This conceptualization is discussed in terms of how logic circuits and signal-activated subsystems work, and it is illustrated for examples of increasingly more complex morphogenetic phenomena, e.g., auxin-mediated cell expansion, entry into the mitotic cell cycle phases, and polar/lateral intercellular auxin transport. For each of these phenomena, a posited circuit/subsystem diagram draws rapid attention to missing components, either in the logic circuit or in the subsystem it supervises. These components must be identified experimentally if each of these basic phenomena is to be fully understood. Importantly, the power of the circuit/subsystem approach to modeling developmental phenomena resides not in its pictorial appeal but in the mathematical tools that are sufficiently strong to reveal and quantify the synergistics of networked systems and thus foster a better understanding of morphogenesis.

  16. High-speed multiple sequence alignment on a reconfigurable platform.

    PubMed

    Oliver, Tim; Schmidt, Bertil; Maskell, Douglas; Nathan, Darran; Clemens, Ralf

    2006-01-01

    Progressive alignment is a widely used approach to compute multiple sequence alignments (MSAs). However, aligning several hundred sequences by popular progressive alignment tools requires hours on sequential computers. Due to the rapid growth of sequence databases biologists have to compute MSAs in a far shorter time. In this paper we present a new approach to MSA on reconfigurable hardware platforms to gain high performance at low cost. We have constructed a linear systolic array to perform pairwise sequence distance computations using dynamic programming. This results in an implementation with significant runtime savings on a standard FPGA.

  17. Math and Data Exploration

    ERIC Educational Resources Information Center

    Liu, Dennis

    2010-01-01

    Biology is well suited for mathematical description, from the perfect geometry of viruses, to equations that describe the flux of ions across cellular membranes, to computationally intensive models for protein folding. For this short Web review, however, the author focuses on how mathematics helps biologists sort, evaluate, and draw conclusions…

  18. Graduate Training at the Interface of Computational and Experimental Biology: An Outcome Report from a Partnership of Volunteers between a University and a National Laboratory.

    PubMed

    von Arnim, Albrecht G; Missra, Anamika

    2017-01-01

    Leading voices in the biological sciences have called for a transformation in graduate education leading to the PhD degree. One area commonly singled out for growth and innovation is cross-training in computational science. In 1998, the University of Tennessee (UT) founded an intercollegiate graduate program called the UT-ORNL Graduate School of Genome Science and Technology in partnership with the nearby Oak Ridge National Laboratory. Here, we report outcome data that attest to the program's effectiveness in graduating computationally enabled biologists for diverse careers. Among 77 PhD graduates since 2003, the majority came with traditional degrees in the biological sciences, yet two-thirds moved into computational or hybrid (computational-experimental) positions. We describe the curriculum of the program and how it has changed. We also summarize how the program seeks to establish cohesion between computational and experimental biologists. This type of program can respond flexibly and dynamically to unmet training needs. In conclusion, this study from a flagship, state-supported university may serve as a reference point for creating a stable, degree-granting, interdepartmental graduate program in computational biology and allied areas. © 2017 A. G. von Arnim and A. Missra. CBE—Life Sciences Education © 2017 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  19. From Both Sides, Now: Librarians Team up with Computer Scientist to Deliver Virtual Computer-Information Literacy Instruction

    ERIC Educational Resources Information Center

    Loesch, Martha Fallahay

    2011-01-01

    Two members of the library faculty at Seton Hall University teamed up with a respected professor of mathematics and computer science, in order to create an online course that introduces information literacy both from the perspectives of the computer scientist and from the instruction librarian. This collaboration is unique in that it addresses the…

  20. VISIONET: intuitive visualisation of overlapping transcription factor networks, with applications in cardiogenic gene discovery.

    PubMed

    Nim, Hieu T; Furtado, Milena B; Costa, Mauro W; Rosenthal, Nadia A; Kitano, Hiroaki; Boyd, Sarah E

    2015-05-01

    Existing de novo software platforms have largely overlooked a valuable resource, the expertise of the intended biologist users. Typical data representations such as long gene lists, or highly dense and overlapping transcription factor networks often hinder biologists from relating these results to their expertise. VISIONET, a streamlined visualisation tool built from experimental needs, enables biologists to transform large and dense overlapping transcription factor networks into sparse human-readable graphs via numerically filtering. The VISIONET interface allows users without a computing background to interactively explore and filter their data, and empowers them to apply their specialist knowledge on far more complex and substantial data sets than is currently possible. Applying VISIONET to the Tbx20-Gata4 transcription factor network led to the discovery and validation of Aldh1a2, an essential developmental gene associated with various important cardiac disorders, as a healthy adult cardiac fibroblast gene co-regulated by cardiogenic transcription factors Gata4 and Tbx20. We demonstrate with experimental validations the utility of VISIONET for expertise-driven gene discovery that opens new experimental directions that would not otherwise have been identified.

  1. Synthetic Analog and Digital Circuits for Cellular Computation and Memory

    PubMed Central

    Purcell, Oliver; Lu, Timothy K.

    2014-01-01

    Biological computation is a major area of focus in synthetic biology because it has the potential to enable a wide range of applications. Synthetic biologists have applied engineering concepts to biological systems in order to construct progressively more complex gene circuits capable of processing information in living cells. Here, we review the current state of computational genetic circuits and describe artificial gene circuits that perform digital and analog computation. We then discuss recent progress in designing gene circuits that exhibit memory, and how memory and computation have been integrated to yield more complex systems that can both process and record information. Finally, we suggest new directions for engineering biological circuits capable of computation. PMID:24794536

  2. A Comprehensive Infrastructure for Big Data in Cancer Research: Accelerating Cancer Research and Precision Medicine

    PubMed Central

    Hinkson, Izumi V.; Davidsen, Tanja M.; Klemm, Juli D.; Chandramouliswaran, Ishwar; Kerlavage, Anthony R.; Kibbe, Warren A.

    2017-01-01

    Advancements in next-generation sequencing and other -omics technologies are accelerating the detailed molecular characterization of individual patient tumors, and driving the evolution of precision medicine. Cancer is no longer considered a single disease, but rather, a diverse array of diseases wherein each patient has a unique collection of germline variants and somatic mutations. Molecular profiling of patient-derived samples has led to a data explosion that could help us understand the contributions of environment and germline to risk, therapeutic response, and outcome. To maximize the value of these data, an interdisciplinary approach is paramount. The National Cancer Institute (NCI) has initiated multiple projects to characterize tumor samples using multi-omic approaches. These projects harness the expertise of clinicians, biologists, computer scientists, and software engineers to investigate cancer biology and therapeutic response in multidisciplinary teams. Petabytes of cancer genomic, transcriptomic, epigenomic, proteomic, and imaging data have been generated by these projects. To address the data analysis challenges associated with these large datasets, the NCI has sponsored the development of the Genomic Data Commons (GDC) and three Cloud Resources. The GDC ensures data and metadata quality, ingests and harmonizes genomic data, and securely redistributes the data. During its pilot phase, the Cloud Resources tested multiple cloud-based approaches for enhancing data access, collaboration, computational scalability, resource democratization, and reproducibility. These NCI-led efforts are continuously being refined to better support open data practices and precision oncology, and to serve as building blocks of the NCI Cancer Research Data Commons. PMID:28983483

  3. Why Machine-Information Metaphors Are Bad for Science and Science Education

    ERIC Educational Resources Information Center

    Pigliucci, Massimo; Boudry, Maarten

    2011-01-01

    Genes are often described by biologists using metaphors derived from computational science: they are thought of as carriers of information, as being the equivalent of "blueprints" for the construction of organisms. Likewise, cells are often characterized as "factories" and organisms themselves become analogous to machines. Accordingly, when the…

  4. WOW! Mathematics Convention: A Community Connection

    ERIC Educational Resources Information Center

    Cavazos, Rebecca R.

    2014-01-01

    This article details how certain mathematical "discoveries" that Cavazos' fourth graders made were recorded throughout the year. Cavazos invited a math professor, a biologist, a literacy professor, a chemist, a statistician, and an engineering student, as well as their school principal and computer lab technician, both of whom are…

  5. Enabling drug discovery project decisions with integrated computational chemistry and informatics

    NASA Astrophysics Data System (ADS)

    Tsui, Vickie; Ortwine, Daniel F.; Blaney, Jeffrey M.

    2017-03-01

    Computational chemistry/informatics scientists and software engineers in Genentech Small Molecule Drug Discovery collaborate with experimental scientists in a therapeutic project-centric environment. Our mission is to enable and improve pre-clinical drug discovery design and decisions. Our goal is to deliver timely data, analysis, and modeling to our therapeutic project teams using best-in-class software tools. We describe our strategy, the organization of our group, and our approaches to reach this goal. We conclude with a summary of the interdisciplinary skills required for computational scientists and recommendations for their training.

  6. Comparative phyloinformatics of virus genes at micro and macro levels in a distributed computing environment.

    PubMed

    Singh, Dadabhai T; Trehan, Rahul; Schmidt, Bertil; Bretschneider, Timo

    2008-01-01

    Preparedness for a possible global pandemic caused by viruses such as the highly pathogenic influenza A subtype H5N1 has become a global priority. In particular, it is critical to monitor the appearance of any new emerging subtypes. Comparative phyloinformatics can be used to monitor, analyze, and possibly predict the evolution of viruses. However, in order to utilize the full functionality of available analysis packages for large-scale phyloinformatics studies, a team of computer scientists, biostatisticians and virologists is needed--a requirement which cannot be fulfilled in many cases. Furthermore, the time complexities of many algorithms involved leads to prohibitive runtimes on sequential computer platforms. This has so far hindered the use of comparative phyloinformatics as a commonly applied tool in this area. In this paper the graphical-oriented workflow design system called Quascade and its efficient usage for comparative phyloinformatics are presented. In particular, we focus on how this task can be effectively performed in a distributed computing environment. As a proof of concept, the designed workflows are used for the phylogenetic analysis of neuraminidase of H5N1 isolates (micro level) and influenza viruses (macro level). The results of this paper are hence twofold. Firstly, this paper demonstrates the usefulness of a graphical user interface system to design and execute complex distributed workflows for large-scale phyloinformatics studies of virus genes. Secondly, the analysis of neuraminidase on different levels of complexity provides valuable insights of this virus's tendency for geographical based clustering in the phylogenetic tree and also shows the importance of glycan sites in its molecular evolution. The current study demonstrates the efficiency and utility of workflow systems providing a biologist friendly approach to complex biological dataset analysis using high performance computing. In particular, the utility of the platform Quascade for deploying distributed and parallelized versions of a variety of computationally intensive phylogenetic algorithms has been shown. Secondly, the analysis of the utilized H5N1 neuraminidase datasets at macro and micro levels has clearly indicated a pattern of spatial clustering of the H5N1 viral isolates based on geographical distribution rather than temporal or host range based clustering.

  7. Provenance-Powered Automatic Workflow Generation and Composition

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Lee, S.; Pan, L.; Lee, T. J.

    2015-12-01

    In recent years, scientists have learned how to codify tools into reusable software modules that can be chained into multi-step executable workflows. Existing scientific workflow tools, created by computer scientists, require domain scientists to meticulously design their multi-step experiments before analyzing data. However, this is oftentimes contradictory to a domain scientist's daily routine of conducting research and exploration. We hope to resolve this dispute. Imagine this: An Earth scientist starts her day applying NASA Jet Propulsion Laboratory (JPL) published climate data processing algorithms over ARGO deep ocean temperature and AMSRE sea surface temperature datasets. Throughout the day, she tunes the algorithm parameters to study various aspects of the data. Suddenly, she notices some interesting results. She then turns to a computer scientist and asks, "can you reproduce my results?" By tracking and reverse engineering her activities, the computer scientist creates a workflow. The Earth scientist can now rerun the workflow to validate her findings, modify the workflow to discover further variations, or publish the workflow to share the knowledge. In this way, we aim to revolutionize computer-supported Earth science. We have developed a prototyping system to realize the aforementioned vision, in the context of service-oriented science. We have studied how Earth scientists conduct service-oriented data analytics research in their daily work, developed a provenance model to record their activities, and developed a technology to automatically generate workflow starting from user behavior and adaptability and reuse of these workflows for replicating/improving scientific studies. A data-centric repository infrastructure is established to catch richer provenance to further facilitate collaboration in the science community. We have also established a Petri nets-based verification instrument for provenance-based automatic workflow generation and recommendation.

  8. Living with cracks: Damage and repair in human bone

    NASA Astrophysics Data System (ADS)

    Taylor, David; Hazenberg, Jan G.; Lee, T. Clive

    2007-04-01

    Our bones are full of cracks, which form and grow as a result of daily loading activities. Bone is the major structural material in our bodies. Although weaker than many engineering materials, it has one trick that keeps it ahead - it can repair itself. Small cracks, which grow under cyclic stresses by the mechanism of fatigue, can be detected and removed before they become long enough to be dangerous. This article reviews the work that has been done to understand how cracks form and grow in bone, and how they can be detected and repaired in a timely manner. This is truly an interdisciplinary research field, requiring the close cooperation of materials scientists, biologists and engineers.

  9. Jalal A. Aliyev (1928-2016): a great scientist, a great teacher and a great human being.

    PubMed

    Huseynova, Irada M; Allakhverdiev, Suleyman I; Govindjee

    2016-06-01

    Jalal A. Aliyev was a distinguished and respected plant biologist of our time, a great teacher, and great human being. He was a pioneer of photosynthesis research in Azerbaijan. Almost up to the end of his life, he was deeply engaged in research. His work on the productivity of wheat, and biochemistry, genetics and molecular biology of gram (chick pea) are some of his important legacies. He left us on February 1, 2016, but many around the world remember him as he was engaged in international dialog on solving global issues, and in supporting international conferences on ''Photosynthesis Research for Sustainability" in 2011 and 2013.

  10. The Scientist as Illustrator.

    PubMed

    Iwasa, Janet H

    2016-04-01

    Proficiency in art and illustration was once considered an essential skill for biologists, because text alone often could not suffice to describe observations of biological systems. With modern imaging technology, it is no longer necessary to illustrate what we can see by eye. However, in molecular and cellular biology, our understanding of biological processes is dependent on our ability to synthesize diverse data to generate a hypothesis. Creating visual models of these hypotheses is important for generating new ideas and for communicating to our peers and to the public. Here, I discuss the benefits of creating visual models in molecular and cellular biology and consider steps to enable researchers to become more effective visual communicators. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Health: The No-Man's-Land Between Physics and Biology.

    PubMed

    Mansfield, Peter J

    2015-10-01

    Health as a positive attribute is poorly understood because understanding requires concepts from physics, of which physicians and other life scientists have a very poor grasp. This paper reviews the physics that bears on biology, in particular complex quaternions and scalar fields, relates these to the morphogenetic fields proposed by biologists, and defines health as an attribute of living action within these fields. The distinction of quality, as juxtaposed with quantity, proves essential. Its basic properties are set out, but a science and mathematics of quality are awaited. The implications of this model are discussed, particularly as proper health enhancement could set a natural limit to demand for, and therefore the cost of, medical services.

  12. Center for computation and visualization of geometric structures. Final report, 1992 - 1995

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-11-01

    This report describes the overall goals and the accomplishments of the Geometry Center of the University of Minnesota, whose mission is to develop, support, and promote computational tools for visualizing geometric structures, for facilitating communication among mathematical and computer scientists and between these scientists and the public at large, and for stimulating research in geometry.

  13. India's Computational Biology Growth and Challenges.

    PubMed

    Chakraborty, Chiranjib; Bandyopadhyay, Sanghamitra; Agoramoorthy, Govindasamy

    2016-09-01

    India's computational science is growing swiftly due to the outburst of internet and information technology services. The bioinformatics sector of India has been transforming rapidly by creating a competitive position in global bioinformatics market. Bioinformatics is widely used across India to address a wide range of biological issues. Recently, computational researchers and biologists are collaborating in projects such as database development, sequence analysis, genomic prospects and algorithm generations. In this paper, we have presented the Indian computational biology scenario highlighting bioinformatics-related educational activities, manpower development, internet boom, service industry, research activities, conferences and trainings undertaken by the corporate and government sectors. Nonetheless, this new field of science faces lots of challenges.

  14. What do computer scientists tweet? Analyzing the link-sharing practice on Twitter.

    PubMed

    Schmitt, Marco; Jäschke, Robert

    2017-01-01

    Twitter communication has permeated every sphere of society. To highlight and share small pieces of information with possibly vast audiences or small circles of the interested has some value in almost any aspect of social life. But what is the value exactly for a scientific field? We perform a comprehensive study of computer scientists using Twitter and their tweeting behavior concerning the sharing of web links. Discerning the domains, hosts and individual web pages being tweeted and the differences between computer scientists and a Twitter sample enables us to look in depth at the Twitter-based information sharing practices of a scientific community. Additionally, we aim at providing a deeper understanding of the role and impact of altmetrics in computer science and give a glance at the publications mentioned on Twitter that are most relevant for the computer science community. Our results show a link sharing culture that concentrates more heavily on public and professional quality information than the Twitter sample does. The results also show a broad variety in linked sources and especially in linked publications with some publications clearly related to community-specific interests of computer scientists, while others with a strong relation to attention mechanisms in social media. This refers to the observation that Twitter is a hybrid form of social media between an information service and a social network service. Overall the computer scientists' style of usage seems to be more on the information-oriented side and to some degree also on professional usage. Therefore, altmetrics are of considerable use in analyzing computer science.

  15. New Frontiers in Analyzing Dynamic Group Interactions: Bridging Social and Computer Science

    PubMed Central

    Lehmann-Willenbrock, Nale; Hung, Hayley; Keyton, Joann

    2017-01-01

    This special issue on advancing interdisciplinary collaboration between computer scientists and social scientists documents the joint results of the international Lorentz workshop, “Interdisciplinary Insights into Group and Team Dynamics,” which took place in Leiden, The Netherlands, July 2016. An equal number of scholars from social and computer science participated in the workshop and contributed to the papers included in this special issue. In this introduction, we first identify interaction dynamics as the core of group and team models and review how scholars in social and computer science have typically approached behavioral interactions in groups and teams. Next, we identify key challenges for interdisciplinary collaboration between social and computer scientists, and we provide an overview of the different articles in this special issue aimed at addressing these challenges. PMID:29249891

  16. Human computers: the first pioneers of the information age.

    PubMed

    Grier, D A

    2001-03-01

    Before computers were machines, they were people. They were men and women, young and old, well educated and common. They were the workers who convinced scientists that large-scale calculation had value. Long before Presper Eckert and John Mauchly built the ENIAC at the Moore School of Electronics, Philadelphia, or Maurice Wilkes designed the EDSAC for Manchester University, human computers had created the discipline of computation. They developed numerical methodologies and proved them on practical problems. These human computers were not savants or calculating geniuses. Some knew little more than basic arithmetic. A few were near equals of the scientists they served and, in a different time or place, might have become practicing scientists had they not been barred from a scientific career by their class, education, gender or ethnicity.

  17. New Frontiers in Analyzing Dynamic Group Interactions: Bridging Social and Computer Science.

    PubMed

    Lehmann-Willenbrock, Nale; Hung, Hayley; Keyton, Joann

    2017-10-01

    This special issue on advancing interdisciplinary collaboration between computer scientists and social scientists documents the joint results of the international Lorentz workshop, "Interdisciplinary Insights into Group and Team Dynamics," which took place in Leiden, The Netherlands, July 2016. An equal number of scholars from social and computer science participated in the workshop and contributed to the papers included in this special issue. In this introduction, we first identify interaction dynamics as the core of group and team models and review how scholars in social and computer science have typically approached behavioral interactions in groups and teams. Next, we identify key challenges for interdisciplinary collaboration between social and computer scientists, and we provide an overview of the different articles in this special issue aimed at addressing these challenges.

  18. Newspaper Coverage of Biological Subissues in the Spotted Owl Debate, 1989-1993.

    ERIC Educational Resources Information Center

    Furlow, F. Bryant

    1994-01-01

    Computer archives for 27 U.S. daily newspapers were accessed to evaluate the levels of coverage for 5 biological subissues and 5 ecological concepts in articles on the spotted owl debate. The author identified a correlation between biologist/reporter contact and increased biological subissue and ecological concepts coverage. (LZ)

  19. Building place-based collaborations to develop high school students' groundwater systems knowledge and decision-making capacity

    NASA Astrophysics Data System (ADS)

    Podrasky, A.; Covitt, B. A.; Woessner, W.

    2017-12-01

    The availability of clean water to support human uses and ecological integrity has become an urgent interest for many scientists, decision makers and citizens. Likewise, as computational capabilities increasingly revolutionize and become integral to the practice of science, technology, engineering and math (STEM) disciplines, the STEM+ Computing (STEM+C) Partnerships program seeks to integrate the use of computational approaches in K-12 STEM teaching and learning. The Comp Hydro project, funded by a STEM+C grant from the National Science Foundation, brings together a diverse team of scientists, educators, professionals and citizens at sites in Arizona, Colorado, Maryland and Montana to foster water literacy, as well as computational science literacy, by integrating authentic, place- and data- based learning using physical, mathematical, computational and conceptual models. This multi-state project is currently engaging four teams of six teachers who work during two academic years with educators and scientists at each site. Teams work to develop instructional units specific to their region that integrate hydrologic science and computational modeling. The units, currently being piloted in high school earth and environmental science classes, provide a classroom context to investigate student understanding of how computation is used in Earth systems science. To develop effective science instruction that is rich in place- and data- based learning, effective collaborations between researchers, educators, scientists, professionals and citizens are crucial. In this poster, we focus on project implementation in Montana, where an instructional unit has been developed and is being tested through collaboration among University scientists, researchers and educators, high school teachers and agency and industry scientists and engineers. In particular, we discuss three characteristics of effective collaborative science education design for developing and implementing place- and data- based science education to support students in developing socio-scientific and computational literacy sufficient for making decisions about real world issues such as groundwater contamination. These characteristics include that science education experiences are real, responsive/accessible and rigorous.

  20. Are ecological and evolutionary theories scientific?

    PubMed

    Murray, B G

    2001-05-01

    Scientists observe nature, search for generalizations, and provide explanations for why the world is as it is. Generalizations are of two kinds. The first are descriptive and inductive, such as Boyle's Law. They are derived from observations and therefore refer to observables (in this case, pressure and volume). The second are often imaginative and form the axioms of a deductive theory, such as Newton's Laws of Motion. They often refer to unobservables (e.g. inertia and gravitation). Biology has many inductive generalizations (e.g. Bergmann's Rule and 'all cells arise from preexisting cells') but few, if any, recognized universal laws and virtually no deductive theory. Many biologists and philosophers of biology have agreed that predictive theory is inappropriate in biology, which is said to be more complex than physics, and that one can have nonpredictive explanations, such as the neo-Darwinian Theory of Evolution by Natural Selection. Other philosophers dismiss nonpredictive, explanatory theories, including evolutionary 'theory', as metaphysics. Most biologists do not think of themselves as philosophers or give much thought to the philosophical basis of their research. Nevertheless, their philosophy shows in the way they do research. The plethora of ad hoc (i.e. not universal) hypotheses indicates that biologists are reluctant inductivists in that the search for generalization does not have a high priority. Biologists test their hypotheses by verification. Theoretical physicists, in contrast, are deductive unifiers and test their explanatory hypotheses by falsification. I argue that theoretical biology (concerned with unobservables, such as fitness and natural selection) is not scientific because it lacks universal laws and predictive theory. In order to make this argument, I review the differences between verificationism and falsificationism, induction and deduction, and descriptive and explanatory laws. I show how these differ with a specific example of a successful and still useful (even if now superseded as explanatory) deductive theory, Newton's Theory of Motion. I also review some of the philosophical views expressed on these topics because philosophers seem to be even more divided than biologists, which is not at all helpful. The fact that biology does not have predictive theories does not constitute irrefutable evidence that it cannot have them. The only way to falsify this philosophical hypothesis, however, is to produce a predictive theory with universal biological laws. I have proposed such a theory, but it has been presented piecemeal. At the end of this paper, I bring the pieces together into a deductive theory on the evolution of life history traits (e.g. clutch size, mating relationships, sexual size dimorphism).

  1. Synthetic analog and digital circuits for cellular computation and memory.

    PubMed

    Purcell, Oliver; Lu, Timothy K

    2014-10-01

    Biological computation is a major area of focus in synthetic biology because it has the potential to enable a wide range of applications. Synthetic biologists have applied engineering concepts to biological systems in order to construct progressively more complex gene circuits capable of processing information in living cells. Here, we review the current state of computational genetic circuits and describe artificial gene circuits that perform digital and analog computation. We then discuss recent progress in designing gene networks that exhibit memory, and how memory and computation have been integrated to yield more complex systems that can both process and record information. Finally, we suggest new directions for engineering biological circuits capable of computation. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  2. Scientific Computing Paradigm

    NASA Technical Reports Server (NTRS)

    VanZandt, John

    1994-01-01

    The usage model of supercomputers for scientific applications, such as computational fluid dynamics (CFD), has changed over the years. Scientific visualization has moved scientists away from looking at numbers to looking at three-dimensional images, which capture the meaning of the data. This change has impacted the system models for computing. This report details the model which is used by scientists at NASA's research centers.

  3. Northern Prairie Wildlife Research Center

    USGS Publications Warehouse

    ,

    2009-01-01

    The Northern Prairie Wildlife Research Center (NPWRC) conducts integrated research to fulfill the Department of the Interior's responsibilities to the Nation's natural resources. Located on 600 acres along the James River Valley near Jamestown, North Dakota, the NPWRC develops and disseminates scientific information needed to understand, conserve, and wisely manage the Nation's biological resources. Research emphasis is primarily on midcontinental plant and animal species and ecosystems of the United States. During the center's 40-year history, its scientists have earned an international reputation for leadership and expertise on the biology of waterfowl and grassland birds, wetland ecology and classification, mammalian behavior and ecology, grassland ecosystems, and application of statistics and geographic information systems. To address current science challenges, NPWRC scientists collaborate with researchers from other U.S. Geological Survey centers and disciplines (Biology, Geography, Geology, and Water) and with biologists and managers in the Department of the Interior (DOI), other Federal agencies, State agencies, universities, and nongovernmental organizations. Expanding upon its scientific expertise and leadership, the NPWRC is moving in new directions, including invasive plant species, restoration of native habitats, carbon sequestration and marketing, and ungulate management on DOI lands.

  4. Are there ecological limits to population?

    PubMed Central

    Keyfitz, N

    1993-01-01

    Policy on population and environment in the United States and abroad has been vacillating, unsure of its course; it would be more decisive if the several disciplines could agree on the nature of the problems and their urgency. The two disciplines principally concerned are biology and economics, and the contribution of this paper is to identify eight of the many axes or directions on which the methods and traditions of the two are different. For example, the first of the axes runs between contingency and orderly progress, with biology tending to seek out the former and economics the latter; thus biologists can more easily comprehend catastrophes, such as the demise of the dinosaurs or widespread desertification. The third axis concerns indefinite market-driven substitutability, seen by economists as resulting from scientific discovery; natural scientists, including biologists, whose discoveries make possible the substitutions, are skeptical. Axis 7 results from the fact that economics concentrates on goods that are on the market, and so deals with a truncated part of the commodity cycle, while ecology aims at the whole; because goods disappear from economic statistics once they pass into the hands of consumers many of their ecological effects are invisible. I believe that from similar further study of the two disciplines a common set of policy recommendations will ultimately emerge. PMID:8346195

  5. How Can We Improve Problem Solving in Undergraduate Biology? Applying Lessons from 30 Years of Physics Education Research

    PubMed Central

    Hoskinson, A.-M.; Caballero, M. D.; Knight, J. K.

    2013-01-01

    If students are to successfully grapple with authentic, complex biological problems as scientists and citizens, they need practice solving such problems during their undergraduate years. Physics education researchers have investigated student problem solving for the past three decades. Although physics and biology problems differ in structure and content, the instructional purposes align closely: explaining patterns and processes in the natural world and making predictions about physical and biological systems. In this paper, we discuss how research-supported approaches developed by physics education researchers can be adopted by biologists to enhance student problem-solving skills. First, we compare the problems that biology students are typically asked to solve with authentic, complex problems. We then describe the development of research-validated physics curricula emphasizing process skills in problem solving. We show that solving authentic, complex biology problems requires many of the same skills that practicing physicists and biologists use in representing problems, seeking relationships, making predictions, and verifying or checking solutions. We assert that acquiring these skills can help biology students become competent problem solvers. Finally, we propose how biology scholars can apply lessons from physics education in their classrooms and inspire new studies in biology education research. PMID:23737623

  6. Assemble worldwide biologists in a network construct a web services based architecture for bioinformatics.

    PubMed

    Tao, Yuan; Liu, Juan

    2005-01-01

    The Internet has already deflated our world of working and living into a very small scope, thus bringing out the concept of Earth Village, in which people could communicate and co-work though thousands' miles far away from each other. This paper describes a prototype, which is just like an Earth Lab for bioinformatics, based on Web services framework to build up a network architecture for bioinformatics research and for world wide biologists to easily implement enormous, complex processes, and effectively share and access computing resources and data, regardless of how heterogeneous the format of the data is and how decentralized and distributed these resources are around the world. A diminutive and simplified example scenario is given out to realize the prototype after that.

  7. Message from the ISCB: 2015 ISCB Accomplishment by a Senior Scientist Award: Cyrus Chothia.

    PubMed

    Fogg, Christiana N; Kovats, Diane E

    2015-07-01

    The International Society for Computational Biology (ISCB; http://www.iscb.org) honors a senior scientist annually for his or her outstanding achievements with the ISCB Accomplishment by a Senior Scientist Award. This award recognizes a leader in the field of computational biology for his or her significant contributions to the community through research, service and education. Cyrus Chothia, an emeritus scientist at the Medical Research Council Laboratory of Molecular Biology and emeritus fellow of Wolfson College at Cambridge University, England, is the 2015 ISCB Accomplishment by a Senior Scientist Award winner.Chothia was selected by the Awards Committee, which is chaired by Dr Bonnie Berger of the Massachusetts Institute of Technology. He will receive his award and deliver a keynote presentation at 2015 Intelligent Systems for Molecular Biology/European Conference on Computational Biology in Dublin, Ireland, in July 2015. dkovats@iscb.org. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  8. Enabling Earth Science: The Facilities and People of the NCCS

    NASA Technical Reports Server (NTRS)

    2002-01-01

    The NCCS's mass data storage system allows scientists to store and manage the vast amounts of data generated by these computations, and its high-speed network connections allow the data to be accessed quickly from the NCCS archives. Some NCCS users perform studies that are directly related to their ability to run computationally expensive and data-intensive simulations. Because the number and type of questions scientists research often are limited by computing power, the NCCS continually pursues the latest technologies in computing, mass storage, and networking technologies. Just as important as the processors, tapes, and routers of the NCCS are the personnel who administer this hardware, create and manage accounts, maintain security, and assist the scientists, often working one on one with them.

  9. Big Data: An Opportunity for Collaboration with Computer Scientists on Data-Driven Science

    NASA Astrophysics Data System (ADS)

    Baru, C.

    2014-12-01

    Big data technologies are evolving rapidly, driven by the need to manage ever increasing amounts of historical data; process relentless streams of human and machine-generated data; and integrate data of heterogeneous structure from extremely heterogeneous sources of information. Big data is inherently an application-driven problem. Developing the right technologies requires an understanding of the applications domain. Though, an intriguing aspect of this phenomenon is that the availability of the data itself enables new applications not previously conceived of! In this talk, we will discuss how the big data phenomenon creates an imperative for collaboration among domain scientists (in this case, geoscientists) and computer scientists. Domain scientists provide the application requirements as well as insights about the data involved, while computer scientists help assess whether problems can be solved with currently available technologies or require adaptaion of existing technologies and/or development of new technologies. The synergy can create vibrant collaborations potentially leading to new science insights as well as development of new data technologies and systems. The area of interface between geosciences and computer science, also referred to as geoinformatics is, we believe, a fertile area for interdisciplinary research.

  10. CREASE 6.0 Catalog of Resources for Education in Ada and Software Engineering

    DTIC Science & Technology

    1992-02-01

    Programming Software Engineering Strong Typing Tasking Audene . Computer Scientists Terbook(s): Barnes, J. Programming in Ada, 3rd ed. Addison-Wesley...Ada. Concept: Abstract Data Types Management Overview Package Real-Time Programming Tasking Audene Computer Scientists Textbook(s): Barnes, J

  11. Parallel computing in genomic research: advances and applications

    PubMed Central

    Ocaña, Kary; de Oliveira, Daniel

    2015-01-01

    Today’s genomic experiments have to process the so-called “biological big data” that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC) environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities. PMID:26604801

  12. Parallel computing in genomic research: advances and applications.

    PubMed

    Ocaña, Kary; de Oliveira, Daniel

    2015-01-01

    Today's genomic experiments have to process the so-called "biological big data" that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC) environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities.

  13. Homo Sapiens as Geological Agents

    NASA Astrophysics Data System (ADS)

    Holloway, T.; Bedsworth, L. W.; Caldeira, K.; Rosenzweig, C.; Kelley, G.; Rosenzweig, C.; Caldeira, K.; Bedsworth, L. W.; Holloway, T.; Purdy, J. S.; Vince, G.; Syvitski, J. A.; Bondre, N. R.; Kelly, J.; Vince, G.; Seto, K. C.; Steffen, W.; Oreskes, N.

    2015-12-01

    In the 18th and 19th centuries, earth scientists came to understand the magnitude and power of geological and geophysical processes. In comparison, the activities of humans seemed paltry if not insignificant. With the development of radiometric dating in the 20th century, scientists realized that human history was but a miniscule part of Earth history. Metaphors to this effect abounded, and filled textbooks: If Earth history were a 24-hour day, human history would not occupy even the final second. If Earth history were a yardstick, the human portion would not even be visible to the naked eye. Generations of scientists were taught that one of the principal contributions of geology, qua science, was the demonstration of our insignificance. The Anthropocene concept disrupts this. To affirms its existence is to insist that human activities compete in scale and significance with other Earth processes, and may threaten to overwhelm them. It also inverts our relation to normative claims. For more than a century earth scientists and evolutionary biologists insisted that their theories were descriptive and not normative—that there was no moral conclusion to be drawn from either planetary or human evolution. Now, we confront the suggestion that there is a moral component to our new paradigm: we can scarcely claim that humans are disrupting the climate, destroying biodiversity, and acidifying the oceans without implying that there is something troubling about these developments. Thus, the Anthropocene concept suggests both a radical redefinition of the scope of Earth science, and a radical reconsideration of the place of normative judgments in scientific work.

  14. An economic and financial exploratory

    NASA Astrophysics Data System (ADS)

    Cincotti, S.; Sornette, D.; Treleaven, P.; Battiston, S.; Caldarelli, G.; Hommes, C.; Kirman, A.

    2012-11-01

    This paper describes the vision of a European Exploratory for economics and finance using an interdisciplinary consortium of economists, natural scientists, computer scientists and engineers, who will combine their expertise to address the enormous challenges of the 21st century. This Academic Public facility is intended for economic modelling, investigating all aspects of risk and stability, improving financial technology, and evaluating proposed regulatory and taxation changes. The European Exploratory for economics and finance will be constituted as a network of infrastructure, observatories, data repositories, services and facilities and will foster the creation of a new cross-disciplinary research community of social scientists, complexity scientists and computing (ICT) scientists to collaborate in investigating major issues in economics and finance. It is also considered a cradle for training and collaboration with the private sector to spur spin-offs and job creations in Europe in the finance and economic sectors. The Exploratory will allow Social Scientists and Regulators as well as Policy Makers and the private sector to conduct realistic investigations with real economic, financial and social data. The Exploratory will (i) continuously monitor and evaluate the status of the economies of countries in their various components, (ii) use, extend and develop a large variety of methods including data mining, process mining, computational and artificial intelligence and every other computer and complex science techniques coupled with economic theory and econometric, and (iii) provide the framework and infrastructure to perform what-if analysis, scenario evaluations and computational, laboratory, field and web experiments to inform decision makers and help develop innovative policy, market and regulation designs.

  15. Award-Winning Animation Helps Scientists See Nature at Work | News | NREL

    Science.gov Websites

    Scientists See Nature at Work August 8, 2008 A computer-aided image combines a photo of a man with a three -dimensional, computer-generated image. The man has long brown hair and a long beard. He is wearing a blue - simultaneously. "It is very difficult to parallelize the process to run even on a huge computer,"

  16. From Years of Work in Psychology and Computer Science, Scientists Build Theories of Thinking and Learning.

    ERIC Educational Resources Information Center

    Wheeler, David L.

    1988-01-01

    Scientists feel that progress in artificial intelligence and the availability of thousands of experimental results make this the right time to build and test theories on how people think and learn, using the computer to model minds. (MSE)

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCaskey, Alexander J.

    Hybrid programming models for beyond-CMOS technologies will prove critical for integrating new computing technologies alongside our existing infrastructure. Unfortunately the software infrastructure required to enable this is lacking or not available. XACC is a programming framework for extreme-scale, post-exascale accelerator architectures that integrates alongside existing conventional applications. It is a pluggable framework for programming languages developed for next-gen computing hardware architectures like quantum and neuromorphic computing. It lets computational scientists efficiently off-load classically intractable work to attached accelerators through user-friendly Kernel definitions. XACC makes post-exascale hybrid programming approachable for domain computational scientists.

  18. Audubon Wildlife Adventures. Grizzly Guidebook. School Edition.

    ERIC Educational Resources Information Center

    National Audubon Society, Washington, DC.

    This program introduces the young computer players to the world of the grizzly bear, the largest land carnivore in North America. Through a series of four interactive stories, players learn of the bear's habits and human activities that have brought it close to extinction. Playing the part of a park ranger, a research biologist or a natural…

  19. From Glass Flowers to Computer Games: Examining the Emergent Media Practices of Plant Biologists

    ERIC Educational Resources Information Center

    Reitmeyer, Morgan

    2011-01-01

    The goal of this project is to begin investigating the emergent media practices of current academic disciplines. This dissertation posits that Writing Across the Curriculum (WAC) scholars have investigated new media use in undergraduate pedagogy, and to some extent the practices of graduate students. However, WAC scholars have yet to try to…

  20. Semantic knowledge for histopathological image analysis: from ontologies to processing portals and deep learning

    NASA Astrophysics Data System (ADS)

    Kergosien, Yannick L.; Racoceanu, Daniel

    2017-11-01

    This article presents our vision about the next generation of challenges in computational/digital pathology. The key role of the domain ontology, developed in a sustainable manner (i.e. using reference checklists and protocols, as the living semantic repositories), opens the way to effective/sustainable traceability and relevance feedback concerning the use of existing machine learning algorithms, proven to be very performant in the latest digital pathology challenges (i.e. convolutional neural networks). Being able to work in an accessible web-service environment, with strictly controlled issues regarding intellectual property (image and data processing/analysis algorithms) and medical data/image confidentiality is essential for the future. Among the web-services involved in the proposed approach, the living yellow pages in the area of computational pathology seems to be very important in order to reach an operational awareness, validation, and feasibility. This represents a very promising way to go to the next generation of tools, able to bring more guidance to the computer scientists and confidence to the pathologists, towards an effective/efficient daily use. Besides, a consistent feedback and insights will be more likely to emerge in the near future - from these sophisticated machine learning tools - back to the pathologists-, strengthening, therefore, the interaction between the different actors of a sustainable biomedical ecosystem (patients, clinicians, biologists, engineers, scientists etc.). Beside going digital/computational - with virtual slide technology demanding new workflows-, Pathology must prepare for another coming revolution: semantic web technologies now enable the knowledge of experts to be stored in databases, shared through the Internet, and accessible by machines. Traceability, disambiguation of reports, quality monitoring, interoperability between health centers are some of the associated benefits that pathologists were seeking. However, major changes are also to be expected for the relation of human diagnosis to machine based procedures. Improving on a former imaging platform which used a local knowledge base and a reasoning engine to combine image processing modules into higher level tasks, we propose a framework where different actors of the histopathology imaging world can cooperate using web services - exchanging knowledge as well as imaging services - and where the results of such collaborations on diagnostic related tasks can be evaluated in international challenges such as those recently organized for mitosis detection, nuclear atypia, or tissue architecture in the context of cancer grading. This framework is likely to offer an effective context-guidance and traceability to Deep Learning approaches, with an interesting promising perspective given by the multi-task learning (MTL) paradigm, distinguished by its applicability to several different learning algorithms, its non- reliance on specialized architectures and the promising results demonstrated, in particular towards the problem of weak supervision-, an issue found when direct links from pathology terms in reports to corresponding regions within images are missing.

  1. "Ask Argonne" - Charlie Catlett, Computer Scientist, Part 2

    ScienceCinema

    Catlett, Charlie

    2018-02-14

    A few weeks back, computer scientist Charlie Catlett talked a bit about the work he does and invited questions from the public during Part 1 of his "Ask Argonne" video set (http://bit.ly/1joBtzk). In Part 2, he answers some of the questions that were submitted. Enjoy!

  2. "Ask Argonne" - Charlie Catlett, Computer Scientist, Part 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Catlett, Charlie

    2014-06-17

    A few weeks back, computer scientist Charlie Catlett talked a bit about the work he does and invited questions from the public during Part 1 of his "Ask Argonne" video set (http://bit.ly/1joBtzk). In Part 2, he answers some of the questions that were submitted. Enjoy!

  3. Taming the BEAST—A Community Teaching Material Resource for BEAST 2

    PubMed Central

    Barido-Sottani, Joëlle; Bošková, Veronika; Plessis, Louis Du; Kühnert, Denise; Magnus, Carsten; Mitov, Venelin; Müller, Nicola F.; PečErska, Jūlija; Rasmussen, David A.; Zhang, Chi; Drummond, Alexei J.; Heath, Tracy A.; Pybus, Oliver G.; Vaughan, Timothy G.; Stadler, Tanja

    2018-01-01

    Abstract Phylogenetics and phylodynamics are central topics in modern evolutionary biology. Phylogenetic methods reconstruct the evolutionary relationships among organisms, whereas phylodynamic approaches reveal the underlying diversification processes that lead to the observed relationships. These two fields have many practical applications in disciplines as diverse as epidemiology, developmental biology, palaeontology, ecology, and linguistics. The combination of increasingly large genetic data sets and increases in computing power is facilitating the development of more sophisticated phylogenetic and phylodynamic methods. Big data sets allow us to answer complex questions. However, since the required analyses are highly specific to the particular data set and question, a black-box method is not sufficient anymore. Instead, biologists are required to be actively involved with modeling decisions during data analysis. The modular design of the Bayesian phylogenetic software package BEAST 2 enables, and in fact enforces, this involvement. At the same time, the modular design enables computational biology groups to develop new methods at a rapid rate. A thorough understanding of the models and algorithms used by inference software is a critical prerequisite for successful hypothesis formulation and assessment. In particular, there is a need for more readily available resources aimed at helping interested scientists equip themselves with the skills to confidently use cutting-edge phylogenetic analysis software. These resources will also benefit researchers who do not have access to similar courses or training at their home institutions. Here, we introduce the “Taming the Beast” (https://taming-the-beast.github.io/) resource, which was developed as part of a workshop series bearing the same name, to facilitate the usage of the Bayesian phylogenetic software package BEAST 2. PMID:28673048

  4. From Lived Experiences to Game Creation: How Scaffolding Supports Elementary School Students Learning Computer Science Principles in an After School Setting

    ERIC Educational Resources Information Center

    Her Many Horses, Ian

    2016-01-01

    The world, and especially our own country, is in dire need of a larger and more diverse population of computer scientists. While many organizations have approached this problem of too few computer scientists in various ways, a promising, and I believe necessary, path is to expose elementary students to authentic practices of the discipline.…

  5. Maine Tidal Power Initiative: Environmental Impact Protocols For Tidal Power

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peterson, Michael Leroy; Zydlewski, Gayle Barbin; Xue, Huijie

    2014-02-02

    The Maine Tidal Power Initiative (MTPI), an interdisciplinary group of engineers, biologists, oceanographers, and social scientists, has been conducting research to evaluate tidal energy resources and better understand the potential effects and impacts of marine hydro-kinetic (MHK) development on the environment and local community. Project efforts include: 1) resource assessment, 2) development of initial device design parameters using scale model tests, 3) baseline environmental studies and monitoring, and 4) human and community responses. This work included in-situ measurement of the environmental and social response to the pre-commercial Turbine Generator Unit (TGU®) developed by Ocean Renewable Power Company (ORPC) as wellmore » as considering the path forward for smaller community scale projects.« less

  6. Physical Biology of the Materials-Microorganism Interface.

    PubMed

    Sakimoto, Kelsey K; Kornienko, Nikolay; Cestellos-Blanco, Stefano; Lim, Jongwoo; Liu, Chong; Yang, Peidong

    2018-02-14

    Future solar-to-chemical production will rely upon a deep understanding of the material-microorganism interface. Hybrid technologies, which combine inorganic semiconductor light harvesters with biological catalysis to transform light, air, and water into chemicals, already demonstrate a wide product scope and energy efficiencies surpassing that of natural photosynthesis. But optimization to economic competitiveness and fundamental curiosity beg for answers to two basic questions: (1) how do materials transfer energy and charge to microorganisms, and (2) how do we design for bio- and chemocompatibility between these seemingly unnatural partners? This Perspective highlights the state-of-the-art and outlines future research paths to inform the cadre of spectroscopists, electrochemists, bioinorganic chemists, material scientists, and biologists who will ultimately solve these mysteries.

  7. From bench to patient: model systems in drug discovery.

    PubMed

    Breyer, Matthew D; Look, A Thomas; Cifra, Alessandra

    2015-10-01

    Model systems, including laboratory animals, microorganisms, and cell- and tissue-based systems, are central to the discovery and development of new and better drugs for the treatment of human disease. In this issue, Disease Models & Mechanisms launches a Special Collection that illustrates the contribution of model systems to drug discovery and optimisation across multiple disease areas. This collection includes reviews, Editorials, interviews with leading scientists with a foot in both academia and industry, and original research articles reporting new and important insights into disease therapeutics. This Editorial provides a summary of the collection's current contents, highlighting the impact of multiple model systems in moving new discoveries from the laboratory bench to the patients' bedsides. © 2015. Published by The Company of Biologists Ltd.

  8. A global approach to analysis and interpretation of metabolic data for plant natural product discovery.

    PubMed

    Hur, Manhoi; Campbell, Alexis Ann; Almeida-de-Macedo, Marcia; Li, Ling; Ransom, Nick; Jose, Adarsh; Crispin, Matt; Nikolau, Basil J; Wurtele, Eve Syrkin

    2013-04-01

    Discovering molecular components and their functionality is key to the development of hypotheses concerning the organization and regulation of metabolic networks. The iterative experimental testing of such hypotheses is the trajectory that can ultimately enable accurate computational modelling and prediction of metabolic outcomes. This information can be particularly important for understanding the biology of natural products, whose metabolism itself is often only poorly defined. Here, we describe factors that must be in place to optimize the use of metabolomics in predictive biology. A key to achieving this vision is a collection of accurate time-resolved and spatially defined metabolite abundance data and associated metadata. One formidable challenge associated with metabolite profiling is the complexity and analytical limits associated with comprehensively determining the metabolome of an organism. Further, for metabolomics data to be efficiently used by the research community, it must be curated in publicly available metabolomics databases. Such databases require clear, consistent formats, easy access to data and metadata, data download, and accessible computational tools to integrate genome system-scale datasets. Although transcriptomics and proteomics integrate the linear predictive power of the genome, the metabolome represents the nonlinear, final biochemical products of the genome, which results from the intricate system(s) that regulate genome expression. For example, the relationship of metabolomics data to the metabolic network is confounded by redundant connections between metabolites and gene-products. However, connections among metabolites are predictable through the rules of chemistry. Therefore, enhancing the ability to integrate the metabolome with anchor-points in the transcriptome and proteome will enhance the predictive power of genomics data. We detail a public database repository for metabolomics, tools and approaches for statistical analysis of metabolomics data, and methods for integrating these datasets with transcriptomic data to create hypotheses concerning specialized metabolisms that generate the diversity in natural product chemistry. We discuss the importance of close collaborations among biologists, chemists, computer scientists and statisticians throughout the development of such integrated metabolism-centric databases and software.

  9. A global approach to analysis and interpretation of metabolic data for plant natural product discovery†

    PubMed Central

    Hur, Manhoi; Campbell, Alexis Ann; Almeida-de-Macedo, Marcia; Li, Ling; Ransom, Nick; Jose, Adarsh; Crispin, Matt; Nikolau, Basil J.

    2013-01-01

    Discovering molecular components and their functionality is key to the development of hypotheses concerning the organization and regulation of metabolic networks. The iterative experimental testing of such hypotheses is the trajectory that can ultimately enable accurate computational modelling and prediction of metabolic outcomes. This information can be particularly important for understanding the biology of natural products, whose metabolism itself is often only poorly defined. Here, we describe factors that must be in place to optimize the use of metabolomics in predictive biology. A key to achieving this vision is a collection of accurate time-resolved and spatially defined metabolite abundance data and associated metadata. One formidable challenge associated with metabolite profiling is the complexity and analytical limits associated with comprehensively determining the metabolome of an organism. Further, for metabolomics data to be efficiently used by the research community, it must be curated in publically available metabolomics databases. Such databases require clear, consistent formats, easy access to data and metadata, data download, and accessible computational tools to integrate genome system-scale datasets. Although transcriptomics and proteomics integrate the linear predictive power of the genome, the metabolome represents the nonlinear, final biochemical products of the genome, which results from the intricate system(s) that regulate genome expression. For example, the relationship of metabolomics data to the metabolic network is confounded by redundant connections between metabolites and gene-products. However, connections among metabolites are predictable through the rules of chemistry. Therefore, enhancing the ability to integrate the metabolome with anchor-points in the transcriptome and proteome will enhance the predictive power of genomics data. We detail a public database repository for metabolomics, tools and approaches for statistical analysis of metabolomics data, and methods for integrating these dataset with transcriptomic data to create hypotheses concerning specialized metabolism that generates the diversity in natural product chemistry. We discuss the importance of close collaborations among biologists, chemists, computer scientists and statisticians throughout the development of such integrated metabolism-centric databases and software. PMID:23447050

  10. PURSUING AN EXCITING CAREER AS A WILDLIFE BIOLOGIST

    EPA Science Inventory

    Many people associate a career as a biologist to be similar to TV stars such as Jeff Corwin. Although biologists get to do exciting things like what viewers see on TV, being a biologist involves much more. I will talk about my career as a biologist, discuss experiences that you...

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ayer, Vidya M.; Miguez, Sheila; Toby, Brian H.

    Scientists have been central to the historical development of the computer industry, but the importance of software only continues to grow for all areas of scientific research and in particular for powder diffraction. Knowing how to program a computer is a basic and useful skill for scientists. The article introduces the three types of programming languages and why scripting languages are now preferred for scientists. Of them, the authors assert Python is the most useful and easiest to learn. Python is introduced. Also presented is an overview to a few of the many add-on packages available to extend the capabilitiesmore » of Python, for example, for numerical computations, scientific graphics and graphical user interface programming.« less

  12. Long live the Data Scientist, but can he/she persist?

    NASA Astrophysics Data System (ADS)

    Wyborn, L. A.

    2011-12-01

    In recent years the fourth paradigm of data intensive science has slowly taken hold as the increased capacity of instruments and an increasing number of instruments (in particular sensor networks) have changed how fundamental research is undertaken. Most modern scientific research is about digital capture of data direct from instruments, processing it by computers, storing the results on computers and only publishing a small fraction of data in hard copy publications. At the same time, the rapid increase in capacity of supercomputers, particularly at petascale, means that far larger data sets can be analysed and to greater resolution than previously possible. The new cloud computing paradigm which allows distributed data, software and compute resources to be linked by seamless workflows, is creating new opportunities in processing of high volumes of data to an increasingly larger number of researchers. However, to take full advantage of these compute resources, data sets for analysis have to be aggregated from multiple sources to create high performance data sets. These new technology developments require that scientists must become more skilled in data management and/or have a higher degree of computer literacy. In almost every science discipline there is now an X-informatics branch and a computational X branch (eg, Geoinformatics and Computational Geoscience): both require a new breed of researcher that has skills in both the science fundamentals and also knowledge of some ICT aspects (computer programming, data base design and development, data curation, software engineering). People that can operate in both science and ICT are increasingly known as 'data scientists'. Data scientists are a critical element of many large scale earth and space science informatics projects, particularly those that are tackling current grand challenges at an international level on issues such as climate change, hazard prediction and sustainable development of our natural resources. These projects by their very nature require the integration of multiple digital data sets from multiple sources. Often the preparation of the data for computational analysis can take months and requires painstaking attention to detail to ensure that anomalies identified are real and are not just artefacts of the data preparation and/or the computational analysis. Although data scientists are increasingly vital to successful data intensive earth and space science projects, unless they are recognised for their capabilities in both the science and the computational domains they are likely to migrate to either a science role or an ICT role as their career advances. Most reward and recognition systems do not recognise those with skills in both, hence, getting trained data scientists to persist beyond one or two projects can be challenge. Those data scientists that persist in the profession are characteristically committed and enthusiastic people who have the support of their organisations to take on this role. They also tend to be people who share developments and are critical to the success of the open source software movement. However, the fact remains that survival of the data scientist as a species is being threatened unless something is done to recognise their invaluable contributions to the new fourth paradigm of science.

  13. Pixel Perfect

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perrine, Kenneth A.; Hopkins, Derek F.; Lamarche, Brian L.

    2005-09-01

    Biologists and computer engineers at Pacific Northwest National Laboratory have specified, designed, and implemented a hardware/software system for performing real-time, multispectral image processing on a confocal microscope. This solution is intended to extend the capabilities of the microscope, enabling scientists to conduct advanced experiments on cell signaling and other kinds of protein interactions. FRET (fluorescence resonance energy transfer) techniques are used to locate and monitor protein activity. In FRET, it is critical that spectral images be precisely aligned with each other despite disturbances in the physical imaging path caused by imperfections in lenses and cameras, and expansion and contraction ofmore » materials due to temperature changes. The central importance of this work is therefore automatic image registration. This runs in a framework that guarantees real-time performance (processing pairs of 1024x1024, 8-bit images at 15 frames per second) and enables the addition of other types of advanced image processing algorithms such as image feature characterization. The supporting system architecture consists of a Visual Basic front-end containing a series of on-screen interfaces for controlling various aspects of the microscope and a script engine for automation. One of the controls is an ActiveX component written in C++ for handling the control and transfer of images. This component interfaces with a pair of LVDS image capture boards and a PCI board containing a 6-million gate Xilinx Virtex-II FPGA. Several types of image processing are performed on the FPGA in a pipelined fashion, including the image registration. The FPGA offloads work that would otherwise need to be performed by the main CPU and has a guaranteed real-time throughput. Image registration is performed in the FPGA by applying a cubic warp on one image to precisely align it with the other image. Before each experiment, an automated calibration procedure is run in order to set up the cubic warp. During image acquisitions, the cubic warp is evaluated by way of forward differencing. Unwanted pixelation artifacts are minimized by bilinear sampling. The resulting system is state-of-the-art for biological imaging. Precisely registered images enable the reliable use of FRET techniques. In addition, real-time image processing performance allows computed images to be fed back and displayed to scientists immediately, and the pipelined nature of the FPGA allows additional image processing algorithms to be incorporated into the system without slowing throughput.« less

  14. How to Cloud for Earth Scientists: An Introduction

    NASA Technical Reports Server (NTRS)

    Lynnes, Chris

    2018-01-01

    This presentation is a tutorial on getting started with cloud computing for the purposes of Earth Observation datasets. We first discuss some of the main advantages that cloud computing can provide for the Earth scientist: copious processing power, immense and affordable data storage, and rapid startup time. We also talk about some of the challenges of getting the most out of cloud computing: re-organizing the way data are analyzed, handling node failures and attending.

  15. A Tale of Two scientists and their Involvement in Education & Outreach

    NASA Astrophysics Data System (ADS)

    McDonnell, J.

    2004-12-01

    Many scientists, when faced with developing an education and outreach plan for their research proposals, are unclear on what kinds of impacts they can have on broader non scientist audiences. Many scientists feel their only options are to develop a website or invite a teacher to get involved in their sampling or research cruises. Scientists, who are constrained by time and resources, are not aware of the range of education and outreach options available to them and of the great value their involvement can bring to the public. In an recent survey at the National Science Foundation sponsored ORION conference (January 2004), respondents stated that the greatest public benefits to having scientists involved in public education are (1) that they can present the benefits and relevance of research (26%), (2) focus awareness on environmental issues (26%), (3) serve as models for teachers and motivators for children (25%) and (4) increase public understanding, awareness and appreciation of science (about 22%). As a member of the Mid-Atlantic Center for Ocean Sciences Education Excellence (MACOSEE), the Institute of Marine & Coastal Sciences (IMCS) at Rutgers University is dedicated to helping scientists and educators realize the benefits of working together to advance ocean discovery and make known the vital role of the ocean in our lives. A website called "Scientist Connection" (www.macosee.net) was developed to help busy scientists choose a role in education and outreach that will make the most of their talent and time. The goal of the web site is to help scientists produce a worthwhile education project that complements and enriches their research. In this session, the author will present two case studies that demonstrate very different but effective approaches to scientist's involvement in education and outreach projects. In the first case, we will chronicle how a team of biologists and oceanographers in the Rutgers University, Coastal Ocean Observation Laboratory (or COOLroom) developed the education and outreach capacity to serve thousands of boaters, fisherman, and tourists daily with their real-time data products from experimental coastal observing systems. We also will touch on how scientists and educators at IMCS leveraged additional grants to support the translation of data and information from the coastal observatories into an instructional product called COOL Classroom, usable by educators and the public. This case study will show how MACOSEE is striving to use observing systems to provide the scientific backbone for an integrated program of science and education that improves user access to, and understanding of, modern ocean science and how it affects our daily lives. In the second case, we will show how Rutgers scientists are working with print media to support education and outreach. We will tell the story of how a small newspaper pilot project grew into a university wide mechanism for scientists to reach a half a million newspaper readers for minimal cost and time investment to the scientist.

  16. Bayesian Inference: with ecological applications

    USGS Publications Warehouse

    Link, William A.; Barker, Richard J.

    2010-01-01

    This text provides a mathematically rigorous yet accessible and engaging introduction to Bayesian inference with relevant examples that will be of interest to biologists working in the fields of ecology, wildlife management and environmental studies as well as students in advanced undergraduate statistics.. This text opens the door to Bayesian inference, taking advantage of modern computational efficiencies and easily accessible software to evaluate complex hierarchical models.

  17. Plant metabolic modeling: achieving new insight into metabolism and metabolic engineering.

    PubMed

    Baghalian, Kambiz; Hajirezaei, Mohammad-Reza; Schreiber, Falk

    2014-10-01

    Models are used to represent aspects of the real world for specific purposes, and mathematical models have opened up new approaches in studying the behavior and complexity of biological systems. However, modeling is often time-consuming and requires significant computational resources for data development, data analysis, and simulation. Computational modeling has been successfully applied as an aid for metabolic engineering in microorganisms. But such model-based approaches have only recently been extended to plant metabolic engineering, mainly due to greater pathway complexity in plants and their highly compartmentalized cellular structure. Recent progress in plant systems biology and bioinformatics has begun to disentangle this complexity and facilitate the creation of efficient plant metabolic models. This review highlights several aspects of plant metabolic modeling in the context of understanding, predicting and modifying complex plant metabolism. We discuss opportunities for engineering photosynthetic carbon metabolism, sucrose synthesis, and the tricarboxylic acid cycle in leaves and oil synthesis in seeds and the application of metabolic modeling to the study of plant acclimation to the environment. The aim of the review is to offer a current perspective for plant biologists without requiring specialized knowledge of bioinformatics or systems biology. © 2014 American Society of Plant Biologists. All rights reserved.

  18. Identification and restoration in 3D fluorescence microscopy

    NASA Astrophysics Data System (ADS)

    Dieterlen, Alain; Xu, Chengqi; Haeberle, Olivier; Hueber, Nicolas; Malfara, R.; Colicchio, B.; Jacquey, Serge

    2004-06-01

    3-D optical fluorescent microscopy becomes now an efficient tool for volumic investigation of living biological samples. The 3-D data can be acquired by Optical Sectioning Microscopy which is performed by axial stepping of the object versus the objective. For any instrument, each recorded image can be described by a convolution equation between the original object and the Point Spread Function (PSF) of the acquisition system. To assess performance and ensure the data reproducibility, as for any 3-D quantitative analysis, the system indentification is mandatory. The PSF explains the properties of the image acquisition system; it can be computed or acquired experimentally. Statistical tools and Zernike moments are shown appropriate and complementary to describe a 3-D system PSF and to quantify the variation of the PSF as function of the optical parameters. Some critical experimental parameters can be identified with these tools. This is helpful for biologist to define an aquisition protocol optimizing the use of the system. Reduction of out-of-focus light is the task of 3-D microscopy; it is carried out computationally by deconvolution process. Pre-filtering the images improves the stability of deconvolution results, now less dependent on the regularization parameter; this helps the biologists to use restoration process.

  19. The IT in Secondary Science Book. A Compendium of Ideas for Using Computers and Teaching Science.

    ERIC Educational Resources Information Center

    Frost, Roger

    Scientists need to measure and communicate, to handle information, and model ideas. In essence, they need to process information. Young scientists have the same needs. Computers have become a tremendously important addition to the processing of information through database use, graphing and modeling and also in the collection of information…

  20. 75 FR 64996 - Takes of Marine Mammals Incidental to Specified Activities; Marine Geophysical Survey in the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-21

    ... cruises. A laptop computer is located on the observer platform for ease of data entry. The computer is... lines, the receiving systems will receive the returning acoustic signals. The study (e.g., equipment...-board assistance by the scientists who have proposed the study. The Chief Scientist is Dr. Franco...

  1. Life Beyond Earth and the Evolutionary Synthesis

    NASA Astrophysics Data System (ADS)

    Vakoch, Douglas A.

    For many astronomers, the progressive development of life has been seen as a natural occurrence given proper environmental conditions on a planet: even though such beings would not be identical to humans, there would be significant parallels. A striking contrast is seen in writings of nonphysical scientists, who have held more widely differing views. But within this diversity, reasons for differences become more apparent when we see how views about extraterrestrials can be related to the differential emphasis placed on modern evolutionary theory by scientists of various disciplines. One clue to understanding the differences between the biologists, paleontologists, and anthropologists who speculated on extraterrestrials is suggested by noting who wrote on the subject. Given the relatively small number of commentators on the topic, it seems more than coincidental that four of the major contributors to the evolutionary synthesis in the 1930s and 1940s are among them. Upon closer examination it is evident that the exobiological arguments of Theodosius Dobzhansky and George Gaylord Simpson and, less directly, of H. J. Muller and Ernst Mayr are all related to their earlier work in formulating synthetic evolution. By examining the variety of views held by nonphysical scientists, we can see that there were significant disagreements between them about evolution into the 1960s. By the mid-1980s, many believed that "higher" life, particularly intelligent life, probably occurs quite infrequently in the universe; nevertheless, some held out the possibility that convergence of intelligence could occur across worlds. Regardless of the final conclusions these scientists reached about the likely prevalence of extraterrestrial intelligence, the use of evolutionary arguments to support their positions became increasingly common.

  2. Marine molecular biology: an emerging field of biological sciences.

    PubMed

    Thakur, Narsinh L; Jain, Roopesh; Natalio, Filipe; Hamer, Bojan; Thakur, Archana N; Müller, Werner E G

    2008-01-01

    An appreciation of the potential applications of molecular biology is of growing importance in many areas of life sciences, including marine biology. During the past two decades, the development of sophisticated molecular technologies and instruments for biomedical research has resulted in significant advances in the biological sciences. However, the value of molecular techniques for addressing problems in marine biology has only recently begun to be cherished. It has been proven that the exploitation of molecular biological techniques will allow difficult research questions about marine organisms and ocean processes to be addressed. Marine molecular biology is a discipline, which strives to define and solve the problems regarding the sustainable exploration of marine life for human health and welfare, through the cooperation between scientists working in marine biology, molecular biology, microbiology and chemistry disciplines. Several success stories of the applications of molecular techniques in the field of marine biology are guiding further research in this area. In this review different molecular techniques are discussed, which have application in marine microbiology, marine invertebrate biology, marine ecology, marine natural products, material sciences, fisheries, conservation and bio-invasion etc. In summary, if marine biologists and molecular biologists continue to work towards strong partnership during the next decade and recognize intellectual and technological advantages and benefits of such partnership, an exciting new frontier of marine molecular biology will emerge in the future.

  3. COV2HTML: a visualization and analysis tool of bacterial next generation sequencing (NGS) data for postgenomics life scientists.

    PubMed

    Monot, Marc; Orgeur, Mickael; Camiade, Emilie; Brehier, Clément; Dupuy, Bruno

    2014-03-01

    COV2HTML is an interactive web interface, which is addressed to biologists, and allows performing both coverage visualization and analysis of NGS alignments performed on prokaryotic organisms (bacteria and phages). It combines two processes: a tool that converts the huge NGS mapping or coverage files into light specific coverage files containing information on genetic elements; and a visualization interface allowing a real-time analysis of data with optional integration of statistical results. To demonstrate the scope of COV2HTML, the program was tested with data from two published studies. The first data were from RNA-seq analysis of Campylobacter jejuni, based on comparison of two conditions with two replicates. We were able to recover 26 out of 27 genes highlighted in the publication using COV2HTML. The second data comprised of stranded TSS and RNA-seq data sets on the Archaea Sulfolobus solfataricus. COV2HTML was able to highlight most of the TSSs from the article and allows biologists to visualize both TSS and RNA-seq on the same screen. The strength of the COV2HTML interface is making possible NGS data analysis without software installation, login, or a long training period. A web version is accessible at https://mmonot.eu/COV2HTML/ . This website is free and open to users without any login requirement.

  4. What do computer scientists tweet? Analyzing the link-sharing practice on Twitter

    PubMed Central

    Schmitt, Marco

    2017-01-01

    Twitter communication has permeated every sphere of society. To highlight and share small pieces of information with possibly vast audiences or small circles of the interested has some value in almost any aspect of social life. But what is the value exactly for a scientific field? We perform a comprehensive study of computer scientists using Twitter and their tweeting behavior concerning the sharing of web links. Discerning the domains, hosts and individual web pages being tweeted and the differences between computer scientists and a Twitter sample enables us to look in depth at the Twitter-based information sharing practices of a scientific community. Additionally, we aim at providing a deeper understanding of the role and impact of altmetrics in computer science and give a glance at the publications mentioned on Twitter that are most relevant for the computer science community. Our results show a link sharing culture that concentrates more heavily on public and professional quality information than the Twitter sample does. The results also show a broad variety in linked sources and especially in linked publications with some publications clearly related to community-specific interests of computer scientists, while others with a strong relation to attention mechanisms in social media. This refers to the observation that Twitter is a hybrid form of social media between an information service and a social network service. Overall the computer scientists’ style of usage seems to be more on the information-oriented side and to some degree also on professional usage. Therefore, altmetrics are of considerable use in analyzing computer science. PMID:28636619

  5. Analysis of severe storm data

    NASA Technical Reports Server (NTRS)

    Hickey, J. S.

    1983-01-01

    The Mesoscale Analysis and Space Sensor (MASS) Data Management and Analysis System developed by Atsuko Computing International (ACI) on the MASS HP-1000 Computer System within the Systems Dynamics Laboratory of the Marshall Space Flight Center is described. The MASS Data Management and Analysis System was successfully implemented and utilized daily by atmospheric scientists to graphically display and analyze large volumes of conventional and satellite derived meteorological data. The scientists can process interactively various atmospheric data (Sounding, Single Level, Gird, and Image) by utilizing the MASS (AVE80) share common data and user inputs, thereby reducing overhead, optimizing execution time, and thus enhancing user flexibility, useability, and understandability of the total system/software capabilities. In addition ACI installed eight APPLE III graphics/imaging computer terminals in individual scientist offices and integrated them into the MASS HP-1000 Computer System thus providing significant enhancement to the overall research environment.

  6. NASA/DOD Aerospace Knowledge Diffusion Research Project. Paper 19: Computer and information technology and aerospace knowledge diffusion

    NASA Technical Reports Server (NTRS)

    Pinelli, Thomas E.; Kennedy, John M.; Barclay, Rebecca O.; Bishop, Ann P.

    1992-01-01

    To remain a world leader in aerospace, the US must improve and maintain the professional competency of its engineers and scientists, increase the research and development (R&D) knowledge base, improve productivity, and maximize the integration of recent technological developments into the R&D process. How well these objectives are met, and at what cost, depends on a variety of factors, but largely on the ability of US aerospace engineers and scientists to acquire and process the results of federally funded R&D. The Federal Government's commitment to high speed computing and networking systems presupposes that computer and information technology will play a major role in the aerospace knowledge diffusion process. However, we know little about information technology needs, uses, and problems within the aerospace knowledge diffusion process. The use of computer and information technology by US aerospace engineers and scientists in academia, government, and industry is reported.

  7. Introduction to the Space Physics Analysis Network (SPAN)

    NASA Technical Reports Server (NTRS)

    Green, J. L. (Editor); Peters, D. J. (Editor)

    1985-01-01

    The Space Physics Analysis Network or SPAN is emerging as a viable method for solving an immediate communication problem for the space scientist. SPAN provides low-rate communication capability with co-investigators and colleagues, and access to space science data bases and computational facilities. The SPAN utilizes up-to-date hardware and software for computer-to-computer communications allowing binary file transfer and remote log-on capability to over 25 nationwide space science computer systems. SPAN is not discipline or mission dependent with participation from scientists in such fields as magnetospheric, ionospheric, planetary, and solar physics. Basic information on the network and its use are provided. It is anticipated that SPAN will grow rapidly over the next few years, not only from the standpoint of more network nodes, but as scientists become more proficient in the use of telescience, more capability will be needed to satisfy the demands.

  8. Most Social Scientists Shun Free Use of Supercomputers.

    ERIC Educational Resources Information Center

    Kiernan, Vincent

    1998-01-01

    Social scientists, who frequently complain that the federal government spends too little on them, are passing up what scholars in the physical and natural sciences see as the government's best give-aways: free access to supercomputers. Some social scientists say the supercomputers are difficult to use; others find desktop computers provide…

  9. The Unified English Braille Code: Examination by Science, Mathematics, and Computer Science Technical Expert Braille Readers

    ERIC Educational Resources Information Center

    Holbrook, M. Cay; MacCuspie, P. Ann

    2010-01-01

    Braille-reading mathematicians, scientists, and computer scientists were asked to examine the usability of the Unified English Braille Code (UEB) for technical materials. They had little knowledge of the code prior to the study. The research included two reading tasks, a short tutorial about UEB, and a focus group. The results indicated that the…

  10. Text-mining and information-retrieval services for molecular biology

    PubMed Central

    Krallinger, Martin; Valencia, Alfonso

    2005-01-01

    Text-mining in molecular biology - defined as the automatic extraction of information about genes, proteins and their functional relationships from text documents - has emerged as a hybrid discipline on the edges of the fields of information science, bioinformatics and computational linguistics. A range of text-mining applications have been developed recently that will improve access to knowledge for biologists and database annotators. PMID:15998455

  11. Conceptual Barriers to Progress Within Evolutionary Biology

    PubMed Central

    Laland, Kevin N.; Odling-Smee, John; Feldman, Marcus W.; Kendal, Jeremy

    2011-01-01

    In spite of its success, Neo-Darwinism is faced with major conceptual barriers to further progress, deriving directly from its metaphysical foundations. Most importantly, neo-Darwinism fails to recognize a fundamental cause of evolutionary change, “niche construction”. This failure restricts the generality of evolutionary theory, and introduces inaccuracies. It also hinders the integration of evolutionary biology with neighbouring disciplines, including ecosystem ecology, developmental biology, and the human sciences. Ecology is forced to become a divided discipline, developmental biology is stubbornly difficult to reconcile with evolutionary theory, and the majority of biologists and social scientists are still unhappy with evolutionary accounts of human behaviour. The incorporation of niche construction as both a cause and a product of evolution removes these disciplinary boundaries while greatly generalizing the explanatory power of evolutionary theory. PMID:21572912

  12. Science and Sentiment: Grinnell's Fact-Based Philosophy of Biodiversity Conservation.

    PubMed

    Shavit, Ayelet; Griesemer, James R

    2018-06-01

    At the beginning of the twentieth century, the biologist Joseph Grinnell made a distinction between science and sentiment for producing fact-based generalizations on how to conserve biodiversity. We are inspired by Grinnellian science, which successfully produced a century-long impact on studying and conserving biodiversity that runs orthogonal to some familiar philosophical distinctions such as fact versus value, emotion versus reason and basic versus applied science. According to Grinnell, unlike sentiment-based generalizations, a fact-based generalization traces its diverse commitments and thus becomes tractable for its audience. We argue that foregrounding tractability better explains Grinnell's practice in the context of his time as well as in the context of current discourse among scientists over the political "biases" of biodiversity research and its problem of "reproducibility."

  13. Conceptual Barriers to Progress Within Evolutionary Biology.

    PubMed

    Laland, Kevin N; Odling-Smee, John; Feldman, Marcus W; Kendal, Jeremy

    2009-08-01

    In spite of its success, Neo-Darwinism is faced with major conceptual barriers to further progress, deriving directly from its metaphysical foundations. Most importantly, neo-Darwinism fails to recognize a fundamental cause of evolutionary change, "niche construction". This failure restricts the generality of evolutionary theory, and introduces inaccuracies. It also hinders the integration of evolutionary biology with neighbouring disciplines, including ecosystem ecology, developmental biology, and the human sciences. Ecology is forced to become a divided discipline, developmental biology is stubbornly difficult to reconcile with evolutionary theory, and the majority of biologists and social scientists are still unhappy with evolutionary accounts of human behaviour. The incorporation of niche construction as both a cause and a product of evolution removes these disciplinary boundaries while greatly generalizing the explanatory power of evolutionary theory.

  14. Interactive text mining with Pipeline Pilot: a bibliographic web-based tool for PubMed.

    PubMed

    Vellay, S G P; Latimer, N E Miller; Paillard, G

    2009-06-01

    Text mining has become an integral part of all research in the medical field. Many text analysis software platforms support particular use cases and only those. We show an example of a bibliographic tool that can be used to support virtually any use case in an agile manner. Here we focus on a Pipeline Pilot web-based application that interactively analyzes and reports on PubMed search results. This will be of interest to any scientist to help identify the most relevant papers in a topical area more quickly and to evaluate the results of query refinement. Links with Entrez databases help both the biologist and the chemist alike. We illustrate this application with Leishmaniasis, a neglected tropical disease, as a case study.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This symposium included five sessions. Session I dealt with the technology for contending with harmful effluents primarily from coal conversion processes. Session II was designed to address the need for the systematic application of existing capabilities to the collection and characterization of materials of importance to the life scientists. Session III had the underlying theme of the health effects research - biologists, chemists, and technologists working together to confront the problems of the emerging industries. Session IV provided the most recent data in the areas of atmospheric, solid, and liquid releases. Session V dealt with effects on humans and onmore » those people who may potentially be affected by the toxic material that they produce. In summary, the sessions were: technology, chemical, characterization, biological effects, environmental and ecological effects and occupational health effects. 29 pages were included.« less

  16. Meet EPA Scientist Valerie Zartarian, Ph.D.

    EPA Pesticide Factsheets

    Senior exposure scientist and research environmental engineer Valerie Zartarian, Ph.D. helps build computer models and other tools that advance our understanding of how people interact with chemicals.

  17. Hot, Hot, Hot Computer Careers.

    ERIC Educational Resources Information Center

    Basta, Nicholas

    1988-01-01

    Discusses the increasing need for electrical, electronic, and computer engineers; and scientists. Provides current status of the computer industry and average salaries. Considers computer chip manufacture and the current chip shortage. (MVL)

  18. OSG-GEM: Gene Expression Matrix Construction Using the Open Science Grid.

    PubMed

    Poehlman, William L; Rynge, Mats; Branton, Chris; Balamurugan, D; Feltus, Frank A

    2016-01-01

    High-throughput DNA sequencing technology has revolutionized the study of gene expression while introducing significant computational challenges for biologists. These computational challenges include access to sufficient computer hardware and functional data processing workflows. Both these challenges are addressed with our scalable, open-source Pegasus workflow for processing high-throughput DNA sequence datasets into a gene expression matrix (GEM) using computational resources available to U.S.-based researchers on the Open Science Grid (OSG). We describe the usage of the workflow (OSG-GEM), discuss workflow design, inspect performance data, and assess accuracy in mapping paired-end sequencing reads to a reference genome. A target OSG-GEM user is proficient with the Linux command line and possesses basic bioinformatics experience. The user may run this workflow directly on the OSG or adapt it to novel computing environments.

  19. OSG-GEM: Gene Expression Matrix Construction Using the Open Science Grid

    PubMed Central

    Poehlman, William L.; Rynge, Mats; Branton, Chris; Balamurugan, D.; Feltus, Frank A.

    2016-01-01

    High-throughput DNA sequencing technology has revolutionized the study of gene expression while introducing significant computational challenges for biologists. These computational challenges include access to sufficient computer hardware and functional data processing workflows. Both these challenges are addressed with our scalable, open-source Pegasus workflow for processing high-throughput DNA sequence datasets into a gene expression matrix (GEM) using computational resources available to U.S.-based researchers on the Open Science Grid (OSG). We describe the usage of the workflow (OSG-GEM), discuss workflow design, inspect performance data, and assess accuracy in mapping paired-end sequencing reads to a reference genome. A target OSG-GEM user is proficient with the Linux command line and possesses basic bioinformatics experience. The user may run this workflow directly on the OSG or adapt it to novel computing environments. PMID:27499617

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shankar, Arjun

    Computer scientist Arjun Shankar is director of the Compute and Data Environment for Science (CADES), ORNL’s multidisciplinary big data computing center. CADES offers computing, networking and data analytics to facilitate workflows for both ORNL and external research projects.

  1. Jungle Computing: Distributed Supercomputing Beyond Clusters, Grids, and Clouds

    NASA Astrophysics Data System (ADS)

    Seinstra, Frank J.; Maassen, Jason; van Nieuwpoort, Rob V.; Drost, Niels; van Kessel, Timo; van Werkhoven, Ben; Urbani, Jacopo; Jacobs, Ceriel; Kielmann, Thilo; Bal, Henri E.

    In recent years, the application of high-performance and distributed computing in scientific practice has become increasingly wide spread. Among the most widely available platforms to scientists are clusters, grids, and cloud systems. Such infrastructures currently are undergoing revolutionary change due to the integration of many-core technologies, providing orders-of-magnitude speed improvements for selected compute kernels. With high-performance and distributed computing systems thus becoming more heterogeneous and hierarchical, programming complexity is vastly increased. Further complexities arise because urgent desire for scalability and issues including data distribution, software heterogeneity, and ad hoc hardware availability commonly force scientists into simultaneous use of multiple platforms (e.g., clusters, grids, and clouds used concurrently). A true computing jungle.

  2. Hope or Hype? What is Next for Biofuels? (LBNL Science at the Theater)

    ScienceCinema

    Keasling, Jay; Bristow, Jim; Tringe, Susannah Green

    2017-12-09

    Science at the Theater: From the sun to your gas tank: A new breed of biofuels may help solve the global energy challenge and reduce the impact of fossil fuels on global warming. KTVU Channel 2 health and science editor John Fowler will moderate a panel of Lawrence Berkeley National Laboratory scientists who are developing ways to convert the solar energy stored in plants into liquid fuels. Jay Keasling is one of the foremost authorities in the field of synthetic biology. He is applying this research toward the production of advanced carbon-neutral biofuels that can replace gasoline on a gallon-for-gallon basis. Keasling is Berkeley Labs Acting Deputy Director and the Chief Executive Officer of the U.S. Department of Energys Joint BioEnergy Institute. Jim Bristow is deputy director of programs for the U.S. Department of Energy Joint Genome Institute (JGI), a national user facility in Walnut Creek, CA. He developed and implemented JGIs Community Sequencing Program, which provides large-scale DNA sequencing and analysis to advance genomics related to bioenergy and environmental characterization and cleanup. Susanna Green Tringe is a computational biologist with the U.S. Department of Energy Joint Genome Institute (JGI). She helped pioneer the field of metagenomics, a new strategy for isolating, sequencing, and characterizing DNA extracted directly from environmental samples, such as the contents of the termite gut, which yielded enzymes responsible for breakdown of wood into fuel.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lapidus, Alla L.

    From the date its role in heredity was discovered, DNA has been generating interest among scientists from different fields of knowledge: physicists have studied the three dimensional structure of the DNA molecule, biologists tried to decode the secrets of life hidden within these long molecules, and technologists invent and improve methods of DNA analysis. The analysis of the nucleotide sequence of DNA occupies a special place among the methods developed. Thanks to the variety of sequencing technologies available, the process of decoding the sequence of genomic DNA (or whole genome sequencing) has become robust and inexpensive. Meanwhile the assembly ofmore » whole genome sequences remains a challenging task. In addition to the need to assemble millions of DNA fragments of different length (from 35 bp (Solexa) to 800 bp (Sanger)), great interest in analysis of microbial communities (metagenomes) of different complexities raises new problems and pushes some new requirements for sequence assembly tools to the forefront. The genome assembly process can be divided into two steps: draft assembly and assembly improvement (finishing). Despite the fact that automatically performed assembly (or draft assembly) is capable of covering up to 98% of the genome, in most cases, it still contains incorrectly assembled reads. The error rate of the consensus sequence produced at this stage is about 1/2000 bp. A finished genome represents the genome assembly of much higher accuracy (with no gaps or incorrectly assembled areas) and quality ({approx}1 error/10,000 bp), validated through a number of computer and laboratory experiments.« less

  4. Microscopic Observation of Self-Propagation of Calcifying Nanoparticles (Nanobacteria)

    NASA Technical Reports Server (NTRS)

    Mathew, Grace; McKay, David S.; Ciftcioglu, Neva

    2007-01-01

    Biologists typically define living organisms as carbon and water-based cellular forms with :self-replication" as the fundamental trait of the life process. However, this standard dictionary definition of life does not help scientists to categorize self-replicators like viruses, prions, proteons and artificial life. CNP also named nanobacteria were discovered in early 1990s as about 100 nanometer-sized bacteria-like particles with unique apatite mineral-shells around them, and found to be associated with pathological-calcification related diseases. Although CNP have been isolated and cultured from mammalian blood and diseased calcified tissues, and their biomineralizing properties well established, their biological nature and self-replicating capability have always been severely challenged. The terms "self-replication", "self-assembly" or "self-propagation" have been widely used for all systems including nanomachines, crystals, computer viruses and memes. In a simple taxonomy, all biological and non-biological "self replicators", have been classified into "living" or "nonliving" based on the properties of the systems and the amount of support they require to self-replicate. To enhance our understanding about self-replicating nature of CNP, we have investigated their growth in specific culture conditions using conventional inverted light microscope and BioStation IM, Nikon s latest time-lapse imaging system. Their morphological structure was examined using scanning (SEM) and transmission (TEM) electron microscopy. This present study, in conjunction with previous findings of metabolic activity, antibiotic sensitivity, antibody specificity, morphological aspects and infectivity, all concomitantly validate CNP as living self-replicators.

  5. Bat Biology, Genomes, and the Bat1K Project: To Generate Chromosome-Level Genomes for All Living Bat Species.

    PubMed

    Teeling, Emma C; Vernes, Sonja C; Dávalos, Liliana M; Ray, David A; Gilbert, M Thomas P; Myers, Eugene

    2018-02-15

    Bats are unique among mammals, possessing some of the rarest mammalian adaptations, including true self-powered flight, laryngeal echolocation, exceptional longevity, unique immunity, contracted genomes, and vocal learning. They provide key ecosystem services, pollinating tropical plants, dispersing seeds, and controlling insect pest populations, thus driving healthy ecosystems. They account for more than 20% of all living mammalian diversity, and their crown-group evolutionary history dates back to the Eocene. Despite their great numbers and diversity, many species are threatened and endangered. Here we announce Bat1K, an initiative to sequence the genomes of all living bat species (n∼1,300) to chromosome-level assembly. The Bat1K genome consortium unites bat biologists (>148 members as of writing), computational scientists, conservation organizations, genome technologists, and any interested individuals committed to a better understanding of the genetic and evolutionary mechanisms that underlie the unique adaptations of bats. Our aim is to catalog the unique genetic diversity present in all living bats to better understand the molecular basis of their unique adaptations; uncover their evolutionary history; link genotype with phenotype; and ultimately better understand, promote, and conserve bats. Here we review the unique adaptations of bats and highlight how chromosome-level genome assemblies can uncover the molecular basis of these traits. We present a novel sequencing and assembly strategy and review the striking societal and scientific benefits that will result from the Bat1K initiative.

  6. Of the Helmholtz Club, South-Californian seedbed for visual and cognitive neuroscience, and its patron Francis Crick

    PubMed Central

    Aicardi, Christine

    2014-01-01

    Taking up the view that semi-institutional gatherings such as clubs, societies, research schools, have been instrumental in creating sheltered spaces from which many a 20th-century project-driven interdisciplinary research programme could develop and become established within the institutions of science, the paper explores the history of one such gathering from its inception in the early 1980s into the 2000s, the Helmholtz Club, which brought together scientists from such various research fields as neuroanatomy, neurophysiology, psychophysics, computer science and engineering, who all had an interest in the study of the visual system and of higher cognitive functions relying on visual perception such as visual consciousness. It argues that British molecular biologist turned South Californian neuroscientist Francis Crick had an early and lasting influence over the Helmholtz Club of which he was a founding pillar, and that from its inception, the club served as a constitutive element in his long-term plans for a neuroscience of vision and of cognition. Further, it argues that in this role, the Helmholtz Club served many purposes, the primary of which was to be a social forum for interdisciplinary discussion, where ‘discussion’ was not mere talk but was imbued with an epistemic value and as such, carefully cultivated. Finally, it questions what counts as ‘doing science’ and in turn, definitions of success and failure—and provides some material evidence towards re-appraising the successfulness of Crick’s contribution to the neurosciences. PMID:24384229

  7. What Neural Substrates Trigger the Adept Scientific Pattern Discovery by Biologists?

    ERIC Educational Resources Information Center

    Lee, Jun-Ki; Kwon, Yong-Ju

    2011-01-01

    This study investigated the neural correlates of experts and novices during biological object pattern detection using an fMRI approach in order to reveal the neural correlates of a biologist's superior pattern discovery ability. Sixteen healthy male participants (8 biologists and 8 non-biologists) volunteered for the study. Participants were shown…

  8. Developing a strategy for computational lab skills training through Software and Data Carpentry: Experiences from the ELIXIR Pilot action

    PubMed Central

    Pawlik, Aleksandra; van Gelder, Celia W.G.; Nenadic, Aleksandra; Palagi, Patricia M.; Korpelainen, Eija; Lijnzaad, Philip; Marek, Diana; Sansone, Susanna-Assunta; Hancock, John; Goble, Carole

    2017-01-01

    Quality training in computational skills for life scientists is essential to allow them to deliver robust, reproducible and cutting-edge research. A pan-European bioinformatics programme, ELIXIR, has adopted a well-established and progressive programme of computational lab and data skills training from Software and Data Carpentry, aimed at increasing the number of skilled life scientists and building a sustainable training community in this field. This article describes the Pilot action, which introduced the Carpentry training model to the ELIXIR community. PMID:28781745

  9. Developing a strategy for computational lab skills training through Software and Data Carpentry: Experiences from the ELIXIR Pilot action.

    PubMed

    Pawlik, Aleksandra; van Gelder, Celia W G; Nenadic, Aleksandra; Palagi, Patricia M; Korpelainen, Eija; Lijnzaad, Philip; Marek, Diana; Sansone, Susanna-Assunta; Hancock, John; Goble, Carole

    2017-01-01

    Quality training in computational skills for life scientists is essential to allow them to deliver robust, reproducible and cutting-edge research. A pan-European bioinformatics programme, ELIXIR, has adopted a well-established and progressive programme of computational lab and data skills training from Software and Data Carpentry, aimed at increasing the number of skilled life scientists and building a sustainable training community in this field. This article describes the Pilot action, which introduced the Carpentry training model to the ELIXIR community.

  10. A signaling visualization toolkit to support rational design of combination therapies and biomarker discovery: SiViT.

    PubMed

    Bown, James L; Shovman, Mark; Robertson, Paul; Boiko, Andrei; Goltsov, Alexey; Mullen, Peter; Harrison, David J

    2017-05-02

    Targeted cancer therapy aims to disrupt aberrant cellular signalling pathways. Biomarkers are surrogates of pathway state, but there is limited success in translating candidate biomarkers to clinical practice due to the intrinsic complexity of pathway networks. Systems biology approaches afford better understanding of complex, dynamical interactions in signalling pathways targeted by anticancer drugs. However, adoption of dynamical modelling by clinicians and biologists is impeded by model inaccessibility. Drawing on computer games technology, we present a novel visualization toolkit, SiViT, that converts systems biology models of cancer cell signalling into interactive simulations that can be used without specialist computational expertise. SiViT allows clinicians and biologists to directly introduce for example loss of function mutations and specific inhibitors. SiViT animates the effects of these introductions on pathway dynamics, suggesting further experiments and assessing candidate biomarker effectiveness. In a systems biology model of Her2 signalling we experimentally validated predictions using SiViT, revealing the dynamics of biomarkers of drug resistance and highlighting the role of pathway crosstalk. No model is ever complete: the iteration of real data and simulation facilitates continued evolution of more accurate, useful models. SiViT will make accessible libraries of models to support preclinical research, combinatorial strategy design and biomarker discovery.

  11. Algorithms in nature: the convergence of systems biology and computational thinking

    PubMed Central

    Navlakha, Saket; Bar-Joseph, Ziv

    2011-01-01

    Computer science and biology have enjoyed a long and fruitful relationship for decades. Biologists rely on computational methods to analyze and integrate large data sets, while several computational methods were inspired by the high-level design principles of biological systems. Recently, these two directions have been converging. In this review, we argue that thinking computationally about biological processes may lead to more accurate models, which in turn can be used to improve the design of algorithms. We discuss the similar mechanisms and requirements shared by computational and biological processes and then present several recent studies that apply this joint analysis strategy to problems related to coordination, network analysis, and tracking and vision. We also discuss additional biological processes that can be studied in a similar manner and link them to potential computational problems. With the rapid accumulation of data detailing the inner workings of biological systems, we expect this direction of coupling biological and computational studies to greatly expand in the future. PMID:22068329

  12. The impact of landsat satellite monitoring on conservation biology.

    PubMed

    Leimgruber, Peter; Christen, Catherine A; Laborderie, Alison

    2005-07-01

    Landsat 7's recent malfunctioning will result in significant gaps in long-term satellite monitoring of Earth, affecting not only the research of the Earth science community but also conservation users of these data. To determine whether or how important Landsat monitoring is for conservation and natural resource management, we reviewed the Landsat program's history with special emphasis on the development of user groups. We also conducted a bibliographic search to determine the extent to which conservation research has been based on Landsat data. Conservation biologists were not an early user group of Landsat data because a) biologists lacked technical capacity--computers and software--to analyze these data; b) Landsat's 1980s commercialization rendered images too costly for biologists' budgets; and c) the broad-scale disciplines of conservation biology and landscape ecology did not develop until the mid-to-late 1980s. All these conditions had changed by the 1990s and Landsat imagery became an important tool for conservation biology. Satellite monitoring and Landsat continuity are mandated by the Land Remote Sensing Act of 1992. This legislation leaves open commercial options. However, past experiments with commercial operations were neither viable nor economical, and severely reduced the quality of monitoring, archiving and data access for academia and the public. Future satellite monitoring programs are essential for conservation and natural resource management, must provide continuity with Landsat, and should be government operated.

  13. NUCLEAR ESPIONAGE: Report Details Spying on Touring Scientists.

    PubMed

    Malakoff, D

    2000-06-30

    A congressional report released this week details dozens of sometimes clumsy attempts by foreign agents to obtain nuclear secrets from U.S. nuclear scientists traveling abroad, ranging from offering scientists prostitutes to prying off the backs of their laptop computers. The report highlights the need to better prepare traveling researchers to safeguard secrets and resist such temptations, say the two lawmakers who requested the report and officials at the Department of Energy, which employs the scientists.

  14. Keeping an Eye on the Prize

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hazi, A U

    2007-02-06

    Setting performance goals is part of the business plan for almost every company. The same is true in the world of supercomputers. Ten years ago, the Department of Energy (DOE) launched the Accelerated Strategic Computing Initiative (ASCI) to help ensure the safety and reliability of the nation's nuclear weapons stockpile without nuclear testing. ASCI, which is now called the Advanced Simulation and Computing (ASC) Program and is managed by DOE's National Nuclear Security Administration (NNSA), set an initial 10-year goal to obtain computers that could process up to 100 trillion floating-point operations per second (teraflops). Many computer experts thought themore » goal was overly ambitious, but the program's results have proved them wrong. Last November, a Livermore-IBM team received the 2005 Gordon Bell Prize for achieving more than 100 teraflops while modeling the pressure-induced solidification of molten metal. The prestigious prize, which is named for a founding father of supercomputing, is awarded each year at the Supercomputing Conference to innovators who advance high-performance computing. Recipients for the 2005 prize included six Livermore scientists--physicists Fred Streitz, James Glosli, and Mehul Patel and computer scientists Bor Chan, Robert Yates, and Bronis de Supinski--as well as IBM researchers James Sexton and John Gunnels. This team produced the first atomic-scale model of metal solidification from the liquid phase with results that were independent of system size. The record-setting calculation used Livermore's domain decomposition molecular-dynamics (ddcMD) code running on BlueGene/L, a supercomputer developed by IBM in partnership with the ASC Program. BlueGene/L reached 280.6 teraflops on the Linpack benchmark, the industry standard used to measure computing speed. As a result, it ranks first on the list of Top500 Supercomputer Sites released in November 2005. To evaluate the performance of nuclear weapons systems, scientists must understand how materials behave under extreme conditions. Because experiments at high pressures and temperatures are often difficult or impossible to conduct, scientists rely on computer models that have been validated with obtainable data. Of particular interest to weapons scientists is the solidification of metals. ''To predict the performance of aging nuclear weapons, we need detailed information on a material's phase transitions'', says Streitz, who leads the Livermore-IBM team. For example, scientists want to know what happens to a metal as it changes from molten liquid to a solid and how that transition affects the material's characteristics, such as its strength.« less

  15. Final Report. Center for Scalable Application Development Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mellor-Crummey, John

    2014-10-26

    The Center for Scalable Application Development Software (CScADS) was established as a part- nership between Rice University, Argonne National Laboratory, University of California Berkeley, University of Tennessee – Knoxville, and University of Wisconsin – Madison. CScADS pursued an integrated set of activities with the aim of increasing the productivity of DOE computational scientists by catalyzing the development of systems software, libraries, compilers, and tools for leadership computing platforms. Principal Center activities were workshops to engage the research community in the challenges of leadership computing, research and development of open-source software, and work with computational scientists to help them develop codesmore » for leadership computing platforms. This final report summarizes CScADS activities at Rice University in these areas.« less

  16. Characterization of real-time computers

    NASA Technical Reports Server (NTRS)

    Shin, K. G.; Krishna, C. M.

    1984-01-01

    A real-time system consists of a computer controller and controlled processes. Despite the synergistic relationship between these two components, they have been traditionally designed and analyzed independently of and separately from each other; namely, computer controllers by computer scientists/engineers and controlled processes by control scientists. As a remedy for this problem, in this report real-time computers are characterized by performance measures based on computer controller response time that are: (1) congruent to the real-time applications, (2) able to offer an objective comparison of rival computer systems, and (3) experimentally measurable/determinable. These measures, unlike others, provide the real-time computer controller with a natural link to controlled processes. In order to demonstrate their utility and power, these measures are first determined for example controlled processes on the basis of control performance functionals. They are then used for two important real-time multiprocessor design applications - the number-power tradeoff and fault-masking and synchronization.

  17. A semantic problem solving environment for integrative parasite research: identification of intervention targets for Trypanosoma cruzi.

    PubMed

    Parikh, Priti P; Minning, Todd A; Nguyen, Vinh; Lalithsena, Sarasi; Asiaee, Amir H; Sahoo, Satya S; Doshi, Prashant; Tarleton, Rick; Sheth, Amit P

    2012-01-01

    Research on the biology of parasites requires a sophisticated and integrated computational platform to query and analyze large volumes of data, representing both unpublished (internal) and public (external) data sources. Effective analysis of an integrated data resource using knowledge discovery tools would significantly aid biologists in conducting their research, for example, through identifying various intervention targets in parasites and in deciding the future direction of ongoing as well as planned projects. A key challenge in achieving this objective is the heterogeneity between the internal lab data, usually stored as flat files, Excel spreadsheets or custom-built databases, and the external databases. Reconciling the different forms of heterogeneity and effectively integrating data from disparate sources is a nontrivial task for biologists and requires a dedicated informatics infrastructure. Thus, we developed an integrated environment using Semantic Web technologies that may provide biologists the tools for managing and analyzing their data, without the need for acquiring in-depth computer science knowledge. We developed a semantic problem-solving environment (SPSE) that uses ontologies to integrate internal lab data with external resources in a Parasite Knowledge Base (PKB), which has the ability to query across these resources in a unified manner. The SPSE includes Web Ontology Language (OWL)-based ontologies, experimental data with its provenance information represented using the Resource Description Format (RDF), and a visual querying tool, Cuebee, that features integrated use of Web services. We demonstrate the use and benefit of SPSE using example queries for identifying gene knockout targets of Trypanosoma cruzi for vaccine development. Answers to these queries involve looking up multiple sources of data, linking them together and presenting the results. The SPSE facilitates parasitologists in leveraging the growing, but disparate, parasite data resources by offering an integrative platform that utilizes Semantic Web techniques, while keeping their workload increase minimal.

  18. The equally wonderful field: Ernst Mayr and organismic biology.

    PubMed

    Milam, Erika Lorraine

    2010-01-01

    Biologists in the 1960s witnessed a period of intense intra-disciplinary negotiations, especially the positioning of organismic biologists relative to molecular biologists. The perceived valorization of the physical sciences by "molecular" biologists became a catalyst creating a unified front of "organismic" biology that incorporated not just evolutionary biologists, but also students of animal behavior, ecology, systematics, botany - in short, almost any biological community that predominantly conducted their research in the field or museum and whose practitioners felt the pinch of the prestige and funding accruing to molecular biologists and biochemists. Ernst Mayr, Theodosius Dobzhansky, and George Gaylord Simpson took leading roles in defending alternatives to what they categorized as the mechanistic approach of chemistry and physics applied to living systems - the "equally wonderful field of organismic biology." Thus, it was through increasingly tense relations with molecular biology that organismic biologists cohered into a distinct community, with their own philosophical grounding, institutional security, and historical identity. Because this identity was based in large part on a fundamental rejection of the physical sciences as a desirable model within biology, organismic biologists succeeded in protecting the future of their field by emphasizing deep divisions that ran through the biological sciences as a whole.

  19. Comparative phylogeography clarifies the complexity and problems of continental distribution that drove A. R. Wallace to favor islands

    PubMed Central

    Riddle, Brett R.

    2016-01-01

    Deciphering the geographic context of diversification and distributional dynamics in continental biotas has long been an interest of biogeographers, ecologists, and evolutionary biologists. Thirty years ago, the approach now known as comparative phylogeography was introduced in a landmark study of a continental biota. Here, I use a set of 455 studies to explore the current scope of continental comparative phylogeography, including geographic, conceptual, temporal, ecological, and genomic attributes. Geographically, studies are more frequent in the northern hemisphere, but the south is catching up. Most studies focus on a Quaternary timeframe, but the Neogene is well represented. As such, explanations for geographic structure and history include geological and climatic events in Earth history, and responses include vicariance, dispersal, and range contraction-expansion into and out of refugia. Focal taxa are biased toward terrestrial or semiterrestrial vertebrates, although plants and invertebrates are well represented in some regions. The use of various kinds of nuclear DNA markers is increasing, as are multiple locus studies, but use of organelle DNA is not decreasing. Species distribution models are not yet widely incorporated into studies. In the future, continental comparative phylogeographers will continue to contribute to erosion of the simple vicariance vs. dispersal paradigm, including exposure of the widespread nature of temporal pseudocongruence and its implications for models of diversification; provide new templates for addressing a variety of ecological and evolutionary traits; and develop closer working relationships with earth scientists and biologists in a variety of disciplines. PMID:27432953

  20. Comparative phylogeography clarifies the complexity and problems of continental distribution that drove A. R. Wallace to favor islands.

    PubMed

    Riddle, Brett R

    2016-07-19

    Deciphering the geographic context of diversification and distributional dynamics in continental biotas has long been an interest of biogeographers, ecologists, and evolutionary biologists. Thirty years ago, the approach now known as comparative phylogeography was introduced in a landmark study of a continental biota. Here, I use a set of 455 studies to explore the current scope of continental comparative phylogeography, including geographic, conceptual, temporal, ecological, and genomic attributes. Geographically, studies are more frequent in the northern hemisphere, but the south is catching up. Most studies focus on a Quaternary timeframe, but the Neogene is well represented. As such, explanations for geographic structure and history include geological and climatic events in Earth history, and responses include vicariance, dispersal, and range contraction-expansion into and out of refugia. Focal taxa are biased toward terrestrial or semiterrestrial vertebrates, although plants and invertebrates are well represented in some regions. The use of various kinds of nuclear DNA markers is increasing, as are multiple locus studies, but use of organelle DNA is not decreasing. Species distribution models are not yet widely incorporated into studies. In the future, continental comparative phylogeographers will continue to contribute to erosion of the simple vicariance vs. dispersal paradigm, including exposure of the widespread nature of temporal pseudocongruence and its implications for models of diversification; provide new templates for addressing a variety of ecological and evolutionary traits; and develop closer working relationships with earth scientists and biologists in a variety of disciplines.

  1. Race in biology and anthropology: A study of college texts and professors

    NASA Astrophysics Data System (ADS)

    Lieberman, Leonard; Hampton, Raymond E.; Littlefield, Alice; Hallead, Glen

    Information about social issues is underemphasized in college science education. This article takes the race concept as an example of this neglect. We review the history of the race concept and report the current status of the concept in textbooks and among professors. Responses to surveys of faculty at Ph.D.-granting departments indicate that 67% of biologists accept the concept of biological races in the species Homo sapiens, while only 50% of physical anthropologists do so. Content analysis of college textbooks indicates a significant degree of change over time (1936-1984) in physical anthropology but a lesser degree in biology. We suggest several reasons for the dissimilarity in the two disciplines. We propose continued use of the concept for some infrahuman species, while abandoning its application to Homo sapiens. For those biologists and anthropologists who continue to use the concept, scientific accuracy can be achieved by the presentation in lecture and text of the following ideas: first, consensus among scientists on the race concept's utility and accuracy does not exist; second, there is more variation within than between so-called races; third, discordant gradations due to natural selection, drift, and interbreeding make consistent racial boundary lines impossible to identify; fourth, past use of the race concept has had harmful consequences; fifth, the most precise study of human hereditary variation maps one trait at a time; and sixth, racial labels are misleading, especially as most populations have a cultural designation.

  2. New protection initiatives announced for coral reefs

    NASA Astrophysics Data System (ADS)

    Showstack, Randy

    Off the coasts of some of the South Pacific's most idyllic-sounding atolls, Austin Bowden-Kerby has seen first-hand the heavy damage to coral reefs from dynamite and cyanide fishing. For instance, while snorkeling near Chuuk, an island in Micronesia, he has observed craters and rubble beds of coral, which locals have told him date to World War II ordnance.A marine biologist and project scientist for the Coral Gardens Initiative of the Foundation for the Peoples of the South Pacific, Bowden-Kerby has also identified what he says are some public health effects related to destroyed coral reefs and their dying fisheries. These problems include protein and vitamin A deficiency and blindness, all of which may—in some instances—be linked to poor nutrition resulting from lower reef fish consumption by islanders, according to Bowden-Kerby.

  3. The American Oystercatcher (Haematopus palliatus) Working Group: 15 years of collaborative focal species research and management

    USGS Publications Warehouse

    Simons, Theodore R.

    2017-01-01

    The American Oystercatcher (Haematopus palliatus) Working Group formed spontaneously in 2001 as coastal waterbird biologists recognized the potential for American Oystercatchers to serve as focal species for collaborative research and management. Accomplishments over the past 15 years include the establishment of rangewide surveys, color-banding protocols, mark-resight studies, a revision of the Birds of North America species account, and new mechanisms for sharing ideas and data. Collaborations among State, Federal, and private sector scientists, natural resource managers, and dedicated volunteers have provided insights into the biology and conservation of American Oystercatchers in the United States and abroad that would not have been possible without the relationships formed through the Working Group. These accomplishments illustrate how broad collaborative approaches and the engagement of the public are key elements of effective shorebird conservation programs.

  4. The recent evolution of the question "What is life"?

    PubMed

    Morange, Michel

    2012-01-01

    The question "What is life?" is absent from the writings of present-day biologists and scientists. However, an answer to this question, even if only partial, is needed for successful completion of projects in astrobiology and synthetic biology. The reasons for this absence are metaphysical, epistemological, and historical. No one has a full answer to this question, but there are many good reasons to keep posing it. Answers are no longer sought in the existence of strengths or mechanisms specific to life. The secret of life has been unveiled and it is nothing other than physical chemistry. What remains to be understood is the way the characteristics of organisms have emerged and been combined within one unique "object." The answer to the question "What is life?" is now looked for in the scenario that generated life.

  5. Straight talk with... Martin Stratmann.

    PubMed

    Stratmann, Martin

    2014-07-01

    The 83 institutes and research facilities of the Max Planck Society, established in 1948, include some of the world's leading scholars in the life sciences, including 17 Noble Prize winners, and publish 15,000 research papers annually. For the past 18 years, biologists have stood at the helm of the prestigious German organization. But last month, an electrochemist and materials scientist, Martin Stratmann, began a six-year term as president of the Munich-based society.Stratmann, who is 60, served as the director of the Max Planck Institute for Iron Research in Düsseldorf since 2000, where he helped develop self-healing coatings that can protect steels and other metals from rust. Stratmann spoke with David Levine about his vision for the Max Planck Society and about what the change of guard will mean for biomedical research. The conversation has been edited for clarity.

  6. Automating the application of smart materials for protein crystallization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khurshid, Sahir; Govada, Lata; EL-Sharif, Hazim F.

    2015-03-01

    The first semi-liquid, non-protein nucleating agent for automated protein crystallization trials is described. This ‘smart material’ is demonstrated to induce crystal growth and will provide a simple, cost-effective tool for scientists in academia and industry. The fabrication and validation of the first semi-liquid nonprotein nucleating agent to be administered automatically to crystallization trials is reported. This research builds upon prior demonstration of the suitability of molecularly imprinted polymers (MIPs; known as ‘smart materials’) for inducing protein crystal growth. Modified MIPs of altered texture suitable for high-throughput trials are demonstrated to improve crystal quality and to increase the probability of successmore » when screening for suitable crystallization conditions. The application of these materials is simple, time-efficient and will provide a potent tool for structural biologists embarking on crystallization trials.« less

  7. Even conservation rules are made to be broken: implications for biodiversity.

    PubMed

    Robbins, Paul; McSweeney, Kendra; Waite, Thomas; Rice, Jennifer

    2006-02-01

    Despite efforts to enclose and control conservation zones around the world, direct human impacts in conservation areas continue, often resulting from clandestine violations of conservation rules through outright poaching, strategic agricultural encroachment, or noncompliance. Nevertheless, next to nothing is actually known about the spatially and temporally explicit patterns of anthropogenic disturbance resulting from such noncompliance. This article reviews current understandings of ecological disturbance and conservation noncompliance, concluding that differing forms of noncompliance hold differing implications for diversity. The authors suggest that forms of anthropogenic patchy disturbance resulting from violation may maintain, if not enhance, floral diversity. They therefore argue for extended empirical investigation of such activities and call for conservation biologists to work with social scientists to assess this conservation reality by analyzing how and when incomplete enforcement and rule-breaking drive ecological change.

  8. Materials for stem cell factories of the future

    NASA Astrophysics Data System (ADS)

    Celiz, Adam D.; Smith, James G. W.; Langer, Robert; Anderson, Daniel G.; Winkler, David A.; Barrett, David A.; Davies, Martyn C.; Young, Lorraine E.; Denning, Chris; Alexander, Morgan R.

    2014-06-01

    Polymeric substrates are being identified that could permit translation of human pluripotent stem cells from laboratory-based research to industrial-scale biomedicine. Well-defined materials are required to allow cell banking and to provide the raw material for reproducible differentiation into lineages for large-scale drug-screening programs and clinical use. Yet more than 1 billion cells for each patient are needed to replace losses during heart attack, multiple sclerosis and diabetes. Producing this number of cells is challenging, and a rethink of the current predominant cell-derived substrates is needed to provide technology that can be scaled to meet the needs of millions of patients a year. In this Review, we consider the role of materials discovery, an emerging area of materials chemistry that is in large part driven by the challenges posed by biologists to materials scientists.

  9. P2S--Coupled simulation with the Precipitation-Runoff Modeling System (PRMS) and the Stream Temperature Network (SNTemp) Models

    USGS Publications Warehouse

    Markstrom, Steven L.

    2012-01-01

    A software program, called P2S, has been developed which couples the daily stream temperature simulation capabilities of the U.S. Geological Survey Stream Network Temperature model with the watershed hydrology simulation capabilities of the U.S. Geological Survey Precipitation-Runoff Modeling System. The Precipitation-Runoff Modeling System is a modular, deterministic, distributed-parameter, physical-process watershed model that simulates hydrologic response to various combinations of climate and land use. Stream Network Temperature was developed to help aquatic biologists and engineers predict the effects of changes that hydrology and energy have on water temperatures. P2S will allow scientists and watershed managers to evaluate the effects of historical climate and projected climate change, landscape evolution, and resource management scenarios on watershed hydrology and in-stream water temperature.

  10. Developing state-of-the-art Cosmology courses for undergraduate non-science students

    NASA Astrophysics Data System (ADS)

    Lopez-Aleman, Ramon

    2007-04-01

    All undergraduate students at the University of Puerto Rico, Rio Piedras are required to take a General Studies interdisciplinary science course as a requisite for graduation. We have successfully developed a new course for non-science majors that deal in current topics of interest including Big Bang cosmology, the uses and misuses of anthropic principle as a philosophical guide for scientists, dark energy and accelerated expansion, string theory and quantum gravity, and the current controversy of Intelligent Design vs Evolution by Natural Selection as explanations for the origins of life on Earth, intelligence and free will in sentient beings. The course was designed with help of philosophers, neuroscientists, biologists and physicists to present science as interesting, exciting, and socially useful sets of ``stories'' to people who usually dislike and misunderstand traditional science courses.

  11. Overcoming immunological barriers in regenerative medicine.

    PubMed

    Zakrzewski, Johannes L; van den Brink, Marcel R M; Hubbell, Jeffrey A

    2014-08-01

    Regenerative therapies that use allogeneic cells are likely to encounter immunological barriers similar to those that occur with transplantation of solid organs and allogeneic hematopoietic stem cells (HSCs). Decades of experience in clinical transplantation hold valuable lessons for regenerative medicine, offering approaches for developing tolerance-induction treatments relevant to cell therapies. Outside the field of solid-organ and allogeneic HSC transplantation, new strategies are emerging for controlling the immune response, such as methods based on biomaterials or mimicry of antigen-specific peripheral tolerance. Novel biomaterials can alter the behavior of cells in tissue-engineered constructs and can blunt host immune responses to cells and biomaterial scaffolds. Approaches to suppress autoreactive immune cells may also be useful in regenerative medicine. The most innovative solutions will be developed through closer collaboration among stem cell biologists, transplantation immunologists and materials scientists.

  12. Computational modeling of brain tumors: discrete, continuum or hybrid?

    NASA Astrophysics Data System (ADS)

    Wang, Zhihui; Deisboeck, Thomas S.

    In spite of all efforts, patients diagnosed with highly malignant brain tumors (gliomas), continue to face a grim prognosis. Achieving significant therapeutic advances will also require a more detailed quantitative understanding of the dynamic interactions among tumor cells, and between these cells and their biological microenvironment. Data-driven computational brain tumor models have the potential to provide experimental tumor biologists with such quantitative and cost-efficient tools to generate and test hypotheses on tumor progression, and to infer fundamental operating principles governing bidirectional signal propagation in multicellular cancer systems. This review highlights the modeling objectives of and challenges with developing such in silico brain tumor models by outlining two distinct computational approaches: discrete and continuum, each with representative examples. Future directions of this integrative computational neuro-oncology field, such as hybrid multiscale multiresolution modeling are discussed.

  13. Computational modeling of brain tumors: discrete, continuum or hybrid?

    NASA Astrophysics Data System (ADS)

    Wang, Zhihui; Deisboeck, Thomas S.

    2008-04-01

    In spite of all efforts, patients diagnosed with highly malignant brain tumors (gliomas), continue to face a grim prognosis. Achieving significant therapeutic advances will also require a more detailed quantitative understanding of the dynamic interactions among tumor cells, and between these cells and their biological microenvironment. Data-driven computational brain tumor models have the potential to provide experimental tumor biologists with such quantitative and cost-efficient tools to generate and test hypotheses on tumor progression, and to infer fundamental operating principles governing bidirectional signal propagation in multicellular cancer systems. This review highlights the modeling objectives of and challenges with developing such in silicobrain tumor models by outlining two distinct computational approaches: discrete and continuum, each with representative examples. Future directions of this integrative computational neuro-oncology field, such as hybrid multiscale multiresolution modeling are discussed.

  14. Nuclear energy in the service of biomedicine: the U.S. Atomic Energy Commission's radioisotope program, 1946-1950.

    PubMed

    Creager, Angela N H

    2006-01-01

    The widespread adoption of radioisotopes as tools in biomedical research and therapy became one of the major consequences of the "physicists' war" for postwar life science. Scientists in the Manhattan Project, as part of their efforts to advocate for civilian uses of atomic energy after the war, proposed using infrastructure from the wartime bomb project to develop a government-run radioisotope distribution program. After the Atomic Energy Bill was passed and before the Atomic Energy Commission (AEC) was formally established, the Manhattan Project began shipping isotopes from Oak Ridge. Scientists and physicians put these reactor-produced isotopes to many of the same uses that had been pioneered with cyclotron-generated radioisotopes in the 1930s and early 1940s. The majority of early AEC shipments were radioiodine and radiophosphorus, employed to evaluate thyroid function, diagnose medical disorders, and irradiate tumors. Both researchers and politicians lauded radioisotopes publicly for their potential in curing diseases, particularly cancer. However, isotopes proved less successful than anticipated in treating cancer and more successful in medical diagnostics. On the research side, reactor-generated radioisotopes equipped biologists with new tools to trace molecular transformations from metabolic pathways to ecosystems. The U.S. government's production and promotion of isotopes stimulated their consumption by scientists and physicians (both domestic and abroad), such that in the postwar period isotopes became routine elements of laboratory and clinical use. In the early postwar years, radioisotopes signified the government's commitment to harness the atom for peace, particularly through contributions to biology, medicine, and agriculture.

  15. Quantifying avian nest survival along an urbanization gradient using citizen- and scientist-generated data.

    PubMed

    Ryder, Thomas B; Reitsma, Robert; Evans, Brian; Marra, Peter P

    2010-03-01

    Despite the increasing pace of urbanization little is known about the factors that limit bird populations (i.e., population-level processes) within the urban/suburban land-use matrix. Here, we report rates of nest survival within the matrix of an urban land-use gradient in the greater Washington, D.C., USA, area for five common songbirds using data collected by scientists and citizens as part of a project called Neighborhood Nestwatch. Using program MARK, we modeled the effects of species, urbanization at multiple spatial scales (canopy cover and impervious surface), and observer (citizen vs. scientist) on nest survival of four open-cup and one cavity-nesting species. In addition, artificial nests were used to determine the relative impacts of specific predators along the land-use gradient. Our results suggest that predation on nests within the land-use matrix declines with urbanization but that there are species-specific differences. Moreover, variation in nest survival among species was best explained by urbanization metrics measured at larger "neighborhood" spatial scales (e.g., 1000 m). Trends were supported by data from artificial nests and suggest that variable predator communities (avian vs. mammalian) are one possible mechanism to explain differential nest survival. In addition, we assessed the quality of citizen science data and show that citizens had no negative effect on nest survival and provided estimates of nest survival comparable to Smithsonian biologists. Although birds nesting within the urban matrix experienced higher nest survival, individuals also faced a multitude of other challenges such as contaminants and invasive species, all of which could reduce adult survival.

  16. Essential Autonomous Science Inference on Rovers (EASIR)

    NASA Technical Reports Server (NTRS)

    Roush, Ted L.; Shipman, Mark; Morris, Robert; Gazis, Paul; Pedersen, Liam

    2003-01-01

    Existing constraints on time, computational, and communication resources associated with Mars rover missions suggest on-board science evaluation of sensor data can contribute to decreasing human-directed operational planning, optimizing returned science data volumes, and recognition of unique or novel data. All of which act to increase the scientific return from a mission. Many different levels of science autonomy exist and each impacts the data collected and returned by, and activities of, rovers. Several computational algorithms, designed to recognize objects of interest to geologists and biologists, are discussed. The algorithms represent various functions that producing scientific opinions and several scenarios illustrate how the opinions can be used.

  17. Development and Evaluation of Strong-Campbell Interest Inventory Scales to Measure Interests of Military Occupational Specialties of the Marine Corps.

    DTIC Science & Technology

    1982-08-01

    though the two groups were different in terms of SC!I scientific interests and academic orientation scores (the aviation supply sample scored higher on...51 Chemists/Physicists 50 MARINE OFFICERS- COMUNICATION 49 MARINE OFFICERS-DATA SYSTEMS 48 Engineers 47 Biologists 46 Systems Analysts/Computer...Base ( Scientific and Technical Information Office) Commander, Air Force Human Resources Laboratory, Lowry Air Force Base (Technical Training Branch

  18. The Fermilab Connection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fermilab

    More than 4,000 scientists in 53 countries use Fermilab and its particle accelerators, detectors and computers for their research. That includes about 2,500 scientists from 223 U.S. institutions in 42 states, plus the District of Columbia and Puerto Rico.

  19. EarthCube: A Community-Driven Cyberinfrastructure for the Geosciences

    NASA Astrophysics Data System (ADS)

    Koskela, Rebecca; Ramamurthy, Mohan; Pearlman, Jay; Lehnert, Kerstin; Ahern, Tim; Fredericks, Janet; Goring, Simon; Peckham, Scott; Powers, Lindsay; Kamalabdi, Farzad; Rubin, Ken; Yarmey, Lynn

    2017-04-01

    EarthCube is creating a dynamic, System of Systems (SoS) infrastructure and data tools to collect, access, analyze, share, and visualize all forms of geoscience data and resources, using advanced collaboration, technological, and computational capabilities. EarthCube, as a joint effort between the U.S. National Science Foundation Directorate for Geosciences and the Division of Advanced Cyberinfrastructure, is a quickly growing community of scientists across all geoscience domains, as well as geoinformatics researchers and data scientists. EarthCube has attracted an evolving, dynamic virtual community of more than 2,500 contributors, including earth, ocean, polar, planetary, atmospheric, geospace, computer and social scientists, educators, and data and information professionals. During 2017, EarthCube will transition to the implementation phase. The implementation will balance "innovation" and "production" to advance cross-disciplinary science goals as well as the development of future data scientists. This presentation will describe the current architecture design for the EarthCube cyberinfrastructure and implementation plan.

  20. Entropy and the Magic Flute

    NASA Astrophysics Data System (ADS)

    Morowitz, Harold J.

    1996-10-01

    Harold Morowitz has long been highly regarded both as an eminent scientist and as an accomplished science writer. The essays in The Wine of Life , his first collection, were hailed by C.P. Snow as "some of the wisest, wittiest and best informed I have ever read," and Carl Sagan called them "a delight to read." In later volumes he established a reputation for a wide-ranging intellect, an ability to see unexpected connections and draw striking parallels, and a talent for communicating scientific ideas with optimism and wit. With Entropy and the Magic Flute , Morowitz once again offers an appealing mix of brief reflections on everything from litmus paper to the hippopotamus to the sociology of Palo Alto coffee shops. Many of these pieces are appreciations of scientists that Morowitz holds in high regard, while others focus on health issues, such as America's obsession with cheese toppings. There is also a fascinating piece on the American Type Culture Collection, a zoo or warehouse for microbes that houses some 11,800 strains of bacteria, and over 3,000 specimens of protozoa, algae, plasmids, and oncogenes. Here then are over forty light, graceful essays in which one of our wisest experimental biologists comments on issues of science, technology, society, philosophy, and the arts.

  1. What concept analysis in philosophy of science should be (and why competing philosophical analyses of gene concepts cannot be tested by polling scientists).

    PubMed

    Waters, C Kenneth

    2004-01-01

    What should philosophers of science accomplish when they analyze scientific concepts and interpret scientific knowledge? What is concept analysis if it is not a description of the way scientists actually think? I investigate these questions by using Hans Reichenbach's account of the descriptive, critical, and advisory tasks of philosophy of science to examine Karola Stotz and Paul Griffiths' idea that poll-based methodologies can test philosophical analyses of scientific concepts. Using Reichenbach's account as a point of departure, I argue that philosophy of science should identify and clarify epistemic virtues and describe scientific knowledge in relation to these virtues. The role of concept analysis is to articulate scientific concepts in ways that help reveal epistemic virtues and limitations of particular sciences. This means an analysis of the gene concept(s) should help clarify the explanatory power and limitations of gene-based explanations, and should help account for the investigative utility and biases of gene-centered sciences. I argue that a philosophical analysis of gene concept(s) that helps achieve these critical aims should not be rejected on the basis of poll-based studies even if such studies could show that professional biologists don't actually use gene terminology in precise ways corresponding to the philosophical analysis.

  2. Meeting after meeting: 20 years of discoveries by the members of the Exocytosis-Endocytosis Club.

    PubMed

    Niedergang, Florence; Gasman, Stéphane; Vitale, Nicolas; Desnos, Claire; Lamaze, Christophe

    2017-09-01

    Twenty years ago, a group of French cell biologists merged two scientific clubs with the aim of bringing together researchers in the fields of Endocytosis and Exocytosis. Founded in 1997, the first annual meeting of the Exocytosis Club was held in 1998. The Endocytosis Club held quarterly meetings from its founding in 1999. The first joint annual meeting of the Exocytosis-Endocytosis Club took place in Paris in April, 2001. What started as a modest gathering of enthusiastic scientists working in the field of cell trafficking has gone from strength to strength, rapidly becoming an unmissable yearly meeting, vividly demonstrating the high quality of science performed in our community and beyond. On the occasion of the 20th meeting of our club, we want to provide historic insight into the fields of exocytosis and endocytosis, and by extension, to subcellular trafficking, highlighting how French scientists have contributed to major advances in these fields. Today, the Exocytosis-Endocytosis Club represents a vibrant and friendly community that will hold its 20th meeting at the Presqu'Ile de Giens, near Toulon in the South of France, on May 11-13, 2017. © 2017 Société Française des Microscopies and Société de Biologie Cellulaire de France. Published by John Wiley & Sons Ltd.

  3. Evaluation of the instream flow incremental methodology by U.S. Fish and Wildlife Service field users

    USGS Publications Warehouse

    Armour, Carl L.; Taylor, Jonathan G.

    1991-01-01

    This paper summarizes results of a survey conducted in 1988 of 57 U.S. Fish and Wildlife Service field offices. The purpose was to document opinions of biologists experienced in applying the Instream Flow Incremental Methodology (IFIM). Responses were received from 35 offices where 616 IFIM applications were reported. The existence of six monitoring studies designed to evaluate the adequacy of flows provided at sites was confirmed. The two principal categories reported as stumbling blocks to the successful application of IFIM were beliefs that the methodology is technically too simplistic or that it is too complex to apply. Recommendations receiving the highest scores for future initiatives to enhance IFIM use were (1) training and workshops for field biologists; and (2) improving suitability index (SI) curves and computer models, and evaluating the relationship of weighted useable area (WUA) to fish responses. The authors concur that emphasis for research should be on addressing technical concerns about SI curves and WUA.

  4. Linking Advanced Visualization and MATLAB for the Analysis of 3D Gene Expression Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruebel, Oliver; Keranen, Soile V.E.; Biggin, Mark

    Three-dimensional gene expression PointCloud data generated by the Berkeley Drosophila Transcription Network Project (BDTNP) provides quantitative information about the spatial and temporal expression of genes in early Drosophila embryos at cellular resolution. The BDTNP team visualizes and analyzes Point-Cloud data using the software application PointCloudXplore (PCX). To maximize the impact of novel, complex data sets, such as PointClouds, the data needs to be accessible to biologists and comprehensible to developers of analysis functions. We address this challenge by linking PCX and Matlab via a dedicated interface, thereby providing biologists seamless access to advanced data analysis functions and giving bioinformatics researchersmore » the opportunity to integrate their analysis directly into the visualization application. To demonstrate the usefulness of this approach, we computationally model parts of the expression pattern of the gene even skipped using a genetic algorithm implemented in Matlab and integrated into PCX via our Matlab interface.« less

  5. Metavisitor, a Suite of Galaxy Tools for Simple and Rapid Detection and Discovery of Viruses in Deep Sequence Data

    PubMed Central

    Vernick, Kenneth D.

    2017-01-01

    Metavisitor is a software package that allows biologists and clinicians without specialized bioinformatics expertise to detect and assemble viral genomes from deep sequence datasets. The package is composed of a set of modular bioinformatic tools and workflows that are implemented in the Galaxy framework. Using the graphical Galaxy workflow editor, users with minimal computational skills can use existing Metavisitor workflows or adapt them to suit specific needs by adding or modifying analysis modules. Metavisitor works with DNA, RNA or small RNA sequencing data over a range of read lengths and can use a combination of de novo and guided approaches to assemble genomes from sequencing reads. We show that the software has the potential for quick diagnosis as well as discovery of viruses from a vast array of organisms. Importantly, we provide here executable Metavisitor use cases, which increase the accessibility and transparency of the software, ultimately enabling biologists or clinicians to focus on biological or medical questions. PMID:28045932

  6. The BioCyc collection of microbial genomes and metabolic pathways.

    PubMed

    Karp, Peter D; Billington, Richard; Caspi, Ron; Fulcher, Carol A; Latendresse, Mario; Kothari, Anamika; Keseler, Ingrid M; Krummenacker, Markus; Midford, Peter E; Ong, Quang; Ong, Wai Kit; Paley, Suzanne M; Subhraveti, Pallavi

    2017-08-17

    BioCyc.org is a microbial genome Web portal that combines thousands of genomes with additional information inferred by computer programs, imported from other databases and curated from the biomedical literature by biologist curators. BioCyc also provides an extensive range of query tools, visualization services and analysis software. Recent advances in BioCyc include an expansion in the content of BioCyc in terms of both the number of genomes and the types of information available for each genome; an expansion in the amount of curated content within BioCyc; and new developments in the BioCyc software tools including redesigned gene/protein pages and metabolite pages; new search tools; a new sequence-alignment tool; a new tool for visualizing groups of related metabolic pathways; and a facility called SmartTables, which enables biologists to perform analyses that previously would have required a programmer's assistance. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. The RCSB protein data bank: integrative view of protein, gene and 3D structural information

    PubMed Central

    Rose, Peter W.; Prlić, Andreas; Altunkaya, Ali; Bi, Chunxiao; Bradley, Anthony R.; Christie, Cole H.; Costanzo, Luigi Di; Duarte, Jose M.; Dutta, Shuchismita; Feng, Zukang; Green, Rachel Kramer; Goodsell, David S.; Hudson, Brian; Kalro, Tara; Lowe, Robert; Peisach, Ezra; Randle, Christopher; Rose, Alexander S.; Shao, Chenghua; Tao, Yi-Ping; Valasatava, Yana; Voigt, Maria; Westbrook, John D.; Woo, Jesse; Yang, Huangwang; Young, Jasmine Y.; Zardecki, Christine; Berman, Helen M.; Burley, Stephen K.

    2017-01-01

    The Research Collaboratory for Structural Bioinformatics Protein Data Bank (RCSB PDB, http://rcsb.org), the US data center for the global PDB archive, makes PDB data freely available to all users, from structural biologists to computational biologists and beyond. New tools and resources have been added to the RCSB PDB web portal in support of a ‘Structural View of Biology.’ Recent developments have improved the User experience, including the high-speed NGL Viewer that provides 3D molecular visualization in any web browser, improved support for data file download and enhanced organization of website pages for query, reporting and individual structure exploration. Structure validation information is now visible for all archival entries. PDB data have been integrated with external biological resources, including chromosomal position within the human genome; protein modifications; and metabolic pathways. PDB-101 educational materials have been reorganized into a searchable website and expanded to include new features such as the Geis Digital Archive. PMID:27794042

  8. DOSE: an R/Bioconductor package for disease ontology semantic and enrichment analysis.

    PubMed

    Yu, Guangchuang; Wang, Li-Gen; Yan, Guang-Rong; He, Qing-Yu

    2015-02-15

    Disease ontology (DO) annotates human genes in the context of disease. DO is important annotation in translating molecular findings from high-throughput data to clinical relevance. DOSE is an R package providing semantic similarity computations among DO terms and genes which allows biologists to explore the similarities of diseases and of gene functions in disease perspective. Enrichment analyses including hypergeometric model and gene set enrichment analysis are also implemented to support discovering disease associations of high-throughput biological data. This allows biologists to verify disease relevance in a biological experiment and identify unexpected disease associations. Comparison among gene clusters is also supported. DOSE is released under Artistic-2.0 License. The source code and documents are freely available through Bioconductor (http://www.bioconductor.org/packages/release/bioc/html/DOSE.html). Supplementary data are available at Bioinformatics online. gcyu@connect.hku.hk or tqyhe@jnu.edu.cn. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  9. Volunteer Clouds and Citizen Cyberscience for LHC Physics

    NASA Astrophysics Data System (ADS)

    Aguado Sanchez, Carlos; Blomer, Jakob; Buncic, Predrag; Chen, Gang; Ellis, John; Garcia Quintas, David; Harutyunyan, Artem; Grey, Francois; Lombrana Gonzalez, Daniel; Marquina, Miguel; Mato, Pere; Rantala, Jarno; Schulz, Holger; Segal, Ben; Sharma, Archana; Skands, Peter; Weir, David; Wu, Jie; Wu, Wenjing; Yadav, Rohit

    2011-12-01

    Computing for the LHC, and for HEP more generally, is traditionally viewed as requiring specialized infrastructure and software environments, and therefore not compatible with the recent trend in "volunteer computing", where volunteers supply free processing time on ordinary PCs and laptops via standard Internet connections. In this paper, we demonstrate that with the use of virtual machine technology, at least some standard LHC computing tasks can be tackled with volunteer computing resources. Specifically, by presenting volunteer computing resources to HEP scientists as a "volunteer cloud", essentially identical to a Grid or dedicated cluster from a job submission perspective, LHC simulations can be processed effectively. This article outlines both the technical steps required for such a solution and the implications for LHC computing as well as for LHC public outreach and for participation by scientists from developing regions in LHC research.

  10. Computer-based communication in support of scientific and technical work. [conferences on management information systems used by scientists of NASA programs

    NASA Technical Reports Server (NTRS)

    Vallee, J.; Wilson, T.

    1976-01-01

    Results are reported of the first experiments for a computer conference management information system at the National Aeronautics and Space Administration. Between August 1975 and March 1976, two NASA projects with geographically separated participants (NASA scientists) used the PLANET computer conferencing system for portions of their work. The first project was a technology assessment of future transportation systems. The second project involved experiments with the Communication Technology Satellite. As part of this project, pre- and postlaunch operations were discussed in a computer conference. These conferences also provided the context for an analysis of the cost of computer conferencing. In particular, six cost components were identified: (1) terminal equipment, (2) communication with a network port, (3) network connection, (4) computer utilization, (5) data storage and (6) administrative overhead.

  11. Know Your Discipline: Teaching the Philosophy of Computer Science

    ERIC Educational Resources Information Center

    Tedre, Matti

    2007-01-01

    The diversity and interdisciplinarity of computer science and the multiplicity of its uses in other sciences make it hard to define computer science and to prescribe how computer science should be carried out. The diversity of computer science also causes friction between computer scientists from different branches. Computer science curricula, as…

  12. Educational background and professional participation by federal wildlife biologists: Implications for science, management, and The Wildlife Society

    USGS Publications Warehouse

    Schmutz, Joel A.

    2002-01-01

    Over 2,000 people are employed in wildlife biology in the United States federal government. The size of this constituency motivated me to examine the amount of formal education federal biologists have received and the extent of continuing education they undertake by reading journals or attending scientific meetings. Most federal biologists who are members of The Wildlife Society (TWS) have a graduate degree. However, one-third have only a Bachelor of Science degree, despite the current trend toward hiring people with graduate degrees. Most federal biologists are not research biologists. Numbers of journals subscribed to was positively related to educational level. Less than one-third of all wildlife biologists employed by the United States Fish and Wildlife Service are members of TWS or subscribe to any of its journals. In contrast, the majority of presenters at the TWS 2000 Annual Conference were research biologists and members of TWS. The failure of many federal wildlife biologists to read scientific literature or attend professional meetings indicates a failure to promote the importance of continuing education in the federal workplace. I identify 2 potential adverse impacts of this failing: an inability to recognize important and relevant scientific contributions and an ineffectiveness in carrying out adaptive management.

  13. Visual Data Exploration and Analysis - Report on the Visualization Breakout Session of the SCaLeS Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bethel, E. Wes; Frank, Randy; Fulcomer, Sam

    Scientific visualization is the transformation of abstract information into images, and it plays an integral role in the scientific process by facilitating insight into observed or simulated phenomena. Visualization as a discipline spans many research areas from computer science, cognitive psychology and even art. Yet the most successful visualization applications are created when close synergistic interactions with domain scientists are part of the algorithmic design and implementation process, leading to visual representations with clear scientific meaning. Visualization is used to explore, to debug, to gain understanding, and as an analysis tool. Visualization is literally everywhere--images are present in this report,more » on television, on the web, in books and magazines--the common theme is the ability to present information visually that is rapidly assimilated by human observers, and transformed into understanding or insight. As an indispensable part a modern science laboratory, visualization is akin to the biologist's microscope or the electrical engineer's oscilloscope. Whereas the microscope is limited to small specimens or use of optics to focus light, the power of scientific visualization is virtually limitless: visualization provides the means to examine data that can be at galactic or atomic scales, or at any size in between. Unlike the traditional scientific tools for visual inspection, visualization offers the means to ''see the unseeable.'' Trends in demographics or changes in levels of atmospheric CO{sub 2} as a function of greenhouse gas emissions are familiar examples of such unseeable phenomena. Over time, visualization techniques evolve in response to scientific need. Each scientific discipline has its ''own language,'' verbal and visual, used for communication. The visual language for depicting electrical circuits is much different than the visual language for depicting theoretical molecules or trends in the stock market. There is no ''one visualization too'' that can serve as a panacea for all science disciplines. Instead, visualization researchers work hand in hand with domain scientists as part of the scientific research process to define, create, adapt and refine software that ''speaks the visual language'' of each scientific domain.« less

  14. Benefits of Exchange Between Computer Scientists and Perceptual Scientists: A Panel Discussion

    NASA Technical Reports Server (NTRS)

    Kaiser, Mary K.; Null, Cynthia H. (Technical Monitor)

    1995-01-01

    We have established several major goals for this panel: 1) Introduce the computer graphics community to some specific leaders in the use of perceptual psychology relating to computer graphics; 2) Enumerate the major results that are known, and provide a set of resources for finding others; 3) Identify research areas where knowledge of perceptual psychology can help computer system designers improve their systems; and 4) Provide advice to researchers on how they can establish collaborations in their own research programs. We believe this will be a very important panel. In addition to generating lively discussion, we hope to point out some of the fundamental issues that occur at the boundary between computer science and perception, and possibly help researchers avoid some of the common pitfalls.

  15. Boom. Bust. Build.

    ERIC Educational Resources Information Center

    Kite, Vance; Park, Soonhye

    2018-01-01

    In 2006 Jeanette Wing, a professor of computer science at Carnegie Mellon University, proposed computational thinking (CT) as a literacy just as important as reading, writing, and mathematics. Wing defined CT as a set of skills and strategies computer scientists use to solve complex, computational problems (Wing 2006). The computer science and…

  16. Workforce Retention Study in Support of the U.S. Army Aberdeen Test Center Human Capital Management Strategy

    DTIC Science & Technology

    2016-09-01

    Sciences Group 6% 1550s Computer Scientists Group 5% Other 1500s ORSAa, Mathematics, & Statistics Group 3% 1600s Equipment & Facilities Group 4...Employee removal based on misconduct, delinquency , suitability, unsatisfactory performance, or failure to qualify for conversion to a career appointment...average of 10.4% in many areas, but over double the average for the 1550s (Computer Scientists) and other 1500s (ORSA, Mathematics, and Statistics ). Also

  17. Research Institute for Advanced Computer Science: Annual Report October 1998 through September 1999

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.; Gross, Anthony R. (Technical Monitor)

    1999-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center (ARC). It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. ARC has been designated NASA's Center of Excellence in Information Technology. In this capacity, ARC is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA ARC and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, and visiting scientist programs, designed to encourage and facilitate collaboration between the university and NASA information technology research communities.

  18. Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2000-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth; (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking. Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, and visiting scientist programs, designed to encourage and facilitate collaboration between the university and NASA information technology research communities.

  19. The philosophy of modelling or does the philosophy of biology have any use?

    PubMed

    Orzack, Steven Hecht

    2012-01-19

    Biologists in search of answers to real-world issues such as the ecological consequences of global warming, the design of species' conservation plans, understanding landscape dynamics and understanding gene expression make decisions constantly that are based on a 'philosophical' stance as to how to create and test explanations of an observed phenomenon. For better or for worse, some kind of philosophy is an integral part of the doing of biology. Given this, it is more important than ever to undertake a practical assessment of what philosophy does mean and should mean to biologists. Here, I address three questions: should biologists pay any attention to 'philosophy'; should biologists pay any attention to 'philosophy of biology'; and should biologists pay any attention to the philosophy of biology literature on modelling? I describe why the last question is easily answered affirmatively, with the proviso that the practical benefits to be gained by biologists from this literature will be directly proportional to the extent to which biologists understand 'philosophy' to be a part of biology, not apart from biology.

  20. The philosophy of modelling or does the philosophy of biology have any use?

    PubMed Central

    Orzack, Steven Hecht

    2012-01-01

    Biologists in search of answers to real-world issues such as the ecological consequences of global warming, the design of species' conservation plans, understanding landscape dynamics and understanding gene expression make decisions constantly that are based on a ‘philosophical’ stance as to how to create and test explanations of an observed phenomenon. For better or for worse, some kind of philosophy is an integral part of the doing of biology. Given this, it is more important than ever to undertake a practical assessment of what philosophy does mean and should mean to biologists. Here, I address three questions: should biologists pay any attention to ‘philosophy’; should biologists pay any attention to ‘philosophy of biology’; and should biologists pay any attention to the philosophy of biology literature on modelling? I describe why the last question is easily answered affirmatively, with the proviso that the practical benefits to be gained by biologists from this literature will be directly proportional to the extent to which biologists understand ‘philosophy’ to be a part of biology, not apart from biology. PMID:22144380

  1. Recent Advances and Issues in Computers. Oryx Frontiers of Science Series.

    ERIC Educational Resources Information Center

    Gay, Martin K.

    Discussing recent issues in computer science, this book contains 11 chapters covering: (1) developments that have the potential for changing the way computers operate, including microprocessors, mass storage systems, and computing environments; (2) the national computational grid for high-bandwidth, high-speed collaboration among scientists, and…

  2. The Development of a Biologist

    ERIC Educational Resources Information Center

    Flannery, Maura C.

    2004-01-01

    John Tyler Bonner, a biologist writings reminded the reader of how his perspective is different from that of many biologists. Bonner's views, his books, his areas of interest and research and his life, which was spent studying slime molds is described.

  3. Electronic Ecosystem.

    ERIC Educational Resources Information Center

    Travis, John

    1991-01-01

    A discipline in which scientists seek to simulate and synthesize lifelike behaviors within computers, chemical mixtures, and other media is discussed. A computer program with self-replicating digital "organisms" that evolve as they compete for computer time and memory is described. (KR)

  4. Chemistry Research

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Philip Morris research center scientists use a computer program called CECTRP, for Chemical Equilibrium Composition and Transport Properties, to gain insight into the behavior of atoms as they progress along the reaction pathway. Use of the program lets the scientist accurately predict the behavior of a given molecule or group of molecules. Computer generated data must be checked by laboratory experiment, but the use of CECTRP saves the researchers hundreds of hours of laboratory time since experiments must run only to validate the computer's prediction. Philip Morris estimates that had CECTRP not been available, at least two man years would have been required to develop a program to perform similar free energy calculations.

  5. Developing an online programme in computational biology.

    PubMed

    Vincent, Heather M; Page, Christopher

    2013-11-01

    Much has been written about the need for continuing education and training to enable life scientists and computer scientists to manage and exploit the different types of biological data now becoming available. Here we describe the development of an online programme that combines short training courses, so that those who require an educational programme can progress to complete a formal qualification. Although this flexible approach fits the needs of course participants, it does not fit easily within the organizational structures of a campus-based university.

  6. Stochastic Simulation Service: Bridging the Gap between the Computational Expert and the Biologist

    PubMed Central

    Banerjee, Debjani; Bellesia, Giovanni; Daigle, Bernie J.; Douglas, Geoffrey; Gu, Mengyuan; Gupta, Anand; Hellander, Stefan; Horuk, Chris; Nath, Dibyendu; Takkar, Aviral; Lötstedt, Per; Petzold, Linda R.

    2016-01-01

    We present StochSS: Stochastic Simulation as a Service, an integrated development environment for modeling and simulation of both deterministic and discrete stochastic biochemical systems in up to three dimensions. An easy to use graphical user interface enables researchers to quickly develop and simulate a biological model on a desktop or laptop, which can then be expanded to incorporate increasing levels of complexity. StochSS features state-of-the-art simulation engines. As the demand for computational power increases, StochSS can seamlessly scale computing resources in the cloud. In addition, StochSS can be deployed as a multi-user software environment where collaborators share computational resources and exchange models via a public model repository. We demonstrate the capabilities and ease of use of StochSS with an example of model development and simulation at increasing levels of complexity. PMID:27930676

  7. Stochastic Simulation Service: Bridging the Gap between the Computational Expert and the Biologist

    DOE PAGES

    Drawert, Brian; Hellander, Andreas; Bales, Ben; ...

    2016-12-08

    We present StochSS: Stochastic Simulation as a Service, an integrated development environment for modeling and simulation of both deterministic and discrete stochastic biochemical systems in up to three dimensions. An easy to use graphical user interface enables researchers to quickly develop and simulate a biological model on a desktop or laptop, which can then be expanded to incorporate increasing levels of complexity. StochSS features state-of-the-art simulation engines. As the demand for computational power increases, StochSS can seamlessly scale computing resources in the cloud. In addition, StochSS can be deployed as a multi-user software environment where collaborators share computational resources andmore » exchange models via a public model repository. We also demonstrate the capabilities and ease of use of StochSS with an example of model development and simulation at increasing levels of complexity.« less

  8. Computational resources for ribosome profiling: from database to Web server and software.

    PubMed

    Wang, Hongwei; Wang, Yan; Xie, Zhi

    2017-08-14

    Ribosome profiling is emerging as a powerful technique that enables genome-wide investigation of in vivo translation at sub-codon resolution. The increasing application of ribosome profiling in recent years has achieved remarkable progress toward understanding the composition, regulation and mechanism of translation. This benefits from not only the awesome power of ribosome profiling but also an extensive range of computational resources available for ribosome profiling. At present, however, a comprehensive review on these resources is still lacking. Here, we survey the recent computational advances guided by ribosome profiling, with a focus on databases, Web servers and software tools for storing, visualizing and analyzing ribosome profiling data. This review is intended to provide experimental and computational biologists with a reference to make appropriate choices among existing resources for the question at hand. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  9. Towards improved socio-economic assessments of ocean acidification's impacts.

    PubMed

    Hilmi, Nathalie; Allemand, Denis; Dupont, Sam; Safa, Alain; Haraldsson, Gunnar; Nunes, Paulo A L D; Moore, Chris; Hattam, Caroline; Reynaud, Stéphanie; Hall-Spencer, Jason M; Fine, Maoz; Turley, Carol; Jeffree, Ross; Orr, James; Munday, Philip L; Cooley, Sarah R

    2013-01-01

    Ocean acidification is increasingly recognized as a component of global change that could have a wide range of impacts on marine organisms, the ecosystems they live in, and the goods and services they provide humankind. Assessment of these potential socio-economic impacts requires integrated efforts between biologists, chemists, oceanographers, economists and social scientists. But because ocean acidification is a new research area, significant knowledge gaps are preventing economists from estimating its welfare impacts. For instance, economic data on the impact of ocean acidification on significant markets such as fisheries, aquaculture and tourism are very limited (if not non-existent), and non-market valuation studies on this topic are not yet available. Our paper summarizes the current understanding of future OA impacts and sets out what further information is required for economists to assess socio-economic impacts of ocean acidification. Our aim is to provide clear directions for multidisciplinary collaborative research.

  10. The Science Teaching Fellows Program: A Model for Online Faculty Development of Early Career Scientists Interested in Teaching.

    PubMed

    Brancaccio-Taras, Loretta; Gull, Kelly A; Ratti, Claudia

    2016-12-01

    The American Society for Microbiology (ASM) has a history of providing a wide range of faculty development opportunities. Recently, ASM developed the Science Teaching Fellows Program (STF) for early career biologists and postdoctoral students to explore student-centered teaching and develop the skills needed to succeed in positions that have a significant teaching component. Participants were selected to STF through a competitive application process. The STF program consisted of a series of six webinars. In preparation for each webinar, participants completed a pre-webinar assignment. After each webinar, fellows practiced what they learned by completing a post-webinar assignment. In a survey used to assess the impact of STF, participants reported greater knowledge of the webinar-based instructional topics and a sense of being part of an educational community and were more confident about varied teaching methods.

  11. Pharmaceutical Perspectives of Spices and Condiments as Alternative Antimicrobial Remedy

    PubMed Central

    D’Souza, Savita P.; Chavannavar, Suvarna V.; Kanchanashri, B.; Niveditha, S. B.

    2017-01-01

    Medicinal values of spices and condiments are being revived by biologists through in vitro and in vivo trials providing evidence for its antimicrobial activities. The essential oils and extracts of spices like black pepper, cloves, cinnamon, and nutmeg contain active compounds like piperine, eugenol, cinnamaldehyde, and lignans. Similarly, condiments like coriander, black cumin, turmeric, garlic, and ginger are recognized for constituents like linalool, thymoquinones, curcumin, allicin, and geranial respectively. These act as natural preventive components of several diseases and represent as antioxidants in body cells. Scientists have to investigate the biochemical nature, mode of action, and minimum concentration of administrating active ingredients effectively. This review reports findings of recent research carried out across South Asia and Middle East countries where spices and condiments form chief flavoring components of traditional foods. It narrates the history, myths, and facts people believe in these regions. There may not be scientific explanation but has evidence of cure for centuries. PMID:28449595

  12. Biologist Edwin Grant Conklin and the idea of the religious direction of human evolution in the early 1920s.

    PubMed

    Pavuk, Alexander

    2017-01-01

    Edwin Grant Conklin, renowned US embryologist and evolutionary popularizer, publicly advocated a social vision of evolution that intertwined science and modernist Protestant theology in the early 1920s. The moral prestige of professional science in American culture - along with Conklin's own elite scientific status - diverted attention from the frequency with which his work crossed boundaries between natural science, religion and philosophy. Writing for broad audiences, Conklin was one of the most significant of the religious and modernist biological scientists whose rhetoric went well beyond simply claiming that certain kinds of religion were amenable to evolutionary science; he instead incorporated religion itself into evolution's broadest workings. A sampling of Conklin's widely-resonant discourse suggests that there was substantially more to the religion-evolution story in the 1920s US than many creationist-centred narratives of the era imply.

  13. Using a commodity high-definition television for collaborative structural biology

    PubMed Central

    Yennamalli, Ragothaman; Arangarasan, Raj; Bryden, Aaron; Gleicher, Michael; Phillips, George N.

    2014-01-01

    Visualization of protein structures using stereoscopic systems is frequently needed by structural biologists working to understand a protein’s structure–function relationships. Often several scientists are working as a team and need simultaneous interaction with each other and the graphics representations. Most existing molecular visualization tools support single-user tasks, which are not suitable for a collaborative group. Expensive caves, domes or geowalls have been developed, but the availability and low cost of high-definition televisions (HDTVs) and game controllers in the commodity entertainment market provide an economically attractive option to achieve a collaborative environment. This paper describes a low-cost environment, using standard consumer game controllers and commercially available stereoscopic HDTV monitors with appropriate signal converters for structural biology collaborations employing existing binary distributions of commonly used software packages like Coot, PyMOL, Chimera, VMD, O, Olex2 and others. PMID:24904249

  14. Epidemiological review of toxoplasmosis in humans and animals in Romania.

    PubMed

    Dubey, J P; Hotea, I; Olariu, T R; Jones, J L; Dărăbuş, G

    2014-03-01

    Infections by the protozoan parasite Toxoplasma gondii are widely prevalent in humans and other animals worldwide. However, information from eastern European countries is sketchy. In many eastern European countries, including Romania, it has been assumed that chronic T. gondii infection is a common cause of infertility and abortion. For this reason, many women in Romania with these problems were needlessly tested for T. gondii infection. Most papers on toxoplasmosis in Romania were published in Romanian in local journals and often not available to scientists in other countries. Currently, the rate of congenital infection in Romania is largely unknown. In addition, there is little information on genetic characteristics of T. gondii or prevalence in animals and humans in Romania. In the present paper we review prevalence, clinical spectrum and epidemiology of T. gondii in humans and animals in Romania. This knowledge should be useful to biologists, public health workers, veterinarians and physicians.

  15. The Science Teaching Fellows Program: A Model for Online Faculty Development of Early Career Scientists Interested in Teaching†

    PubMed Central

    Brancaccio-Taras, Loretta; Gull, Kelly A.; Ratti, Claudia

    2016-01-01

    The American Society for Microbiology (ASM) has a history of providing a wide range of faculty development opportunities. Recently, ASM developed the Science Teaching Fellows Program (STF) for early career biologists and postdoctoral students to explore student-centered teaching and develop the skills needed to succeed in positions that have a significant teaching component. Participants were selected to STF through a competitive application process. The STF program consisted of a series of six webinars. In preparation for each webinar, participants completed a pre-webinar assignment. After each webinar, fellows practiced what they learned by completing a post-webinar assignment. In a survey used to assess the impact of STF, participants reported greater knowledge of the webinar-based instructional topics and a sense of being part of an educational community and were more confident about varied teaching methods. PMID:28101259

  16. Current Applications of Chromatographic Methods in the Study of Human Body Fluids for Diagnosing Disorders.

    PubMed

    Jóźwik, Jagoda; Kałużna-Czaplińska, Joanna

    2016-01-01

    Currently, analysis of various human body fluids is one of the most essential and promising approaches to enable the discovery of biomarkers or pathophysiological mechanisms for disorders and diseases. Analysis of these fluids is challenging due to their complex composition and unique characteristics. Development of new analytical methods in this field has made it possible to analyze body fluids with higher selectivity, sensitivity, and precision. The composition and concentration of analytes in body fluids are most often determined by chromatography-based techniques. There is no doubt that proper use of knowledge that comes from a better understanding of the role of body fluids requires the cooperation of scientists of diverse specializations, including analytical chemists, biologists, and physicians. This article summarizes current knowledge about the application of different chromatographic methods in analyses of a wide range of compounds in human body fluids in order to diagnose certain diseases and disorders.

  17. Cell scientist to watch - Sandra Rieger.

    PubMed

    2018-06-19

    Sandra Rieger studied at the University of Applied Sciences at Fulda, Germany, and wrote her diploma thesis in collaboration with Zyomyx, Inc. (San Francisco, USA). She then joined the laboratory of Reinhard Koester at the Helmholtz Center in Munich to complete her PhD in developmental neurobiology in 2008. For her postdoctoral studies, Sandra moved to the University of California, Los Angeles to work with Alvaro Sagasti on axon regeneration in zebrafish. Since 2011, she has been Assistant Professor for regenerative biology and medicine at the MDI Biological Laboratory in Maine, USA. In the summer of 2018, Sandra will establish a laboratory at the University of Miami, Florida, to become a tenure-track Associate Professor at the Department of Biology. The Rieger laboratory studies cellular communication mechanisms between sensory neurons and injured epidermal cells, leading to wound healing, nerve regeneration and degeneration after injury or exposure to chemotherapeutic agents. © 2018. Published by The Company of Biologists Ltd.

  18. Approaches and species in the history of vertebrate embryology.

    PubMed

    Hopwood, Nick

    2011-01-01

    Recent debates about model organisms echo far into the past; taking a longer view adds perspective to present concerns. The major approaches in the history of research on vertebrate embryos have tended to exploit different species, though there are long-term continuities too. Early nineteenth-century embryologists worked on surrogates for humans and began to explore the range of vertebrate embryogenesis; late nineteenth-century Darwinists hunted exotic ontogenies; around 1900 experimentalists favored living embryos in which they could easily intervene; reproductive scientists tackled farm animals and human beings; after World War II developmental biologists increasingly engineered species for laboratory life; and proponents of evo-devo have recently challenged the resulting dominance of a few models. Decisions about species have depended on research questions, biological properties, supply lines, and, not least, on methods. Nor are species simply chosen; embryology has transformed them even as they have profoundly shaped the science.

  19. Lamarck, Evolution, and the Inheritance of Acquired Characters

    PubMed Central

    Burkhardt, Richard W.

    2013-01-01

    Scientists are not always remembered for the ideas they cherished most. In the case of the French biologist Jean-Baptiste Lamarck, his name since the end of the nineteenth century has been tightly linked to the idea of the inheritance of acquired characters. This was indeed an idea that he endorsed, but he did not claim it as his own nor did he give it much thought. He took pride instead in advancing the ideas that (1) nature produced successively all the different forms of life on earth, and (2) environmentally induced behavioral changes lead the way in species change. This article surveys Lamarck’s ideas about organic change, identifies several ironies with respect to how his name is commonly remembered, and suggests that some historical justice might be done by using the adjective “Lamarckian” to denote something more (or other) than a belief in the inheritance of acquired characters. PMID:23908372

  20. A call to insect scientists: Challenges and opportunities of managing insect communities under climate change

    USGS Publications Warehouse

    Hellmann, Jessica J.; Grundel, Ralph; Hoving, Chris; Schuurman, Gregor W.

    2016-01-01

    As climate change moves insect systems into uncharted territory, more knowledge about insect dynamics and the factors that drive them could enable us to better manage and conserve insect communities. Climate change may also require us revisit insect management goals and strategies and lead to a new kind of scientific engagement in management decision-making. Here we make five key points about the role of insect science in aiding and crafting management decisions, and we illustrate those points with the monarch butterfly and the Karner blue butterfly, two species undergoing considerable change and facing new management dilemmas. Insect biology has a strong history of engagement in applied problems, and as the impacts of climate change increase, a reimagined ethic of entomology in service of broader society may emerge. We hope to motivate insect biologists to contribute time and effort toward solving the challenges of climate change.

  1. Plague bacterium as a transformer species in prairie dogs and the grasslands of western North America

    USGS Publications Warehouse

    Eads, David A.; Biggins, Dean E.

    2015-01-01

    Invasive transformer species change the character, condition, form, or nature of ecosystems and deserve considerable attention from conservation scientists. We applied the transformer species concept to the plague bacterium Yersinia pestis in western North America, where the pathogen was introduced around 1900. Y. pestis transforms grassland ecosystems by severely depleting the abundance of prairie dogs (Cynomys spp.) and thereby causing declines in native species abundance and diversity, including threatened and endangered species; altering food web connections; altering the import and export of nutrients; causing a loss of ecosystem resilience to encroaching invasive plants; and modifying prairie dog burrows. Y. pestis poses an important challenge to conservation biologists because it causes trophic-level perturbations that affect the stability of ecosystems. Unfortunately, understanding of the effects of Y. pestis on ecosystems is rudimentary, highlighting an acute need for continued research.

  2. Plague bacterium as a transformer species in prairie dogs and the grasslands of western North America.

    PubMed

    Eads, David A; Biggins, Dean E

    2015-08-01

    Invasive transformer species change the character, condition, form, or nature of ecosystems and deserve considerable attention from conservation scientists. We applied the transformer species concept to the plague bacterium Yersinia pestis in western North America, where the pathogen was introduced around 1900. Y. pestis transforms grassland ecosystems by severely depleting the abundance of prairie dogs (Cynomys spp.) and thereby causing declines in native species abundance and diversity, including threatened and endangered species; altering food web connections; altering the import and export of nutrients; causing a loss of ecosystem resilience to encroaching invasive plants; and modifying prairie dog burrows. Y. pestis poses an important challenge to conservation biologists because it causes trophic-level perturbations that affect the stability of ecosystems. Unfortunately, understanding of the effects of Y. pestis on ecosystems is rudimentary, highlighting an acute need for continued research. © 2015 Society for Conservation Biology.

  3. Innovative Treatments for Cancer:. The Impact of Delivering siRNAs, Chemotherapies, and Preventative Agents Using Nanoformulations

    NASA Astrophysics Data System (ADS)

    Hook, Sara S.; Farrell, Dorothy; Hinkal, George W.; Ptak, Krzystzof; Grodzinski, Piotr; Panaro, Nicholas J.

    2013-09-01

    A multi-disciplinary approach to research epitomized by the emerging field of cancer nanotechnology can catalyze scientific developments and enable clinical translation beyond what we currently utilize. Engineers, chemists, and physical scientists are teaming up with cancer biologists and clinical oncologists to attack the vast array of cancer malignancies using materials at the nanoscale. We discuss how nanoformulations are enabling the targeted, efficient, delivery of not only genetic therapies such silencing RNAs, but also conventional cytotoxic agents and small molecules which results in decreased systemic toxicity and improved therapeutic index. As preventative approaches, there are various imaging agents and devices are being developed for screening purposes as well as new formulations of sunscreens, neutraceuticals, and cancer vaccines. The goal then of incorporating nanotechnology into clinical applications is to achieve new and more effective ways of diagnosing, treating, and preventing cancer to ultimately change the lives of patients worldwide.

  4. Current progress in 3D printing for cardiovascular tissue engineering.

    PubMed

    Mosadegh, Bobak; Xiong, Guanglei; Dunham, Simon; Min, James K

    2015-03-16

    3D printing is a technology that allows the fabrication of structures with arbitrary geometries and heterogeneous material properties. The application of this technology to biological structures that match the complexity of native tissue is of great interest to researchers. This mini-review highlights the current progress of 3D printing for fabricating artificial tissues of the cardiovascular system, specifically the myocardium, heart valves, and coronary arteries. In addition, how 3D printed sensors and actuators can play a role in tissue engineering is discussed. To date, all the work with building 3D cardiac tissues have been proof-of-principle demonstrations, and in most cases, yielded products less effective than other traditional tissue engineering strategies. However, this technology is in its infancy and therefore there is much promise that through collaboration between biologists, engineers and material scientists, 3D bioprinting can make a significant impact on the field of cardiovascular tissue engineering.

  5. Waterfowl Management Handbook

    USGS Publications Warehouse

    Cross, Diana H.

    1988-01-01

    The North American Waterfowl Management Plan, the Service's most recent mandate for management of migratory waterfowl, and recent legislation such as the Farm Bill all underscore the need for a single source of information about the management of waterfowl and their habitat. Much of this information exists in scientific papers, unpublished reports, or has never been recorded, and thus is not readily accessible by waterfowl managers. A prototype handbook was developed in 1987 and critiqued by 38 reviewers who provided suggestions on style and substance as well as topics for inclusion. The assistance of these reviewers, who included Federal and State wildlife managers, Federal and State biologists, and scientists in the United States and Canada, is most gratefully acknowledged. This product differs from most Fish and Wildlife Leaflets. It will be issued as a series of chapters over the next several years, each with a unique number, designed to be inserted in an accompanying looseleaf binder.

  6. Prophet”or Professor? The Life and Work of Lewis Fry Richardson

    NASA Astrophysics Data System (ADS)

    Smagorinsky, Joseph

    This book focuses on a man who, in his lifetime, was scarcely known to the general public. Yet within certain circles, Richardson has had enormous impact within recent years. Although there are many scientists and humanists who exercise influence in their own respective fields, rarely do they bridge disciplines. It is this combination that has made Lewis Fry Richardson a figure worthy of a full-length biography, not just to record his contributions to each field but to provide an analysis and understanding of what motivated his diversity. In another age, Richardson would have been counted as a Renaissance man. He has variously been referred to as a chemist, physicist, mathematician, psychologist, meteorologist, economist, and biologist. In retrospect, he clearly was well ahead of his time, whether the subject in question was his work in numerical weather prediction or in war studies.

  7. Physical Principles of Skeletal Minerals Revealed with Spectromicroscopy

    ScienceCinema

    Gilbert, Pupa [U of Wisconsin-Madison, Wisconsin, United States

    2017-12-09

    Skeletal elements of marine and terrestrial organisms have the most fascinating nano-to-macro-structures, attracting the attention of physicists, biologists, chemists, and materials scientists. Using X-PEEM spectromicroscopy we revealed some of the fundamental mechanisms leading to the formation of these biominerals. Specifically, we addressed the following questions and provided the answers: 1Q) How do teeth, bones, and echinoderm and mollusk shells acquire their unusual, curved and complex morphology, if they are composed of single crystals? 1A) Via amorphous precursor phases; 2Q) How does crystallinity propagate through the amorophous precursor phases in sea urchin spicules and teeth? 2A) By secondary nucleation, following random walk patterns; 3Q) How does iridescent mother-of-pearl become ordered? 3A) Gradually, through a kinetic mechanisms in which fastest growing single-crystals win the competition for space, thus end up being approximately co-oriented.

  8. Biological versus electronic adaptive coloration: how can one inform the other?

    PubMed Central

    Kreit, Eric; Mäthger, Lydia M.; Hanlon, Roger T.; Dennis, Patrick B.; Naik, Rajesh R.; Forsythe, Eric; Heikenfeld, Jason

    2013-01-01

    Adaptive reflective surfaces have been a challenge for both electronic paper (e-paper) and biological organisms. Multiple colours, contrast, polarization, reflectance, diffusivity and texture must all be controlled simultaneously without optical losses in order to fully replicate the appearance of natural surfaces and vividly communicate information. This review merges the frontiers of knowledge for both biological adaptive coloration, with a focus on cephalopods, and synthetic reflective e-paper within a consistent framework of scientific metrics. Currently, the highest performance approach for both nature and technology uses colourant transposition. Three outcomes are envisioned from this review: reflective display engineers may gain new insights from millions of years of natural selection and evolution; biologists will benefit from understanding the types of mechanisms, characterization and metrics used in synthetic reflective e-paper; all scientists will gain a clearer picture of the long-term prospects for capabilities such as adaptive concealment and signalling. PMID:23015522

  9. Of the Helmholtz Club, South-Californian seedbed for visual and cognitive neuroscience, and its patron Francis Crick.

    PubMed

    Aicardi, Christine

    2014-03-01

    Taking up the view that semi-institutional gatherings such as clubs, societies, research schools, have been instrumental in creating sheltered spaces from which many a 20th-century project-driven interdisciplinary research programme could develop and become established within the institutions of science, the paper explores the history of one such gathering from its inception in the early 1980s into the 2000s, the Helmholtz Club, which brought together scientists from such various research fields as neuroanatomy, neurophysiology, psychophysics, computer science and engineering, who all had an interest in the study of the visual system and of higher cognitive functions relying on visual perception such as visual consciousness. It argues that British molecular biologist turned South Californian neuroscientist Francis Crick had an early and lasting influence over the Helmholtz Club of which he was a founding pillar, and that from its inception, the club served as a constitutive element in his long-term plans for a neuroscience of vision and of cognition. Further, it argues that in this role, the Helmholtz Club served many purposes, the primary of which was to be a social forum for interdisciplinary discussion, where 'discussion' was not mere talk but was imbued with an epistemic value and as such, carefully cultivated. Finally, it questions what counts as 'doing science' and in turn, definitions of success and failure-and provides some material evidence towards re-appraising the successfulness of Crick's contribution to the neurosciences. Copyright © 2013 The Author. Published by Elsevier Ltd.. All rights reserved.

  10. Next-generation phenotyping: requirements and strategies for enhancing our understanding of genotype-phenotype relationships and its relevance to crop improvement.

    PubMed

    Cobb, Joshua N; Declerck, Genevieve; Greenberg, Anthony; Clark, Randy; McCouch, Susan

    2013-04-01

    More accurate and precise phenotyping strategies are necessary to empower high-resolution linkage mapping and genome-wide association studies and for training genomic selection models in plant improvement. Within this framework, the objective of modern phenotyping is to increase the accuracy, precision and throughput of phenotypic estimation at all levels of biological organization while reducing costs and minimizing labor through automation, remote sensing, improved data integration and experimental design. Much like the efforts to optimize genotyping during the 1980s and 1990s, designing effective phenotyping initiatives today requires multi-faceted collaborations between biologists, computer scientists, statisticians and engineers. Robust phenotyping systems are needed to characterize the full suite of genetic factors that contribute to quantitative phenotypic variation across cells, organs and tissues, developmental stages, years, environments, species and research programs. Next-generation phenotyping generates significantly more data than previously and requires novel data management, access and storage systems, increased use of ontologies to facilitate data integration, and new statistical tools for enhancing experimental design and extracting biologically meaningful signal from environmental and experimental noise. To ensure relevance, the implementation of efficient and informative phenotyping experiments also requires familiarity with diverse germplasm resources, population structures, and target populations of environments. Today, phenotyping is quickly emerging as the major operational bottleneck limiting the power of genetic analysis and genomic prediction. The challenge for the next generation of quantitative geneticists and plant breeders is not only to understand the genetic basis of complex trait variation, but also to use that knowledge to efficiently synthesize twenty-first century crop varieties.

  11. The Screening Compound Collection: A Key Asset for Drug Discovery.

    PubMed

    Boss, Christoph; Hazemann, Julien; Kimmerlin, Thierry; von Korff, Modest; Lüthi, Urs; Peter, Oliver; Sander, Thomas; Siegrist, Romain

    2017-10-25

    In this case study on an essential instrument of modern drug discovery, we summarize our successful efforts in the last four years toward enhancing the Actelion screening compound collection. A key organizational step was the establishment of the Compound Library Committee (CLC) in September 2013. This cross-functional team consisting of computational scientists, medicinal chemists and a biologist was endowed with a significant annual budget for regular new compound purchases. Based on an initial library analysis performed in 2013, the CLC developed a New Library Strategy. The established continuous library turn-over mode, and the screening library size of 300'000 compounds were maintained, while the structural library quality was increased. This was achieved by shifting the selection criteria from 'druglike' to 'leadlike' structures, enriching for non-flat structures, aiming for compound novelty, and increasing the ratio of higher cost 'Premium Compounds'. Novel chemical space was gained by adding natural compounds, macrocycles, designed and focused libraries to the collection, and through mutual exchanges of proprietary compounds with agrochemical companies. A comparative analysis in 2016 provided evidence for the positive impact of these measures. Screening the improved library has provided several highly promising hits, including a macrocyclic compound, that are currently followed up in different Hit-to-Lead and Lead Optimization programs. It is important to state that the goal of the CLC was not to achieve higher HTS hit rates, but to increase the chances of identified hits to serve as the basis of successful early drug discovery programs. The experience gathered so far legitimates the New Library Strategy.

  12. Argonne Out Loud: Computation, Big Data, and the Future of Cities

    ScienceCinema

    Catlett, Charlie

    2018-01-16

    Charlie Catlett, a Senior Computer Scientist at Argonne and Director of the Urban Center for Computation and Data at the Computation Institute of the University of Chicago and Argonne, talks about how he and his colleagues are using high-performance computing, data analytics, and embedded systems to better understand and design cities.

  13. RIACS FY2002 Annual Report

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.; Gross, Anthony R. (Technical Monitor)

    2002-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. Operated by the Universities Space Research Association (a non-profit university consortium), RIACS is located at the NASA Ames Research Center, Moffett Field, California. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in September 2003. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology (IT) Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1) Automated Reasoning for Autonomous Systems; 2) Human-Centered Computing; and 3) High Performance Computing and Networking. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains including aerospace technology, earth science, life sciences, and astrobiology. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.

  14. Corra: Computational framework and tools for LC-MS discovery and targeted mass spectrometry-based proteomics

    PubMed Central

    Brusniak, Mi-Youn; Bodenmiller, Bernd; Campbell, David; Cooke, Kelly; Eddes, James; Garbutt, Andrew; Lau, Hollis; Letarte, Simon; Mueller, Lukas N; Sharma, Vagisha; Vitek, Olga; Zhang, Ning; Aebersold, Ruedi; Watts, Julian D

    2008-01-01

    Background Quantitative proteomics holds great promise for identifying proteins that are differentially abundant between populations representing different physiological or disease states. A range of computational tools is now available for both isotopically labeled and label-free liquid chromatography mass spectrometry (LC-MS) based quantitative proteomics. However, they are generally not comparable to each other in terms of functionality, user interfaces, information input/output, and do not readily facilitate appropriate statistical data analysis. These limitations, along with the array of choices, present a daunting prospect for biologists, and other researchers not trained in bioinformatics, who wish to use LC-MS-based quantitative proteomics. Results We have developed Corra, a computational framework and tools for discovery-based LC-MS proteomics. Corra extends and adapts existing algorithms used for LC-MS-based proteomics, and statistical algorithms, originally developed for microarray data analyses, appropriate for LC-MS data analysis. Corra also adapts software engineering technologies (e.g. Google Web Toolkit, distributed processing) so that computationally intense data processing and statistical analyses can run on a remote server, while the user controls and manages the process from their own computer via a simple web interface. Corra also allows the user to output significantly differentially abundant LC-MS-detected peptide features in a form compatible with subsequent sequence identification via tandem mass spectrometry (MS/MS). We present two case studies to illustrate the application of Corra to commonly performed LC-MS-based biological workflows: a pilot biomarker discovery study of glycoproteins isolated from human plasma samples relevant to type 2 diabetes, and a study in yeast to identify in vivo targets of the protein kinase Ark1 via phosphopeptide profiling. Conclusion The Corra computational framework leverages computational innovation to enable biologists or other researchers to process, analyze and visualize LC-MS data with what would otherwise be a complex and not user-friendly suite of tools. Corra enables appropriate statistical analyses, with controlled false-discovery rates, ultimately to inform subsequent targeted identification of differentially abundant peptides by MS/MS. For the user not trained in bioinformatics, Corra represents a complete, customizable, free and open source computational platform enabling LC-MS-based proteomic workflows, and as such, addresses an unmet need in the LC-MS proteomics field. PMID:19087345

  15. Exascale computing and what it means for shock physics

    NASA Astrophysics Data System (ADS)

    Germann, Timothy

    2015-06-01

    The U.S. Department of Energy is preparing to launch an Exascale Computing Initiative, to address the myriad challenges required to deploy and effectively utilize an exascale-class supercomputer (i.e., one capable of performing 1018 operations per second) in the 2023 timeframe. Since physical (power dissipation) requirements limit clock rates to at most a few GHz, this will necessitate the coordination of on the order of a billion concurrent operations, requiring sophisticated system and application software, and underlying mathematical algorithms, that may differ radically from traditional approaches. Even at the smaller workstation or cluster level of computation, the massive concurrency and heterogeneity within each processor will impact computational scientists. Through the multi-institutional, multi-disciplinary Exascale Co-design Center for Materials in Extreme Environments (ExMatEx), we have initiated an early and deep collaboration between domain (computational materials) scientists, applied mathematicians, computer scientists, and hardware architects, in order to establish the relationships between algorithms, software stacks, and architectures needed to enable exascale-ready materials science application codes within the next decade. In my talk, I will discuss these challenges, and what it will mean for exascale-era electronic structure, molecular dynamics, and engineering-scale simulations of shock-compressed condensed matter. In particular, we anticipate that the emerging hierarchical, heterogeneous architectures can be exploited to achieve higher physical fidelity simulations using adaptive physics refinement. This work is supported by the U.S. Department of Energy, Office of Science, Office of Advanced Scientific Computing Research.

  16. CESDIS

    NASA Technical Reports Server (NTRS)

    1994-01-01

    CESDIS, the Center of Excellence in Space Data and Information Sciences was developed jointly by NASA, Universities Space Research Association (USRA), and the University of Maryland in 1988 to focus on the design of advanced computing techniques and data systems to support NASA Earth and space science research programs. CESDIS is operated by USRA under contract to NASA. The Director, Associate Director, Staff Scientists, and administrative staff are located on-site at NASA's Goddard Space Flight Center in Greenbelt, Maryland. The primary CESDIS mission is to increase the connection between computer science and engineering research programs at colleges and universities and NASA groups working with computer applications in Earth and space science. Research areas of primary interest at CESDIS include: 1) High performance computing, especially software design and performance evaluation for massively parallel machines; 2) Parallel input/output and data storage systems for high performance parallel computers; 3) Data base and intelligent data management systems for parallel computers; 4) Image processing; 5) Digital libraries; and 6) Data compression. CESDIS funds multiyear projects at U. S. universities and colleges. Proposals are accepted in response to calls for proposals and are selected on the basis of peer reviews. Funds are provided to support faculty and graduate students working at their home institutions. Project personnel visit Goddard during academic recess periods to attend workshops, present seminars, and collaborate with NASA scientists on research projects. Additionally, CESDIS takes on specific research tasks of shorter duration for computer science research requested by NASA Goddard scientists.

  17. Computational Thinking: A Digital Age Skill for Everyone

    ERIC Educational Resources Information Center

    Barr, David; Harrison, John; Conery, Leslie

    2011-01-01

    In a seminal article published in 2006, Jeanette Wing described computational thinking (CT) as a way of "solving problems, designing systems, and understanding human behavior by drawing on the concepts fundamental to computer science." Wing's article gave rise to an often controversial discussion and debate among computer scientists,…

  18. Cloud Computing for Protein-Ligand Binding Site Comparison

    PubMed Central

    2013-01-01

    The proteome-wide analysis of protein-ligand binding sites and their interactions with ligands is important in structure-based drug design and in understanding ligand cross reactivity and toxicity. The well-known and commonly used software, SMAP, has been designed for 3D ligand binding site comparison and similarity searching of a structural proteome. SMAP can also predict drug side effects and reassign existing drugs to new indications. However, the computing scale of SMAP is limited. We have developed a high availability, high performance system that expands the comparison scale of SMAP. This cloud computing service, called Cloud-PLBS, combines the SMAP and Hadoop frameworks and is deployed on a virtual cloud computing platform. To handle the vast amount of experimental data on protein-ligand binding site pairs, Cloud-PLBS exploits the MapReduce paradigm as a management and parallelizing tool. Cloud-PLBS provides a web portal and scalability through which biologists can address a wide range of computer-intensive questions in biology and drug discovery. PMID:23762824

  19. Cloud computing for protein-ligand binding site comparison.

    PubMed

    Hung, Che-Lun; Hua, Guan-Jie

    2013-01-01

    The proteome-wide analysis of protein-ligand binding sites and their interactions with ligands is important in structure-based drug design and in understanding ligand cross reactivity and toxicity. The well-known and commonly used software, SMAP, has been designed for 3D ligand binding site comparison and similarity searching of a structural proteome. SMAP can also predict drug side effects and reassign existing drugs to new indications. However, the computing scale of SMAP is limited. We have developed a high availability, high performance system that expands the comparison scale of SMAP. This cloud computing service, called Cloud-PLBS, combines the SMAP and Hadoop frameworks and is deployed on a virtual cloud computing platform. To handle the vast amount of experimental data on protein-ligand binding site pairs, Cloud-PLBS exploits the MapReduce paradigm as a management and parallelizing tool. Cloud-PLBS provides a web portal and scalability through which biologists can address a wide range of computer-intensive questions in biology and drug discovery.

  20. Application of advanced computing techniques to the analysis and display of space science measurements

    NASA Technical Reports Server (NTRS)

    Klumpar, D. M.; Lapolla, M. V.; Horblit, B.

    1995-01-01

    A prototype system has been developed to aid the experimental space scientist in the display and analysis of spaceborne data acquired from direct measurement sensors in orbit. We explored the implementation of a rule-based environment for semi-automatic generation of visualizations that assist the domain scientist in exploring one's data. The goal has been to enable rapid generation of visualizations which enhance the scientist's ability to thoroughly mine his data. Transferring the task of visualization generation from the human programmer to the computer produced a rapid prototyping environment for visualizations. The visualization and analysis environment has been tested against a set of data obtained from the Hot Plasma Composition Experiment on the AMPTE/CCE satellite creating new visualizations which provided new insight into the data.

  1. An Absolute Index (Ab-index) to Measure a Researcher’s Useful Contributions and Productivity

    PubMed Central

    Biswal, Akshaya Kumar

    2013-01-01

    Bibliographic analysis has been a very powerful tool in evaluating the effective contributions of a researcher and determining his/her future research potential. The lack of an absolute quantification of the author’s scientific contributions by the existing measurement system hampers the decision-making process. In this paper, a new metric system, Absolute index (Ab-index), has been proposed that allows a more objective comparison of the contributions of a researcher. The Ab-index takes into account the impact of research findings while keeping in mind the physical and intellectual contributions of the author(s) in accomplishing the task. The Ab-index and h-index were calculated for 10 highly cited geneticists and molecular biologist and 10 young researchers of biological sciences and compared for their relationship to the researchers input as a primary author. This is the first report of a measuring method clarifying the contributions of the first author, corresponding author, and other co-authors and the sharing of credit in a logical ratio. A java application has been developed for the easy calculation of the Ab-index. It can be used as a yardstick for comparing the credibility of different scientists competing for the same resources while the Productivity index (Pr-index), which is the rate of change in the Ab-index per year, can be used for comparing scientists of different age groups. The Ab-index has clear advantage over other popular metric systems in comparing scientific credibility of young scientists. The sum of the Ab-indices earned by individual researchers of an institute per year can be referred to as Pr-index of the institute. PMID:24391941

  2. Calorie restriction, post-reproductive life span, and programmed aging: a plea for rigor.

    PubMed

    De Grey, Aubrey D N J

    2007-11-01

    All scientists are acutely aware of the profound challenge that they face when communicating scientific findings to nonscientists, especially when great uncertainty is involved and when the topic is of personal interest to the general public. Simplification of the issues--sometimes extending to a degree of oversimplification--is a sad but generally recognized necessity. It is not, however, a necessity when scientists communicate with each other, and when that happens, the explanation may lie elsewhere: either in the speaker's vested interests or in overconfidence on the speaker's part in the extent to which he or she has grasped the topic under discussion. Both these explanations are serious allegations and must not be made without good reason, not least because an alternative explanation is often the entirely legitimate preference for scientific "shorthand." However, when a general tendency toward oversimplification emerges within an expert community, not only in informal interactions but in learned publications, the field in question can suffer a loss of reputation for rigor, which may especially infect younger scientists joining that field (or contemplating joining it). I feel that this has occurred to a dangerous degree within biogerontology in respect of the way in which the effect of the environment on the rate of aging-whether that of an individual organism or of a lineage-is described. There are still important controversies in that area, but I refer here strictly to issues concerning which a thorough consensus exists. In this essay I highlight some fundamental tenets of biogerontology that are frequently, and to my mind problematically, mis-stated by many in this field in their printed pronouncements. Greater precision on these points will, I believe, benefit biogerontology at many levels, avoiding confusion among biogerontologists, among other biologists, and among the general public.

  3. Accelerating phylogenetics computing on the desktop: experiments with executing UPGMA in programmable logic.

    PubMed

    Davis, J P; Akella, S; Waddell, P H

    2004-01-01

    Having greater computational power on the desktop for processing taxa data sets has been a dream of biologists/statisticians involved in phylogenetics data analysis. Many existing algorithms have been highly optimized-one example being Felsenstein's PHYLIP code, written in C, for UPGMA and neighbor joining algorithms. However, the ability to process more than a few tens of taxa in a reasonable amount of time using conventional computers has not yielded a satisfactory speedup in data processing, making it difficult for phylogenetics practitioners to quickly explore data sets-such as might be done from a laptop computer. We discuss the application of custom computing techniques to phylogenetics. In particular, we apply this technology to speed up UPGMA algorithm execution by a factor of a hundred, against that of PHYLIP code running on the same PC. We report on these experiments and discuss how custom computing techniques can be used to not only accelerate phylogenetics algorithm performance on the desktop, but also on larger, high-performance computing engines, thus enabling the high-speed processing of data sets involving thousands of taxa.

  4. Apollo: a sequence annotation editor

    PubMed Central

    Lewis, SE; Searle, SMJ; Harris, N; Gibson, M; Iyer, V; Richter, J; Wiel, C; Bayraktaroglu, L; Birney, E; Crosby, MA; Kaminker, JS; Matthews, BB; Prochnik, SE; Smith, CD; Tupy, JL; Rubin, GM; Misra, S; Mungall, CJ; Clamp, ME

    2002-01-01

    The well-established inaccuracy of purely computational methods for annotating genome sequences necessitates an interactive tool to allow biological experts to refine these approximations by viewing and independently evaluating the data supporting each annotation. Apollo was developed to meet this need, enabling curators to inspect genome annotations closely and edit them. FlyBase biologists successfully used Apollo to annotate the Drosophila melanogaster genome and it is increasingly being used as a starting point for the development of customized annotation editing tools for other genome projects. PMID:12537571

  5. Information processing, computation, and cognition.

    PubMed

    Piccinini, Gualtiero; Scarantino, Andrea

    2011-01-01

    Computation and information processing are among the most fundamental notions in cognitive science. They are also among the most imprecisely discussed. Many cognitive scientists take it for granted that cognition involves computation, information processing, or both - although others disagree vehemently. Yet different cognitive scientists use 'computation' and 'information processing' to mean different things, sometimes without realizing that they do. In addition, computation and information processing are surrounded by several myths; first and foremost, that they are the same thing. In this paper, we address this unsatisfactory state of affairs by presenting a general and theory-neutral account of computation and information processing. We also apply our framework by analyzing the relations between computation and information processing on one hand and classicism, connectionism, and computational neuroscience on the other. We defend the relevance to cognitive science of both computation, at least in a generic sense, and information processing, in three important senses of the term. Our account advances several foundational debates in cognitive science by untangling some of their conceptual knots in a theory-neutral way. By leveling the playing field, we pave the way for the future resolution of the debates' empirical aspects.

  6. Scientists at Work. Final Report.

    ERIC Educational Resources Information Center

    Education Turnkey Systems, Inc., Falls Church, VA.

    This report summarizes activities related to the development, field testing, evaluation, and marketing of the "Scientists at Work" program which combines computer assisted instruction with database tools to aid cognitively impaired middle and early high school children in learning and applying thinking skills to science. The brief report reviews…

  7. PatchSurfers: Two methods for local molecular property-based binding ligand prediction.

    PubMed

    Shin, Woong-Hee; Bures, Mark Gregory; Kihara, Daisuke

    2016-01-15

    Protein function prediction is an active area of research in computational biology. Function prediction can help biologists make hypotheses for characterization of genes and help interpret biological assays, and thus is a productive area for collaboration between experimental and computational biologists. Among various function prediction methods, predicting binding ligand molecules for a target protein is an important class because ligand binding events for a protein are usually closely intertwined with the proteins' biological function, and also because predicted binding ligands can often be directly tested by biochemical assays. Binding ligand prediction methods can be classified into two types: those which are based on protein-protein (or pocket-pocket) comparison, and those that compare a target pocket directly to ligands. Recently, our group proposed two computational binding ligand prediction methods, Patch-Surfer, which is a pocket-pocket comparison method, and PL-PatchSurfer, which compares a pocket to ligand molecules. The two programs apply surface patch-based descriptions to calculate similarity or complementarity between molecules. A surface patch is characterized by physicochemical properties such as shape, hydrophobicity, and electrostatic potentials. These properties on the surface are represented using three-dimensional Zernike descriptors (3DZD), which are based on a series expansion of a 3 dimensional function. Utilizing 3DZD for describing the physicochemical properties has two main advantages: (1) rotational invariance and (2) fast comparison. Here, we introduce Patch-Surfer and PL-PatchSurfer with an emphasis on PL-PatchSurfer, which is more recently developed. Illustrative examples of PL-PatchSurfer performance on binding ligand prediction as well as virtual drug screening are also provided. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Sig2BioPAX: Java tool for converting flat files to BioPAX Level 3 format.

    PubMed

    Webb, Ryan L; Ma'ayan, Avi

    2011-03-21

    The World Wide Web plays a critical role in enabling molecular, cell, systems and computational biologists to exchange, search, visualize, integrate, and analyze experimental data. Such efforts can be further enhanced through the development of semantic web concepts. The semantic web idea is to enable machines to understand data through the development of protocol free data exchange formats such as Resource Description Framework (RDF) and the Web Ontology Language (OWL). These standards provide formal descriptors of objects, object properties and their relationships within a specific knowledge domain. However, the overhead of converting datasets typically stored in data tables such as Excel, text or PDF into RDF or OWL formats is not trivial for non-specialists and as such produces a barrier to seamless data exchange between researchers, databases and analysis tools. This problem is particularly of importance in the field of network systems biology where biochemical interactions between genes and their protein products are abstracted to networks. For the purpose of converting biochemical interactions into the BioPAX format, which is the leading standard developed by the computational systems biology community, we developed an open-source command line tool that takes as input tabular data describing different types of molecular biochemical interactions. The tool converts such interactions into the BioPAX level 3 OWL format. We used the tool to convert several existing and new mammalian networks of protein interactions, signalling pathways, and transcriptional regulatory networks into BioPAX. Some of these networks were deposited into PathwayCommons, a repository for consolidating and organizing biochemical networks. The software tool Sig2BioPAX is a resource that enables experimental and computational systems biologists to contribute their identified networks and pathways of molecular interactions for integration and reuse with the rest of the research community.

  9. Bridging Social and Semantic Computing - Design and Evaluation of User Interfaces for Hybrid Systems

    ERIC Educational Resources Information Center

    Bostandjiev, Svetlin Alex I.

    2012-01-01

    The evolution of the Web brought new interesting problems to computer scientists that we loosely classify in the fields of social and semantic computing. Social computing is related to two major paradigms: computations carried out by a large amount of people in a collective intelligence fashion (i.e. wikis), and performing computations on social…

  10. Networking Biology: The Origins of Sequence-Sharing Practices in Genomics.

    PubMed

    Stevens, Hallam

    2015-10-01

    The wide sharing of biological data, especially nucleotide sequences, is now considered to be a key feature of genomics. Historians and sociologists have attempted to account for the rise of this sharing by pointing to precedents in model organism communities and in natural history. This article supplements these approaches by examining the role that electronic networking technologies played in generating the specific forms of sharing that emerged in genomics. The links between early computer users at the Stanford Artificial Intelligence Laboratory in the 1960s, biologists using local computer networks in the 1970s, and GenBank in the 1980s, show how networking technologies carried particular practices of communication, circulation, and data distribution from computing into biology. In particular, networking practices helped to transform sequences themselves into objects that had value as a community resource.

  11. Empirical investigation of the ethical reasoning of physicians and molecular biologists - the importance of the four principles of biomedical ethics.

    PubMed

    Ebbesen, Mette; Pedersen, Birthe D

    2007-10-25

    This study presents an empirical investigation of the ethical reasoning and ethical issues at stake in the daily work of physicians and molecular biologists in Denmark. The aim of this study was to test empirically whether there is a difference in ethical considerations and principles between Danish physicians and Danish molecular biologists, and whether the bioethical principles of the American bioethicists Tom L. Beauchamp and James F. Childress are applicable to these groups. This study is based on 12 semi-structured interviews with three groups of respondents: a group of oncology physicians working in a clinic at a public hospital and two groups of molecular biologists conducting basic research, one group employed at a public university and the other in a private biopharmaceutical company. In this sample, the authors found that oncology physicians and molecular biologists employed in a private biopharmaceutical company have the specific principle of beneficence in mind in their daily work. Both groups are motivated to help sick patients. According to the study, molecular biologists explicitly consider nonmaleficence in relation to the environment, the researchers' own health, and animal models; and only implicitly in relation to patients or human subjects. In contrast, considerations of nonmaleficence by oncology physicians relate to patients or human subjects. Physicians and molecular biologists both consider the principle of respect for autonomy as a negative obligation in the sense that informed consent of patients should be respected. However, in contrast to molecular biologists, physicians experience the principle of respect for autonomy as a positive obligation as the physician, in dialogue with the patient, offers a medical prognosis based upon the patients wishes and ideas, mutual understanding, and respect. Finally, this study discloses utilitarian characteristics in the overall conception of justice as conceived by oncology physicians and molecular biologists from the private biopharmaceutical company. Molecular biologists employed at a public university are, in this study, concerned with allocation, however, they do not propose a specific theory of justice. This study demonstrates that each of the four bioethical principles of the American bioethicists Tom L. Beauchamp & James F. Childress - respect for autonomy, beneficence, nonmaleficence and justice - are reflected in the daily work of physicians and molecular biologists in Denmark. Consequently, these principles are applicable in the Danish biomedical setting.

  12. Empirical investigation of the ethical reasoning of physicians and molecular biologists – the importance of the four principles of biomedical ethics

    PubMed Central

    Ebbesen, Mette; Pedersen, Birthe D

    2007-01-01

    Background This study presents an empirical investigation of the ethical reasoning and ethical issues at stake in the daily work of physicians and molecular biologists in Denmark. The aim of this study was to test empirically whether there is a difference in ethical considerations and principles between Danish physicians and Danish molecular biologists, and whether the bioethical principles of the American bioethicists Tom L. Beauchamp and James F. Childress are applicable to these groups. Method This study is based on 12 semi-structured interviews with three groups of respondents: a group of oncology physicians working in a clinic at a public hospital and two groups of molecular biologists conducting basic research, one group employed at a public university and the other in a private biopharmaceutical company. Results In this sample, the authors found that oncology physicians and molecular biologists employed in a private biopharmaceutical company have the specific principle of beneficence in mind in their daily work. Both groups are motivated to help sick patients. According to the study, molecular biologists explicitly consider nonmaleficence in relation to the environment, the researchers' own health, and animal models; and only implicitly in relation to patients or human subjects. In contrast, considerations of nonmaleficence by oncology physicians relate to patients or human subjects. Physicians and molecular biologists both consider the principle of respect for autonomy as a negative obligation in the sense that informed consent of patients should be respected. However, in contrast to molecular biologists, physicians experience the principle of respect for autonomy as a positive obligation as the physician, in dialogue with the patient, offers a medical prognosis based upon the patients wishes and ideas, mutual understanding, and respect. Finally, this study discloses utilitarian characteristics in the overall conception of justice as conceived by oncology physicians and molecular biologists from the private biopharmaceutical company. Molecular biologists employed at a public university are, in this study, concerned with allocation, however, they do not propose a specific theory of justice. Conclusion This study demonstrates that each of the four bioethical principles of the American bioethicists Tom L. Beauchamp & James F. Childress – respect for autonomy, beneficence, nonmaleficence and justice – are reflected in the daily work of physicians and molecular biologists in Denmark. Consequently, these principles are applicable in the Danish biomedical setting. PMID:17961251

  13. Profugus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, Thomas; Hamilton, Steven; Slattery, Stuart

    Profugus is an open-source mini-application (mini-app) for radiation transport and reactor applications. It contains the fundamental computational kernels used in the Exnihilo code suite from Oak Ridge National Laboratory. However, Exnihilo is production code with a substantial user base. Furthermore, Exnihilo is export controlled. This makes collaboration with computer scientists and computer engineers difficult. Profugus is designed to bridge that gap. By encapsulating the core numerical algorithms in an abbreviated code base that is open-source, computer scientists can analyze the algorithms and easily make code-architectural changes to test performance without compromising the production code values of Exnihilo. Profugus is notmore » meant to be production software with respect to problem analysis. The computational kernels in Profugus are designed to analyze performance, not correctness. Nonetheless, users of Profugus can setup and run problems with enough real-world features to be useful as proof-of-concept for actual production work.« less

  14. Combinatorial Algorithms to Enable Computational Science and Engineering: Work from the CSCAPES Institute

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boman, Erik G.; Catalyurek, Umit V.; Chevalier, Cedric

    2015-01-16

    This final progress report summarizes the work accomplished at the Combinatorial Scientific Computing and Petascale Simulations Institute. We developed Zoltan, a parallel mesh partitioning library that made use of accurate hypergraph models to provide load balancing in mesh-based computations. We developed several graph coloring algorithms for computing Jacobian and Hessian matrices and organized them into a software package called ColPack. We developed parallel algorithms for graph coloring and graph matching problems, and also designed multi-scale graph algorithms. Three PhD students graduated, six more are continuing their PhD studies, and four postdoctoral scholars were advised. Six of these students and Fellowsmore » have joined DOE Labs (Sandia, Berkeley), as staff scientists or as postdoctoral scientists. We also organized the SIAM Workshop on Combinatorial Scientific Computing (CSC) in 2007, 2009, and 2011 to continue to foster the CSC community.« less

  15. Integrated Circuits/Segregated Labor: Women in Three Computer-Related Occupations. Project Report No. 84-A27.

    ERIC Educational Resources Information Center

    Strober, Myra H.; Arnold, Carolyn L.

    This discussion of the impact of new computer occupations on women's employment patterns is divided into four major sections. The first section describes the six computer-related occupations to be analyzed: (1) engineers; (2) computer scientists and systems analysts; (3) programmers; (4) electronic technicians; (5) computer operators; and (6) data…

  16. Enduring Influence of Stereotypical Computer Science Role Models on Women's Academic Aspirations

    ERIC Educational Resources Information Center

    Cheryan, Sapna; Drury, Benjamin J.; Vichayapai, Marissa

    2013-01-01

    The current work examines whether a brief exposure to a computer science role model who fits stereotypes of computer scientists has a lasting influence on women's interest in the field. One-hundred undergraduate women who were not computer science majors met a female or male peer role model who embodied computer science stereotypes in appearance…

  17. Collective Computation of Neural Network

    DTIC Science & Technology

    1990-03-15

    Sciences, Beijing ABSTRACT Computational neuroscience is a new branch of neuroscience originating from current research on the theory of computer...scientists working in artificial intelligence engineering and neuroscience . The paper introduces the collective computational properties of model neural...vision research. On this basis, the authors analyzed the significance of the Hopfield model. Key phrases: Computational Neuroscience , Neural Network, Model

  18. The Quantitative Methods Boot Camp: Teaching Quantitative Thinking and Computing Skills to Graduate Students in the Life Sciences

    PubMed Central

    Stefan, Melanie I.; Gutlerner, Johanna L.; Born, Richard T.; Springer, Michael

    2015-01-01

    The past decade has seen a rapid increase in the ability of biologists to collect large amounts of data. It is therefore vital that research biologists acquire the necessary skills during their training to visualize, analyze, and interpret such data. To begin to meet this need, we have developed a “boot camp” in quantitative methods for biology graduate students at Harvard Medical School. The goal of this short, intensive course is to enable students to use computational tools to visualize and analyze data, to strengthen their computational thinking skills, and to simulate and thus extend their intuition about the behavior of complex biological systems. The boot camp teaches basic programming using biological examples from statistics, image processing, and data analysis. This integrative approach to teaching programming and quantitative reasoning motivates students’ engagement by demonstrating the relevance of these skills to their work in life science laboratories. Students also have the opportunity to analyze their own data or explore a topic of interest in more detail. The class is taught with a mixture of short lectures, Socratic discussion, and in-class exercises. Students spend approximately 40% of their class time working through both short and long problems. A high instructor-to-student ratio allows students to get assistance or additional challenges when needed, thus enhancing the experience for students at all levels of mastery. Data collected from end-of-course surveys from the last five offerings of the course (between 2012 and 2014) show that students report high learning gains and feel that the course prepares them for solving quantitative and computational problems they will encounter in their research. We outline our course here which, together with the course materials freely available online under a Creative Commons License, should help to facilitate similar efforts by others. PMID:25880064

  19. The quantitative methods boot camp: teaching quantitative thinking and computing skills to graduate students in the life sciences.

    PubMed

    Stefan, Melanie I; Gutlerner, Johanna L; Born, Richard T; Springer, Michael

    2015-04-01

    The past decade has seen a rapid increase in the ability of biologists to collect large amounts of data. It is therefore vital that research biologists acquire the necessary skills during their training to visualize, analyze, and interpret such data. To begin to meet this need, we have developed a "boot camp" in quantitative methods for biology graduate students at Harvard Medical School. The goal of this short, intensive course is to enable students to use computational tools to visualize and analyze data, to strengthen their computational thinking skills, and to simulate and thus extend their intuition about the behavior of complex biological systems. The boot camp teaches basic programming using biological examples from statistics, image processing, and data analysis. This integrative approach to teaching programming and quantitative reasoning motivates students' engagement by demonstrating the relevance of these skills to their work in life science laboratories. Students also have the opportunity to analyze their own data or explore a topic of interest in more detail. The class is taught with a mixture of short lectures, Socratic discussion, and in-class exercises. Students spend approximately 40% of their class time working through both short and long problems. A high instructor-to-student ratio allows students to get assistance or additional challenges when needed, thus enhancing the experience for students at all levels of mastery. Data collected from end-of-course surveys from the last five offerings of the course (between 2012 and 2014) show that students report high learning gains and feel that the course prepares them for solving quantitative and computational problems they will encounter in their research. We outline our course here which, together with the course materials freely available online under a Creative Commons License, should help to facilitate similar efforts by others.

  20. Scientific Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fermilab

    2017-09-01

    Scientists, engineers and programmers at Fermilab are tackling today’s most challenging computational problems. Their solutions, motivated by the needs of worldwide research in particle physics and accelerators, help America stay at the forefront of innovation.

  1. Charles William Lacaillade. Biologist, Parasitologist, Educator, and Mentor.

    PubMed

    Imperato, Pascal James

    2017-02-01

    Charles William Lacaillade (1904-1978) was an eminent biologist in the middle decades of the twentieth century. He was born in Lawrence, Massachusetts of parents whose ancestors were French Canadians. His father, also named Charles William Lacaillade, was a dentist who graduated from Tufts University School of Dentistry in 1898. His mother, Elodia Eno, came from a family of very successful businessmen. Lacaillade was the third of six children. His two older brothers, Harold Carleton and Hector Eno, both graduated from the University of Louisville, School of Dentistry, while his younger brother, Lawrence, became a businessman. His sister, Luemma, married Dr. Henry Steadman, a veterinarian, while his youngest sister, Gloria, married a U.S. Army officer, Lieutenant Colonel Victor Anido. Lacaillade received his MS and PhD degrees in biology and zoology from Harvard University. He then became a fellow at The Rockefeller Institute for Medical Research. At both institutions, he studied under some of the most eminent biological scientists of the time. These included Rudolf W. Glaser, George Howard Parker, Theobald Smith, Carl TenBroeck, and William Morton Wheeler. At the Rockefeller Institute, he co-discovered the vector and mode of transmission of Eastern Equine Encephalomyelitis. This discovery, and the research he conducted with Rudolf W. Glaser, quickly established him as an outstanding biological researcher. However, a change in leadership at the Rockefeller Institute resulted in research priorities being given to the disciplines of general physiology, physical chemistry, and nutrition. This shift in the research agenda away from the biological sciences precluded career advancement at the Rockefeller Institute for post-doctoral fellows like Lacaillade. It was the height of the Great Depression, and even biologists with terminal doctoral degrees found it difficult to find positions. In 1935, Lacaillade accepted a position as an assistant in biology at St. John's College in Brooklyn, New York. Although a small single-gender college for men, the Department of Biology there under Dr. Andrew I. Dawson had an impressive record of research achievements. Lacaillade remained at this institution for the remainder of his career until his retirement in 1970. He eventually became Distinguished Professor of Biology, Chair of the Department of Biology, and the recipient of numerous awards and recognitions. Lacaillade quickly developed a reputation as an outstanding teacher, mentor, and scientist. He taught introductory courses in biology as well as advanced ones in parasitology and entomology. He preceptored graduate students and guided their dissertation research. Above all else, he was a superb mentor who provided sage advice to pre-professional students planning careers in medicine and dentistry. Lacaillade effortlessly adapted to the transformation of St. John's College, with an annual enrollment of some 600, to St. John's University, with an average annual student census of 20,000. He also oversaw the geographic relocation of his department from Brooklyn to the then new campus in Jamaica, New York in 1955. He proved to be a stabilizing presence during the faculty strike of 1966 and its aftermath which included a reorganization of the university. Throughout his life, Lacaillade was admired as a man of letters. His interests spanned art, literature, opera, and the theater. He had a passionate interest in English literature, about which he wrote, and was proud of his collection of first editions of English writers. Charles William Lacaillade was an eminent success as a research biologist early in his career. However, his greater successes came later as an outstanding educator and mentor. As such, he had a positive and lasting influence on the lives and careers of many students and colleagues. He passed away on 17 September 1978 in Danvers, Massachusetts.

  2. An Analysis of Cloud Computing with Amazon Web Services for the Atmospheric Science Data Center

    NASA Astrophysics Data System (ADS)

    Gleason, J. L.; Little, M. M.

    2013-12-01

    NASA science and engineering efforts rely heavily on compute and data handling systems. The nature of NASA science data is such that it is not restricted to NASA users, instead it is widely shared across a globally distributed user community including scientists, educators, policy decision makers, and the public. Therefore NASA science computing is a candidate use case for cloud computing where compute resources are outsourced to an external vendor. Amazon Web Services (AWS) is a commercial cloud computing service developed to use excess computing capacity at Amazon, and potentially provides an alternative to costly and potentially underutilized dedicated acquisitions whenever NASA scientists or engineers require additional data processing. AWS desires to provide a simplified avenue for NASA scientists and researchers to share large, complex data sets with external partners and the public. AWS has been extensively used by JPL for a wide range of computing needs and was previously tested on a NASA Agency basis during the Nebula testing program. Its ability to support the Langley Science Directorate needs to be evaluated by integrating it with real world operational needs across NASA and the associated maturity that would come with that. The strengths and weaknesses of this architecture and its ability to support general science and engineering applications has been demonstrated during the previous testing. The Langley Office of the Chief Information Officer in partnership with the Atmospheric Sciences Data Center (ASDC) has established a pilot business interface to utilize AWS cloud computing resources on a organization and project level pay per use model. This poster discusses an effort to evaluate the feasibility of the pilot business interface from a project level perspective by specifically using a processing scenario involving the Clouds and Earth's Radiant Energy System (CERES) project.

  3. OptFuels: Fuel treatment optimization

    Treesearch

    Greg Jones

    2011-01-01

    Scientists at the USDA Forest Service, Rocky Mountain Research Station, in Missoula, MT, in collaboration with scientists at the University of Montana, are developing a tool to help forest managers prioritize forest fuel reduction treatments. Although several computer models analyze fuels and fire behavior, stand-level effects of fuel treatments, and priority planning...

  4. Four Argonne National Laboratory scientists receive Early Career Research

    Science.gov Websites

    Media Contacts Social Media Photos Videos Fact Sheets, Brochures and Reports Summer Science Writing Writing Internship Four Argonne National Laboratory scientists receive Early Career Research Program economic impact of cascading shortages. He will also seek to enable scaling on high-performance computing

  5. Air Force Laboratory’s 2005 Technology Milestones

    DTIC Science & Technology

    2006-01-01

    Computational materials science methods can benefit the design and property prediction of complex real-world materials. With these models , scientists and...Warfighter Page Air High - Frequency Acoustic System...800) 203-6451 High - Frequency Acoustic System Payoff Scientists created the High - Frequency Acoustic Suppression Technology (HiFAST) airflow control

  6. A Computational Workflow for the Automated Generation of Models of Genetic Designs.

    PubMed

    Misirli, Göksel; Nguyen, Tramy; McLaughlin, James Alastair; Vaidyanathan, Prashant; Jones, Timothy S; Densmore, Douglas; Myers, Chris; Wipat, Anil

    2018-06-05

    Computational models are essential to engineer predictable biological systems and to scale up this process for complex systems. Computational modeling often requires expert knowledge and data to build models. Clearly, manual creation of models is not scalable for large designs. Despite several automated model construction approaches, computational methodologies to bridge knowledge in design repositories and the process of creating computational models have still not been established. This paper describes a workflow for automatic generation of computational models of genetic circuits from data stored in design repositories using existing standards. This workflow leverages the software tool SBOLDesigner to build structural models that are then enriched by the Virtual Parts Repository API using Systems Biology Open Language (SBOL) data fetched from the SynBioHub design repository. The iBioSim software tool is then utilized to convert this SBOL description into a computational model encoded using the Systems Biology Markup Language (SBML). Finally, this SBML model can be simulated using a variety of methods. This workflow provides synthetic biologists with easy to use tools to create predictable biological systems, hiding away the complexity of building computational models. This approach can further be incorporated into other computational workflows for design automation.

  7. U.S. Geological Survey programs in Florida, 1999

    USGS Publications Warehouse

    ,

    1999-01-01

    The safety, health, and economic well-being of Florida?s citizens are important to the U.S. Geological Survey (USGS), which is involved in water-related, geologic, biological, land use, and mapping issues in many parts of the State. The USGS office in Tallahassee acts as the liaison for all studies conducted by USGS scientists in Florida. Water resources activities are conducted not only from the office in Tallahassee, but also from offices in Miami, Tampa, and Altamonte Springs (Orlando). Scientists in these offices investigate surface water, ground water and water quality in Florida, working in cooperation with other Federal, State and local agencies and organizations. The USGS Center for Coastal Geology and Regional Marine Studies was established in St. Petersburg in 1988, in cooperation with the University of South Florida. The Center conducts a wide variety of research on mineral resources and on coastal and regional marine problems, including coastal erosion, climate change, wetlands deterioration, and coastal pollution. A USGS mapping office is located in St. Petersburg. Also, the Earth Science Information Center (ESIC) in Tallahassee provides USGS information to customers and directs inquiries to the appropriate USGS office or State agency on earth science topics, particularly those related to cartography, geography, aerial photography, and digital data. Biologists at the USGS Florida Caribbean Science Center, located in Gainesville, conduct biological and ecosystem studies in Florida, Puerto Rico, and the Virgin Islands.

  8. Founding editorial--forensics and TheScientificWorld.

    PubMed

    Rowe, W

    2001-10-30

    At the beginning of a new millennium it seems a good idea to stop for a moment and take stock of the current state of forensic science. As a field of scientific research and scientific application, forensic science is a little more than a century old. Forensic science may be said to have begun in 1887 with the simultaneous publication of A. Conan Doyle's A Study in Scarlet and Hans Gross's Handbuch f1/4r Untersuchungsrichter. Conan Doyle's novel introduced to the world the character of Sherlock Holmes, whose literary career would popularize the use of physical evidence in criminal investigations. Gross's manual for examining magistrates suggests ways in which the expertise of chemists, biologists, geologists, and other natural scientists could contribute to investigations. Gross's book was translated into a number of languages and went through various updated editions during the course of the century. The intervening century saw the development and application of fingerprinting, firearm and tool mark identification, forensic chemistry, forensic biology, forensic toxicology, forensic odontology, forensic pathology, and forensic engineering. Increasingly, the judicial systems of the industrial nations of the world have come to rely upon the expertise of scientists in a variety of disciplines. In most advanced countries, virtually all criminal prosecutions now involve the presentation of scientific testimony. This has had the beneficial effect of diminishing the reliance of courts on eyewitness testimony and defendant confessions.

  9. Gee Fu: a sequence version and web-services database tool for genomic assembly, genome feature and NGS data.

    PubMed

    Ramirez-Gonzalez, Ricardo; Caccamo, Mario; MacLean, Daniel

    2011-10-01

    Scientists now use high-throughput sequencing technologies and short-read assembly methods to create draft genome assemblies in just days. Tools and pipelines like the assembler, and the workflow management environments make it easy for a non-specialist to implement complicated pipelines to produce genome assemblies and annotations very quickly. Such accessibility results in a proliferation of assemblies and associated files, often for many organisms. These assemblies get used as a working reference by lots of different workers, from a bioinformatician doing gene prediction or a bench scientist designing primers for PCR. Here we describe Gee Fu, a database tool for genomic assembly and feature data, including next-generation sequence alignments. Gee Fu is an instance of a Ruby-On-Rails web application on a feature database that provides web and console interfaces for input, visualization of feature data via AnnoJ, access to data through a web-service interface, an API for direct data access by Ruby scripts and access to feature data stored in BAM files. Gee Fu provides a platform for storing and sharing different versions of an assembly and associated features that can be accessed and updated by bench biologists and bioinformaticians in ways that are easy and useful for each. http://tinyurl.com/geefu dan.maclean@tsl.ac.uk.

  10. KSC-2012-5686

    NASA Image and Video Library

    2012-10-06

    CAPE CANAVERAL, Fla. -- News and social media representatives participate in a space station and mission science briefing in NASA Kennedy Space Center's Press Site auditorium in Florida. On the dais from left are Michael Curie, NASA Public Affairs, Julie Robinson, program scientist for International Space Station at NASA's Johnson Space Center, Timothy Yeatman, interim chief scientist at the Center for the Advancement of Science in Space, Sheila Nielsen-Preiss, cell biologist at Montana State University, and Scott Smith, NASA nutritionist at NASA's Johnson Space Center. The briefing provided media with an overview of the experiments and payloads scheduled for launch on NASA's first Commercial Resupply Services, or CRS-1, mission to the International Space Station. Space Exploration Technologies Corp., or SpaceX, built both the mission's Falcon 9 rocket and Dragon capsule. Launch is scheduled for 8:35 p.m. EDT on Oct. 7 from Space Launch Complex 40 on Cape Canaveral Air Force Station. SpaceX CRS-1 is an important step toward making America’s microgravity research program self-sufficient by providing a way to deliver and return significant amounts of cargo, including science experiments, to and from the orbiting laboratory. NASA has contracted for 12 commercial resupply flights from SpaceX and eight from the Orbital Sciences Corp. For more information, visit http://www.nasa.gov/mission_pages/station/living/launch/index.html. Photo credit: NASA/Kim Shiflett

  11. Africanizing Science in Post-colonial Kenya: Long-Term Field Research in the Amboseli Ecosystem, 1963-1989.

    PubMed

    Lewis, Amanda E

    2017-11-08

    Following Kenya's independence in 1963, scientists converged on an ecologically sensitive area in southern Kenya on the northern slope of Mt. Kilimanjaro called Amboseli. This region is the homeland of the Ilkisongo Maasai who grazed this ecosystem along with the wildlife of interest to the scientists. Biologists saw opportunities to study this complex community, an environment rich in biological diversity. The Amboseli landscape proved to be fertile ground for testing new methods and lines of inquiry in the biological sciences that were generalizable and important for shaping natural resource management policies in Kenya. However, the local community was in the midst of its own transformation from a primarily transhumant lifestyle to a largely sedentary one, a complex political situation between local and national authorities, and the introduction of a newly educated generation. This article examines the intersection of African history and field science through the post-colonial Africanization of Kenyan politics, the broadening of scientific practices in Amboseli in previously Western-occupied spaces to include Kenyan participants, and an increasing awareness of the role of local African contexts in the results, methods, and implications of biological research. "Africanization" as an idea in the history of science is multifaceted encompassing not just Africans in the scientific process, but it needs an examination of the larger political and social context on both a local and national level.

  12. A toolbox and record for scientific models

    NASA Technical Reports Server (NTRS)

    Ellman, Thomas

    1994-01-01

    Computational science presents a host of challenges for the field of knowledge-based software design. Scientific computation models are difficult to construct. Models constructed by one scientist are easily misapplied by other scientists to problems for which they are not well-suited. Finally, models constructed by one scientist are difficult for others to modify or extend to handle new types of problems. Construction of scientific models actually involves much more than the mechanics of building a single computational model. In the course of developing a model, a scientist will often test a candidate model against experimental data or against a priori expectations. Test results often lead to revisions of the model and a consequent need for additional testing. During a single model development session, a scientist typically examines a whole series of alternative models, each using different simplifying assumptions or modeling techniques. A useful scientific software design tool must support these aspects of the model development process as well. In particular, it should propose and carry out tests of candidate models. It should analyze test results and identify models and parts of models that must be changed. It should determine what types of changes can potentially cure a given negative test result. It should organize candidate models, test data, and test results into a coherent record of the development process. Finally, it should exploit the development record for two purposes: (1) automatically determining the applicability of a scientific model to a given problem; (2) supporting revision of a scientific model to handle a new type of problem. Existing knowledge-based software design tools must be extended in order to provide these facilities.

  13. Computer Series, 98. Electronics for Scientists: A Computer-Intensive Approach.

    ERIC Educational Resources Information Center

    Scheeline, Alexander; Mork, Brian J.

    1988-01-01

    Reports the design for a principles-before-details presentation of electronics for an instrumental analysis class. Uses computers for data collection and simulations. Requires one semester with two 2.5-hour periods and two lectures per week. Includes lab and lecture syllabi. (MVL)

  14. Big data computing: Building a vision for ARS information management

    USDA-ARS?s Scientific Manuscript database

    Improvements are needed within the ARS to increase scientific capacity and keep pace with new developments in computer technologies that support data acquisition and analysis. Enhancements in computing power and IT infrastructure are needed to provide scientists better access to high performance com...

  15. Science-Driven Computing: NERSC's Plan for 2006-2010

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simon, Horst D.; Kramer, William T.C.; Bailey, David H.

    NERSC has developed a five-year strategic plan focusing on three components: Science-Driven Systems, Science-Driven Services, and Science-Driven Analytics. (1) Science-Driven Systems: Balanced introduction of the best new technologies for complete computational systems--computing, storage, networking, visualization and analysis--coupled with the activities necessary to engage vendors in addressing the DOE computational science requirements in their future roadmaps. (2) Science-Driven Services: The entire range of support activities, from high-quality operations and user services to direct scientific support, that enable a broad range of scientists to effectively use NERSC systems in their research. NERSC will concentrate on resources needed to realize the promise ofmore » the new highly scalable architectures for scientific discovery in multidisciplinary computational science projects. (3) Science-Driven Analytics: The architectural and systems enhancements and services required to integrate NERSC's powerful computational and storage resources to provide scientists with new tools to effectively manipulate, visualize, and analyze the huge data sets derived from simulations and experiments.« less

  16. Molluscan Evolutionary Genomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simison, W. Brian; Boore, Jeffrey L.

    2005-12-01

    In the last 20 years there have been dramatic advances in techniques of high-throughput DNA sequencing, most recently accelerated by the Human Genome Project, a program that has determined the three billion base pair code on which we are based. Now this tremendous capability is being directed at other genome targets that are being sampled across the broad range of life. This opens up opportunities as never before for evolutionary and organismal biologists to address questions of both processes and patterns of organismal change. We stand at the dawn of a new 'modern synthesis' period, paralleling that of the earlymore » 20th century when the fledgling field of genetics first identified the underlying basis for Darwin's theory. We must now unite the efforts of systematists, paleontologists, mathematicians, computer programmers, molecular biologists, developmental biologists, and others in the pursuit of discovering what genomics can teach us about the diversity of life. Genome-level sampling for mollusks to date has mostly been limited to mitochondrial genomes and it is likely that these will continue to provide the best targets for broad phylogenetic sampling in the near future. However, we are just beginning to see an inroad into complete nuclear genome sequencing, with several mollusks and other eutrochozoans having been selected for work about to begin. Here, we provide an overview of the state of molluscan mitochondrial genomics, highlight a few of the discoveries from this research, outline the promise of broadening this dataset, describe upcoming projects to sequence whole mollusk nuclear genomes, and challenge the community to prepare for making the best use of these data.« less

  17. biochem4j: Integrated and extensible biochemical knowledge through graph databases.

    PubMed

    Swainston, Neil; Batista-Navarro, Riza; Carbonell, Pablo; Dobson, Paul D; Dunstan, Mark; Jervis, Adrian J; Vinaixa, Maria; Williams, Alan R; Ananiadou, Sophia; Faulon, Jean-Loup; Mendes, Pedro; Kell, Douglas B; Scrutton, Nigel S; Breitling, Rainer

    2017-01-01

    Biologists and biochemists have at their disposal a number of excellent, publicly available data resources such as UniProt, KEGG, and NCBI Taxonomy, which catalogue biological entities. Despite the usefulness of these resources, they remain fundamentally unconnected. While links may appear between entries across these databases, users are typically only able to follow such links by manual browsing or through specialised workflows. Although many of the resources provide web-service interfaces for computational access, performing federated queries across databases remains a non-trivial but essential activity in interdisciplinary systems and synthetic biology programmes. What is needed are integrated repositories to catalogue both biological entities and-crucially-the relationships between them. Such a resource should be extensible, such that newly discovered relationships-for example, those between novel, synthetic enzymes and non-natural products-can be added over time. With the introduction of graph databases, the barrier to the rapid generation, extension and querying of such a resource has been lowered considerably. With a particular focus on metabolic engineering as an illustrative application domain, biochem4j, freely available at http://biochem4j.org, is introduced to provide an integrated, queryable database that warehouses chemical, reaction, enzyme and taxonomic data from a range of reliable resources. The biochem4j framework establishes a starting point for the flexible integration and exploitation of an ever-wider range of biological data sources, from public databases to laboratory-specific experimental datasets, for the benefit of systems biologists, biosystems engineers and the wider community of molecular biologists and biological chemists.

  18. biochem4j: Integrated and extensible biochemical knowledge through graph databases

    PubMed Central

    Batista-Navarro, Riza; Dunstan, Mark; Jervis, Adrian J.; Vinaixa, Maria; Ananiadou, Sophia; Faulon, Jean-Loup; Kell, Douglas B.

    2017-01-01

    Biologists and biochemists have at their disposal a number of excellent, publicly available data resources such as UniProt, KEGG, and NCBI Taxonomy, which catalogue biological entities. Despite the usefulness of these resources, they remain fundamentally unconnected. While links may appear between entries across these databases, users are typically only able to follow such links by manual browsing or through specialised workflows. Although many of the resources provide web-service interfaces for computational access, performing federated queries across databases remains a non-trivial but essential activity in interdisciplinary systems and synthetic biology programmes. What is needed are integrated repositories to catalogue both biological entities and–crucially–the relationships between them. Such a resource should be extensible, such that newly discovered relationships–for example, those between novel, synthetic enzymes and non-natural products–can be added over time. With the introduction of graph databases, the barrier to the rapid generation, extension and querying of such a resource has been lowered considerably. With a particular focus on metabolic engineering as an illustrative application domain, biochem4j, freely available at http://biochem4j.org, is introduced to provide an integrated, queryable database that warehouses chemical, reaction, enzyme and taxonomic data from a range of reliable resources. The biochem4j framework establishes a starting point for the flexible integration and exploitation of an ever-wider range of biological data sources, from public databases to laboratory-specific experimental datasets, for the benefit of systems biologists, biosystems engineers and the wider community of molecular biologists and biological chemists. PMID:28708831

  19. High-Performance Computing Unlocks Innovation at NREL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Need to fly around a wind farm? Or step inside a molecule? NREL scientists use a super powerful (and highly energy-efficient) computer to visualize and solve big problems in renewable energy research.

  20. Mathematical computer programs: A compilation

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Computer programs, routines, and subroutines for aiding engineers, scientists, and mathematicians in direct problem solving are presented. Also included is a group of items that affords the same users greater flexibility in the use of software.

  1. A Modular Environment for Geophysical Inversion and Run-time Autotuning using Heterogeneous Computing Systems

    NASA Astrophysics Data System (ADS)

    Myre, Joseph M.

    Heterogeneous computing systems have recently come to the forefront of the High-Performance Computing (HPC) community's interest. HPC computer systems that incorporate special purpose accelerators, such as Graphics Processing Units (GPUs), are said to be heterogeneous. Large scale heterogeneous computing systems have consistently ranked highly on the Top500 list since the beginning of the heterogeneous computing trend. By using heterogeneous computing systems that consist of both general purpose processors and special- purpose accelerators, the speed and problem size of many simulations could be dramatically increased. Ultimately this results in enhanced simulation capabilities that allows, in some cases for the first time, the execution of parameter space and uncertainty analyses, model optimizations, and other inverse modeling techniques that are critical for scientific discovery and engineering analysis. However, simplifying the usage and optimization of codes for heterogeneous computing systems remains a challenge. This is particularly true for scientists and engineers for whom understanding HPC architectures and undertaking performance analysis may not be primary research objectives. To enable scientists and engineers to remain focused on their primary research objectives, a modular environment for geophysical inversion and run-time autotuning on heterogeneous computing systems is presented. This environment is composed of three major components: 1) CUSH---a framework for reducing the complexity of programming heterogeneous computer systems, 2) geophysical inversion routines which can be used to characterize physical systems, and 3) run-time autotuning routines designed to determine configurations of heterogeneous computing systems in an attempt to maximize the performance of scientific and engineering codes. Using three case studies, a lattice-Boltzmann method, a non-negative least squares inversion, and a finite-difference fluid flow method, it is shown that this environment provides scientists and engineers with means to reduce the programmatic complexity of their applications, to perform geophysical inversions for characterizing physical systems, and to determine high-performing run-time configurations of heterogeneous computing systems using a run-time autotuner.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nikolic, R J

    This month's issue has the following articles: (1) Dawn of a New Era of Scientific Discovery - Commentary by Edward I. Moses; (2) At the Frontiers of Fundamental Science Research - Collaborators from national laboratories, universities, and international organizations are using the National Ignition Facility to probe key fundamental science questions; (3) Livermore Responds to Crisis in Post-Earthquake Japan - More than 70 Laboratory scientists provided round-the-clock expertise in radionuclide analysis and atmospheric dispersion modeling as part of the nation's support to Japan following the March 2011 earthquake and nuclear accident; (4) A Comprehensive Resource for Modeling, Simulation, and Experimentsmore » - A new Web-based resource called MIDAS is a central repository for material properties, experimental data, and computer models; and (5) Finding Data Needles in Gigabit Haystacks - Livermore computer scientists have developed a novel computer architecture based on 'persistent' memory to ease data-intensive computations.« less

  3. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill; Feiereisen, William (Technical Monitor)

    2000-01-01

    The term "Grid" refers to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. The vision for NASN's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks that will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. IPG development and deployment is addressing requirements obtained by analyzing a number of different application areas, in particular from the NASA Aero-Space Technology Enterprise. This analysis has focussed primarily on two types of users: The scientist / design engineer whose primary interest is problem solving (e.g., determining wing aerodynamic characteristics in many different operating environments), and whose primary interface to IPG will be through various sorts of problem solving frameworks. The second type of user if the tool designer: The computational scientists who convert physics and mathematics into code that can simulate the physical world. These are the two primary users of IPG, and they have rather different requirements. This paper describes the current state of IPG (the operational testbed), the set of capabilities being put into place for the operational prototype IPG, as well as some of the longer term R&D tasks.

  4. The Effects of a Robot Game Environment on Computer Programming Education for Elementary School Students

    ERIC Educational Resources Information Center

    Shim, Jaekwoun; Kwon, Daiyoung; Lee, Wongyu

    2017-01-01

    In the past, computer programming was perceived as a task only carried out by computer scientists; in the 21st century, however, computer programming is viewed as a critical and necessary skill that everyone should learn. In order to improve teaching of problem-solving abilities in a computing environment, extensive research is being done on…

  5. Journal news

    USGS Publications Warehouse

    Conroy, M.J.; Samuel, M.D.; White, Joanne C.

    1995-01-01

    Statistical power (and conversely, Type II error) is often ignored by biologists. Power is important to consider in the design of studies, to ensure that sufficient resources are allocated to address a hypothesis under examination. Deter- mining appropriate sample size when designing experiments or calculating power for a statistical test requires an investigator to consider the importance of making incorrect conclusions about the experimental hypothesis and the biological importance of the alternative hypothesis (or the biological effect size researchers are attempting to measure). Poorly designed studies frequently provide results that are at best equivocal, and do little to advance science or assist in decision making. Completed studies that fail to reject Ho should consider power and the related probability of a Type II error in the interpretation of results, particularly when implicit or explicit acceptance of Ho is used to support a biological hypothesis or management decision. Investigators must consider the biological question they wish to answer (Tacha et al. 1982) and assess power on the basis of biologically significant differences (Taylor and Gerrodette 1993). Power calculations are somewhat subjective, because the author must specify either f or the minimum difference that is biologically important. Biologists may have different ideas about what values are appropriate. While determining biological significance is of central importance in power analysis, it is also an issue of importance in wildlife science. Procedures, references, and computer software to compute power are accessible; therefore, authors should consider power. We welcome comments or suggestions on this subject.

  6. RIACS

    NASA Technical Reports Server (NTRS)

    Moore, Robert C.

    1998-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities that serves as a bridge between NASA and the academic community. Under a five-year co-operative agreement with NASA, research at RIACS is focused on areas that are strategically enabling to the Ames Research Center's role as NASA's Center of Excellence for Information Technology. Research is carried out by a staff of full-time scientist,augmented by visitors, students, post doctoral candidates and visiting university faculty. The primary mission of RIACS is charted to carry out research and development in computer science. This work is devoted in the main to tasks that are strategically enabling with respect to NASA's bold mission in space exploration and aeronautics. There are three foci for this work: Automated Reasoning. Human-Centered Computing. and High Performance Computing and Networking. RIACS has the additional goal of broadening the base of researcher in these areas of importance to the nation's space and aeronautics enterprises. Through its visiting scientist program, RIACS facilitates the participation of university-based researchers, including both faculty and students, in the research activities of NASA and RIACS. RIACS researchers work in close collaboration with NASA computer scientists on projects such as the Remote Agent Experiment on Deep Space One mission, and Super-Resolution Surface Modeling.

  7. Computational chemistry at Janssen

    NASA Astrophysics Data System (ADS)

    van Vlijmen, Herman; Desjarlais, Renee L.; Mirzadegan, Tara

    2017-03-01

    Computer-aided drug discovery activities at Janssen are carried out by scientists in the Computational Chemistry group of the Discovery Sciences organization. This perspective gives an overview of the organizational and operational structure, the science, internal and external collaborations, and the impact of the group on Drug Discovery at Janssen.

  8. Computing Life

    ERIC Educational Resources Information Center

    National Institute of General Medical Sciences (NIGMS), 2009

    2009-01-01

    Computer advances now let researchers quickly search through DNA sequences to find gene variations that could lead to disease, simulate how flu might spread through one's school, and design three-dimensional animations of molecules that rival any video game. By teaming computers and biology, scientists can answer new and old questions that could…

  9. Computed microtomography and X-ray fluorescence analysis for comprehensive analysis of structural changes in bone.

    PubMed

    Buzmakov, Alexey; Chukalina, Marina; Nikolaev, Dmitry; Schaefer, Gerald; Gulimova, Victoria; Saveliev, Sergey; Tereschenko, Elena; Seregin, Alexey; Senin, Roman; Prun, Victor; Zolotov, Denis; Asadchikov, Victor

    2013-01-01

    This paper presents the results of a comprehensive analysis of structural changes in the caudal vertebrae of Turner's thick-toed geckos by computer microtomography and X-ray fluorescence analysis. We present algorithms used for the reconstruction of tomographic images which allow to work with high noise level projections that represent typical conditions dictated by the nature of the samples. Reptiles, due to their ruggedness, small size, belonging to the amniote and a number of other valuable features, are an attractive model object for long-orbital experiments on unmanned spacecraft. Issues of possible changes in their bone tissue under the influence of spaceflight are the subject of discussions between biologists from different laboratories around the world.

  10. Local Alignment Tool Based on Hadoop Framework and GPU Architecture

    PubMed Central

    Hung, Che-Lun; Hua, Guan-Jie

    2014-01-01

    With the rapid growth of next generation sequencing technologies, such as Slex, more and more data have been discovered and published. To analyze such huge data the computational performance is an important issue. Recently, many tools, such as SOAP, have been implemented on Hadoop and GPU parallel computing architectures. BLASTP is an important tool, implemented on GPU architectures, for biologists to compare protein sequences. To deal with the big biology data, it is hard to rely on single GPU. Therefore, we implement a distributed BLASTP by combining Hadoop and multi-GPUs. The experimental results present that the proposed method can improve the performance of BLASTP on single GPU, and also it can achieve high availability and fault tolerance. PMID:24955362

  11. Local alignment tool based on Hadoop framework and GPU architecture.

    PubMed

    Hung, Che-Lun; Hua, Guan-Jie

    2014-01-01

    With the rapid growth of next generation sequencing technologies, such as Slex, more and more data have been discovered and published. To analyze such huge data the computational performance is an important issue. Recently, many tools, such as SOAP, have been implemented on Hadoop and GPU parallel computing architectures. BLASTP is an important tool, implemented on GPU architectures, for biologists to compare protein sequences. To deal with the big biology data, it is hard to rely on single GPU. Therefore, we implement a distributed BLASTP by combining Hadoop and multi-GPUs. The experimental results present that the proposed method can improve the performance of BLASTP on single GPU, and also it can achieve high availability and fault tolerance.

  12. Exploiting graphics processing units for computational biology and bioinformatics.

    PubMed

    Payne, Joshua L; Sinnott-Armstrong, Nicholas A; Moore, Jason H

    2010-09-01

    Advances in the video gaming industry have led to the production of low-cost, high-performance graphics processing units (GPUs) that possess more memory bandwidth and computational capability than central processing units (CPUs), the standard workhorses of scientific computing. With the recent release of generalpurpose GPUs and NVIDIA's GPU programming language, CUDA, graphics engines are being adopted widely in scientific computing applications, particularly in the fields of computational biology and bioinformatics. The goal of this article is to concisely present an introduction to GPU hardware and programming, aimed at the computational biologist or bioinformaticist. To this end, we discuss the primary differences between GPU and CPU architecture, introduce the basics of the CUDA programming language, and discuss important CUDA programming practices, such as the proper use of coalesced reads, data types, and memory hierarchies. We highlight each of these topics in the context of computing the all-pairs distance between instances in a dataset, a common procedure in numerous disciplines of scientific computing. We conclude with a runtime analysis of the GPU and CPU implementations of the all-pairs distance calculation. We show our final GPU implementation to outperform the CPU implementation by a factor of 1700.

  13. The Study Team for Early Life Asthma Research (STELAR) consortium ‘Asthma e-lab’: team science bringing data, methods and investigators together

    PubMed Central

    Custovic, Adnan; Ainsworth, John; Arshad, Hasan; Bishop, Christopher; Buchan, Iain; Cullinan, Paul; Devereux, Graham; Henderson, John; Holloway, John; Roberts, Graham; Turner, Steve; Woodcock, Ashley; Simpson, Angela

    2015-01-01

    We created Asthma e-Lab, a secure web-based research environment to support consistent recording, description and sharing of data, computational/statistical methods and emerging findings across the five UK birth cohorts. The e-Lab serves as a data repository for our unified dataset and provides the computational resources and a scientific social network to support collaborative research. All activities are transparent, and emerging findings are shared via the e-Lab, linked to explanations of analytical methods, thus enabling knowledge transfer. eLab facilitates the iterative interdisciplinary dialogue between clinicians, statisticians, computer scientists, mathematicians, geneticists and basic scientists, capturing collective thought behind the interpretations of findings. PMID:25805205

  14. Identifying the Factors Leading to Success: How an Innovative Science Curriculum Cultivates Student Motivation

    ERIC Educational Resources Information Center

    Scogin, Stephen C.

    2016-01-01

    "PlantingScience" is an award-winning program recognized for its innovation and use of computer-supported scientist mentoring. Science learners work on inquiry-based experiments in their classrooms and communicate asynchronously with practicing plant scientist-mentors about the projects. The purpose of this study was to identify specific…

  15. Tessera: Open source software for accelerated data science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sego, Landon H.; Hafen, Ryan P.; Director, Hannah M.

    2014-06-30

    Extracting useful, actionable information from data can be a formidable challenge for the safeguards, nonproliferation, and arms control verification communities. Data scientists are often on the “front-lines” of making sense of complex and large datasets. They require flexible tools that make it easy to rapidly reformat large datasets, interactively explore and visualize data, develop statistical algorithms, and validate their approaches—and they need to perform these activities with minimal lines of code. Existing commercial software solutions often lack extensibility and the flexibility required to address the nuances of the demanding and dynamic environments where data scientists work. To address this need,more » Pacific Northwest National Laboratory developed Tessera, an open source software suite designed to enable data scientists to interactively perform their craft at the terabyte scale. Tessera automatically manages the complicated tasks of distributed storage and computation, empowering data scientists to do what they do best: tackling critical research and mission objectives by deriving insight from data. We illustrate the use of Tessera with an example analysis of computer network data.« less

  16. Cross Domain Deterrence: Livermore Technical Report, 2014-2016

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnes, Peter D.; Bahney, Ben; Matarazzo, Celeste

    2016-08-03

    Lawrence Livermore National Laboratory (LLNL) is an original collaborator on the project titled “Deterring Complex Threats: The Effects of Asymmetry, Interdependence, and Multi-polarity on International Strategy,” (CDD Project) led by the UC Institute on Global Conflict and Cooperation at UCSD under PIs Jon Lindsay and Erik Gartzke , and funded through the DoD Minerva Research Initiative. In addition to participating in workshops and facilitating interaction among UC social scientists, LLNL is leading the computational modeling effort and assisting with empirical case studies to probe the viability of analytic, modeling and data analysis concepts. This report summarizes LLNL work on themore » CDD Project to date, primarily in Project Years 1-2, corresponding to Federal fiscal year 2015. LLNL brings two unique domains of expertise to bear on this Project: (1) access to scientific expertise on the technical dimensions of emerging threat technology, and (2) high performance computing (HPC) expertise, required for analyzing the complexity of bargaining interactions in the envisioned threat models. In addition, we have a small group of researchers trained as social scientists who are intimately familiar with the International Relations research. We find that pairing simulation scientists, who are typically trained in computer science, with domain experts, social scientists in this case, is the most effective route to developing powerful new simulation tools capable of representing domain concepts accurately and answering challenging questions in the field.« less

  17. Towards Robot Scientists for autonomous scientific discovery

    PubMed Central

    2010-01-01

    We review the main components of autonomous scientific discovery, and how they lead to the concept of a Robot Scientist. This is a system which uses techniques from artificial intelligence to automate all aspects of the scientific discovery process: it generates hypotheses from a computer model of the domain, designs experiments to test these hypotheses, runs the physical experiments using robotic systems, analyses and interprets the resulting data, and repeats the cycle. We describe our two prototype Robot Scientists: Adam and Eve. Adam has recently proven the potential of such systems by identifying twelve genes responsible for catalysing specific reactions in the metabolic pathways of the yeast Saccharomyces cerevisiae. This work has been formally recorded in great detail using logic. We argue that the reporting of science needs to become fully formalised and that Robot Scientists can help achieve this. This will make scientific information more reproducible and reusable, and promote the integration of computers in scientific reasoning. We believe the greater automation of both the physical and intellectual aspects of scientific investigations to be essential to the future of science. Greater automation improves the accuracy and reliability of experiments, increases the pace of discovery and, in common with conventional laboratory automation, removes tedious and repetitive tasks from the human scientist. PMID:20119518

  18. Towards Robot Scientists for autonomous scientific discovery.

    PubMed

    Sparkes, Andrew; Aubrey, Wayne; Byrne, Emma; Clare, Amanda; Khan, Muhammed N; Liakata, Maria; Markham, Magdalena; Rowland, Jem; Soldatova, Larisa N; Whelan, Kenneth E; Young, Michael; King, Ross D

    2010-01-04

    We review the main components of autonomous scientific discovery, and how they lead to the concept of a Robot Scientist. This is a system which uses techniques from artificial intelligence to automate all aspects of the scientific discovery process: it generates hypotheses from a computer model of the domain, designs experiments to test these hypotheses, runs the physical experiments using robotic systems, analyses and interprets the resulting data, and repeats the cycle. We describe our two prototype Robot Scientists: Adam and Eve. Adam has recently proven the potential of such systems by identifying twelve genes responsible for catalysing specific reactions in the metabolic pathways of the yeast Saccharomyces cerevisiae. This work has been formally recorded in great detail using logic. We argue that the reporting of science needs to become fully formalised and that Robot Scientists can help achieve this. This will make scientific information more reproducible and reusable, and promote the integration of computers in scientific reasoning. We believe the greater automation of both the physical and intellectual aspects of scientific investigations to be essential to the future of science. Greater automation improves the accuracy and reliability of experiments, increases the pace of discovery and, in common with conventional laboratory automation, removes tedious and repetitive tasks from the human scientist.

  19. Implementations of the CC'01 Human-Computer Interaction Guidelines Using Bloom's Taxonomy

    ERIC Educational Resources Information Center

    Manaris, Bill; Wainer, Michael; Kirkpatrick, Arthur E.; Stalvey, RoxAnn H.; Shannon, Christine; Leventhal, Laura; Barnes, Julie; Wright, John; Schafer, J. Ben; Sanders, Dean

    2007-01-01

    In today's technology-laden society human-computer interaction (HCI) is an important knowledge area for computer scientists and software engineers. This paper surveys existing approaches to incorporate HCI into computer science (CS) and such related issues as the perceived gap between the interests of the HCI community and the needs of CS…

  20. Eckert, Wallace John (1902-71)

    NASA Astrophysics Data System (ADS)

    Murdin, P.

    2000-11-01

    Computer scientist and astronomer. Born in Pittsburgh, PA, Eckert was a pioneer of the use of IBM punched card equipment for astronomical calculations. As director of the US Nautical Almanac Office he introduced computer methods to calculate and print tables instead of relying on human `computers'. When, later, he became director of the Watson Scientific Computing Laboratory at Columbia Universit...

Top