Science.gov

Sample records for computational biology research

  1. The applications of computers in biological research

    NASA Technical Reports Server (NTRS)

    Wei, Jennifer

    1988-01-01

    Research in many fields could not be done without computers. There is often a great deal of technical data, even in the biological fields, that need to be analyzed. These data, unfortunately, previously absorbed much of every researcher's time. Now, due to the steady increase in computer technology, biological researchers are able to make incredible advances in their work without the added worries of tedious and difficult tasks such as the many mathematical calculations involved in today's research and health care.

  2. Structural biology computing: Lessons for the biomedical research sciences.

    PubMed

    Morin, Andrew; Sliz, Piotr

    2013-11-01

    The field of structural biology, whose aim is to elucidate the molecular and atomic structures of biological macromolecules, has long been at the forefront of biomedical sciences in adopting and developing computational research methods. Operating at the intersection between biophysics, biochemistry, and molecular biology, structural biology's growth into a foundational framework on which many concepts and findings of molecular biology are interpreted1 has depended largely on parallel advancements in computational tools and techniques. Without these computing advances, modern structural biology would likely have remained an exclusive pursuit practiced by few, and not become the widely practiced, foundational field it is today. As other areas of biomedical research increasingly embrace research computing techniques, the successes, failures and lessons of structural biology computing can serve as a useful guide to progress in other biomedically related research fields. PMID:23828134

  3. Computer Literacy for Life Sciences: Helping the Digital-Era Biology Undergraduates Face Today's Research

    ERIC Educational Resources Information Center

    Smolinski, Tomasz G.

    2010-01-01

    Computer literacy plays a critical role in today's life sciences research. Without the ability to use computers to efficiently manipulate and analyze large amounts of data resulting from biological experiments and simulations, many of the pressing questions in the life sciences could not be answered. Today's undergraduates, despite the ubiquity of…

  4. ISCB Ebola Award for Important Future Research on the Computational Biology of Ebola Virus

    PubMed Central

    Karp, Peter D.; Berger, Bonnie; Kovats, Diane; Lengauer, Thomas; Linial, Michal; Sabeti, Pardis; Hide, Winston; Rost, Burkhard

    2015-01-01

    Speed is of the essence in combating Ebola; thus, computational approaches should form a significant component of Ebola research. As for the development of any modern drug, computational biology is uniquely positioned to contribute through comparative analysis of the genome sequences of Ebola strains as well as 3-D protein modeling. Other computational approaches to Ebola may include large-scale docking studies of Ebola proteins with human proteins and with small-molecule libraries, computational modeling of the spread of the virus, computational mining of the Ebola literature, and creation of a curated Ebola database. Taken together, such computational efforts could significantly accelerate traditional scientific approaches. In recognition of the need for important and immediate solutions from the field of computational biology against Ebola, the International Society for Computational Biology (ISCB) announces a prize for an important computational advance in fighting the Ebola virus. ISCB will confer the ISCB Fight against Ebola Award, along with a prize of US$2,000, at its July 2016 annual meeting (ISCB Intelligent Systems for Molecular Biology (ISMB) 2016, Orlando, Florida). PMID:26097686

  5. ISCB Ebola Award for Important Future Research on the Computational Biology of Ebola Virus

    PubMed Central

    Karp, Peter D.; Berger, Bonnie; Kovats, Diane; Lengauer, Thomas; Linial, Michal; Sabeti, Pardis; Hide, Winston; Rost, Burkhard

    2015-01-01

    Speed is of the essence in combating Ebola; thus, computational approaches should form a significant component of Ebola research. As for the development of any modern drug, computational biology is uniquely positioned to contribute through comparative analysis of the genome sequences of Ebola strains as well as 3-D protein modeling. Other computational approaches to Ebola may include large-scale docking studies of Ebola proteins with human proteins and with small-molecule libraries, computational modeling of the spread of the virus, computational mining of the Ebola literature, and creation of a curated Ebola database. Taken together, such computational efforts could significantly accelerate traditional scientific approaches. In recognition of the need for important and immediate solutions from the field of computational biology against Ebola, the International Society for Computational Biology (ISCB) announces a prize for an important computational advance in fighting the Ebola virus. ISCB will confer the ISCB Fight against Ebola Award, along with a prize of US$2,000, at its July 2016 annual meeting (ISCB Intelligent Systems for Molecular Biology [ISMB] 2016, Orlando, Florida).

  6. Computational Systems Chemical Biology

    PubMed Central

    Oprea, Tudor I.; May, Elebeoba E.; Leitão, Andrei; Tropsha, Alexander

    2013-01-01

    There is a critical need for improving the level of chemistry awareness in systems biology. The data and information related to modulation of genes and proteins by small molecules continue to accumulate at the same time as simulation tools in systems biology and whole body physiologically-based pharmacokinetics (PBPK) continue to evolve. We called this emerging area at the interface between chemical biology and systems biology systems chemical biology, SCB (Oprea et al., 2007). The overarching goal of computational SCB is to develop tools for integrated chemical-biological data acquisition, filtering and processing, by taking into account relevant information related to interactions between proteins and small molecules, possible metabolic transformations of small molecules, as well as associated information related to genes, networks, small molecules and, where applicable, mutants and variants of those proteins. There is yet an unmet need to develop an integrated in silico pharmacology / systems biology continuum that embeds drug-target-clinical outcome (DTCO) triplets, a capability that is vital to the future of chemical biology, pharmacology and systems biology. Through the development of the SCB approach, scientists will be able to start addressing, in an integrated simulation environment, questions that make the best use of our ever-growing chemical and biological data repositories at the system-wide level. This chapter reviews some of the major research concepts and describes key components that constitute the emerging area of computational systems chemical biology. PMID:20838980

  7. Large Scale Computing and Storage Requirements for Biological and Environmental Research

    SciTech Connect

    DOE Office of Science, Biological and Environmental Research Program Office ,

    2009-09-30

    In May 2009, NERSC, DOE's Office of Advanced Scientific Computing Research (ASCR), and DOE's Office of Biological and Environmental Research (BER) held a workshop to characterize HPC requirements for BER-funded research over the subsequent three to five years. The workshop revealed several key points, in addition to achieving its goal of collecting and characterizing computing requirements. Chief among them: scientific progress in BER-funded research is limited by current allocations of computational resources. Additionally, growth in mission-critical computing -- combined with new requirements for collaborative data manipulation and analysis -- will demand ever increasing computing, storage, network, visualization, reliability and service richness from NERSC. This report expands upon these key points and adds others. It also presents a number of"case studies" as significant representative samples of the needs of science teams within BER. Workshop participants were asked to codify their requirements in this"case study" format, summarizing their science goals, methods of solution, current and 3-5 year computing requirements, and special software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel,"multi-core" environment that is expected to dominate HPC architectures over the next few years.

  8. Computational Systems Biology

    SciTech Connect

    McDermott, Jason E.; Samudrala, Ram; Bumgarner, Roger E.; Montogomery, Kristina; Ireton, Renee

    2009-05-01

    Computational systems biology is the term that we use to describe computational methods to identify, infer, model, and store relationships between the molecules, pathways, and cells (“systems”) involved in a living organism. Based on this definition, the field of computational systems biology has been in existence for some time. However, the recent confluence of high throughput methodology for biological data gathering, genome-scale sequencing and computational processing power has driven a reinvention and expansion of this field. The expansions include not only modeling of small metabolic{Ishii, 2004 #1129; Ekins, 2006 #1601; Lafaye, 2005 #1744} and signaling systems{Stevenson-Paulik, 2006 #1742; Lafaye, 2005 #1744} but also modeling of the relationships between biological components in very large systems, incluyding whole cells and organisms {Ideker, 2001 #1124; Pe'er, 2001 #1172; Pilpel, 2001 #393; Ideker, 2002 #327; Kelley, 2003 #1117; Shannon, 2003 #1116; Ideker, 2004 #1111}{Schadt, 2003 #475; Schadt, 2006 #1661}{McDermott, 2002 #878; McDermott, 2005 #1271}. Generally these models provide a general overview of one or more aspects of these systems and leave the determination of details to experimentalists focused on smaller subsystems. The promise of such approaches is that they will elucidate patterns, relationships and general features that are not evident from examining specific components or subsystems. These predictions are either interesting in and of themselves (for example, the identification of an evolutionary pattern), or are interesting and valuable to researchers working on a particular problem (for example highlight a previously unknown functional pathway). Two events have occurred to bring about the field computational systems biology to the forefront. One is the advent of high throughput methods that have generated large amounts of information about particular systems in the form of genetic studies, gene expression analyses (both protein and

  9. Computer literacy for life sciences: helping the digital-era biology undergraduates face today's research.

    PubMed

    Smolinski, Tomasz G

    2010-01-01

    Computer literacy plays a critical role in today's life sciences research. Without the ability to use computers to efficiently manipulate and analyze large amounts of data resulting from biological experiments and simulations, many of the pressing questions in the life sciences could not be answered. Today's undergraduates, despite the ubiquity of computers in their lives, seem to be largely unfamiliar with how computers are being used to pursue and answer such questions. This article describes an innovative undergraduate-level course, titled Computer Literacy for Life Sciences, that aims to teach students the basics of a computerized scientific research pursuit. The purpose of the course is for students to develop a hands-on working experience in using standard computer software tools as well as computer techniques and methodologies used in life sciences research. This paper provides a detailed description of the didactical tools and assessment methods used in and outside of the classroom as well as a discussion of the lessons learned during the first installment of the course taught at Emory University in fall semester 2009. PMID:20810969

  10. Calibrated Peer Review for Computer-Assisted Learning of Biological Research Competencies

    ERIC Educational Resources Information Center

    Clase, Kari L.; Gundlach, Ellen; Pelaez, Nancy J.

    2010-01-01

    Recently, both science and technology faculty have been recognizing biological research competencies that are valued but rarely assessed. Some of these valued learning outcomes include scientific methods and thinking, critical assessment of primary papers, quantitative reasoning, communication, and putting biological research into a historical and…

  11. Systems biology in psychiatric research: from complex data sets over wiring diagrams to computer simulations.

    PubMed

    Tretter, Felix; Gebicke-Haerter, Peter J

    2012-01-01

    The classification of psychiatric disorders has always been a problem in clinical settings. The present debate about the major systems in clinical practice, DSM-IV and ICD-10, has resulted in attempts to improve and replace those schemes by some that include more endophenotypic and molecular features. However, these disorders not only require more precise diagnostic tools, but also have to be viewed more extensively in their dynamic behaviors, which require more precise data sets related to their origins and developments. This enormous challenge in brain research has to be approached on different levels of the biological system by new methods, including improvements in electroencephalography, brain imaging, and molecular biology. All these methods entail accumulations of large data sets that become more and more difficult to interpret. In particular, on the molecular level, there is an apparent need to use highly sophisticated computer programs to tackle these problems. Evidently, only interdisciplinary work among mathematicians, physicists, biologists, and clinicians can further improve our understanding of complex diseases of the brain. PMID:22231839

  12. RESEARCH STRATEGIES FOR THE APPLICATION OF THE TECHNIQUES OF COMPUTATIONAL BIOLOGICAL CHEMISTRY TO ENVIRONMENTAL PROBLEMS

    EPA Science Inventory

    On October 25 and 26, 1984, the U.S. EPA sponsored a workshop to consider the potential applications of the techniques of computational biological chemistry to problems in environmental health. Eleven extramural scientists from the various related disciplines and a similar number...

  13. Management and Analysis of Biological and Clinical Data: How Computer Science May Support Biomedical and Clinical Research

    NASA Astrophysics Data System (ADS)

    Veltri, Pierangelo

    The use of computer based solutions for data management in biology and clinical science has contributed to improve life-quality and also to gather research results in shorter time. Indeed, new algorithms and high performance computation have been using in proteomics and genomics studies for curing chronic diseases (e.g., drug designing) as well as supporting clinicians both in diagnosis (e.g., images-based diagnosis) and patient curing (e.g., computer based information analysis on information gathered from patient). In this paper we survey on examples of computer based techniques applied in both biology and clinical contexts. The reported applications are also results of experiences in real case applications at University Medical School of Catanzaro and also part of experiences of the National project Staywell SH 2.0 involving many research centers and companies aiming to study and improve citizen wellness.

  14. InteractoMIX: a suite of computational tools to exploit interactomes in biological and clinical research.

    PubMed

    Poglayen, Daniel; Marín-López, Manuel Alejandro; Bonet, Jaume; Fornes, Oriol; Garcia-Garcia, Javier; Planas-Iglesias, Joan; Segura, Joan; Oliva, Baldo; Fernandez-Fuentes, Narcis

    2016-06-15

    Virtually all the biological processes that occur inside or outside cells are mediated by protein-protein interactions (PPIs). Hence, the charting and description of the PPI network, initially in organisms, the interactome, but more recently in specific tissues, is essential to fully understand cellular processes both in health and disease. The study of PPIs is also at the heart of renewed efforts in the medical and biotechnological arena in the quest of new therapeutic targets and drugs. Here, we present a mini review of 11 computational tools and resources tools developed by us to address different aspects of PPIs: from interactome level to their atomic 3D structural details. We provided details on each specific resource, aims and purpose and compare with equivalent tools in the literature. All the tools are presented in a centralized, one-stop, web site: InteractoMIX (http://interactomix.com). PMID:27284060

  15. Computational Biology, Advanced Scientific Computing, and Emerging Computational Architectures

    SciTech Connect

    2007-06-27

    This CRADA was established at the start of FY02 with $200 K from IBM and matching funds from DOE to support post-doctoral fellows in collaborative research between International Business Machines and Oak Ridge National Laboratory to explore effective use of emerging petascale computational architectures for the solution of computational biology problems. 'No cost' extensions of the CRADA were negotiated with IBM for FY03 and FY04.

  16. Synthetic biology: insights into biological computation.

    PubMed

    Manzoni, Romilde; Urrios, Arturo; Velazquez-Garcia, Silvia; de Nadal, Eulàlia; Posas, Francesc

    2016-04-18

    Organisms have evolved a broad array of complex signaling mechanisms that allow them to survive in a wide range of environmental conditions. They are able to sense external inputs and produce an output response by computing the information. Synthetic biology attempts to rationally engineer biological systems in order to perform desired functions. Our increasing understanding of biological systems guides this rational design, while the huge background in electronics for building circuits defines the methodology. In this context, biocomputation is the branch of synthetic biology aimed at implementing artificial computational devices using engineered biological motifs as building blocks. Biocomputational devices are defined as biological systems that are able to integrate inputs and return outputs following pre-determined rules. Over the last decade the number of available synthetic engineered devices has increased exponentially; simple and complex circuits have been built in bacteria, yeast and mammalian cells. These devices can manage and store information, take decisions based on past and present inputs, and even convert a transient signal into a sustained response. The field is experiencing a fast growth and every day it is easier to implement more complex biological functions. This is mainly due to advances in in vitro DNA synthesis, new genome editing tools, novel molecular cloning techniques, continuously growing part libraries as well as other technological advances. This allows that digital computation can now be engineered and implemented in biological systems. Simple logic gates can be implemented and connected to perform novel desired functions or to better understand and redesign biological processes. Synthetic biological digital circuits could lead to new therapeutic approaches, as well as new and efficient ways to produce complex molecules such as antibiotics, bioplastics or biofuels. Biological computation not only provides possible biomedical and

  17. Computational Biology: A Strategic Initiative LDRD

    SciTech Connect

    Barksy, D; Colvin, M

    2002-02-07

    The goal of this Strategic Initiative LDRD project was to establish at LLNL a new core capability in computational biology, combining laboratory strengths in high performance computing, molecular biology, and computational chemistry and physics. As described in this report, this project has been very successful in achieving this goal. This success is demonstrated by the large number of referred publications, invited talks, and follow-on research grants that have resulted from this project. Additionally, this project has helped build connections to internal and external collaborators and funding agencies that will be critical to the long-term vitality of LLNL programs in computational biology. Most importantly, this project has helped establish on-going research groups in the Biology and Biotechnology Research Program, the Physics and Applied Technology Directorate, and the Computation Directorate. These groups include three laboratory staff members originally hired as post-doctoral researchers for this strategic initiative.

  18. Space biology research development

    NASA Technical Reports Server (NTRS)

    Bonting, Sjoerd L.

    1993-01-01

    The purpose of the Search for Extraterrestrial Intelligence (SETI) Institute is to conduct and promote research related activities regarding the search for extraterrestrial life, particularly intelligent life. Such research encompasses the broad discipline of 'Life in the Universe', including all scientific and technological aspects of astronomy and the planetary sciences, chemical evolution, the origin of life, biological evolution, and cultural evolution. The primary purpose was to provide funding for the Principal Investigator to collaborate with the personnel of the SETI Institute and the NASA-Ames Research center in order to plan and develop space biology research on and in connection with Space Station Freedom; to promote cooperation with the international partners in the space station; to conduct a study on the use of biosensors in space biology research and life support system operation; and to promote space biology research through the initiation of an annual publication 'Advances in Space Biology and Medicine'.

  19. Biology and medical research at the exascale.

    SciTech Connect

    Wolf, L.; Pieper, G. W.

    2010-01-01

    Advances in computational hardware and algorithms that have transformed areas of physics and engineering have recently brought similar benefits to biology and biomedical research. Biological sciences are undergoing a revolution. High-performance computing has accelerated the transition from hypothesis-driven to design-driven research at all scales, and computational simulation of biological systems is now driving the direction of biological experimentation and the generation of insights.

  20. MODELING HOST-PATHOGEN INTERACTIONS: COMPUTATIONAL BIOLOGY AND BIOINFORMATICS FOR INFECTIOUS DISEASE RESEARCH (Session introduction)

    SciTech Connect

    McDermott, Jason E.; Braun, Pascal; Bonneau, Richard A.; Hyduke, Daniel R.

    2011-12-01

    Pathogenic infections are a major cause of both human disease and loss of crop yields and animal stocks and thus cause immense damage to the worldwide economy. The significance of infectious diseases is expected to increase in an ever more connected warming world, in which new viral, bacterial and fungal pathogens can find novel hosts and ecologic niches. At the same time, the complex and sophisticated mechanisms by which diverse pathogenic agents evade defense mechanisms and subvert their hosts networks to suit their lifestyle needs is still very incompletely understood especially from a systems perspective [1]. Thus, understanding host-pathogen interactions is both an important and a scientifically fascinating topic. Recently, technology has offered the opportunity to investigate host-pathogen interactions on a level of detail and scope that offers immense computational and analytical possibilities. Genome sequencing was pioneered on some of these pathogens, and the number of strains and variants of pathogens sequenced to date vastly outnumbers the number of host genomes available. At the same time, for both plant and human hosts more and more data on population level genomic variation becomes available and offers a rich field for analysis into the genetic interactions between host and pathogen.

  1. Encouraging Student Biological Research.

    ERIC Educational Resources Information Center

    Frame, Kathy, Ed.; Hays, Rachel, Ed.; Mack, Alison, Ed.

    This publication encourages student involvement in biological research through student research with the cooperation of teachers and scientists. The contents of the book are divided into two sections. The first section introduces students to research investigations and includes: (1) "How the Investigations Are Set Up and the Rationale Behind Their…

  2. Computational Biology and High Performance Computing 2000

    SciTech Connect

    Simon, Horst D.; Zorn, Manfred D.; Spengler, Sylvia J.; Shoichet, Brian K.; Stewart, Craig; Dubchak, Inna L.; Arkin, Adam P.

    2000-10-19

    The pace of extraordinary advances in molecular biology has accelerated in the past decade due in large part to discoveries coming from genome projects on human and model organisms. The advances in the genome project so far, happening well ahead of schedule and under budget, have exceeded any dreams by its protagonists, let alone formal expectations. Biologists expect the next phase of the genome project to be even more startling in terms of dramatic breakthroughs in our understanding of human biology, the biology of health and of disease. Only today can biologists begin to envision the necessary experimental, computational and theoretical steps necessary to exploit genome sequence information for its medical impact, its contribution to biotechnology and economic competitiveness, and its ultimate contribution to environmental quality. High performance computing has become one of the critical enabling technologies, which will help to translate this vision of future advances in biology into reality. Biologists are increasingly becoming aware of the potential of high performance computing. The goal of this tutorial is to introduce the exciting new developments in computational biology and genomics to the high performance computing community.

  3. Computational Skills for Biology Students

    ERIC Educational Resources Information Center

    Gross, Louis J.

    2008-01-01

    This interview with Distinguished Science Award recipient Louis J. Gross highlights essential computational skills for modern biology, including: (1) teaching concepts listed in the Math & Bio 2010 report; (2) illustrating to students that jobs today require quantitative skills; and (3) resources and materials that focus on computational skills.

  4. Limits of computational biology

    PubMed Central

    Bray, Dennis

    2015-01-01

    Abstract Are we close to a complete inventory of living processes so that we might expect in the near future to reproduce every essential aspect necessary for life? Or are there mechanisms and processes in cells and organisms that are presently inaccessible to us? Here I argue that a close examination of a particularly well-understood system— that of Escherichia coli chemotaxis— shows we are still a long way from a complete description. There is a level of molecular uncertainty, particularly that responsible for fine-tuning and adaptation to myriad external conditions, which we presently cannot resolve or reproduce on a computer. Moreover, the same uncertainty exists for any process in any organism and is especially pronounced and important in higher animals such as humans. Embryonic development, tissue homeostasis, immune recognition, memory formation, and survival in the real world, all depend on vast numbers of subtle variations in cell chemistry most of which are presently unknown or only poorly characterized. Overcoming these limitations will require us to not only accumulate large quantities of highly detailed data but also develop new computational methods able to recapitulate the massively parallel processing of living cells. PMID:25318467

  5. Computational representation of biological systems

    SciTech Connect

    Frazier, Zach; McDermott, Jason E.; Guerquin, Michal; Samudrala, Ram

    2009-04-20

    Integration of large and diverse biological data sets is a daunting problem facing systems biology researchers. Exploring the complex issues of data validation, integration, and representation, we present a systematic approach for the management and analysis of large biological data sets based on data warehouses. Our system has been implemented in the Bioverse, a framework combining diverse protein information from a variety of knowledge areas such as molecular interactions, pathway localization, protein structure, and protein function.

  6. Computational Biology and Bioinformatics in Nigeria

    PubMed Central

    Fatumo, Segun A.; Adoga, Moses P.; Ojo, Opeolu O.; Oluwagbemi, Olugbenga; Adeoye, Tolulope; Ewejobi, Itunuoluwa; Adebiyi, Marion; Adebiyi, Ezekiel; Bewaji, Clement; Nashiru, Oyekanmi

    2014-01-01

    Over the past few decades, major advances in the field of molecular biology, coupled with advances in genomic technologies, have led to an explosive growth in the biological data generated by the scientific community. The critical need to process and analyze such a deluge of data and turn it into useful knowledge has caused bioinformatics to gain prominence and importance. Bioinformatics is an interdisciplinary research area that applies techniques, methodologies, and tools in computer and information science to solve biological problems. In Nigeria, bioinformatics has recently played a vital role in the advancement of biological sciences. As a developing country, the importance of bioinformatics is rapidly gaining acceptance, and bioinformatics groups comprised of biologists, computer scientists, and computer engineers are being constituted at Nigerian universities and research institutes. In this article, we present an overview of bioinformatics education and research in Nigeria. We also discuss professional societies and academic and research institutions that play central roles in advancing the discipline in Nigeria. Finally, we propose strategies that can bolster bioinformatics education and support from policy makers in Nigeria, with potential positive implications for other developing countries. PMID:24763310

  7. India's Computational Biology Growth and Challenges.

    PubMed

    Chakraborty, Chiranjib; Bandyopadhyay, Sanghamitra; Agoramoorthy, Govindasamy

    2016-09-01

    India's computational science is growing swiftly due to the outburst of internet and information technology services. The bioinformatics sector of India has been transforming rapidly by creating a competitive position in global bioinformatics market. Bioinformatics is widely used across India to address a wide range of biological issues. Recently, computational researchers and biologists are collaborating in projects such as database development, sequence analysis, genomic prospects and algorithm generations. In this paper, we have presented the Indian computational biology scenario highlighting bioinformatics-related educational activities, manpower development, internet boom, service industry, research activities, conferences and trainings undertaken by the corporate and government sectors. Nonetheless, this new field of science faces lots of challenges. PMID:27465042

  8. Deep learning for computational biology.

    PubMed

    Angermueller, Christof; Pärnamaa, Tanel; Parts, Leopold; Stegle, Oliver

    2016-01-01

    Technological advances in genomics and imaging have led to an explosion of molecular and cellular profiling data from large numbers of samples. This rapid increase in biological data dimension and acquisition rate is challenging conventional analysis strategies. Modern machine learning methods, such as deep learning, promise to leverage very large data sets for finding hidden structure within them, and for making accurate predictions. In this review, we discuss applications of this new breed of analysis approaches in regulatory genomics and cellular imaging. We provide background of what deep learning is, and the settings in which it can be successfully applied to derive biological insights. In addition to presenting specific applications and providing tips for practical use, we also highlight possible pitfalls and limitations to guide computational biologists when and how to make the most use of this new technology. PMID:27474269

  9. Computers and Qualitative Research.

    ERIC Educational Resources Information Center

    Willis, Jerry; Jost, Muktha

    1999-01-01

    Discusses the use of computers in qualitative research, including sources of information; collaboration; electronic discussion groups; Web sites; Internet search engines; electronic sources of data; data collection; communicating research results; desktop publishing; hypermedia and multimedia documents; electronic publishing; holistic and…

  10. Computing in Research.

    ERIC Educational Resources Information Center

    Ashenhurst, Robert L.

    The introduction and diffusion of automatic computing facilities during the 1960's is reviewed; it is described as a time when research strategies in a broad variety of disciplines changed to take advantage of the newfound power provided by the computer. Several types of typical problems encountered by researchers who adopted the new technologies,…

  11. Data integration in biological research: an overview.

    PubMed

    Lapatas, Vasileios; Stefanidakis, Michalis; Jimenez, Rafael C; Via, Allegra; Schneider, Maria Victoria

    2015-12-01

    Data sharing, integration and annotation are essential to ensure the reproducibility of the analysis and interpretation of the experimental findings. Often these activities are perceived as a role that bioinformaticians and computer scientists have to take with no or little input from the experimental biologist. On the contrary, biological researchers, being the producers and often the end users of such data, have a big role in enabling biological data integration. The quality and usefulness of data integration depend on the existence and adoption of standards, shared formats, and mechanisms that are suitable for biological researchers to submit and annotate the data, so it can be easily searchable, conveniently linked and consequently used for further biological analysis and discovery. Here, we provide background on what is data integration from a computational science point of view, how it has been applied to biological research, which key aspects contributed to its success and future directions. PMID:26336651

  12. Data-intensive computing laying foundation for biological breakthroughs

    SciTech Connect

    Hachigian, David J.

    2007-06-18

    Finding a different way is the goal of the Data-Intensive Computing for Complex Biological Systems (Biopilot) project—a joint research effort between the Pacific Northwest National Laboratory (PNNL) and Oak Ridge National Laboratory funded by the U.S. Department of Energy’s Office of Advanced Scientific Computing Research. The two national laboratories, both of whom are world leaders in computing and computational sciences, are teaming to support areas of biological research in urgent need of data-intensive computing capabilities.

  13. Research in computer science

    NASA Technical Reports Server (NTRS)

    Ortega, J. M.

    1986-01-01

    Various graduate research activities in the field of computer science are reported. Among the topics discussed are: (1) failure probabilities in multi-version software; (2) Gaussian Elimination on parallel computers; (3) three dimensional Poisson solvers on parallel/vector computers; (4) automated task decomposition for multiple robot arms; (5) multi-color incomplete cholesky conjugate gradient methods on the Cyber 205; and (6) parallel implementation of iterative methods for solving linear equations.

  14. A First Attempt to Bring Computational Biology into Advanced High School Biology Classrooms

    PubMed Central

    Gallagher, Suzanne Renick; Coon, William; Donley, Kristin; Scott, Abby; Goldberg, Debra S.

    2011-01-01

    Computer science has become ubiquitous in many areas of biological research, yet most high school and even college students are unaware of this. As a result, many college biology majors graduate without adequate computational skills for contemporary fields of biology. The absence of a computational element in secondary school biology classrooms is of growing concern to the computational biology community and biology teachers who would like to acquaint their students with updated approaches in the discipline. We present a first attempt to correct this absence by introducing a computational biology element to teach genetic evolution into advanced biology classes in two local high schools. Our primary goal was to show students how computation is used in biology and why a basic understanding of computation is necessary for research in many fields of biology. This curriculum is intended to be taught by a computational biologist who has worked with a high school advanced biology teacher to adapt the unit for his/her classroom, but a motivated high school teacher comfortable with mathematics and computing may be able to teach this alone. In this paper, we present our curriculum, which takes into consideration the constraints of the required curriculum, and discuss our experiences teaching it. We describe the successes and challenges we encountered while bringing this unit to high school students, discuss how we addressed these challenges, and make suggestions for future versions of this curriculum.We believe that our curriculum can be a valuable seed for further development of computational activities aimed at high school biology students. Further, our experiences may be of value to others teaching computational biology at this level. Our curriculum can be obtained at http://ecsite.cs.colorado.edu/?page_id=149#biology or by contacting the authors. PMID:22046118

  15. Computer Analogies: Teaching Molecular Biology and Ecology.

    ERIC Educational Resources Information Center

    Rice, Stanley; McArthur, John

    2002-01-01

    Suggests that computer science analogies can aid the understanding of gene expression, including the storage of genetic information on chromosomes. Presents a matrix of biology and computer science concepts. (DDR)

  16. Computational fluid dynamics research

    NASA Technical Reports Server (NTRS)

    Chandra, Suresh; Jones, Kenneth; Hassan, Hassan; Mcrae, David Scott

    1992-01-01

    The focus of research in the computational fluid dynamics (CFD) area is two fold: (1) to develop new approaches for turbulence modeling so that high speed compressible flows can be studied for applications to entry and re-entry flows; and (2) to perform research to improve CFD algorithm accuracy and efficiency for high speed flows. Research activities, faculty and student participation, publications, and financial information are outlined.

  17. Research in computer science

    NASA Technical Reports Server (NTRS)

    Ortega, J. M.

    1985-01-01

    Synopses are given for NASA supported work in computer science at the University of Virginia. Some areas of research include: error seeding as a testing method; knowledge representation for engineering design; analysis of faults in a multi-version software experiment; implementation of a parallel programming environment; two computer graphics systems for visualization of pressure distribution and convective density particles; task decomposition for multiple robot arms; vectorized incomplete conjugate gradient; and iterative methods for solving linear equations on the Flex/32.

  18. Research in Computational Astrobiology

    NASA Technical Reports Server (NTRS)

    Chaban, Galina; Colombano, Silvano; Scargle, Jeff; New, Michael H.; Pohorille, Andrew; Wilson, Michael A.

    2003-01-01

    We report on several projects in the field of computational astrobiology, which is devoted to advancing our understanding of the origin, evolution and distribution of life in the Universe using theoretical and computational tools. Research projects included modifying existing computer simulation codes to use efficient, multiple time step algorithms, statistical methods for analysis of astrophysical data via optimal partitioning methods, electronic structure calculations on water-nuclei acid complexes, incorporation of structural information into genomic sequence analysis methods and calculations of shock-induced formation of polycylic aromatic hydrocarbon compounds.

  19. Excellence in Computational Biology and Informatics — EDRN Public Portal

    Cancer.gov

    9th Early Detection Research Network (EDRN) Scientific Workshop. Excellence in Computational Biology and Informatics: Sponsored by the EDRN Data Sharing Subcommittee Moderator: Daniel Crichton, M.S., NASA Jet Propulsion Laboratory

  20. Bringing Advanced Computational Techniques to Energy Research

    SciTech Connect

    Mitchell, Julie C

    2012-11-17

    Please find attached our final technical report for the BACTER Institute award. BACTER was created as a graduate and postdoctoral training program for the advancement of computational biology applied to questions of relevance to bioenergy research.

  1. UC Merced Center for Computational Biology Final Report

    SciTech Connect

    Colvin, Michael; Watanabe, Masakatsu

    2010-11-30

    Final report for the UC Merced Center for Computational Biology. The Center for Computational Biology (CCB) was established to support multidisciplinary scientific research and academic programs in computational biology at the new University of California campus in Merced. In 2003, the growing gap between biology research and education was documented in a report from the National Academy of Sciences, Bio2010 Transforming Undergraduate Education for Future Research Biologists. We believed that a new type of biological sciences undergraduate and graduate programs that emphasized biological concepts and considered biology as an information science would have a dramatic impact in enabling the transformation of biology. UC Merced as newest UC campus and the first new U.S. research university of the 21st century was ideally suited to adopt an alternate strategy - to create a new Biological Sciences majors and graduate group that incorporated the strong computational and mathematical vision articulated in the Bio2010 report. CCB aimed to leverage this strong commitment at UC Merced to develop a new educational program based on the principle of biology as a quantitative, model-driven science. Also we expected that the center would be enable the dissemination of computational biology course materials to other university and feeder institutions, and foster research projects that exemplify a mathematical and computations-based approach to the life sciences. As this report describes, the CCB has been successful in achieving these goals, and multidisciplinary computational biology is now an integral part of UC Merced undergraduate, graduate and research programs in the life sciences. The CCB began in fall 2004 with the aid of an award from U.S. Department of Energy (DOE), under its Genomes to Life program of support for the development of research and educational infrastructure in the modern biological sciences. This report to DOE describes the research and academic programs

  2. Research in Computational Astrobiology

    NASA Technical Reports Server (NTRS)

    Chaban, Galina; Jaffe, Richard; Liang, Shoudan; New, Michael H.; Pohorille, Andrew; Wilson, Michael A.

    2002-01-01

    We present results from several projects in the new field of computational astrobiology, which is devoted to advancing our understanding of the origin, evolution and distribution of life in the Universe using theoretical and computational tools. We have developed a procedure for calculating long-range effects in molecular dynamics using a plane wave expansion of the electrostatic potential. This method is expected to be highly efficient for simulating biological systems on massively parallel supercomputers. We have perform genomics analysis on a family of actin binding proteins. We have performed quantum mechanical calculations on carbon nanotubes and nucleic acids, which simulations will allow us to investigate possible sources of organic material on the early earth. Finally, we have developed a model of protobiological chemistry using neural networks.

  3. Software agents in molecular computational biology.

    PubMed

    Keele, John W; Wray, James E

    2005-12-01

    Progress made in applying agent systems to molecular computational biology is reviewed and strategies by which to exploit agent technology to greater advantage are investigated. Communities of software agents could play an important role in helping genome scientists design reagents for future research. The advent of genome sequencing in cattle and swine increases the complexity of data analysis required to conduct research in livestock genomics. Databases are always expanding and semantic differences among data are common. Agent platforms have been developed to deal with generic issues such as agent communication, life cycle management and advertisement of services (white and yellow pages). This frees computational biologists from the drudgery of having to re-invent the wheel on these common chores, giving them more time to focus on biology and bioinformatics. Agent platforms that comply with the Foundation for Intelligent Physical Agents (FIPA) standards are able to interoperate. In other words, agents developed on different platforms can communicate and cooperate with one another if domain-specific higher-level communication protocol details are agreed upon between different agent developers. Many software agent platforms are peer-to-peer, which means that even if some of the agents and data repositories are temporarily unavailable, a subset of the goals of the system can still be met. Past use of software agents in bioinformatics indicates that an agent approach should prove fruitful. Examination of current problems in bioinformatics indicates that existing agent platforms should be adaptable to novel situations. PMID:16420735

  4. Data-intensive computing laying foundation for biological breakthroughs

    SciTech Connect

    Straatsma, TP

    2007-06-18

    Biological breakthroughs critical to solving society’s most challenging problems require new and innovative tools and a “different way” to analyze the enormous amounts of data being generated. This article for the Breakthroughs magazine focuses on the Data-Intensive Computing for Complex Biological Systems (Biopilot) project—a joint research effort between the Pacific Northwest National Laboratory (PNNL) and Oak Ridge National Laboratory funded by the U.S. Department of Energy’s Office of Advanced Scientific Computing Research. The two national laboratories, both of whom are world leaders in computing and computational sciences, are teaming to support areas of biological research in urgent need of data-intensive computing capabilities.

  5. Biological databases for human research.

    PubMed

    Zou, Dong; Ma, Lina; Yu, Jun; Zhang, Zhang

    2015-02-01

    The completion of the Human Genome Project lays a foundation for systematically studying the human genome from evolutionary history to precision medicine against diseases. With the explosive growth of biological data, there is an increasing number of biological databases that have been developed in aid of human-related research. Here we present a collection of human-related biological databases and provide a mini-review by classifying them into different categories according to their data types. As human-related databases continue to grow not only in count but also in volume, challenges are ahead in big data storage, processing, exchange and curation. PMID:25712261

  6. Biological Databases for Human Research

    PubMed Central

    Zou, Dong; Ma, Lina; Yu, Jun; Zhang, Zhang

    2015-01-01

    The completion of the Human Genome Project lays a foundation for systematically studying the human genome from evolutionary history to precision medicine against diseases. With the explosive growth of biological data, there is an increasing number of biological databases that have been developed in aid of human-related research. Here we present a collection of human-related biological databases and provide a mini-review by classifying them into different categories according to their data types. As human-related databases continue to grow not only in count but also in volume, challenges are ahead in big data storage, processing, exchange and curation. PMID:25712261

  7. Toward Computational Cumulative Biology by Combining Models of Biological Datasets

    PubMed Central

    Faisal, Ali; Peltonen, Jaakko; Georgii, Elisabeth; Rung, Johan; Kaski, Samuel

    2014-01-01

    A main challenge of data-driven sciences is how to make maximal use of the progressively expanding databases of experimental datasets in order to keep research cumulative. We introduce the idea of a modeling-based dataset retrieval engine designed for relating a researcher's experimental dataset to earlier work in the field. The search is (i) data-driven to enable new findings, going beyond the state of the art of keyword searches in annotations, (ii) modeling-driven, to include both biological knowledge and insights learned from data, and (iii) scalable, as it is accomplished without building one unified grand model of all data. Assuming each dataset has been modeled beforehand, by the researchers or automatically by database managers, we apply a rapidly computable and optimizable combination model to decompose a new dataset into contributions from earlier relevant models. By using the data-driven decomposition, we identify a network of interrelated datasets from a large annotated human gene expression atlas. While tissue type and disease were major driving forces for determining relevant datasets, the found relationships were richer, and the model-based search was more accurate than the keyword search; moreover, it recovered biologically meaningful relationships that are not straightforwardly visible from annotations—for instance, between cells in different developmental stages such as thymocytes and T-cells. Data-driven links and citations matched to a large extent; the data-driven links even uncovered corrections to the publication data, as two of the most linked datasets were not highly cited and turned out to have wrong publication entries in the database. PMID:25427176

  8. Analog Computer Laboratory with Biological Examples.

    ERIC Educational Resources Information Center

    Strebel, Donald E.

    1979-01-01

    The use of biological examples in teaching applications of the analog computer is discussed and several examples from mathematical ecology, enzyme kinetics, and tracer dynamics are described. (Author/GA)

  9. Computational investigations of HNO in biology

    PubMed Central

    Zhang, Yong

    2013-01-01

    HNO (nitroxyl) has been found to have many physiological effects in numerous biological processes. Computational investigations have been employed to help understand the structural properties of HNO complexes and HNO reactivities in some interesting biologically relevant systems. The following computational aspects were reviewed in this work: 1) structural and energetic properties of HNO isomers; 2) interactions between HNO and non-metal molecules; 3) structural and spectroscopic properties of HNO metal complexes; 4) HNO reactions with biologically important non-metal systems; 5) involvement of HNO in reactions of metal complexes and metalloproteins. Results indicate that computational investigations are very helpful to elucidate interesting experimental phenomena and provide new insights into unique structural, spectroscopic, and mechanistic properties of HNO involvement in biology. PMID:23103077

  10. Integrating interactive computational modeling in biology curricula.

    PubMed

    Helikar, Tomáš; Cutucache, Christine E; Dahlquist, Lauren M; Herek, Tyler A; Larson, Joshua J; Rogers, Jim A

    2015-03-01

    While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology) class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology. PMID:25790483

  11. Space Station Biological Research Project

    NASA Technical Reports Server (NTRS)

    Johnson, C. C.; Wade, C. E.; Givens, J. J.

    1997-01-01

    To meet NASA's objective of using the unique aspects of the space environment to expand fundamental knowledge in the biological sciences, the Space Station Biological Research Project at Ames Research Center is developing, or providing oversight, for two major suites of hardware which will be installed on the International Space Station (ISS). The first, the Gravitational Biology Facility, consists of Habitats to support plants, rodents, cells, aquatic specimens, avian and reptilian eggs, and insects and the Habitat Holding Rack in which to house them at microgravity; the second, the Centrifuge Facility, consists of a 2.5 m diameter centrifuge that will provide acceleration levels between 0.01 g and 2.0 g and a Life Sciences Glovebox. These two facilities will support the conduct of experiments to: 1) investigate the effect of microgravity on living systems; 2) what level of gravity is required to maintain normal form and function, and 3) study the use of artificial gravity as a countermeasure to the deleterious effects of microgravity observed in the crew. Upon completion, the ISS will have three complementary laboratory modules provided by NASA, the European Space Agency and the Japanese space agency, NASDA. Use of all facilities in each of the modules will be available to investigators from participating space agencies. With the advent of the ISS, space-based gravitational biology research will transition from 10-16 day short-duration Space Shuttle flights to 90-day-or-longer ISS increments.

  12. Metacognition: computation, biology and function

    PubMed Central

    Fleming, Stephen M.; Dolan, Raymond J.; Frith, Christopher D.

    2012-01-01

    Many complex systems maintain a self-referential check and balance. In animals, such reflective monitoring and control processes have been grouped under the rubric of metacognition. In this introductory article to a Theme Issue on metacognition, we review recent and rapidly progressing developments from neuroscience, cognitive psychology, computer science and philosophy of mind. While each of these areas is represented in detail by individual contributions to the volume, we take this opportunity to draw links between disciplines, and highlight areas where further integration is needed. Specifically, we cover the definition, measurement, neurobiology and possible functions of metacognition, and assess the relationship between metacognition and consciousness. We propose a framework in which level of representation, order of behaviour and access consciousness are orthogonal dimensions of the conceptual landscape. PMID:22492746

  13. Metacognition: computation, biology and function.

    PubMed

    Fleming, Stephen M; Dolan, Raymond J; Frith, Christopher D

    2012-05-19

    Many complex systems maintain a self-referential check and balance. In animals, such reflective monitoring and control processes have been grouped under the rubric of metacognition. In this introductory article to a Theme Issue on metacognition, we review recent and rapidly progressing developments from neuroscience, cognitive psychology, computer science and philosophy of mind. While each of these areas is represented in detail by individual contributions to the volume, we take this opportunity to draw links between disciplines, and highlight areas where further integration is needed. Specifically, we cover the definition, measurement, neurobiology and possible functions of metacognition, and assess the relationship between metacognition and consciousness. We propose a framework in which level of representation, order of behaviour and access consciousness are orthogonal dimensions of the conceptual landscape. PMID:22492746

  14. Filling the gap between biology and computer science.

    PubMed

    Aguilar-Ruiz, Jesús S; Moore, Jason H; Ritchie, Marylyn D

    2008-01-01

    This editorial introduces BioData Mining, a new journal which publishes research articles related to advances in computational methods and techniques for the extraction of useful knowledge from heterogeneous biological data. We outline the aims and scope of the journal, introduce the publishing model and describe the open peer review policy, which fosters interaction within the research community. PMID:18822148

  15. Micro-Computers in Biology Inquiry.

    ERIC Educational Resources Information Center

    Barnato, Carolyn; Barrett, Kathy

    1981-01-01

    Describes the modification of computer programs (BISON and POLLUT) to accommodate species and areas indigenous to the Pacific Coast area. Suggests that these programs, suitable for PET microcomputers, may foster a long-term, ongoing, inquiry-directed approach in biology. (DS)

  16. Aluminium in Biological Environments: A Computational Approach

    PubMed Central

    Mujika, Jon I; Rezabal, Elixabete; Mercero, Jose M; Ruipérez, Fernando; Costa, Dominique; Ugalde, Jesus M; Lopez, Xabier

    2014-01-01

    The increased availability of aluminium in biological environments, due to human intervention in the last century, raises concerns on the effects that this so far “excluded from biology” metal might have on living organisms. Consequently, the bioinorganic chemistry of aluminium has emerged as a very active field of research. This review will focus on our contributions to this field, based on computational studies that can yield an understanding of the aluminum biochemistry at a molecular level. Aluminium can interact and be stabilized in biological environments by complexing with both low molecular mass chelants and high molecular mass peptides. The speciation of the metal is, nonetheless, dictated by the hydrolytic species dominant in each case and which vary according to the pH condition of the medium. In blood, citrate and serum transferrin are identified as the main low molecular mass and high molecular mass molecules interacting with aluminium. The complexation of aluminium to citrate and the subsequent changes exerted on the deprotonation pathways of its tritable groups will be discussed along with the mechanisms for the intake and release of aluminium in serum transferrin at two pH conditions, physiological neutral and endosomatic acidic. Aluminium can substitute other metals, in particular magnesium, in protein buried sites and trigger conformational disorder and alteration of the protonation states of the protein's sidechains. A detailed account of the interaction of aluminium with proteic sidechains will be given. Finally, it will be described how alumnium can exert oxidative stress by stabilizing superoxide radicals either as mononuclear aluminium or clustered in boehmite. The possibility of promotion of Fenton reaction, and production of hydroxyl radicals will also be discussed. PMID:24757505

  17. The "Biologically-Inspired Computing" Column

    NASA Technical Reports Server (NTRS)

    Hinchey, Mike

    2007-01-01

    Self-managing systems, whether viewed from the perspective of Autonomic Computing, or from that of another initiative, offers a holistic vision for the development and evolution of biologically-inspired computer-based systems. It aims to bring new levels of automation and dependability to systems, while simultaneously hiding their complexity and reducing costs. A case can certainly be made that all computer-based systems should exhibit autonomic properties [6], and we envisage greater interest in, and uptake of, autonomic principles in future system development.

  18. Multiscale Computational Models of Complex Biological Systems

    PubMed Central

    Walpole, Joseph; Papin, Jason A.; Peirce, Shayn M.

    2014-01-01

    Integration of data across spatial, temporal, and functional scales is a primary focus of biomedical engineering efforts. The advent of powerful computing platforms, coupled with quantitative data from high-throughput experimental platforms, has allowed multiscale modeling to expand as a means to more comprehensively investigate biological phenomena in experimentally relevant ways. This review aims to highlight recently published multiscale models of biological systems while using their successes to propose the best practices for future model development. We demonstrate that coupling continuous and discrete systems best captures biological information across spatial scales by selecting modeling techniques that are suited to the task. Further, we suggest how to best leverage these multiscale models to gain insight into biological systems using quantitative, biomedical engineering methods to analyze data in non-intuitive ways. These topics are discussed with a focus on the future of the field, the current challenges encountered, and opportunities yet to be realized. PMID:23642247

  19. Understanding biological computation: reliable learning and recognition.

    PubMed Central

    Hogg, T; Huberman, B A

    1984-01-01

    We experimentally examine the consequences of the hypothesis that the brain operates reliably, even though individual components may intermittently fail, by computing with dynamical attractors. Specifically, such a mechanism exploits dynamic collective behavior of a system with attractive fixed points in its phase space. In contrast to the usual methods of reliable computation involving a large number of redundant elements, this technique of self-repair only requires collective computation with a few units, and it is amenable to quantitative investigation. Experiments on parallel computing arrays show that this mechanism leads naturally to rapid self-repair, adaptation to the environment, recognition and discrimination of fuzzy inputs, and conditional learning, properties that are commonly associated with biological computation. PMID:6593731

  20. Ranked retrieval of Computational Biology models

    PubMed Central

    2010-01-01

    Background The study of biological systems demands computational support. If targeting a biological problem, the reuse of existing computational models can save time and effort. Deciding for potentially suitable models, however, becomes more challenging with the increasing number of computational models available, and even more when considering the models' growing complexity. Firstly, among a set of potential model candidates it is difficult to decide for the model that best suits ones needs. Secondly, it is hard to grasp the nature of an unknown model listed in a search result set, and to judge how well it fits for the particular problem one has in mind. Results Here we present an improved search approach for computational models of biological processes. It is based on existing retrieval and ranking methods from Information Retrieval. The approach incorporates annotations suggested by MIRIAM, and additional meta-information. It is now part of the search engine of BioModels Database, a standard repository for computational models. Conclusions The introduced concept and implementation are, to our knowledge, the first application of Information Retrieval techniques on model search in Computational Systems Biology. Using the example of BioModels Database, it was shown that the approach is feasible and extends the current possibilities to search for relevant models. The advantages of our system over existing solutions are that we incorporate a rich set of meta-information, and that we provide the user with a relevance ranking of the models found for a query. Better search capabilities in model databases are expected to have a positive effect on the reuse of existing models. PMID:20701772

  1. Novel opportunities for computational biology and sociology in drug discovery

    PubMed Central

    Yao, Lixia

    2009-01-01

    Drug discovery today is impossible without sophisticated modeling and computation. In this review we touch on previous advances in computational biology and by tracing the steps involved in pharmaceutical development, we explore a range of novel, high value opportunities for computational innovation in modeling the biological process of disease and the social process of drug discovery. These opportunities include text mining for new drug leads, modeling molecular pathways and predicting the efficacy of drug cocktails, analyzing genetic overlap between diseases and predicting alternative drug use. Computation can also be used to model research teams and innovative regions and to estimate the value of academy-industry ties for scientific and human benefit. Attention to these opportunities could promise punctuated advance, and will complement the well-established computational work on which drug discovery currently relies. PMID:19674801

  2. Novel opportunities for computational biology and sociology in drug discovery☆

    PubMed Central

    Yao, Lixia; Evans, James A.; Rzhetsky, Andrey

    2013-01-01

    Current drug discovery is impossible without sophisticated modeling and computation. In this review we outline previous advances in computational biology and, by tracing the steps involved in pharmaceutical development, explore a range of novel, high-value opportunities for computational innovation in modeling the biological process of disease and the social process of drug discovery. These opportunities include text mining for new drug leads, modeling molecular pathways and predicting the efficacy of drug cocktails, analyzing genetic overlap between diseases and predicting alternative drug use. Computation can also be used to model research teams and innovative regions and to estimate the value of academy–industry links for scientific and human benefit. Attention to these opportunities could promise punctuated advance and will complement the well-established computational work on which drug discovery currently relies. PMID:20349528

  3. Structuring Research Opportunities for All Biology Majors.

    ERIC Educational Resources Information Center

    Lewis, Susan E.; Conley, Lisa K.; Horst, Cynthia J.

    2003-01-01

    Describes a required research experience program for all biology majors instituted in the biology department of Carroll College. Discusses successes and challenges of coordinating a program that involves 20-40 research projects each year. (Author/NB)

  4. [Progress in molecular biology study of DNA computer].

    PubMed

    Zhang, Zhi-Zhou; Zhao, Jian; He, Lin

    2003-09-01

    DNA (deoxyribonucleotide acids) computer is an emerging new study area that basically combines molecular biology study of DNA molecules and computational study on how to employ these specific molecules to calculate. In 1994 Adleman described his pioneering research on DNA computing in Science. This is the first experimental report on DNA computer study. In 2001 Benenson et al published a paper in Nature regarding a programmable and autonomous DNA computing device. Because of its Turing-like functions, the device is regarded as another milestone progress for DNA computer study. The main features of DNA computer are massively parallel computing ability and potential enormous data storage capacity. Comparing with conventional electronic computers, DNA molecules provide conceptually a revolution in computing, and more and more implications have been found in various disciplines. DNA computer studies have brought great progress not only in its own computing mechanisms, but also in DNA manipulation technologies especially nano-technology. This article presents the basic principles of DNA computer, its applications, its important relationship with genomic research and our comments on all above issues. PMID:14577383

  5. NASA's computer science research program

    NASA Technical Reports Server (NTRS)

    Larsen, R. L.

    1983-01-01

    Following a major assessment of NASA's computing technology needs, a new program of computer science research has been initiated by the Agency. The program includes work in concurrent processing, management of large scale scientific databases, software engineering, reliable computing, and artificial intelligence. The program is driven by applications requirements in computational fluid dynamics, image processing, sensor data management, real-time mission control and autonomous systems. It consists of university research, in-house NASA research, and NASA's Research Institute for Advanced Computer Science (RIACS) and Institute for Computer Applications in Science and Engineering (ICASE). The overall goal is to provide the technical foundation within NASA to exploit advancing computing technology in aerospace applications.

  6. Space Station Biological Research Project

    NASA Technical Reports Server (NTRS)

    Johnson, Catherine C.; Hargens, Alan R.; Wade, Charles E.

    1995-01-01

    NASA Ames Research Center is responsible for the development of the Space Station Biological Research Project (SSBRP) which will support non-human life sciences research on the International Space Station Alpha (ISSA). The SSBRP is designed to support both basic research to understand the effect of altered gravity fields on biological systems and applied research to investigate the effects of space flight on biological systems. The SSBRP will provide the necessary habitats to support avian and reptile eggs, cells and tissues, plants and rodents. In addition a habitat to support aquatic specimens will be provided by our international partners. Habitats will be mounted in ISSA compatible racks at u-g and will also be mounted on a 2.5 m diameter centrifuge except for the egg incubator which has an internal centrifuge. The 2.5 m centrifuge will provide artificial gravity levels over the range of 0.01 G to 2 G. The current schedule is to launch the first rack in 1999, the Life Sciences glovebox and a second rack early in 2001, a 4 habitat 2.5 in centrifuge later the same year in its own module, and to upgrade the centrifuge to 8 habitats in 2004. The rodent habitats will be derived from the Advanced Animal Habitat currently under development for the Shuttle program and will be capable of housing either rats or mice individually or in groups (6 rats/group and at least 12 mice/group). The egg incubator will be an upgraded Avian Development Facility also developed for the Shuttle program through a Small Business and Innovative Research grant. The Space Tissue Loss cell culture apparatus, developed by Walter Reed Army Institute of Research, is being considered for the cell and tissue culture habitat. The Life Sciences Glovebox is crucial to all life sciences experiments for specimen manipulation and performance of science procedures. It will provide two levels of containment between the work volume and the crew through the use of seals and negative pressure. The glovebox

  7. The Biological Flight Research Facility

    NASA Technical Reports Server (NTRS)

    Johnson, Catherine C.

    1991-01-01

    NASA Ames Research Center is building a research facility, the Biological Flight Research Facility (BFRF), to meet the needs of life scientists to study the long-term effects of variable gravity on living systems. The facility will be housed on Space Station Freedom and is anticipated to operate for the lifetime of the station, approximately 30 years. It will allow plant and animal biologists to study the role of gravity, or its absence, at varying gravity intensities for varying periods of time and with various organisms. The principal difference between current Spacelab missions and those on Space Station Freedom, other than length of mission, will be the capability to perform on-orbit science procedures and the capability to simulate earth gravity. Initially, the facility will house plants and rodents in habitats which can be maintained at microgravity or can be placed on a 2.5-m diam centrifuge. However, the facility is also being designed to accommodate future habitats for small primates, avian, and aquatic specimens. The centrifuge will provide 1 g for controls and will also be able to provide gravity from 0.01 to 2.0 g for threshold gravity studies as well as hypergravity studies. The BFRF will provide the means to conduct basic experiments to gain an understanding of the effects of microgravity on the structure and function of plants and animals, as well as investigate the role of gravity as a potential countermeasure for the physiological changes observed in microgravity.

  8. A New Online Computational Biology Curriculum

    PubMed Central

    Searls, David B.

    2014-01-01

    A recent proliferation of Massive Open Online Courses (MOOCs) and other web-based educational resources has greatly increased the potential for effective self-study in many fields. This article introduces a catalog of several hundred free video courses of potential interest to those wishing to expand their knowledge of bioinformatics and computational biology. The courses are organized into eleven subject areas modeled on university departments and are accompanied by commentary and career advice. PMID:24921255

  9. A new online computational biology curriculum.

    PubMed

    Searls, David B

    2014-06-01

    A recent proliferation of Massive Open Online Courses (MOOCs) and other web-based educational resources has greatly increased the potential for effective self-study in many fields. This article introduces a catalog of several hundred free video courses of potential interest to those wishing to expand their knowledge of bioinformatics and computational biology. The courses are organized into eleven subject areas modeled on university departments and are accompanied by commentary and career advice. PMID:24921255

  10. Catalyzing Inquiry at the Interface of Computing and Biology

    SciTech Connect

    John Wooley; Herbert S. Lin

    2005-10-30

    This study is the first comprehensive NRC study that suggests a high-level intellectual structure for Federal agencies for supporting work at the biology/computing interface. The report seeks to establish the intellectual legitimacy of a fundamentally cross-disciplinary collaboration between biologists and computer scientists. That is, while some universities are increasingly favorable to research at the intersection, life science researchers at other universities are strongly impeded in their efforts to collaborate. This report addresses these impediments and describes proven strategies for overcoming them. An important feature of the report is the use of well-documented examples that describe clearly to individuals not trained in computer science the value and usage of computing across the biological sciences, from genes and proteins to networks and pathways, from organelles to cells, and from individual organisms to populations and ecosystems. It is hoped that these examples will be useful to students in the life sciences to motivate (continued) study in computer science that will enable them to be more facile users of computing in their future biological studies.

  11. Research in computer science

    NASA Technical Reports Server (NTRS)

    Ortega, J. M.

    1984-01-01

    The research efforts of University of Virginia students under a NASA sponsored program are summarized and the status of the program is reported. The research includes: testing method evaluations for N version programming; a representation scheme for modeling three dimensional objects; fault tolerant protocols for real time local area networks; performance investigation of Cyber network; XFEM implementation; and vectorizing incomplete Cholesky conjugate gradients.

  12. Computational chemistry research

    NASA Technical Reports Server (NTRS)

    Levin, Eugene

    1987-01-01

    Task 41 is composed of two parts: (1) analysis and design studies related to the Numerical Aerodynamic Simulation (NAS) Extended Operating Configuration (EOC) and (2) computational chemistry. During the first half of 1987, Dr. Levin served as a member of an advanced system planning team to establish the requirements, goals, and principal technical characteristics of the NAS EOC. A paper entitled 'Scaling of Data Communications for an Advanced Supercomputer Network' is included. The high temperature transport properties (such as viscosity, thermal conductivity, etc.) of the major constituents of air (oxygen and nitrogen) were correctly determined. The results of prior ab initio computer solutions of the Schroedinger equation were combined with the best available experimental data to obtain complete interaction potentials for both neutral and ion-atom collision partners. These potentials were then used in a computer program to evaluate the collision cross-sections from which the transport properties could be determined. A paper entitled 'High Temperature Transport Properties of Air' is included.

  13. Global Biology Research Program: Program plan

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Biological processes which play a dominant role in these cycles which transform and transfer much of this material throughout the biosphere are examined. A greater understanding of planetary biological processes as revealed by the interaction of the biota and the environment. The rationale, scope, research strategy, and research priorities of the global biology is presented.

  14. The Biological Flight Research Facility

    NASA Technical Reports Server (NTRS)

    Johnson, Catherine C.

    1993-01-01

    NASA Ames Research Center (ARC) is building a research facility, the Biological Flight Research Facility (BFRF), to meet the needs of life scientists to study the long-term effects of variable gravity on living systems. The facility will be housed on Space Station Freedom and is anticipated to operate for the lifetime of the station, approximately thirty years. It will allow plant and animal biologists to study the role of gravity, or its absence, at varying gravity intensities for varying periods of time and with various organisms. The principal difference between current Spacelab missions and those on Space Station Freedom, other than length of mission, will be the capability to perform on-orbit science procedures and the capability to simulate earth gravity. Initially the facility will house plants and rodents in habitats which can be maintained at microgravity or can be placed on a 2.5 meter diameter centrifuge. However, the facility is also being designed to accommodate future habitats for small primates, avian, and aquatic specimens. The centrifuge will provide 1 g for controls and will also be able to provide gravity from 0.01 to 2.0 g for threshold gravity studies as well as hypergravity studies. Included in the facility are a service unit for providing clean chambers for the specimens and a glovebox for manipulating the plant and animal specimens and for performing experimental protocols. The BFRF will provide the means to conduct basic experiments to gain an understanding of the effects of microgravity on the structure and function of plants and animals, as well as investigate the role of gravity as a potential countermeasure for the physiological changes observed in microgravity.

  15. Computational Biology Support: RECOMB Conference Series (Conference Support)

    SciTech Connect

    Michael Waterman

    2006-06-15

    This funding was support for student and postdoctoral attendance at the Annual Recomb Conference from 2001 to 2005. The RECOMB Conference series was founded in 1997 to provide a scientific forum for theoretical advances in computational biology and their applications in molecular biology and medicine. The conference series aims at attracting research contributions in all areas of computational molecular biology. Typical, but not exclusive, the topics of interest are: Genomics, Molecular sequence analysis, Recognition of genes and regulatory elements, Molecular evolution, Protein structure, Structural genomics, Gene Expression, Gene Networks, Drug Design, Combinatorial libraries, Computational proteomics, and Structural and functional genomics. The origins of the conference came from the mathematical and computational side of the field, and there remains to be a certain focus on computational advances. However, the effective use of computational techniques to biological innovation is also an important aspect of the conference. The conference had a growing number of attendees, topping 300 in recent years and often exceeding 500. The conference program includes between 30 and 40 contributed papers, that are selected by a international program committee with around 30 experts during a rigorous review process rivaling the editorial procedure for top-rate scientific journals. In previous years papers selection has been made from up to 130--200 submissions from well over a dozen countries. 10-page extended abstracts of the contributed papers are collected in a volume published by ACM Press and Springer, and are available at the conference. Full versions of a selection of the papers are published annually in a special issue of the Journal of Computational Biology devoted to the RECOMB Conference. A further point in the program is a lively poster session. From 120-300 posters have been presented each year at RECOMB 2000. One of the highlights of each RECOMB conference is a

  16. Research in computer science

    NASA Technical Reports Server (NTRS)

    Ortega, J. M.

    1984-01-01

    Several short summaries of the work performed during this reporting period are presented. Topics discussed in this document include: (1) resilient seeded errors via simple techniques; (2) knowledge representation for engineering design; (3) analysis of faults in a multiversion software experiment; (4) implementation of parallel programming environment; (5) symbolic execution of concurrent programs; (6) two computer graphics systems for visualization of pressure distribution and convective density particles; (7) design of a source code management system; (8) vectorizing incomplete conjugate gradient on the Cyber 203/205; (9) extensions of domain testing theory and; (10) performance analyzer for the pisces system.

  17. Computational Materials Research

    NASA Technical Reports Server (NTRS)

    Hinkley, Jeffrey A. (Editor); Gates, Thomas S. (Editor)

    1996-01-01

    Computational Materials aims to model and predict thermodynamic, mechanical, and transport properties of polymer matrix composites. This workshop, the second coordinated by NASA Langley, reports progress in measurements and modeling at a number of length scales: atomic, molecular, nano, and continuum. Assembled here are presentations on quantum calculations for force field development, molecular mechanics of interfaces, molecular weight effects on mechanical properties, molecular dynamics applied to poling of polymers for electrets, Monte Carlo simulation of aromatic thermoplastics, thermal pressure coefficients of liquids, ultrasonic elastic constants, group additivity predictions, bulk constitutive models, and viscoplasticity characterization.

  18. Computational Tools to Assess Turbine Biological Performance

    SciTech Connect

    Richmond, Marshall C.; Serkowski, John A.; Rakowski, Cynthia L.; Strickler, Brad; Weisbeck, Molly; Dotson, Curtis L.

    2014-07-24

    Public Utility District No. 2 of Grant County (GCPUD) operates the Priest Rapids Dam (PRD), a hydroelectric facility on the Columbia River in Washington State. The dam contains 10 Kaplan-type turbine units that are now more than 50 years old. Plans are underway to refit these aging turbines with new runners. The Columbia River at PRD is a migratory pathway for several species of juvenile and adult salmonids, so passage of fish through the dam is a major consideration when upgrading the turbines. In this paper, a method for turbine biological performance assessment (BioPA) is demonstrated. Using this method, a suite of biological performance indicators is computed based on simulated data from a CFD model of a proposed turbine design. Each performance indicator is a measure of the probability of exposure to a certain dose of an injury mechanism. Using known relationships between the dose of an injury mechanism and frequency of injury (dose–response) from laboratory or field studies, the likelihood of fish injury for a turbine design can be computed from the performance indicator. By comparing the values of the indicators from proposed designs, the engineer can identify the more-promising alternatives. We present an application of the BioPA method for baseline risk assessment calculations for the existing Kaplan turbines at PRD that will be used as the minimum biological performance that a proposed new design must achieve.

  19. Structural Biology and Molecular Applications Research

    Cancer.gov

    Part of NCI's Division of Cancer Biology's research portfolio, research and development in this area focuses on enabling technologies, models, and methodologies to support basic and applied cancer research.

  20. Computer Science Research at Langley

    NASA Technical Reports Server (NTRS)

    Voigt, S. J. (Editor)

    1982-01-01

    A workshop was held at Langley Research Center, November 2-5, 1981, to highlight ongoing computer science research at Langley and to identify additional areas of research based upon the computer user requirements. A panel discussion was held in each of nine application areas, and these are summarized in the proceedings. Slides presented by the invited speakers are also included. A survey of scientific, business, data reduction, and microprocessor computer users helped identify areas of focus for the workshop. Several areas of computer science which are of most concern to the Langley computer users were identified during the workshop discussions. These include graphics, distributed processing, programmer support systems and tools, database management, and numerical methods.

  1. Towards molecular computers that operate in a biological environment

    NASA Astrophysics Data System (ADS)

    Kahan, Maya; Gil, Binyamin; Adar, Rivka; Shapiro, Ehud

    2008-07-01

    important consequences when performed in a proper context. We envision that molecular computers that operate in a biological environment can be the basis of “smart drugs”, which are potent drugs that activate only if certain environmental conditions hold. These conditions could include abnormalities in the molecular composition of the biological environment that are indicative of a particular disease. Here we review the research direction that set this vision and attempts to realize it.

  2. A complex systems approach to computational molecular biology

    SciTech Connect

    Lapedes, A. |

    1993-09-01

    We report on the containing research program at Santa Fe Institute that applies complex systems methodology to computational molecular biology. Two aspects are stressed here are the use of co-evolving adaptive neutral networks for determining predictable protein structure classifications, and the use of information theory to elucidate protein structure and function. A ``snapshot`` of the current state of research in these two topics is presented, representing the present state of two major research thrusts in the program of Genetic Data and Sequence Analysis at the Santa Fe Institute.

  3. Biology Education Research: Lessons and Future Directions

    ERIC Educational Resources Information Center

    Singer, Susan R.; Nielsen, Natalie R.; Schweingruber, Heidi A.

    2013-01-01

    Biologists have long been concerned about the quality of undergraduate biology education. Over time, however, biology faculty members have begun to study increasingly sophisticated questions about teaching and learning in the discipline. These scholars, often called biology education researchers, are part of a growing field of inquiry called…

  4. [Emphasis of biological research for space radiation].

    PubMed

    Ohnishi, T; Nagaoka, S

    1998-03-01

    The paper summarized issues, current status and the recent topics in biological research of space radiation. Researches to estimate a risk associated with space radiation exposure during a long-term manned space flight, such as in the International Space Station, is emphasized because of the large uncertainty of biological effects and a complexity of the radiation environment in space. The Issues addressed are; 1) biological effects and end points in low dose radiation, 2) biological effects under low dose rate and long-term radiation exposure, 3) modification of biological responses to radiation under space environments, 4) various aspects of biological end points vs. cellular and molecular mechanisms, 5) estimation of human risk associated with radiation exposure in space flight, 6) regulations for radiation exposure limits for space workers. The paper also summarized and introduced recent progress in space related radiation researches with various biological systems. PMID:11541824

  5. Biological and Environmental Research Network Requirements

    SciTech Connect

    Balaji, V.; Boden, Tom; Cowley, Dave; Dart, Eli; Dattoria, Vince; Desai, Narayan; Egan, Rob; Foster, Ian; Goldstone, Robin; Gregurick, Susan; Houghton, John; Izaurralde, Cesar; Johnston, Bill; Joseph, Renu; Kleese-van Dam, Kerstin; Lipton, Mary; Monga, Inder; Pritchard, Matt; Rotman, Lauren; Strand, Gary; Stuart, Cory; Tatusova, Tatiana; Tierney, Brian; Thomas, Brian; Williams, Dean N.; Zurawski, Jason

    2013-09-01

    The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the U.S. Department of Energy (DOE) Office of Science (SC), the single largest supporter of basic research in the physical sciences in the United States. In support of SC programs, ESnet regularly updates and refreshes its understanding of the networking requirements of the instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet be a highly successful enabler of scientific discovery for over 25 years. In November 2012, ESnet and the Office of Biological and Environmental Research (BER) of the DOE SC organized a review to characterize the networking requirements of the programs funded by the BER program office. Several key findings resulted from the review. Among them: 1) The scale of data sets available to science collaborations continues to increase exponentially. This has broad impact, both on the network and on the computational and storage systems connected to the network. 2) Many science collaborations require assistance to cope with the systems and network engineering challenges inherent in managing the rapid growth in data scale. 3) Several science domains operate distributed facilities that rely on high-performance networking for success. Key examples illustrated in this report include the Earth System Grid Federation (ESGF) and the Systems Biology Knowledgebase (KBase). This report expands on these points, and addresses others as well. The report contains a findings section as well as the text of the case studies discussed at the review.

  6. Computational Laser Spectroscopy in a Biological Tissue

    PubMed Central

    Gantri, M.; Trabelsi, H.; Sediki, E.; Ben Salah, R.

    2010-01-01

    We present a numerical spectroscopic study of visible and infrared laser radiation in a biological tissue. We derive a solution of a general two-dimensional time dependent radiative transfer equation in a tissue-like medium. The used model is suitable for many situations especially when the external source is time-dependent or continuous. We use a control volume-discrete ordinate method associated with an implicit three-level second-order time differencing scheme. We consider a very thin rectangular biological-tissue-like medium submitted to a visible or a near infrared light sources. The RTE is solved for a set of different wavelength source. All sources are assumed to be monochromatic and collimated. The energetic fluence rate is computed at a set of detector points on the boundaries. According to the source type, we investigate either the steady-state or transient response of the medium. The used model is validated in the case of a heterogeneous tissue-like medium using referencing experimental results from the literature. Also, the developed model is used to study changes on transmitted light in a rat-liver tissue-like medium. Optical properties depend on the source wavelength and they are taken from the literature. In particular, light-transmission in the medium is studied for continuous wave and for short pulse. PMID:20396377

  7. Strategies for Introducing Computer Technologies into a Biology Laboratory Program

    ERIC Educational Resources Information Center

    Tillotson, Joanne Kivela

    2002-01-01

    Computers have been installed in the General Biology laboratory at Purchase College and incorporated into the laboratory curriculum for all biology majors at the introductory level. The goal is to ensure that all students become familiar with general computer applications in the biological sciences and are comfortable enough to use them regularly.…

  8. Biological data sciences in genome research

    PubMed Central

    Schatz, Michael C.

    2015-01-01

    The last 20 years have been a remarkable era for biology and medicine. One of the most significant achievements has been the sequencing of the first human genomes, which has laid the foundation for profound insights into human genetics, the intricacies of regulation and development, and the forces of evolution. Incredibly, as we look into the future over the next 20 years, we see the very real potential for sequencing more than 1 billion genomes, bringing even deeper insight into human genetics as well as the genetics of millions of other species on the planet. Realizing this great potential for medicine and biology, though, will only be achieved through the integration and development of highly scalable computational and quantitative approaches that can keep pace with the rapid improvements to biotechnology. In this perspective, I aim to chart out these future technologies, anticipate the major themes of research, and call out the challenges ahead. One of the largest shifts will be in the training used to prepare the class of 2035 for their highly interdisciplinary world. PMID:26430150

  9. Biological data sciences in genome research.

    PubMed

    Schatz, Michael C

    2015-10-01

    The last 20 years have been a remarkable era for biology and medicine. One of the most significant achievements has been the sequencing of the first human genomes, which has laid the foundation for profound insights into human genetics, the intricacies of regulation and development, and the forces of evolution. Incredibly, as we look into the future over the next 20 years, we see the very real potential for sequencing more than 1 billion genomes, bringing even deeper insight into human genetics as well as the genetics of millions of other species on the planet. Realizing this great potential for medicine and biology, though, will only be achieved through the integration and development of highly scalable computational and quantitative approaches that can keep pace with the rapid improvements to biotechnology. In this perspective, I aim to chart out these future technologies, anticipate the major themes of research, and call out the challenges ahead. One of the largest shifts will be in the training used to prepare the class of 2035 for their highly interdisciplinary world. PMID:26430150

  10. Initiatives in biological research in Indian psychiatry

    PubMed Central

    Shrivatava, Amresh

    2010-01-01

    Biological psychiatry is an exploratory science for mental health. These biological changes provide some explicit insight into the complex area of ‘brain-mind and behavior’. One major achievement of research in biological field is the finding to explain how biological factors cause changes in behavior. In India, we have a clear history of initiatives in research from a biological perspective, which goes back to 1958. In the last 61 years, this field has seen significant evolution, precision and effective utilization of contemporary technological advances. It is a matter of great pride to see that in spite of difficult times in terms of challenges of practice and services, administration, resource, funding and manpower the zest for research was very forthcoming. There was neither dedicated time nor any funding for conducting research. It came from the intellectual insight of our fore fathers in the field of mental health to gradually grow to the state of strategic education in research, training in research, international research collaborations and setting up of internationally accredited centers. During difficult economic conditions in the past, the hypothesis tested and conclusions derived have not been so important. It is more important how it was done, how it was made possible and how robust traditions were established. Almost an entire spectrum of biological research has been touched upon by Indian researchers. Some of these are electroconvulsive therapy, biological markers, neurocognition, neuroimaging, neuroendocrine, neurochemistry, electrophysiology and genetics. A lot has been published given the limited space in the Indian Journal of Psychiatry and other medical journals published in India. A large body of biological research conducted on Indian patients has also been published in International literature (which I prefer to call non-Indian journals). Newer research questions in biological psychiatry, keeping with trend of international standards are

  11. Systems biology approaches in aging research.

    PubMed

    Chauhan, Anuradha; Liebal, Ulf W; Vera, Julio; Baltrusch, Simone; Junghanß, Christian; Tiedge, Markus; Fuellen, Georg; Wolkenhauer, Olaf; Köhling, Rüdiger

    2015-01-01

    Aging is a systemic process which progressively manifests itself at multiple levels of structural and functional organization from molecular reactions and cell-cell interactions in tissues to the physiology of an entire organ. There is ever increasing data on biomedical relevant network interactions for the aging process at different scales of time and space. To connect the aging process at different structural, temporal and spatial scales, extensive systems biological approaches need to be deployed. Systems biological approaches can not only systematically handle the large-scale datasets (like high-throughput data) and the complexity of interactions (feedback loops, cross talk), but also can delve into nonlinear behaviors exhibited by several biological processes which are beyond intuitive reasoning. Several public-funded agencies have identified the synergistic role of systems biology in aging research. Using one of the notable public-funded programs (GERONTOSYS), we discuss how systems biological approaches are helping the scientists to find new frontiers in aging research. We elaborate on some systems biological approaches deployed in one of the projects of the consortium (ROSage). The systems biology field in aging research is at its infancy. It is open to adapt existing systems biological methodologies from other research fields and devise new aging-specific systems biological methodologies. PMID:25341520

  12. Biologically inspired path to quantum computer

    NASA Astrophysics Data System (ADS)

    Ogryzko, Vasily; Ozhigov, Yuri

    2014-12-01

    We describe an approach to quantum computer inspired by the information processing at the molecular level in living cells. It is based on the separation of a small ensemble of qubits inside the living system (e.g., a bacterial cell), such that coherent quantum states of this ensemble remain practically unchanged for a long time. We use the notion of a quantum kernel to describe such an ensemble. Quantum kernel is not strictly connected with certain particles; it permanently exchanges atoms and molecules with the environment, which makes quantum kernel a virtual notion. There are many reasons to expect that the state of quantum kernel of a living system can be treated as the stationary state of some Hamiltonian. While the quantum kernel is responsible for the stability of dynamics at the time scale of cellular life, at the longer inter-generation time scale it can change, varying smoothly in the course of biological evolution. To the first level of approximation, quantum kernel can be described in the framework of qubit modification of Jaynes-Cummings-Hubbard model, in which the relaxation corresponds to the exchange of matter between quantum kernel and the rest of the cell and is represented as Lindblad super-operators.

  13. From biological neural networks to thinking machines: Transitioning biological organizational principles to computer technology

    NASA Technical Reports Server (NTRS)

    Ross, Muriel D.

    1991-01-01

    The three-dimensional organization of the vestibular macula is under study by computer assisted reconstruction and simulation methods as a model for more complex neural systems. One goal of this research is to transition knowledge of biological neural network architecture and functioning to computer technology, to contribute to the development of thinking computers. Maculas are organized as weighted neural networks for parallel distributed processing of information. The network is characterized by non-linearity of its terminal/receptive fields. Wiring appears to develop through constrained randomness. A further property is the presence of two main circuits, highly channeled and distributed modifying, that are connected through feedforward-feedback collaterals and biasing subcircuit. Computer simulations demonstrate that differences in geometry of the feedback (afferent) collaterals affects the timing and the magnitude of voltage changes delivered to the spike initiation zone. Feedforward (efferent) collaterals act as voltage followers and likely inhibit neurons of the distributed modifying circuit. These results illustrate the importance of feedforward-feedback loops, of timing, and of inhibition in refining neural network output. They also suggest that it is the distributed modifying network that is most involved in adaptation, memory, and learning. Tests of macular adaptation, through hyper- and microgravitational studies, support this hypothesis since synapses in the distributed modifying circuit, but not the channeled circuit, are altered. Transitioning knowledge of biological systems to computer technology, however, remains problematical.

  14. Computer-Based Semantic Network in Molecular Biology: A Demonstration.

    ERIC Educational Resources Information Center

    Callman, Joshua L.; And Others

    This paper analyzes the hardware and software features that would be desirable in a computer-based semantic network system for representing biology knowledge. It then describes in detail a prototype network of molecular biology knowledge that has been developed using Filevision software and a Macintosh computer. The prototype contains about 100…

  15. Biology Education Research Trends in Turkey

    ERIC Educational Resources Information Center

    Gul, Seyda; Sozbilir, Mustafa

    2015-01-01

    This paper reports on a content analysis of 633 biology education research [BER] papers published by Turkish science educators in national and international journals. The findings indicate that more research has been undertaken in environment and ecology, the cell and animal form and functions. In addition learning, teaching and attitudes were in…

  16. Computational Systems Biology in Cancer: Modeling Methods and Applications

    PubMed Central

    Materi, Wayne; Wishart, David S.

    2007-01-01

    In recent years it has become clear that carcinogenesis is a complex process, both at the molecular and cellular levels. Understanding the origins, growth and spread of cancer, therefore requires an integrated or system-wide approach. Computational systems biology is an emerging sub-discipline in systems biology that utilizes the wealth of data from genomic, proteomic and metabolomic studies to build computer simulations of intra and intercellular processes. Several useful descriptive and predictive models of the origin, growth and spread of cancers have been developed in an effort to better understand the disease and potential therapeutic approaches. In this review we describe and assess the practical and theoretical underpinnings of commonly-used modeling approaches, including ordinary and partial differential equations, petri nets, cellular automata, agent based models and hybrid systems. A number of computer-based formalisms have been implemented to improve the accessibility of the various approaches to researchers whose primary interest lies outside of model development. We discuss several of these and describe how they have led to novel insights into tumor genesis, growth, apoptosis, vascularization and therapy. PMID:19936081

  17. The biological microprocessor, or how to build a computer with biological parts.

    PubMed

    Moe-Behrens, Gerd Hg

    2013-01-01

    Systemics, a revolutionary paradigm shift in scientific thinking, with applications in systems biology, and synthetic biology, have led to the idea of using silicon computers and their engineering principles as a blueprint for the engineering of a similar machine made from biological parts. Here we describe these building blocks and how they can be assembled to a general purpose computer system, a biological microprocessor. Such a system consists of biological parts building an input / output device, an arithmetic logic unit, a control unit, memory, and wires (busses) to interconnect these components. A biocomputer can be used to monitor and control a biological system. PMID:24688733

  18. The biological microprocessor, or how to build a computer with biological parts

    PubMed Central

    Moe-Behrens, Gerd HG

    2013-01-01

    Systemics, a revolutionary paradigm shift in scientific thinking, with applications in systems biology, and synthetic biology, have led to the idea of using silicon computers and their engineering principles as a blueprint for the engineering of a similar machine made from biological parts. Here we describe these building blocks and how they can be assembled to a general purpose computer system, a biological microprocessor. Such a system consists of biological parts building an input / output device, an arithmetic logic unit, a control unit, memory, and wires (busses) to interconnect these components. A biocomputer can be used to monitor and control a biological system. PMID:24688733

  19. 78 FR 77111 - Biological and Environmental Research Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-20

    ... Biological and Environmental Research Advisory Committee AGENCY: Office of Science, Department of Energy..., notice is hereby given that the Biological and Environmental Research Advisory Committee will be renewed... to the Director, Office of Science on the biological and environmental research...

  20. 76 FR 78908 - Biological and Environmental Research Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-20

    ... Biological and Environmental Research Advisory Committee AGENCY: Office of Science, Department of Energy. ACTION: Notice of renewal of the Biological and Environmental Research Advisory Committee. SUMMARY... Biological and Environmental Research Advisory Committee will be renewed for a two- year period...

  1. 76 FR 31319 - Biological and Environmental Research Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-31

    ... Biological and Environmental Research Advisory Committee AGENCY: Office of Science, Department of Energy... the Biological and Environmental Research Advisory Committee (BERAC). The Federal Advisory Committee..., Office of Biological and Environmental Research, SC-23/Germantown Building, 1000 Independence Avenue,...

  2. Biocellion: accelerating computer simulation of multicellular biological system models

    PubMed Central

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-01-01

    Motivation: Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. Results: We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Availability and implementation: Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. Contact: seunghwa.kang@pnnl.gov PMID:25064572

  3. 2003 Biology and Biotechnology Research Program Overview and Highlights

    SciTech Connect

    Prange, C

    2003-03-01

    LLNL conducts multidisciplinary bioscience to fill national needs. Our primary roles are to: develop knowledge and tools which enhance national security, including biological, chemical and nuclear capabilities, and energy and environmental security; develop understanding of genetic and biochemical processes to enhance disease prevention, detection and treatment; develop unique biochemical measurement and computational modeling capabilities which enable understanding of biological processes; and develop technology and tools which enhance healthcare. We execute our roles through integrated multidisciplinary programs that apply our competencies in: microbial and mammalian genomics--the characterization of DNA, the genes it encodes, their regulation and function and their role in living systems; protein function and biochemistry - the structure, function, and interaction of proteins and other molecules involved in the integrated biochemical function of the processes of life; computational modeling and understanding of biochemical systems--the application of high-speed computing technology to simulate and visualize complex, integrated biological processes; bioinformatics--databasing, networking, and analysis of biological data; and bioinstrumentation--the application of physical and engineering technologies to novel biological and biochemical measurements, laboratory automation, medical device development, and healthcare technologies. We leverage the Laboratory's exceptional capabilities in the physical, computational, chemical, environmental and engineering sciences. We partner with industry and universities to utilize their state-of-the art technology and science and to make our capabilities and discoveries available to the broader research community.

  4. Tutoring in School Biology by Computer Conference.

    ERIC Educational Resources Information Center

    Baggott, Linda; Wright, Bruce

    1997-01-01

    Describes an exploration of the use of digitized images in teaching biology to school students via the new digital communications channel, the Integrated Services Digital Network (ISDN). Contains 23 references. (DDR)

  5. Center for Computing Research Summer Research Proceedings 2015.

    SciTech Connect

    Bradley, Andrew Michael; Parks, Michael L.

    2015-12-18

    The Center for Computing Research (CCR) at Sandia National Laboratories organizes a summer student program each summer, in coordination with the Computer Science Research Institute (CSRI) and Cyber Engineering Research Institute (CERI).

  6. Computational Approaches to Vestibular Research

    NASA Technical Reports Server (NTRS)

    Ross, Muriel D.; Wade, Charles E. (Technical Monitor)

    1994-01-01

    The Biocomputation Center at NASA Ames Research Center is dedicated to a union between computational, experimental and theoretical approaches to the study of neuroscience and of life sciences in general. The current emphasis is on computer reconstruction and visualization of vestibular macular architecture in three-dimensions (3-D), and on mathematical modeling and computer simulation of neural activity in the functioning system. Our methods are being used to interpret the influence of spaceflight on mammalian vestibular maculas in a model system, that of the adult Sprague-Dawley rat. More than twenty 3-D reconstructions of type I and type II hair cells and their afferents have been completed by digitization of contours traced from serial sections photographed in a transmission electron microscope. This labor-intensive method has now been replace d by a semiautomated method developed in the Biocomputation Center in which conventional photography is eliminated. All viewing, storage and manipulation of original data is done using Silicon Graphics workstations. Recent improvements to the software include a new mesh generation method for connecting contours. This method will permit the investigator to describe any surface, regardless of complexity, including highly branched structures such as are routinely found in neurons. This same mesh can be used for 3-D, finite volume simulation of synapse activation and voltage spread on neuronal surfaces visualized via the reconstruction process. These simulations help the investigator interpret the relationship between neuroarchitecture and physiology, and are of assistance in determining which experiments will best test theoretical interpretations. Data are also used to develop abstract, 3-D models that dynamically display neuronal activity ongoing in the system. Finally, the same data can be used to visualize the neural tissue in a virtual environment. Our exhibit will depict capabilities of our computational approaches and

  7. ADVANCED COMPUTATIONAL METHODS IN DOSE MODELING: APPLICATION OF COMPUTATIONAL BIOPHYSICAL TRANSPORT, COMPUTATIONAL CHEMISTRY, AND COMPUTATIONAL BIOLOGY

    EPA Science Inventory

    Computational toxicology (CompTox) leverages the significant gains in computing power and computational techniques (e.g., numerical approaches, structure-activity relationships, bioinformatics) realized over the last few years, thereby reducing costs and increasing efficiency i...

  8. Community-driven computational biology with Debian Linux

    PubMed Central

    2010-01-01

    Background The Open Source movement and its technologies are popular in the bioinformatics community because they provide freely available tools and resources for research. In order to feed the steady demand for updates on software and associated data, a service infrastructure is required for sharing and providing these tools to heterogeneous computing environments. Results The Debian Med initiative provides ready and coherent software packages for medical informatics and bioinformatics. These packages can be used together in Taverna workflows via the UseCase plugin to manage execution on local or remote machines. If such packages are available in cloud computing environments, the underlying hardware and the analysis pipelines can be shared along with the software. Conclusions Debian Med closes the gap between developers and users. It provides a simple method for offering new releases of software and data resources, thus provisioning a local infrastructure for computational biology. For geographically distributed teams it can ensure they are working on the same versions of tools, in the same conditions. This contributes to the world-wide networking of researchers. PMID:21210984

  9. Enhancing Biological Understanding through Undergraduate Field Research.

    ERIC Educational Resources Information Center

    Hammer, Samuel

    2001-01-01

    Describes a PEET (Partnerships for Enhancing Expertise in Taxonomy) project designed for undergraduate biology students at Boston University's College of General Studies. Reports that the project used a small group field research setting, facilitating critical thinking skills and group dynamics. Discusses the issue of how to introduce and…

  10. Arrhythmogenesis Research: A Perspective from Computational Electrophysiology Viewpoint

    PubMed Central

    Trayanova, Natalia; Plank, Gernot

    2012-01-01

    The mechanisms by which arrhythmias are generated in the heart remains a field of intensive research. Recent advances in computational biology and electrophysiology have enabled researchers to use an alternative tool in the study of arrhythmia mechanisms, the multi-scale modeling and simulation of cardiac arrhythmogenesis at the organ level. This article reviews the recent advances and achievements using this approach. PMID:18001976

  11. Research on computer systems benchmarking

    NASA Technical Reports Server (NTRS)

    Smith, Alan Jay (Principal Investigator)

    1996-01-01

    This grant addresses the topic of research on computer systems benchmarking and is more generally concerned with performance issues in computer systems. This report reviews work in those areas during the period of NASA support under this grant. The bulk of the work performed concerned benchmarking and analysis of CPUs, compilers, caches, and benchmark programs. The first part of this work concerned the issue of benchmark performance prediction. A new approach to benchmarking and machine characterization was reported, using a machine characterizer that measures the performance of a given system in terms of a Fortran abstract machine. Another report focused on analyzing compiler performance. The performance impact of optimization in the context of our methodology for CPU performance characterization was based on the abstract machine model. Benchmark programs are analyzed in another paper. A machine-independent model of program execution was developed to characterize both machine performance and program execution. By merging these machine and program characterizations, execution time can be estimated for arbitrary machine/program combinations. The work was continued into the domain of parallel and vector machines, including the issue of caches in vector processors and multiprocessors. All of the afore-mentioned accomplishments are more specifically summarized in this report, as well as those smaller in magnitude supported by this grant.

  12. Computational Modeling, Formal Analysis, and Tools for Systems Biology

    PubMed Central

    Bartocci, Ezio; Lió, Pietro

    2016-01-01

    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science. PMID:26795950

  13. Molecular biology research in neuropsychiatry: India's contribution.

    PubMed

    Sathyanarayana Rao, T S; Ramesh, B N; Vasudevaraju, P; Rao, K S J

    2010-01-01

    Neuropsychiatric disorders represent the second largest cause of morbidity worldwide. These disorders have complex etiology and patho-physiology. The major lacunae in the biology of the psychiatric disorders include genomics, biomarkers and drug discovery, for the early detection of the disease, and have great application in the clinical management of disease. Indian psychiatrists and scientists played a significant role in filling the gaps. The present annotation provides in depth information related to research contributions on the molecular biology research in neuropsychiatric disorders in India. There is a great need for further research in this direction as to understand the genetic association of the neuropsychiatric disorders; molecular biology has a tremendous role to play. The alterations in gene expression are implicated in the pathogenesis of several neuropsychiatric disorders, including drug addiction and depression. The development of transgenic neuropsychiatric animal models is of great thrust areas. No studies from India in this direction. Biomarkers in neuropsychiatric disorders are of great help to the clinicians for the early diagnosis of the disorders. The studies related to gene-environment interactions, DNA instability, oxidative stress are less studied in neuropsychiatric disorders and making efforts in this direction will lead to pioneers in these areas of research in India. In conclusion, we provided an insight for future research direction in molecular understanding of neuropsychiatry disorders. PMID:21836667

  14. Biologically Inspired Micro-Flight Research

    NASA Technical Reports Server (NTRS)

    Raney, David L.; Waszak, Martin R.

    2003-01-01

    Natural fliers demonstrate a diverse array of flight capabilities, many of which are poorly understood. NASA has established a research project to explore and exploit flight technologies inspired by biological systems. One part of this project focuses on dynamic modeling and control of micro aerial vehicles that incorporate flexible wing structures inspired by natural fliers such as insects, hummingbirds and bats. With a vast number of potential civil and military applications, micro aerial vehicles represent an emerging sector of the aerospace market. This paper describes an ongoing research activity in which mechanization and control concepts for biologically inspired micro aerial vehicles are being explored. Research activities focusing on a flexible fixed- wing micro aerial vehicle design and a flapping-based micro aerial vehicle concept are presented.

  15. Computer Modelling of Biological Molecules: Free Resources on the Internet.

    ERIC Educational Resources Information Center

    Millar, Neil

    1996-01-01

    Describes a three-dimensional computer modeling system for biological molecules which is suitable for sixth-form teaching. Consists of the modeling program "RasMol" together with structure files of proteins, DNA, and small biological molecules. Describes how the whole system can be downloaded from various sites on the Internet. (Author/JRH)

  16. A Descriptive Analysis of Computer-Assisted Teaching and Learning in Molecular Biological Education

    ERIC Educational Resources Information Center

    Li, Guangxing; Yin, Jiechao; Ren, Yudong; Wang, Binjie; Ren, Xiaofeng

    2006-01-01

    The role and importance of computer-assisted teaching and learning in molecular biological-related education and research has been emphasized and pinpointed. In this study, some benefit viewpoints and discussion are provided for applying the computer-assisted teaching and learning more efficiently in the process of knowledge acquisition and…

  17. Modeling Mendel's Laws on Inheritance in Computational Biology and Medical Sciences

    ERIC Educational Resources Information Center

    Singh, Gurmukh; Siddiqui, Khalid; Singh, Mankiran; Singh, Satpal

    2011-01-01

    The current research article is based on a simple and practical way of employing the computational power of widely available, versatile software MS Excel 2007 to perform interactive computer simulations for undergraduate/graduate students in biology, biochemistry, biophysics, microbiology, medicine in college and university classroom setting. To…

  18. The computational linguistics of biological sequences

    SciTech Connect

    Searls, D.

    1995-12-31

    This tutorial was one of eight tutorials selected to be presented at the Third International Conference on Intelligent Systems for Molecular Biology which was held in the United Kingdom from July 16 to 19, 1995. Protein sequences are analogous in many respects, particularly their folding behavior. Proteins have a much richer variety of interactions, but in theory the same linguistic principles could come to bear in describing dependencies between distant residues that arise by virtue of three-dimensional structure. This tutorial will concentrate on nucleic acid sequences.

  19. Computer display and manipulation of biological molecules

    NASA Technical Reports Server (NTRS)

    Coeckelenbergh, Y.; Macelroy, R. D.; Hart, J.; Rein, R.

    1978-01-01

    This paper describes a computer model that was designed to investigate the conformation of molecules, macromolecules and subsequent complexes. Utilizing an advanced 3-D dynamic computer display system, the model is sufficiently versatile to accommodate a large variety of molecular input and to generate data for multiple purposes such as visual representation of conformational changes, and calculation of conformation and interaction energy. Molecules can be built on the basis of several levels of information. These include the specification of atomic coordinates and connectivities and the grouping of building blocks and duplicated substructures using symmetry rules found in crystals and polymers such as proteins and nucleic acids. Called AIMS (Ames Interactive Molecular modeling System), the model is now being used to study pre-biotic molecular evolution toward life.

  20. Coarse-graining methods for computational biology.

    PubMed

    Saunders, Marissa G; Voth, Gregory A

    2013-01-01

    Connecting the molecular world to biology requires understanding how molecular-scale dynamics propagate upward in scale to define the function of biological structures. To address this challenge, multiscale approaches, including coarse-graining methods, become necessary. We discuss here the theoretical underpinnings and history of coarse-graining and summarize the state of the field, organizing key methodologies based on an emerging paradigm for multiscale theory and modeling of biomolecular systems. This framework involves an integrated, iterative approach to couple information from different scales. The primary steps, which coincide with key areas of method development, include developing first-pass coarse-grained models guided by experimental results, performing numerous large-scale coarse-grained simulations, identifying important interactions that drive emergent behaviors, and finally reconnecting to the molecular scale by performing all-atom molecular dynamics simulations guided by the coarse-grained results. The coarse-grained modeling can then be extended and refined, with the entire loop repeated iteratively if necessary. PMID:23451897

  1. Using a Computer Animation to Teach High School Molecular Biology

    ERIC Educational Resources Information Center

    Rotbain, Yosi; Marbach-Ad, Gili; Stavy, Ruth

    2008-01-01

    We present an active way to use a computer animation in secondary molecular genetics class. For this purpose we developed an activity booklet that helps students to work interactively with a computer animation which deals with abstract concepts and processes in molecular biology. The achievements of the experimental group were compared with those…

  2. Structural biological materials: Overview of current research

    NASA Astrophysics Data System (ADS)

    Chen, P.-Y.; Lin, A. Y.-M.; Stokes, A. G.; Seki, Y.; Bodde, S. G.; McKittrick, J.; Meyers, M. A.

    2008-06-01

    Through specific biological examples this article illustrates the complex designs that have evolved in nature to address strength, toughness, and weight optimization. Current research is reviewed, and the structure of some shells, bones, antlers, crab exoskeletons, and avian feathers and beaks is described using the principles of materials science and engineering by correlating the structure with mechanical properties. In addition, the mechanisms of deformation and failure are discussed.

  3. Onchocerciasis control: biological research is still needed.

    PubMed

    Boussinesq, M

    2008-09-01

    Achievements obtained by the onchocerciasis control programmes should not lead to a relaxation in the biological research on Onchocerco volvulus. Issues such as the Loa loa-related post-ivermectin serious adverse events, the uncertainties as to whether onchocerciasis can be eliminated by ivermectin treatments, and the possible emergence of ivermectin-resistant O. volvulus populations should be addressed proactively. Doxycycline, moxidectin and emodepside appear to be promising as alternative drugs against onchocerciasis but support to researches in immunology and genomics should also be increased to develop new control tools, including both vaccines and macrofilaricidal drugs. PMID:18814732

  4. Computational Biology for Drug Discovery and Characterization

    SciTech Connect

    Lightstone, F C; Bennion, B J

    2009-02-24

    We proposed to determine the underpinnings of a high-throughput computational infrastructure that would support future efforts in therapeutics against biothreat pathogens. Existing modeling capabilities focus on pathogen detection, but extending such capabilities to high-throughput molecular docking would lead to a proactive method to guide the development of therapeutics. This project will focus on determining the feasibility of extending current databases to accommodate molecular docking. We will also examine the feasibility of massive parallelization of docking algorithms and the utility of docking libraries. Transferring this new technique to a high-performance computing (HPC) platform at LLNL would result in a unique capability not available elsewhere in government or industry. We have accomplished the proposed work defined in this LDRD FS study. (1) We successfully defined the feasibility of using three different small-molecule databases for high-throughput docking, the NCI diversity set, ZINC and the ACD. (2) We analyzed the accuracy and parallelization capabilities of six separate docking programs: DOCK, AutoDock, FlexX, Glide, and eHiTS. Each program is completely amenable to parallel execution. The fastest code was eHiTS, and Glide was the most accurate. (3) Customizing large libraries was cumbersome without the proper software, making the databases a bit difficult to tailor. The ZINC database has some prefiltered versions. (4) Scripts were created for quality and job control functions. Further development is needed for analysis and visualization needs. The successful conclusion of this project enables LLNL to have a high-throughput computational docking capability where we have evaluated the codes to specific docking problems and utilized LLNL's HPC for significant gains in performance. We have established a CRADA with an industrial partner (funded by the National Institutes of Health) that will fully utilize this technology for biodefense therapeutic

  5. 78 FR 12043 - Biological and Environmental Research Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-21

    ... Biological and Environmental Research Advisory Committee AGENCY: Office of Science, Department of Energy... and Environmental Research Advisory Committee (BERAC). The Federal Advisory Committee Act (Pub. L. No... Science, Office of Biological and Environmental Research, SC-23/Germantown Building, 1000...

  6. Using computer algebra and SMT solvers in algebraic biology

    NASA Astrophysics Data System (ADS)

    Pineda Osorio, Mateo

    2014-05-01

    Biologic processes are represented as Boolean networks, in a discrete time. The dynamics within these networks are approached with the help of SMT Solvers and the use of computer algebra. Software such as Maple and Z3 was used in this case. The number of stationary states for each network was calculated. The network studied here corresponds to the immune system under the effects of drastic mood changes. Mood is considered as a Boolean variable that affects the entire dynamics of the immune system, changing the Boolean satisfiability and the number of stationary states of the immune network. Results obtained show Z3's great potential as a SMT Solver. Some of these results were verified in Maple, even though it showed not to be as suitable for the problem approach. The solving code was constructed using Z3-Python and Z3-SMT-LiB. Results obtained are important in biology systems and are expected to help in the design of immune therapies. As a future line of research, more complex Boolean network representations of the immune system as well as the whole psychological apparatus are suggested.

  7. Biological effectiveness of neutrons: Research needs

    SciTech Connect

    Casarett, G.W.; Braby, L.A.; Broerse, J.J.; Elkind, M.M.; Goodhead, D.T.; Oleinick, N.L.

    1994-02-01

    The goal of this report was to provide a conceptual plan for a research program that would provide a basis for determining more precisely the biological effectiveness of neutron radiation with emphasis on endpoints relevant to the protection of human health. This report presents the findings of the experts for seven particular categories of scientific information on neutron biological effectiveness. Chapter 2 examines the radiobiological mechanisms underlying the assumptions used to estimate human risk from neutrons and other radiations. Chapter 3 discusses the qualitative and quantitative models used to organize and evaluate experimental observations and to provide extrapolations where direct observations cannot be made. Chapter 4 discusses the physical principles governing the interaction of radiation with biological systems and the importance of accurate dosimetry in evaluating radiation risk and reducing the uncertainty in the biological data. Chapter 5 deals with the chemical and molecular changes underlying cellular responses and the LET dependence of these changes. Chapter 6, in turn, discusses those cellular and genetic changes which lead to mutation or neoplastic transformation. Chapters 7 and 8 examine deterministic and stochastic effects, respectively, and the data required for the prediction of such effects at different organizational levels and for the extrapolation from experimental results in animals to risks for man. Gaps and uncertainties in this data are examined relative to data required for establishing radiation protection standards for neutrons and procedures for the effective and safe use of neutron and other high-LET radiation therapy.

  8. Gordon Research Conference on Mammary Gland Biology

    SciTech Connect

    Not Available

    1989-01-01

    The 1989 conference was the tenth in the series of biennial Gordon Research Conferences on Mammary Gland Biology. Traditionally this conference brings together scientists from diverse backgrounds and experience but with a common interest in the biology of the mammary gland. Investigators from agricultural and medical schools, biochemists, cell and molecular biologists, endocrinologists, immunologists, and representatives from the emerging biotechnology industries met to discuss current concepts and results on the function and regulation of the normal and neoplastic mammary gland in a variety of species. Of the participants, approximately three-fourths were engaged in studying the normal mammary gland function, whereas the other quarter were engaged in studying the neoplastic gland. The interactions between scientists, clinicians, veterinarians examining both normal and neoplastic cell function serves to foster the multi-disciplinary goals of the conference and has stimulated many cooperative projects among participants in previous years.

  9. Computational Approaches for Translational Clinical Research in Disease Progression

    PubMed Central

    McGuire, Mary F.; Iyengar, M. Sriram; Mercer, David W.

    2011-01-01

    Today, there is an ever-increasing amount of biological and clinical data available that could be used to enhance a systems-based understanding of disease progression through innovative computational analysis. In this paper we review a selection of published research regarding computational methodologies, primarily from systems biology, that support translational research from the molecular level to the bedside, with a focus on applications in trauma and critical care. Trauma is the leading cause of mortality in Americans under 45 years of age, and its rapid progression offers both opportunities and challenges for computational analysis of trends in molecular patterns associated with outcomes and therapeutic interventions. This review presents methods and domain-specific examples that may inspire the development of new algorithms and computational methods that utilize both molecular and clinical data for diagnosis, prognosis and therapy in disease progression. PMID:21712727

  10. Biology of Aging: Research Today for a Healthier Tomorrow

    MedlinePlus

    ... Home » Biology of Aging Heath and Aging Biology of Aging Preface The National Institute on Aging ( ... major institutions across the United States and internationally. Biology of Aging: Research Today for a Healthier Tomorrow ...

  11. DOE EPSCoR Initiative in Structural and computational Biology/Bioinformatics

    SciTech Connect

    Wallace, Susan S.

    2008-02-21

    The overall goal of the DOE EPSCoR Initiative in Structural and Computational Biology was to enhance the competiveness of Vermont research in these scientific areas. To develop self-sustaining infrastructure, we increased the critical mass of faculty, developed shared resources that made junior researchers more competitive for federal research grants, implemented programs to train graduate and undergraduate students who participated in these research areas and provided seed money for research projects. During the time period funded by this DOE initiative: (1) four new faculty were recruited to the University of Vermont using DOE resources, three in Computational Biology and one in Structural Biology; (2) technical support was provided for the Computational and Structural Biology facilities; (3) twenty-two graduate students were directly funded by fellowships; (4) fifteen undergraduate students were supported during the summer; and (5) twenty-eight pilot projects were supported. Taken together these dollars resulted in a plethora of published papers, many in high profile journals in the fields and directly impacted competitive extramural funding based on structural or computational biology resulting in 49 million dollars awarded in grants (Appendix I), a 600% return on investment by DOE, the State and University.

  12. Computation and graphics in mathematical research

    SciTech Connect

    Hoffman, D.A.; Spruck, J.

    1993-06-01

    Current research is described on: grain boundaries and dislocations in compound polymers, boundary value problems for hypersurfaces constant Gaussian curvature, and discrete computational geometry. 19 refs, 4 figs.

  13. A systems biology approach to infectious disease research: innovating the pathogen-host research paradigm.

    PubMed

    Aderem, Alan; Adkins, Joshua N; Ansong, Charles; Galagan, James; Kaiser, Shari; Korth, Marcus J; Law, G Lynn; McDermott, Jason G; Proll, Sean C; Rosenberger, Carrie; Schoolnik, Gary; Katze, Michael G

    2011-01-01

    The twentieth century was marked by extraordinary advances in our understanding of microbes and infectious disease, but pandemics remain, food and waterborne illnesses are frequent, multidrug-resistant microbes are on the rise, and the needed drugs and vaccines have not been developed. The scientific approaches of the past-including the intense focus on individual genes and proteins typical of molecular biology-have not been sufficient to address these challenges. The first decade of the twenty-first century has seen remarkable innovations in technology and computational methods. These new tools provide nearly comprehensive views of complex biological systems and can provide a correspondingly deeper understanding of pathogen-host interactions. To take full advantage of these innovations, the National Institute of Allergy and Infectious Diseases recently initiated the Systems Biology Program for Infectious Disease Research. As participants of the Systems Biology Program, we think that the time is at hand to redefine the pathogen-host research paradigm. PMID:21285433

  14. A first course in computing with applications to biology.

    PubMed

    Libeskind-Hadas, Ran; Bush, Eliot

    2013-09-01

    We believe that undergraduate biology students must acquire a foundational background in computing including how to formulate a computational problem; develop an algorithmic solution; implement their solution in software and then test, document and use their code to explore biological phenomena. Moreover, by learning these skills in the first year, students acquire a powerful tool set that they can use and build on throughout their studies. To address this need, we have developed a first-year undergraduate course that teaches students the foundations of computational thinking and programming in the context of problems in biology. This article describes the structure and content of the course and summarizes assessment data on both affective and learning outcomes. PMID:23449003

  15. Radioisotopic methods for biological and medical research

    SciTech Connect

    Knoche, H.W.

    1991-01-01

    This book provides a theoretical basis for the effective and safe use of radioactive materials in research. Particular attention is given to the four major topic areas specified in NRC's license application forms: (1) principles and practices of radiation protection; (2) radioactivity measurement, standardization and monitoring techniques, and instruments; (3) mathematics and calculations basic to the use and measurement of radioactivity; (4) biological effects of radiation. Overview and background information, including a section reviewing nuclear physics, is used where needed throughout the text, and problem sets are included in many of the chapters. Appendices for physical constants and conversion factors and for answers to problems are added.

  16. Biological research on a Space Station

    NASA Technical Reports Server (NTRS)

    Krikorian, A. D.; Johnson, Catherine C.

    1990-01-01

    A Space Station can provide reliable, long duration access to ug environments for basic and applied biological research. The uniqueness of access to near-weightless environments to probe fundamental questions of significance to gravitational and Space biologists can be exploited from many vantage points. Access to centrifuge facilities that can provide 1 g and hypo-g controls will permit identification of gravity-dependent or primary effects. Understanding secondary effects of the ug environment as well will allow a fuller exploitation of the Space environment.

  17. Race in Biological and Biomedical Research

    PubMed Central

    Cooper, Richard S.

    2013-01-01

    The concept of race has had a significant influence on research in human biology since the early 19th century. But race was given its meaning and social impact in the political sphere and subsequently intervened in science as a foreign concept, not grounded in the dominant empiricism of modern biology. The uses of race in science were therefore often disruptive and controversial; at times, science had to be retrofitted to accommodate race, and science in turn was often used to explain and justify race. This relationship was unstable in large part because race was about a phenomenon that could not be observed directly, being based on claims about the structure and function of genomic DNA. Over time, this relationship has been characterized by distinct phases, evolving from the inference of genetic effects based on the observed phenotype to the measurement of base-pair variation in DNA. Despite this fundamental advance in methodology, liabilities imposed by the dual political-empirical origins of race persist. On the one hand, an optimistic prediction can be made that just as geology made it possible to overturn the myth of the recent creation of the earth and evolution told us where the living world came from, molecular genetics will end the use of race in biology. At the same time, because race is fundamentally a political and not a scientific idea, it is possible that only a political intervention will relieve us of the burden of race. PMID:24186487

  18. Computational intelligence techniques for biological data mining: An overview

    NASA Astrophysics Data System (ADS)

    Faye, Ibrahima; Iqbal, Muhammad Javed; Said, Abas Md; Samir, Brahim Belhaouari

    2014-10-01

    Computational techniques have been successfully utilized for a highly accurate analysis and modeling of multifaceted and raw biological data gathered from various genome sequencing projects. These techniques are proving much more effective to overcome the limitations of the traditional in-vitro experiments on the constantly increasing sequence data. However, most critical problems that caught the attention of the researchers may include, but not limited to these: accurate structure and function prediction of unknown proteins, protein subcellular localization prediction, finding protein-protein interactions, protein fold recognition, analysis of microarray gene expression data, etc. To solve these problems, various classification and clustering techniques using machine learning have been extensively used in the published literature. These techniques include neural network algorithms, genetic algorithms, fuzzy ARTMAP, K-Means, K-NN, SVM, Rough set classifiers, decision tree and HMM based algorithms. Major difficulties in applying the above algorithms include the limitations found in the previous feature encoding and selection methods while extracting the best features, increasing classification accuracy and decreasing the running time overheads of the learning algorithms. The application of this research would be potentially useful in the drug design and in the diagnosis of some diseases. This paper presents a concise overview of the well-known protein classification techniques.

  19. Computational approaches to metabolic engineering utilizing systems biology and synthetic biology

    PubMed Central

    Fong, Stephen S.

    2014-01-01

    Metabolic engineering modifies cellular function to address various biochemical applications. Underlying metabolic engineering efforts are a host of tools and knowledge that are integrated to enable successful outcomes. Concurrent development of computational and experimental tools has enabled different approaches to metabolic engineering. One approach is to leverage knowledge and computational tools to prospectively predict designs to achieve the desired outcome. An alternative approach is to utilize combinatorial experimental tools to empirically explore the range of cellular function and to screen for desired traits. This mini-review focuses on computational systems biology and synthetic biology tools that can be used in combination for prospective in silico strain design. PMID:25379141

  20. 2010 Plant Molecular Biology Gordon Research Conference

    SciTech Connect

    Michael Sussman

    2010-07-23

    The Plant Molecular Biology Conference has traditionally covered a breadth of exciting topics and the 2010 conference will continue in that tradition. Emerging concerns about food security have inspired a program with three main themes: (1) genomics, natural variation and breeding to understand adaptation and crop improvement, (2) hormonal cross talk, and (3) plant/microbe interactions. There are also sessions on epigenetics and proteomics/metabolomics. Thus this conference will bring together a range of disciplines, will foster the exchange of ideas and enable participants to learn of the latest developments and ideas in diverse areas of plant biology. The conference provides an excellent opportunity for individuals to discuss their research because additional speakers in each session will be selected from submitted abstracts. There will also be a poster session each day for a two-hour period prior to dinner. In particular, this conference plays a key role in enabling students and postdocs (the next generation of research leaders) to mingle with pioneers in multiple areas of plant science.

  1. The Learning of Biology: A Structural Basis for Future Research

    ERIC Educational Resources Information Center

    Murray, Darrel L.

    1977-01-01

    This article reviews recent research studies and experiences relating the learning theories of Ausubel to biology instruction. Also some suggestions are made for future research on the learning of biology. (MR)

  2. Trademark Research with the Computer.

    ERIC Educational Resources Information Center

    Jordan, Anne S.

    1985-01-01

    Discusses computer use in practice of trademark law by following adoption, filing, and prior use of the trademark "Aspen" (for a fruit juice drink). Databases searched to track previous use of mark, trace possible conflicts, and assist in their resolution are mentioned. Database chart and list of vendors is included. (EJS)

  3. Research on Computers and Problem Solving.

    ERIC Educational Resources Information Center

    Burton, John K.; And Others

    1988-01-01

    Eight articles review and report on research involving computers and problem solving skills. Topics discussed include research design; problem solving skills and programing languages, including BASIC and LOGO; computer anxiety; diagnostic programs for arithmetic problems; and relationships between ability and problem solving scores and between…

  4. Division of Biological and Medical Research annual research summary, 1983

    SciTech Connect

    Barr, S.H.

    1984-08-01

    This research summary contains brief descriptions of research in the following areas: (1) mechanisms of hepatocarcinogenesis; (2) role of metals in cocarcinogenesis and the use of liposomes for metal mobilization; (3) control of mutagenesis and cell differentiation in cultured cells by tumor promoters; (4) radiation effects in mammalian cells; (5) radiation carcinogenesis and radioprotectors; (6) life shortening, tumor induction, and tissue dose for fission-neutron and gamma-ray irradiations; (7) mammalian genetics and biostatistics; (8) radiation toxicity studies; (9) hematopoiesis in chronic toxicity; (10) molecular biology studies; (11) chemical toxicology; (12) carcinogen identification and metabolism; (13) metal metabolism and toxicity; and (14) neurobehavioral chronobiology. (ACR)

  5. Graphics supercomputer for computational fluid dynamics research

    NASA Astrophysics Data System (ADS)

    Liaw, Goang S.

    1994-11-01

    The objective of this project is to purchase a state-of-the-art graphics supercomputer to improve the Computational Fluid Dynamics (CFD) research capability at Alabama A & M University (AAMU) and to support the Air Force research projects. A cutting-edge graphics supercomputer system, Onyx VTX, from Silicon Graphics Computer Systems (SGI), was purchased and installed. Other equipment including a desktop personal computer, PC-486 DX2 with a built-in 10-BaseT Ethernet card, a 10-BaseT hub, an Apple Laser Printer Select 360, and a notebook computer from Zenith were also purchased. A reading room has been converted to a research computer lab by adding some furniture and an air conditioning unit in order to provide an appropriate working environments for researchers and the purchase equipment. All the purchased equipment were successfully installed and are fully functional. Several research projects, including two existing Air Force projects, are being performed using these facilities.

  6. Space Station Biological Research Project Habitat: Incubator

    NASA Technical Reports Server (NTRS)

    Nakamura, G. J.; Kirven-Brooks, M.; Scheller, N. M.

    2001-01-01

    Developed as part of the suite of Space Station Biological Research Project (SSBRP) hardware to support research aboard the International Space Station (ISS), the Incubator is a temperature-controlled chamber, for conducting life science research with small animal, plant and microbial specimens. The Incubator is designed for use only on the ISS and is transported to/from the ISS, unpowered and without specimens, in the Multi-Purpose Logistics Module (MPLM) of the Shuttle. The Incubator interfaces with the three SSBRP Host Systems; the Habitat Holding Racks (HHR), the Life Sciences Glovebox (LSG) and the 2.5 m Centrifuge Rotor (CR), providing investigators with the ability to conduct research in microgravity and at variable gravity levels of up to 2-g. The temperature within the Specimen Chamber can be controlled between 4 and 45 C. Cabin air is recirculated within the Specimen Chamber and can be exchanged with the ISS cabin at a rate of approximately equal 50 cc/min. The humidity of the Specimen Chamber is monitored. The Specimen Chamber has a usable volume of approximately equal 19 liters and contains two (2) connectors at 28v dc, (60W) for science equipment; 5 dedicated thermometers for science; ports to support analog and digital signals from experiment unique sensors or other equipment; an Ethernet port; and a video port. It is currently manifested for UF-3 and will be launched integrated within the first SSBRP Habitat Holding Rack.

  7. Evaluating Computer Lab Modules for Large Biology Courses.

    ERIC Educational Resources Information Center

    Eichinger, David C.; And Others

    This paper describes the first phase of a study to investigate students' evaluations of computer laboratory modules in a university-level, non-majors biology course. The National Science Foundation-funded project has two primary goals: (1) to develop programmable, multifunctional Bio LabStations for data collection and analysis, lab extensions,…

  8. Biology Students Building Computer Simulations Using StarLogo TNG

    ERIC Educational Resources Information Center

    Smith, V. Anne; Duncan, Ishbel

    2011-01-01

    Confidence is an important issue for biology students in handling computational concepts. This paper describes a practical in which honours-level bioscience students simulate complex animal behaviour using StarLogo TNG, a freely-available graphical programming environment. The practical consists of two sessions, the first of which guides students…

  9. Chaste: using agile programming techniques to develop computational biology software.

    PubMed

    Pitt-Francis, Joe; Bernabeu, Miguel O; Cooper, Jonathan; Garny, Alan; Momtahan, Lee; Osborne, James; Pathmanathan, Pras; Rodriguez, Blanca; Whiteley, Jonathan P; Gavaghan, David J

    2008-09-13

    Cardiac modelling is the area of physiome modelling where the available simulation software is perhaps most mature, and it therefore provides an excellent starting point for considering the software requirements for the wider physiome community. In this paper, we will begin by introducing some of the most advanced existing software packages for simulating cardiac electrical activity. We consider the software development methods used in producing codes of this type, and discuss their use of numerical algorithms, relative computational efficiency, usability, robustness and extensibility. We then go on to describe a class of software development methodologies known as test-driven agile methods and argue that such methods are more suitable for scientific software development than the traditional academic approaches. A case study is a project of our own, Cancer, Heart and Soft Tissue Environment, which is a library of computational biology software that began as an experiment in the use of agile programming methods. We present our experiences with a review of our progress thus far, focusing on the advantages and disadvantages of this new approach compared with the development methods used in some existing packages. We conclude by considering whether the likely wider needs of the cardiac modelling community are currently being met and suggest that, in order to respond effectively to changing requirements, it is essential that these codes should be more malleable. Such codes will allow for reliable extensions to include both detailed mathematical models--of the heart and other organs--and more efficient numerical techniques that are currently being developed by many research groups worldwide. PMID:18565813

  10. Space plant biology research in Lithuania.

    PubMed

    Ričkienė, Aurika

    2012-09-01

    In 1957, the Soviet Union launched the first artificial Earth satellite, initiating its space exploration programs. Throughout the rest of the twentieth century, the development of these space programs received special attention from Soviet Union authorities. Scientists from the former Soviet Republics, including Lithuania, participated in these programs. From 1971 to 1990, Lithuanians designed more than 20 experiments on higher plant species during space flight. Some of these experiments had never before been attempted and, therefore, made scientific history. However, the formation and development of space plant biology research in Lithuania or its origins, context of formation, and placement in a worldwide context have not been explored from a historical standpoint. By investigating these topics, this paper seeks to construct an image of the development of a very specific field of science in a small former Soviet republic. PMID:22613222

  11. A Systems Biology Approach to Infectious Disease Research: Innovating the Pathogen-Host Research Paradigm

    SciTech Connect

    Aderem, Alan; Adkins, Joshua N.; Ansong, Charles; Galagan, James; Kaiser, Shari; Korth, Marcus J.; Law, G. L.; McDermott, Jason E.; Proll, Sean; Rosenberger, Carrie; Schoolnik, Gary; Katze, Michael G.

    2011-02-01

    The 20th century was marked by extraordinary advances in our understanding of microbes and infectious disease, but pandemics remain, food and water borne illnesses are frequent, multi-drug resistant microbes are on the rise, and the needed drugs and vaccines have not been developed. The scientific approaches of the past—including the intense focus on individual genes and proteins typical of molecular biology—have not been sufficient to address these challenges. The first decade of the 21st century has seen remarkable innovations in technology and computational methods. These new tools provide nearly comprehensive views of complex biological systems and can provide a correspondingly deeper understanding of pathogen-host interactions. To take full advantage of these innovations, the National Institute of Allergy and Infectious Diseases recently initiated the Systems Biology Program for Infectious Disease Research. As participants of the Systems Biology Program we think that the time is at hand to redefine the pathogen-host research paradigm.

  12. PARTNERING WITH DOE TO APPLY ADVANCED BIOLOGICAL, ENVIRONMENTAL, AND COMPUTATIONAL SCIENCE TO ENVIRONMENTAL ISSUES

    EPA Science Inventory

    On February 18, 2004, the U.S. Environmental Protection Agency and Department of Energy signed a Memorandum of Understanding to expand the research collaboration of both agencies to advance biological, environmental, and computational sciences for protecting human health and the ...

  13. Effects of Computer Assisted Instruction (CAI) on Secondary School Students' Performance in Biology

    ERIC Educational Resources Information Center

    Yusuf, Mudasiru Olalere; Afolabi, Adedeji Olufemi

    2010-01-01

    This study investigated the effects of computer assisted instruction (CAI) on secondary school students' performance in biology. Also, the influence of gender on the performance of students exposed to CAI in individualised or cooperative learning settings package was examined. The research was a quasi experimental involving a 3 x 2 factorial…

  14. Research Guidelines for Computer-Assisted Instruction.

    ERIC Educational Resources Information Center

    Hickey, Albert E.

    Prepared for the Defense Advanced Research Projects Agency (ARPA), this report contains 59 recommendations for research and development in support of computer-assisted instruction (CAI). The guidelines were derived from interviews with 14 leading education researchers. They cover the following learning and instruction variables: (1) learning…

  15. Has Modern Biology Entered the Mouth? The Clinical Impact of Biological Research.

    ERIC Educational Resources Information Center

    Baum, Bruce J.

    1991-01-01

    Three areas of biological research that are beginning to have an impact on clinical medicine are examined, including molecular biology, cell biology, and biotechnology. It is concluded that oral biologists and educators must work cooperatively to bring rapid biological and biomedical advances into dental training in a meaningful way. (MSE)

  16. Computational Proteomics: High-throughput Analysis for Systems Biology

    SciTech Connect

    Cannon, William R.; Webb-Robertson, Bobbie-Jo M.

    2007-01-03

    High-throughput (HTP) proteomics is a rapidly developing field that offers the global profiling of proteins from a biological system. The HTP technological advances are fueling a revolution in biology, enabling analyses at the scales of entire systems (e.g., whole cells, tumors, or environmental communities). However, simply identifying the proteins in a cell is insufficient for understanding the underlying complexity and operating mechanisms of the overall system. Systems level investigations are relying more and more on computational analyses, especially in the field of proteomics generating large-scale global data.

  17. Beyond moore computing research challenge workshop report.

    SciTech Connect

    Huey, Mark C.; Aidun, John Bahram

    2013-10-01

    We summarize the presentations and break out session discussions from the in-house workshop that was held on 11 July 2013 to acquaint a wider group of Sandians with the Beyond Moore Computing research challenge.

  18. Computational design of digital and memory biological devices.

    PubMed

    Rodrigo, Guillermo; Jaramillo, Alfonso

    2007-12-01

    The use of combinatorial optimization techniques with computational design allows the development of automated methods to design biological systems. Automatic design integrates design principles in an unsupervised algorithm to sample a larger region of the biological network space, at the topology and parameter levels. The design of novel synthetic transcriptional networks with targeted behaviors will be key to understand the design principles underlying biological networks. In this work, we evolve transcriptional networks towards a targeted dynamics, by using a library of promoters and coding sequences, to design a complex biological memory device. The designed sequential transcription network implements a JK-Latch, which is fully predictable and richer than other memory devices. Furthermore, we present designs of transcriptional devices behaving as logic gates, and we show how to create digital behavior from analog promoters. Our procedure allows us to propose a scenario for the evolution of multi-functional genetic networks. In addition, we discuss the decomposability of regulatory networks in terms of genetic modules to develop a given cellular function. Summary. We show how to use an automated procedure to design logic and sequential transcription circuits. This methodology will allow advancing the rational design of biological devices to more complex systems, and we propose the first design of a biological JK-latch memory device. PMID:19003443

  19. Argonne's Magellan Cloud Computing Research Project

    ScienceCinema

    Beckman, Pete

    2013-04-19

    Pete Beckman, head of Argonne's Leadership Computing Facility (ALCF), discusses the Department of Energy's new $32-million Magellan project, which designed to test how cloud computing can be used for scientific research. More information: http://www.anl.gov/Media_Center/News/2009/news091014a.html

  20. Argonne's Magellan Cloud Computing Research Project

    SciTech Connect

    Beckman, Pete

    2009-01-01

    Pete Beckman, head of Argonne's Leadership Computing Facility (ALCF), discusses the Department of Energy's new $32-million Magellan project, which designed to test how cloud computing can be used for scientific research. More information: http://www.anl.gov/Media_Center/News/2009/news091014a.html

  1. Natural computing for mechanical systems research: A tutorial overview

    NASA Astrophysics Data System (ADS)

    Worden, Keith; Staszewski, Wieslaw J.; Hensman, James J.

    2011-01-01

    A great many computational algorithms developed over the past half-century have been motivated or suggested by biological systems or processes, the most well-known being the artificial neural networks. These algorithms are commonly grouped together under the terms soft or natural computing. A property shared by most natural computing algorithms is that they allow exploration of, or learning from, data. This property has proved extremely valuable in the solution of many diverse problems in science and engineering. The current paper is intended as a tutorial overview of the basic theory of some of the most common methods of natural computing as they are applied in the context of mechanical systems research. The application of some of the main algorithms is illustrated using case studies. The paper also attempts to give some indication as to which of the algorithms emerging now from the machine learning community are likely to be important for mechanical systems research in the future.

  2. Optical computing at NASA Ames Research Center

    NASA Technical Reports Server (NTRS)

    Reid, Max B.; Bualat, Maria G.; Downie, John D.; Galant, David; Gary, Charles K.; Hine, Butler P.; Ma, Paul W.; Pryor, Anna H.; Spirkovska, Lilly

    1991-01-01

    Optical computing research at NASA Ames Research Center seeks to utilize the capability of analog optical processing, involving free-space propagation between components, to produce natural implementations of algorithms requiring large degrees of parallel computation. Potential applications being investigated include robotic vision, planetary lander guidance, aircraft engine exhaust analysis, analysis of remote sensing satellite multispectral images, control of space structures, and autonomous aircraft inspection.

  3. pClone: Synthetic Biology Tool Makes Promoter Research Accessible to Beginning Biology Students

    ERIC Educational Resources Information Center

    Campbell, A. Malcolm; Eckdahl, Todd; Cronk, Brian; Andresen, Corinne; Frederick, Paul; Huckuntod, Samantha; Shinneman, Claire; Wacker, Annie; Yuan, Jason

    2014-01-01

    The "Vision and Change" report recommended genuine research experiences for undergraduate biology students. Authentic research improves science education, increases the number of scientifically literate citizens, and encourages students to pursue research. Synthetic biology is well suited for undergraduate research and is a growing area…

  4. Computational Neuroscience: Modeling the Systems Biology of Synaptic Plasticity

    PubMed Central

    Kotaleski, Jeanette Hellgren; Blackwell, Kim T.

    2016-01-01

    Preface Synaptic plasticity is a mechanism proposed to underlie learning and memory. The complexity of the interactions between ion channels, enzymes, and genes involved in synaptic plasticity impedes a deep understanding of this phenomenon. Computer modeling is an approach to investigate the information processing that is performed by signaling pathways underlying synaptic plasticity. In the past few years, new software developments that blend computational neuroscience techniques with systems biology techniques have allowed large-scale, quantitative modeling of synaptic plasticity in neurons. We highlight significant advancements produced by these modeling efforts and introduce promising approaches that utilize advancements in live cell imaging. PMID:20300102

  5. Systems Biology in Immunology – A Computational Modeling Perspective

    PubMed Central

    Germain, Ronald N.; Meier-Schellersheim, Martin; Nita-Lazar, Aleksandra; Fraser, Iain D. C.

    2011-01-01

    Systems biology is an emerging discipline that combines high-content, multiplexed measurements with informatic and computational modeling methods to better understand biological function at various scales. Here we present a detailed review of the methods used to create computational models and conduct simulations of immune function, We provide descriptions of the key data gathering techniques employed to generate the quantitative and qualitative data required for such modeling and simulation and summarize the progress to date in applying these tools and techniques to questions of immunological interest, including infectious disease. We include comments on what insights modeling can provide that complement information obtained from the more familiar experimental discovery methods used by most investigators and why quantitative methods are needed to eventually produce a better understanding of immune system operation in health and disease. PMID:21219182

  6. Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2000-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth; (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking. Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a

  7. Biology of an Enzyme: A Research-Like Experience for Introductory Biology Students.

    ERIC Educational Resources Information Center

    Towle, David W.

    1992-01-01

    Presents a series of laboratory exercises designed to introduce students to a realistic experience in biological research that is feasible with large numbers of beginning biology majors. The exercises center on the study of alkaline phosphatase. (DDR)

  8. Exploiting graphics processing units for computational biology and bioinformatics.

    PubMed

    Payne, Joshua L; Sinnott-Armstrong, Nicholas A; Moore, Jason H

    2010-09-01

    Advances in the video gaming industry have led to the production of low-cost, high-performance graphics processing units (GPUs) that possess more memory bandwidth and computational capability than central processing units (CPUs), the standard workhorses of scientific computing. With the recent release of generalpurpose GPUs and NVIDIA's GPU programming language, CUDA, graphics engines are being adopted widely in scientific computing applications, particularly in the fields of computational biology and bioinformatics. The goal of this article is to concisely present an introduction to GPU hardware and programming, aimed at the computational biologist or bioinformaticist. To this end, we discuss the primary differences between GPU and CPU architecture, introduce the basics of the CUDA programming language, and discuss important CUDA programming practices, such as the proper use of coalesced reads, data types, and memory hierarchies. We highlight each of these topics in the context of computing the all-pairs distance between instances in a dataset, a common procedure in numerous disciplines of scientific computing. We conclude with a runtime analysis of the GPU and CPU implementations of the all-pairs distance calculation. We show our final GPU implementation to outperform the CPU implementation by a factor of 1700. PMID:20658333

  9. Computational problems in magnetic fusion research

    SciTech Connect

    Killeen, J.

    1981-08-31

    Numerical calculations have had an important role in fusion research since its beginning, but the application of computers to plasma physics has advanced rapidly in the last few years. One reason for this is the increasing sophistication of the mathematical models of plasma behavior, and another is the increased speed and memory of the computers which made it reasonable to consider numerical simulation of fusion devices. The behavior of a plasma is simulated by a variety of numerical models. Some models used for short times give detailed knowledge of the plasma on a microscopic scale, while other models used for much longer times compute macroscopic properties of the plasma dynamics. The computer models used in fusion research are surveyed. One of the most active areas of research is in time-dependent, three-dimensional, resistive magnetohydrodynamic models. These codes are reviewed briefly.

  10. Plant biology research and training for the 21st century

    SciTech Connect

    Kelly, K.

    1992-01-01

    The committee was assembled in response to a request from the National Science Foundation (NSF), the US Department of Agriculture (USDA), and the US Department of Energy (DoE). The leadership of these agencies asked the National Academy of Sciences through the National Research Council (NRC) to assess the status of plant-science research in the United States in light of the opportunities arising from advances inother areas of biology. NRC was asked to suggest ways of accelerating the application of these new biologic concepts and tools to research in plant science with the aim of enhancing the acquisition of new knowledge about plants. The charge to the committee was to examine the following: Organizations, departments, and institutions conducting plant biology research; human resources involved in plant biology research; graduate training programs in plant biology; federal, state, and private sources of support for plant-biology research; the role of industry in conducting and supporting plant-biology research; the international status of US plant-biology research; and the relationship of plant biology to leading-edge research in biology.

  11. Plant biology research and training for the 21st century

    SciTech Connect

    Kelly, K.

    1992-12-31

    The committee was assembled in response to a request from the National Science Foundation (NSF), the US Department of Agriculture (USDA), and the US Department of Energy (DoE). The leadership of these agencies asked the National Academy of Sciences through the National Research Council (NRC) to assess the status of plant-science research in the United States in light of the opportunities arising from advances inother areas of biology. NRC was asked to suggest ways of accelerating the application of these new biologic concepts and tools to research in plant science with the aim of enhancing the acquisition of new knowledge about plants. The charge to the committee was to examine the following: Organizations, departments, and institutions conducting plant biology research; human resources involved in plant biology research; graduate training programs in plant biology; federal, state, and private sources of support for plant-biology research; the role of industry in conducting and supporting plant-biology research; the international status of US plant-biology research; and the relationship of plant biology to leading-edge research in biology.

  12. Advanced Computer Simulations Of Nanomaterials And Stochastic Biological Processes

    NASA Astrophysics Data System (ADS)

    Minakova, Maria S.

    This dissertation consists of several parts. The first two chapters are devoted to of study of dynamic processes in cellular organelles called filopodia. A stochastic kinetics approach is used to describe non-equilibrium evolution of the filopodial system from nano- to micro scales. Dynamic coupling between chemistry and mechanics is also taken into account in order to investigate the influence of focal adhesions on cell motility. The second chapter explores the possibilities and effects of motor enhanced delivery of actin monomers to the polymerizing tips of filopodia, and how the steady-state filopodial length can exceed the limit set by pure diffusion. Finally, we also challenge the currently existing view of active transport and propose a new theoretical model that accurately describes the motor dynamics and concentration profiles seen in experiments in a physically meaningful way. The third chapter is a result of collaboration between three laboratories, as a part of Energy Frontier Research Center at the University of North Carolina at Chapel Hill. The work presented here unified the fields of synthetic chemistry, photochemistry, and computational physical chemistry in order to investigate a novel bio-synthetic compound and its energy transfer capabilities. This particular peptide-based design has never been studied via Molecular Dynamics with high precision, and it is the first attempt known to us to simulate the whole chromophore-peptide complex in solution in order to gain detailed information about its structural and dynamic features. The fourth chapter deals with the non-equilibrium relaxation induced transport of water molecules in a microemulsion. This problem required a different set of methodologies and a more detailed, all-atomistic treatment of the system. We found interesting water clustering effects and elucidated the most probable mechanism of water transfer through oil under the condition of saturated Langmuir monolayers. Together these

  13. Mathematical and Computational Challenges in Population Biology and Ecosystems Science

    NASA Technical Reports Server (NTRS)

    Levin, Simon A.; Grenfell, Bryan; Hastings, Alan; Perelson, Alan S.

    1997-01-01

    Mathematical and computational approaches provide powerful tools in the study of problems in population biology and ecosystems science. The subject has a rich history intertwined with the development of statistics and dynamical systems theory, but recent analytical advances, coupled with the enhanced potential of high-speed computation, have opened up new vistas and presented new challenges. Key challenges involve ways to deal with the collective dynamics of heterogeneous ensembles of individuals, and to scale from small spatial regions to large ones. The central issues-understanding how detail at one scale makes its signature felt at other scales, and how to relate phenomena across scales-cut across scientific disciplines and go to the heart of algorithmic development of approaches to high-speed computation. Examples are given from ecology, genetics, epidemiology, and immunology.

  14. 78 FR 6087 - Biological and Environmental Research Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-29

    ... Biological and Environmental Research Advisory Committee AGENCY: Office of Science, Department of Energy... Environmental Research Advisory Committee (BERAC). The Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat.... Department of Energy, Office of Science, Office of Biological and Environmental Research,...

  15. 78 FR 34088 - Biological and Environmental Research Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-06

    ... Biological and Environmental Research Advisory Committee AGENCY: Office of Science, Department of Energy... Environmental Research Advisory Committee (BERAC). The Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat..., Office of Science, Office of Biological and Environmental Research, SC-23/Germantown Building,...

  16. 76 FR 57028 - Biological and Environmental Research Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-15

    ... Biological and Environmental Research Advisory Committee AGENCY: Department of Energy; Office of Science... Environmental Research Advisory Committee (BERAC). The Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat....S. Department of Energy, Office of Science, Office of Biological and Environmental Research,...

  17. 77 FR 28368 - Biological and Environmental Research Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-14

    ... Biological and Environmental Research Advisory Committee AGENCY: Office of Science, Department of Energy... Environmental Research Advisory Committee (BERAC). The Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat..., Office of Biological and Environmental Research, SC-23/Germantown Building, 1000 Independence Avenue...

  18. 75 FR 6651 - Biological and Environmental Research Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-10

    ... Biological and Environmental Research Advisory Committee AGENCY: Department of Energy; Office of Science... Environmental Research Advisory Committee (BERAC). Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat. 770... of Biological and Environmental Research, SC-23/Germantown Building, 1000 Independence Avenue,...

  19. 78 FR 63170 - Biological and Environmental Research Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-23

    ... Biological and Environmental Research Advisory Committee AGENCY: Office of Science, Department of Energy... Environmental Research Advisory Committee (BERAC). The Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat... Energy, Office of Science, Office of Biological and Environmental Research, SC-23/Germantown...

  20. 75 FR 53685 - Biological and Environmental Research Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-01

    ... Biological and Environmental Research Advisory Committee AGENCY: Department of Energy, Office of Science... Environmental Research Advisory Committee (BERAC). The Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat... of Biological and Environmental Research, SC-23/Germantown Building, 1000 Independence Avenue,...

  1. Final report for Conference Support Grant "From Computational Biophysics to Systems Biology - CBSB12"

    SciTech Connect

    Hansmann, Ulrich H.E.

    2012-07-02

    This report summarizes the outcome of the international workshop From Computational Biophysics to Systems Biology (CBSB12) which was held June 3-5, 2012, at the University of Tennessee Conference Center in Knoxville, TN, and supported by DOE through the Conference Support Grant 120174. The purpose of CBSB12 was to provide a forum for the interaction between a data-mining interested systems biology community and a simulation and first-principle oriented computational biophysics/biochemistry community. CBSB12 was the sixth in a series of workshops of the same name organized in recent years, and the second that has been held in the USA. As in previous years, it gave researchers from physics, biology, and computer science an opportunity to acquaint each other with current trends in computational biophysics and systems biology, to explore venues of cooperation, and to establish together a detailed understanding of cells at a molecular level. The conference grant of $10,000 was used to cover registration fees and provide travel fellowships to selected students and postdoctoral scientists. By educating graduate students and providing a forum for young scientists to perform research into the working of cells at a molecular level, the workshop adds to DOE's mission of paving the way to exploit the abilities of living systems to capture, store and utilize energy.

  2. Accelerating cancer systems biology research through Semantic Web technology.

    PubMed

    Wang, Zhihui; Sagotsky, Jonathan; Taylor, Thomas; Shironoshita, Patrick; Deisboeck, Thomas S

    2013-01-01

    Cancer systems biology is an interdisciplinary, rapidly expanding research field in which collaborations are a critical means to advance the field. Yet the prevalent database technologies often isolate data rather than making it easily accessible. The Semantic Web has the potential to help facilitate web-based collaborative cancer research by presenting data in a manner that is self-descriptive, human and machine readable, and easily sharable. We have created a semantically linked online Digital Model Repository (DMR) for storing, managing, executing, annotating, and sharing computational cancer models. Within the DMR, distributed, multidisciplinary, and inter-organizational teams can collaborate on projects, without forfeiting intellectual property. This is achieved by the introduction of a new stakeholder to the collaboration workflow, the institutional licensing officer, part of the Technology Transfer Office. Furthermore, the DMR has achieved silver level compatibility with the National Cancer Institute's caBIG, so users can interact with the DMR not only through a web browser but also through a semantically annotated and secure web service. We also discuss the technology behind the DMR leveraging the Semantic Web, ontologies, and grid computing to provide secure inter-institutional collaboration on cancer modeling projects, online grid-based execution of shared models, and the collaboration workflow protecting researchers' intellectual property. PMID:23188758

  3. Systems Biology - A Pivotal Research Methodology for Understanding the Mechanisms of Traditional Medicine

    PubMed Central

    Lee, Soojin

    2015-01-01

    Objectives: Systems biology is a novel subject in the field of life science that aims at a systems’ level understanding of biological systems. Because of the significant progress in high-throughput technologies and molecular biology, systems biology occupies an important place in research during the post-genome era. Methods: The characteristics of systems biology and its applicability to traditional medicine research have been discussed from three points of view: data and databases, network analysis and inference, and modeling and systems prediction. Results: The existing databases are mostly associated with medicinal herbs and their activities, but new databases reflecting clinical situations and platforms to extract, visualize and analyze data easily need to be constructed. Network pharmacology is a key element of systems biology, so addressing the multi-component, multi-target aspect of pharmacology is important. Studies of network pharmacology highlight the drug target network and network target. Mathematical modeling and simulation are just in their infancy, but mathematical modeling of dynamic biological processes is a central aspect of systems biology. Computational simulations allow structured systems and their functional properties to be understood and the effects of herbal medicines in clinical situations to be predicted. Conclusion: Systems biology based on a holistic approach is a pivotal research methodology for understanding the mechanisms of traditional medicine. If systems biology is to be incorporated into traditional medicine, computational technologies and holistic insights need to be integrated. PMID:26388998

  4. Research in mathematical theory of computation. [computer programming applications

    NASA Technical Reports Server (NTRS)

    Mccarthy, J.

    1973-01-01

    Research progress in the following areas is reviewed: (1) new version of computer program LCF (logic for computable functions) including a facility to search for proofs automatically; (2) the description of the language PASCAL in terms of both LCF and in first order logic; (3) discussion of LISP semantics in LCF and attempt to prove the correctness of the London compilers in a formal way; (4) design of both special purpose and domain independent proving procedures specifically program correctness in mind; (5) design of languages for describing such proof procedures; and (6) the embedding of ideas in the first order checker.

  5. Next generation distributed computing for cancer research.

    PubMed

    Agarwal, Pankaj; Owzar, Kouros

    2014-01-01

    Advances in next generation sequencing (NGS) and mass spectrometry (MS) technologies have provided many new opportunities and angles for extending the scope of translational cancer research while creating tremendous challenges in data management and analysis. The resulting informatics challenge is invariably not amenable to the use of traditional computing models. Recent advances in scalable computing and associated infrastructure, particularly distributed computing for Big Data, can provide solutions for addressing these challenges. In this review, the next generation of distributed computing technologies that can address these informatics problems is described from the perspective of three key components of a computational platform, namely computing, data storage and management, and networking. A broad overview of scalable computing is provided to set the context for a detailed description of Hadoop, a technology that is being rapidly adopted for large-scale distributed computing. A proof-of-concept Hadoop cluster, set up for performance benchmarking of NGS read alignment, is described as an example of how to work with Hadoop. Finally, Hadoop is compared with a number of other current technologies for distributed computing. PMID:25983539

  6. Next Generation Distributed Computing for Cancer Research

    PubMed Central

    Agarwal, Pankaj; Owzar, Kouros

    2014-01-01

    Advances in next generation sequencing (NGS) and mass spectrometry (MS) technologies have provided many new opportunities and angles for extending the scope of translational cancer research while creating tremendous challenges in data management and analysis. The resulting informatics challenge is invariably not amenable to the use of traditional computing models. Recent advances in scalable computing and associated infrastructure, particularly distributed computing for Big Data, can provide solutions for addressing these challenges. In this review, the next generation of distributed computing technologies that can address these informatics problems is described from the perspective of three key components of a computational platform, namely computing, data storage and management, and networking. A broad overview of scalable computing is provided to set the context for a detailed description of Hadoop, a technology that is being rapidly adopted for large-scale distributed computing. A proof-of-concept Hadoop cluster, set up for performance benchmarking of NGS read alignment, is described as an example of how to work with Hadoop. Finally, Hadoop is compared with a number of other current technologies for distributed computing. PMID:25983539

  7. Computational modeling of in vitro biological responses on polymethacrylate surfaces

    PubMed Central

    Ghosh, Jayeeta; Lewitus, Dan Y; Chandra, Prafulla; Joy, Abraham; Bushman, Jared; Knight, Doyle; Kohn, Joachim

    2011-01-01

    The objective of this research was to examine the capabilities of QSPR (Quantitative Structure Property Relationship) modeling to predict specific biological responses (fibrinogen adsorption, cell attachment and cell proliferation index) on thin films of different polymethacrylates. Using 33 commercially available monomers it is theoretically possible to construct a library of over 40,000 distinct polymer compositions. A subset of these polymers were synthesized and solvent cast surfaces were prepared in 96 well plates for the measurement of fibrinogen adsorption. NIH 3T3 cell attachment and proliferation index were measured on spin coated thin films of these polymers. Based on the experimental results of these polymers, separate models were built for homo-, co-, and terpolymers in the library with good correlation between experiment and predicted values. The ability to predict biological responses by simple QSPR models for large numbers of polymers has important implications in designing biomaterials for specific biological or medical applications. PMID:21779132

  8. Publication Bias in Methodological Computational Research

    PubMed Central

    Boulesteix, Anne-Laure; Stierle, Veronika; Hapfelmeier, Alexander

    2015-01-01

    The problem of publication bias has long been discussed in research fields such as medicine. There is a consensus that publication bias is a reality and that solutions should be found to reduce it. In methodological computational research, including cancer informatics, publication bias may also be at work. The publication of negative research findings is certainly also a relevant issue, but has attracted very little attention to date. The present paper aims at providing a new formal framework to describe the notion of publication bias in the context of methodological computational research, facilitate and stimulate discussions on this topic, and increase awareness in the scientific community. We report an exemplary pilot study that aims at gaining experiences with the collection and analysis of information on unpublished research efforts with respect to publication bias, and we outline the encountered problems. Based on these experiences, we try to formalize the notion of publication bias. PMID:26508827

  9. Frontiers of research in advanced computations

    SciTech Connect

    1996-07-01

    The principal mission of the Institute for Scientific Computing Research is to foster interactions among LLNL researchers, universities, and industry on selected topics in scientific computing. In the area of computational physics, the Institute has developed a new algorithm, GaPH, to help scientists understand the chemistry of turbulent and driven plasmas or gases at far less cost than other methods. New low-frequency electromagnetic models better describe the plasma etching and deposition characteristics of a computer chip in the making. A new method for modeling realistic curved boundaries within an orthogonal mesh is resulting in a better understanding of the physics associated with such boundaries and much quicker solutions. All these capabilities are being developed for massively parallel implementation, which is an ongoing focus of Institute researchers. Other groups within the Institute are developing novel computational methods to address a range of other problems. Examples include feature detection and motion recognition by computer, improved monitoring of blood oxygen levels, and entirely new models of human joint mechanics and prosthetic devices.

  10. MORT: a powerful foundational library for computational biology and CADD

    PubMed Central

    2014-01-01

    Background A foundational library called MORT (Molecular Objects and Relevant Templates) for the development of new software packages and tools employed in computational biology and computer-aided drug design (CADD) is described here. Results MORT contains several advantages compared with the other libraries. Firstly, MORT written in C++ natively supports the paradigm of object-oriented design, and thus it can be understood and extended easily. Secondly, MORT employs the relational model to represent a molecule, and it is more convenient and flexible than the traditional hierarchical model employed by many other libraries. Thirdly, a lot of functions have been included in this library, and a molecule can be manipulated easily at different levels. For example, it can parse a variety of popular molecular formats (MOL/SDF, MOL2, PDB/ENT, SMILES/SMARTS, etc.), create the topology and coordinate files for the simulations supported by AMBER, calculate the energy of a specific molecule based on the AMBER force fields, etc. Conclusions We believe that MORT can be used as a foundational library for programmers to develop new programs and applications for computational biology and CADD. Source code of MORT is available at http://cadd.suda.edu.cn/MORT/index.htm.

  11. Computational Fluid Dynamics Framework for Turbine Biological Performance Assessment

    SciTech Connect

    Richmond, Marshall C.; Serkowski, John A.; Carlson, Thomas J.; Ebner, Laurie L.; Sick, Mirjam; Cada, G. F.

    2011-05-04

    In this paper, a method for turbine biological performance assessment is introduced to bridge the gap between field and laboratory studies on fish injury and turbine design. Using this method, a suite of biological performance indicators is computed based on simulated data from a computational fluid dynamics (CFD) model of a proposed turbine design. Each performance indicator is a measure of the probability of exposure to a certain dose of an injury mechanism. If the relationship between the dose of an injury mechanism and frequency of injury (dose-response) is known from laboratory or field studies, the likelihood of fish injury for a turbine design can be computed from the performance indicator. By comparing the values of the indicators from various turbine designs, the engineer can identify the more-promising designs. Discussion here is focused on Kaplan-type turbines, although the method could be extended to other designs. Following the description of the general methodology, we will present sample risk assessment calculations based on CFD data from a model of the John Day Dam on the Columbia River in the USA.

  12. Study on global cloud computing research trend

    NASA Astrophysics Data System (ADS)

    Ma, Feicheng; Zhan, Nan

    2014-01-01

    Since "cloud computing" was put forward by Google , it quickly became the most popular concept in IT industry and widely permeated into various areas promoted by IBM, Microsoft and other IT industry giants. In this paper the methods of bibliometric analysis were used to investigate the global cloud computing research trend based on Web of Science (WoS) database and the Engineering Index (EI) Compendex database. In this study, the publication, countries, institutes, keywords of the papers was deeply studied in methods of quantitative analysis, figures and tables are used to describe the production and the development trends of cloud computing.

  13. Systems biology driven software design for the research enterprise

    PubMed Central

    Boyle, John; Cavnor, Christopher; Killcoyne, Sarah; Shmulevich, Ilya

    2008-01-01

    Background In systems biology, and many other areas of research, there is a need for the interoperability of tools and data sources that were not originally designed to be integrated. Due to the interdisciplinary nature of systems biology, and its association with high throughput experimental platforms, there is an additional need to continually integrate new technologies. As scientists work in isolated groups, integration with other groups is rarely a consideration when building the required software tools. Results We illustrate an approach, through the discussion of a purpose built software architecture, which allows disparate groups to reuse tools and access data sources in a common manner. The architecture allows for: the rapid development of distributed applications; interoperability, so it can be used by a wide variety of developers and computational biologists; development using standard tools, so that it is easy to maintain and does not require a large development effort; extensibility, so that new technologies and data types can be incorporated; and non intrusive development, insofar as researchers need not to adhere to a pre-existing object model. Conclusion By using a relatively simple integration strategy, based upon a common identity system and dynamically discovered interoperable services, a light-weight software architecture can become the focal point through which scientists can both get access to and analyse the plethora of experimentally derived data. PMID:18578887

  14. Synthetic biology: An emerging research field in China

    PubMed Central

    Pei, Lei; Schmidt, Markus; Wei, Wei

    2011-01-01

    Synthetic biology is considered as an emerging research field that will bring new opportunities to biotechnology. There is an expectation that synthetic biology will not only enhance knowledge in basic science, but will also have great potential for practical applications. Synthetic biology is still in an early developmental stage in China. We provide here a review of current Chinese research activities in synthetic biology and its different subfields, such as research on genetic circuits, minimal genomes, chemical synthetic biology, protocells and DNA synthesis, using literature reviews and personal communications with Chinese researchers. To meet the increasing demand for a sustainable development, research on genetic circuits to harness biomass is the most pursed research within Chinese researchers. The environmental concerns are driven force of research on the genetic circuits for bioremediation. The research on minimal genomes is carried on identifying the smallest number of genomes needed for engineering minimal cell factories and research on chemical synthetic biology is focused on artificial proteins and expanded genetic code. The research on protocells is more in combination with the research on molecular-scale motors. The research on DNA synthesis and its commercialisation are also reviewed. As for the perspective on potential future Chinese R&D activities, it will be discussed based on the research capacity and governmental policy. PMID:21729747

  15. iTools: A Framework for Classification, Categorization and Integration of Computational Biology Resources

    PubMed Central

    Dinov, Ivo D.; Rubin, Daniel; Lorensen, William; Dugan, Jonathan; Ma, Jeff; Murphy, Shawn; Kirschner, Beth; Bug, William; Sherman, Michael; Floratos, Aris; Kennedy, David; Jagadish, H. V.; Schmidt, Jeanette; Athey, Brian; Califano, Andrea; Musen, Mark; Altman, Russ; Kikinis, Ron; Kohane, Isaac; Delp, Scott; Parker, D. Stott; Toga, Arthur W.

    2008-01-01

    The advancement of the computational biology field hinges on progress in three fundamental directions – the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources–data, software tools and web-services. The iTools design, implementation and resource meta - data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long-term resource

  16. GUI to Facilitate Research on Biological Damage from Radiation

    NASA Technical Reports Server (NTRS)

    Cucinotta, Frances A.; Ponomarev, Artem Lvovich

    2010-01-01

    A graphical-user-interface (GUI) computer program has been developed to facilitate research on the damage caused by highly energetic particles and photons impinging on living organisms. The program brings together, into one computational workspace, computer codes that have been developed over the years, plus codes that will be developed during the foreseeable future, to address diverse aspects of radiation damage. These include codes that implement radiation-track models, codes for biophysical models of breakage of deoxyribonucleic acid (DNA) by radiation, pattern-recognition programs for extracting quantitative information from biological assays, and image-processing programs that aid visualization of DNA breaks. The radiation-track models are based on transport models of interactions of radiation with matter and solution of the Boltzmann transport equation by use of both theoretical and numerical models. The biophysical models of breakage of DNA by radiation include biopolymer coarse-grained and atomistic models of DNA, stochastic- process models of deposition of energy, and Markov-based probabilistic models of placement of double-strand breaks in DNA. The program is designed for use in the NT, 95, 98, 2000, ME, and XP variants of the Windows operating system.

  17. TORCH Computational Reference Kernels - A Testbed for Computer Science Research

    SciTech Connect

    Kaiser, Alex; Williams, Samuel Webb; Madduri, Kamesh; Ibrahim, Khaled; Bailey, David H.; Demmel, James W.; Strohmaier, Erich

    2010-12-02

    For decades, computer scientists have sought guidance on how to evolve architectures, languages, and programming models in order to improve application performance, efficiency, and productivity. Unfortunately, without overarching advice about future directions in these areas, individual guidance is inferred from the existing software/hardware ecosystem, and each discipline often conducts their research independently assuming all other technologies remain fixed. In today's rapidly evolving world of on-chip parallelism, isolated and iterative improvements to performance may miss superior solutions in the same way gradient descent optimization techniques may get stuck in local minima. To combat this, we present TORCH: A Testbed for Optimization ResearCH. These computational reference kernels define the core problems of interest in scientific computing without mandating a specific language, algorithm, programming model, or implementation. To compliment the kernel (problem) definitions, we provide a set of algorithmically-expressed verification tests that can be used to verify a hardware/software co-designed solution produces an acceptable answer. Finally, to provide some illumination as to how researchers have implemented solutions to these problems in the past, we provide a set of reference implementations in C and MATLAB.

  18. Structures of Biological Minerals in Dental Research

    PubMed Central

    Mathew, Mathai; Takagi, Shozo

    2001-01-01

    Structural features of some calcium phosphates of biological interest are described. Structure of hydroxyapatite (OHAp), considered as the prototype for the inorganic component of bones and teeth is discussed with respect to the kinds and locations of ionic substitutions. Octacalcium phosphate (OCP), is a probable precursor in biological mineralization. OCP has a layer type structure, with one layer quite similar to that of OHAp and the other, a hydrated layer consisting of more widely spaced Ca, and PO4 ions and the water molecules. The closeness of fit in the apatitic layers of OCP and OHAp accounts for the epitaxial, interlayered mixtures formed by these compounds and the in situ conversion of OCP to OHAp. Possible roles of OCP in biological mineralization are discussed. PMID:27500063

  19. Parameter Estimation and Model Selection in Computational Biology

    PubMed Central

    Lillacci, Gabriele; Khammash, Mustafa

    2010-01-01

    A central challenge in computational modeling of biological systems is the determination of the model parameters. Typically, only a fraction of the parameters (such as kinetic rate constants) are experimentally measured, while the rest are often fitted. The fitting process is usually based on experimental time course measurements of observables, which are used to assign parameter values that minimize some measure of the error between these measurements and the corresponding model prediction. The measurements, which can come from immunoblotting assays, fluorescent markers, etc., tend to be very noisy and taken at a limited number of time points. In this work we present a new approach to the problem of parameter selection of biological models. We show how one can use a dynamic recursive estimator, known as extended Kalman filter, to arrive at estimates of the model parameters. The proposed method follows. First, we use a variation of the Kalman filter that is particularly well suited to biological applications to obtain a first guess for the unknown parameters. Secondly, we employ an a posteriori identifiability test to check the reliability of the estimates. Finally, we solve an optimization problem to refine the first guess in case it should not be accurate enough. The final estimates are guaranteed to be statistically consistent with the measurements. Furthermore, we show how the same tools can be used to discriminate among alternate models of the same biological process. We demonstrate these ideas by applying our methods to two examples, namely a model of the heat shock response in E. coli, and a model of a synthetic gene regulation system. The methods presented are quite general and may be applied to a wide class of biological systems where noisy measurements are used for parameter estimation or model selection. PMID:20221262

  20. Accelerating Cancer Systems Biology Research through Semantic Web Technology

    PubMed Central

    Wang, Zhihui; Sagotsky, Jonathan; Taylor, Thomas; Shironoshita, Patrick; Deisboeck, Thomas S.

    2012-01-01

    Cancer systems biology is an interdisciplinary, rapidly expanding research field in which collaborations are a critical means to advance the field. Yet the prevalent database technologies often isolate data rather than making it easily accessible. The Semantic Web has the potential to help facilitate web-based collaborative cancer research by presenting data in a manner that is self-descriptive, human and machine readable, and easily sharable. We have created a semantically linked online Digital Model Repository (DMR) for storing, managing, executing, annotating, and sharing computational cancer models. Within the DMR, distributed, multidisciplinary, and inter-organizational teams can collaborate on projects, without forfeiting intellectual property. This is achieved by the introduction of a new stakeholder to the collaboration workflow, the institutional licensing officer, part of the Technology Transfer Office. Furthermore, the DMR has achieved silver level compatibility with the National Cancer Institute’s caBIG®, so users can not only interact with the DMR through a web browser but also through a semantically annotated and secure web service. We also discuss the technology behind the DMR leveraging the Semantic Web, ontologies, and grid computing to provide secure inter-institutional collaboration on cancer modeling projects, online grid-based execution of shared models, and the collaboration workflow protecting researchers’ intellectual property. PMID:23188758

  1. Marketing and commercialization of computational research services.

    SciTech Connect

    Toevs, J. W.

    2001-01-01

    Physical and computational scientists and mathematicians in Russia's nuclear cities are turning their work toward generating profits from Western markets. Successful ventures require an understanding of the marketing of contract research as well as Western expectations regarding contract execution, quality, and performance. This paper will address fundamentals in business structure, marketing, and contract performance for organizations engaging in the marketing and commercialization of research services. Considerable emphasis will be placed on developing adequate communication within the organization.

  2. Computations and algorithms in physical and biological problems

    NASA Astrophysics Data System (ADS)

    Qin, Yu

    This dissertation presents the applications of state-of-the-art computation techniques and data analysis algorithms in three physical and biological problems: assembling DNA pieces, optimizing self-assembly yield, and identifying correlations from large multivariate datasets. In the first topic, in-depth analysis of using Sequencing by Hybridization (SBH) to reconstruct target DNA sequences shows that a modified reconstruction algorithm can overcome the theoretical boundary without the need for different types of biochemical assays and is robust to error. In the second topic, consistent with theoretical predictions, simulations using Graphics Processing Unit (GPU) demonstrate how controlling the short-ranged interactions between particles and controlling the concentrations optimize the self-assembly yield of a desired structure, and nonequilibrium behavior when optimizing concentrations is also unveiled by leveraging the computation capacity of GPUs. In the last topic, a methodology to incorporate existing categorization information into the search process to efficiently reconstruct the optimal true correlation matrix for multivariate datasets is introduced. Simulations on both synthetic and real financial datasets show that the algorithm is able to detect signals below the Random Matrix Theory (RMT) threshold. These three problems are representatives of using massive computation techniques and data analysis algorithms to tackle optimization problems, and outperform theoretical boundary when incorporating prior information into the computation.

  3. Cloud Computing Technologies Facilitate Earth Research

    NASA Technical Reports Server (NTRS)

    2015-01-01

    Under a Space Act Agreement, NASA partnered with Seattle-based Amazon Web Services to make the agency's climate and Earth science satellite data publicly available on the company's servers. Users can access the data for free, but they can also pay to use Amazon's computing services to analyze and visualize information using the same software available to NASA researchers.

  4. Computer Modeling and Research in the Classroom

    ERIC Educational Resources Information Center

    Ramos, Maria Joao; Fernandes, Pedro Alexandrino

    2005-01-01

    We report on a computational chemistry course for undergraduate students that successfully incorporated a research project on the design of new contrast agents for magnetic resonance imaging and shift reagents for in vivo NMR. Course outcomes were positive: students were quite motivated during the whole year--they learned what was required of…

  5. Handheld Computers in Education. Research Brief

    ERIC Educational Resources Information Center

    Education Partnerships, Inc., 2003

    2003-01-01

    For over the last 20 years, educators have been trying to find the best practice in using technology for student learning. Some of the most widely used applications with computers have been student learning of programming, word processing, Web research, spreadsheets, games, and Web design. The difficulty with integrating many of these activities…

  6. Microfluidic tools for cell biological research

    PubMed Central

    Velve-Casquillas, Guilhem; Le Berre, Maël; Piel, Matthieu; Tran, Phong T.

    2010-01-01

    Summary Microfluidic technology is creating powerful tools for cell biologists to control the complete cellular microenvironment, leading to new questions and new discoveries. We review here the basic concepts and methodologies in designing microfluidic devices, and their diverse cell biological applications. PMID:21152269

  7. Biology Teacher and Expert Opinions about Computer Assisted Biology Instruction Materials: A Software Entitled Nucleic Acids and Protein Synthesis

    ERIC Educational Resources Information Center

    Hasenekoglu, Ismet; Timucin, Melih

    2007-01-01

    The aim of this study is to collect and evaluate opinions of CAI experts and biology teachers about a high school level Computer Assisted Biology Instruction Material presenting computer-made modelling and simulations. It is a case study. A material covering "Nucleic Acids and Protein Synthesis" topic was developed as the "case". The goal of the…

  8. Removing the center from computing: biology's new mode of digital knowledge production.

    PubMed

    November, Joseph

    2011-06-01

    This article shows how the USA's National Institutes of Health (NIH) helped to bring about a major shift in the way computers are used to produce knowledge and in the design of computers themselves as a consequence of its early 1960s efforts to introduce information technology to biologists. Starting in 1960 the NIH sought to reform the life sciences by encouraging researchers to make use of digital electronic computers, but despite generous federal support biologists generally did not embrace the new technology. Initially the blame fell on biologists' lack of appropriate (i.e. digital) data for computers to process. However, when the NIH consulted MIT computer architect Wesley Clark about this problem, he argued that the computer's quality as a device that was centralized posed an even greater challenge to potential biologist users than did the computer's need for digital data. Clark convinced the NIH that if the agency hoped to effectively computerize biology, it would need to satisfy biologists' experimental and institutional needs by providing them the means to use a computer without going to a computing center. With NIH support, Clark developed the 1963 Laboratory Instrument Computer (LINC), a small, real-time interactive computer intended to be used inside the laboratory and controlled entirely by its biologist users. Once built, the LINC provided a viable alternative to the 1960s norm of large computers housed in computing centers. As such, the LINC not only became popular among biologists, but also served in later decades as an important precursor of today's computing norm in the sciences and far beyond, the personal computer. PMID:21879517

  9. Research computing in a distributed cloud environment

    NASA Astrophysics Data System (ADS)

    Fransham, K.; Agarwal, A.; Armstrong, P.; Bishop, A.; Charbonneau, A.; Desmarais, R.; Hill, N.; Gable, I.; Gaudet, S.; Goliath, S.; Impey, R.; Leavett-Brown, C.; Ouellete, J.; Paterson, M.; Pritchet, C.; Penfold-Brown, D.; Podaima, W.; Schade, D.; Sobie, R. J.

    2010-11-01

    The recent increase in availability of Infrastructure-as-a-Service (IaaS) computing clouds provides a new way for researchers to run complex scientific applications. However, using cloud resources for a large number of research jobs requires significant effort and expertise. Furthermore, running jobs on many different clouds presents even more difficulty. In order to make it easy for researchers to deploy scientific applications across many cloud resources, we have developed a virtual machine resource manager (Cloud Scheduler) for distributed compute clouds. In response to a user's job submission to a batch system, the Cloud Scheduler manages the distribution and deployment of user-customized virtual machines across multiple clouds. We describe the motivation for and implementation of a distributed cloud using the Cloud Scheduler that is spread across both commercial and dedicated private sites, and present some early results of scientific data analysis using the system.

  10. Biological Structures, Interactions, Function and Behavior: Research Opportunities for Physicists

    NASA Astrophysics Data System (ADS)

    Concepcion, Gisela P.

    2008-06-01

    Studies on marine biomolecules at the Marine Natural Products Laboratory (MNPL) and studies on biomedically relevant proteins at the Virtual Laboratory of Biomolecular Structures (VIRLS) of the University of the Philippines Marine Science Institute (UPMSI) are presented. These serve to illustrate some underlying principles of biological structures, interactions, function and behavior, and also to draw out some unresolved questions in biology of possible interest to non-biologists. The Biological Structures course offered at UPMSI, which aims to introduce underlying biological principles to non-biology majors and to promote trans-disciplinary research efforts, is also presented.

  11. Computational Approaches for Predicting Biomedical Research Collaborations

    PubMed Central

    Zhang, Qing; Yu, Hong

    2014-01-01

    Biomedical research is increasingly collaborative, and successful collaborations often produce high impact work. Computational approaches can be developed for automatically predicting biomedical research collaborations. Previous works of collaboration prediction mainly explored the topological structures of research collaboration networks, leaving out rich semantic information from the publications themselves. In this paper, we propose supervised machine learning approaches to predict research collaborations in the biomedical field. We explored both the semantic features extracted from author research interest profile and the author network topological features. We found that the most informative semantic features for author collaborations are related to research interest, including similarity of out-citing citations, similarity of abstracts. Of the four supervised machine learning models (naïve Bayes, naïve Bayes multinomial, SVMs, and logistic regression), the best performing model is logistic regression with an ROC ranging from 0.766 to 0.980 on different datasets. To our knowledge we are the first to study in depth how research interest and productivities can be used for collaboration prediction. Our approach is computationally efficient, scalable and yet simple to implement. The datasets of this study are available at https://github.com/qingzhanggithub/medline-collaboration-datasets. PMID:25375164

  12. Structural biology research at the National Synchroton Light Source

    SciTech Connect

    1996-05-01

    The world`s foremost facility for scientific research using x-rays and ultraviolet and infrared radiation is operated by the national synchrotron Light Source Department. This year alone, a total of 2200 guest researchers performed experiments at the world`s largest source of synchrotron light. Researchers are trying to define the three- dimensional structures of biological macromolecules to create a map of life, a guide for exploring the biological and chemical interactions of the vast variety of molecules found in living organisms. Studies in structural biology may lead to new insights into how biological systems are formed and nourished, how they survive and grow, how they are damaged and die. This document discusses some the the structural biological research done at the National Synchrotron Light Source.

  13. Research on computer virus database management system

    NASA Astrophysics Data System (ADS)

    Qi, Guoquan

    2011-12-01

    The growing proliferation of computer viruses becomes the lethal threat and research focus of the security of network information. While new virus is emerging, the number of viruses is growing, virus classification increasing complex. Virus naming because of agencies' capture time differences can not be unified. Although each agency has its own virus database, the communication between each other lacks, or virus information is incomplete, or a small number of sample information. This paper introduces the current construction status of the virus database at home and abroad, analyzes how to standardize and complete description of virus characteristics, and then gives the information integrity, storage security and manageable computer virus database design scheme.

  14. Conduction pathways in microtubules, biological quantum computation, and consciousness.

    PubMed

    Hameroff, Stuart; Nip, Alex; Porter, Mitchell; Tuszynski, Jack

    2002-01-01

    Technological computation is entering the quantum realm, focusing attention on biomolecular information processing systems such as proteins, as presaged by the work of Michael Conrad. Protein conformational dynamics and pharmacological evidence suggest that protein conformational states-fundamental information units ('bits') in biological systems-are governed by quantum events, and are thus perhaps akin to quantum bits ('qubits') as utilized in quantum computation. 'Real time' dynamic activities within cells are regulated by the cell cytoskeleton, particularly microtubules (MTs) which are cylindrical lattice polymers of the protein tubulin. Recent evidence shows signaling, communication and conductivity in MTs, and theoretical models have predicted both classical and quantum information processing in MTs. In this paper we show conduction pathways for electron mobility and possible quantum tunneling and superconductivity among aromatic amino acids in tubulins. The pathways within tubulin match helical patterns in the microtubule lattice structure, which lend themselves to topological quantum effects resistant to decoherence. The Penrose-Hameroff 'Orch OR' model of consciousness is reviewed as an example of the possible utility of quantum computation in MTs. PMID:11755497

  15. A research program in empirical computer science

    NASA Technical Reports Server (NTRS)

    Knight, J. C.

    1991-01-01

    During the grant reporting period our primary activities have been to begin preparation for the establishment of a research program in experimental computer science. The focus of research in this program will be safety-critical systems. Many questions that arise in the effort to improve software dependability can only be addressed empirically. For example, there is no way to predict the performance of the various proposed approaches to building fault-tolerant software. Performance models, though valuable, are parameterized and cannot be used to make quantitative predictions without experimental determination of underlying distributions. In the past, experimentation has been able to shed some light on the practical benefits and limitations of software fault tolerance. It is common, also, for experimentation to reveal new questions or new aspects of problems that were previously unknown. A good example is the Consistent Comparison Problem that was revealed by experimentation and subsequently studied in depth. The result was a clear understanding of a previously unknown problem with software fault tolerance. The purpose of a research program in empirical computer science is to perform controlled experiments in the area of real-time, embedded control systems. The goal of the various experiments will be to determine better approaches to the construction of the software for computing systems that have to be relied upon. As such it will validate research concepts from other sources, provide new research results, and facilitate the transition of research results from concepts to practical procedures that can be applied with low risk to NASA flight projects. The target of experimentation will be the production software development activities undertaken by any organization prepared to contribute to the research program. Experimental goals, procedures, data analysis and result reporting will be performed for the most part by the University of Virginia.

  16. A comprehensive approach to decipher biological computation to achieve next generation high-performance exascale computing.

    SciTech Connect

    James, Conrad D.; Schiess, Adrian B.; Howell, Jamie; Baca, Micheal J.; Partridge, L. Donald; Finnegan, Patrick Sean; Wolfley, Steven L.; Dagel, Daryl James; Spahn, Olga Blum; Harper, Jason C.; Pohl, Kenneth Roy; Mickel, Patrick R.; Lohn, Andrew; Marinella, Matthew

    2013-10-01

    The human brain (volume=1200cm3) consumes 20W and is capable of performing>10%5E16 operations/s. Current supercomputer technology has reached 1015 operations/s, yet it requires 1500m%5E3 and 3MW, giving the brain a 10%5E12 advantage in operations/s/W/cm%5E3. Thus, to reach exascale computation, two achievements are required: 1) improved understanding of computation in biological tissue, and 2) a paradigm shift towards neuromorphic computing where hardware circuits mimic properties of neural tissue. To address 1), we will interrogate corticostriatal networks in mouse brain tissue slices, specifically with regard to their frequency filtering capabilities as a function of input stimulus. To address 2), we will instantiate biological computing characteristics such as multi-bit storage into hardware devices with future computational and memory applications. Resistive memory devices will be modeled, designed, and fabricated in the MESA facility in consultation with our internal and external collaborators.

  17. Incorporating computational resources in a cancer research program

    PubMed Central

    Woods, Nicholas T.; Jhuraney, Ankita; Monteiro, Alvaro N.A.

    2015-01-01

    Recent technological advances have transformed cancer genetics research. These advances have served as the basis for the generation of a number of richly annotated datasets relevant to the cancer geneticist. In addition, many of these technologies are now within reach of smaller laboratories to answer specific biological questions. Thus, one of the most pressing issues facing an experimental cancer biology research program in genetics is incorporating data from multiple sources to annotate, visualize, and analyze the system under study. Fortunately, there are several computational resources to aid in this process. However, a significant effort is required to adapt a molecular biology-based research program to take advantage of these datasets. Here, we discuss the lessons learned in our laboratory and share several recommendations to make this transition effectively. This article is not meant to be a comprehensive evaluation of all the available resources, but rather highlight those that we have incorporated into our laboratory and how to choose the most appropriate ones for your research program. PMID:25324189

  18. Incorporating computational resources in a cancer research program.

    PubMed

    Woods, Nicholas T; Jhuraney, Ankita; Monteiro, Alvaro N A

    2015-05-01

    Recent technological advances have transformed cancer genetics research. These advances have served as the basis for the generation of a number of richly annotated datasets relevant to the cancer geneticist. In addition, many of these technologies are now within reach of smaller laboratories to answer specific biological questions. Thus, one of the most pressing issues facing an experimental cancer biology research program in genetics is incorporating data from multiple sources to annotate, visualize, and analyze the system under study. Fortunately, there are several computational resources to aid in this process. However, a significant effort is required to adapt a molecular biology-based research program to take advantage of these datasets. Here, we discuss the lessons learned in our laboratory and share several recommendations to make this transition effective. This article is not meant to be a comprehensive evaluation of all the available resources, but rather highlight those that we have incorporated into our laboratory and how to choose the most appropriate ones for your research program. PMID:25324189

  19. Plant seeds in biological research in space

    NASA Technical Reports Server (NTRS)

    Miller, A. T.

    1982-01-01

    Data of 15 years of space flight and laboratory tests of plant seeds of 20 species, mainly on the combined and separate effects of launch vibration, ionizing radiation and weightlessness, are surveyed. It is concluded that plants do not show a pronounced response to space flight factors. Conditions of return to Earth, the number of heavy cosmic ray particles striking biological targets and effects of change in magnetic an electromagnetic fields have been little studied, and that more study of growing plants in space is needed.

  20. How to integrate biological research into society and exclude errors in biomedical publications? Progress in theoretical and systems biology releases pressure on experimental research

    PubMed Central

    Volkov, Vadim

    2014-01-01

    This brief opinion proposes measures to increase efficiency and exclude errors in biomedical research under the existing dynamic situation. Rapid changes in biology began with the description of the three dimensional structure of DNA 60 years ago; today biology has progressed by interacting with computer science and nanoscience together with the introduction of robotic stations for the acquisition of large-scale arrays of data. These changes have had an increasing influence on the entire research and scientific community. Future advance demands short-term measures to ensure error-proof and efficient development. They can include the fast publishing of negative results, publishing detailed methodical papers and excluding a strict connection between career progression and publication activity, especially for younger researchers. Further development of theoretical and systems biology together with the use of multiple experimental methods for biological experiments could also be helpful in the context of years and decades. With regards to the links between science and society, it is reasonable to compare both these systems, to find and describe specific features for biology and to integrate it into the existing stream of social life and financial fluxes. It will increase the level of scientific research and have mutual positive effects for both biology and society. Several examples are given for further discussion. PMID:24748913

  1. A general framework for application of prestrain to computational models of biological materials.

    PubMed

    Maas, Steve A; Erdemir, Ahmet; Halloran, Jason P; Weiss, Jeffrey A

    2016-08-01

    It is often important to include prestress in computational models of biological tissues. The prestress can represent residual stresses (stresses that exist after the tissue is excised from the body) or in situ stresses (stresses that exist in vivo, in the absence of loading). A prestressed reference configuration may also be needed when modeling the reference geometry of biological tissues in vivo. This research developed a general framework for representing prestress in finite element models of biological materials. It is assumed that the material is elastic, allowing the prestress to be represented via a prestrain. For prestrain fields that are not compatible with the reference geometry, the computational framework provides an iterative algorithm for updating the prestrain until equilibrium is satisfied. The iterative framework allows for enforcement of two different constraints: elimination of distortion in order to address the incompatibility issue, and enforcing a specified in situ fiber strain field while allowing for distortion. The framework was implemented as a plugin in FEBio (www.febio.org), making it easy to maintain the software and to extend the framework if needed. Several examples illustrate the application and effectiveness of the approach, including the application of in situ strains to ligaments in the Open Knee model (simtk.org/home/openknee). A novel method for recovering the stress-free configuration from the prestrain deformation gradient is also presented. This general purpose theoretical and computational framework for applying prestrain will allow analysts to overcome the challenges in modeling this important aspect of biological tissue mechanics. PMID:27131609

  2. The Development of Computational Biology in South Africa: Successes Achieved and Lessons Learnt

    PubMed Central

    Mulder, Nicola J.; Christoffels, Alan; de Oliveira, Tulio; Gamieldien, Junaid; Hazelhurst, Scott; Joubert, Fourie; Kumuthini, Judit; Pillay, Ché S.; Snoep, Jacky L.; Tastan Bishop, Özlem; Tiffin, Nicki

    2016-01-01

    Bioinformatics is now a critical skill in many research and commercial environments as biological data are increasing in both size and complexity. South African researchers recognized this need in the mid-1990s and responded by working with the government as well as international bodies to develop initiatives to build bioinformatics capacity in the country. Significant injections of support from these bodies provided a springboard for the establishment of computational biology units at multiple universities throughout the country, which took on teaching, basic research and support roles. Several challenges were encountered, for example with unreliability of funding, lack of skills, and lack of infrastructure. However, the bioinformatics community worked together to overcome these, and South Africa is now arguably the leading country in bioinformatics on the African continent. Here we discuss how the discipline developed in the country, highlighting the challenges, successes, and lessons learnt. PMID:26845152

  3. The Development of Computational Biology in South Africa: Successes Achieved and Lessons Learnt.

    PubMed

    Mulder, Nicola J; Christoffels, Alan; de Oliveira, Tulio; Gamieldien, Junaid; Hazelhurst, Scott; Joubert, Fourie; Kumuthini, Judit; Pillay, Ché S; Snoep, Jacky L; Tastan Bishop, Özlem; Tiffin, Nicki

    2016-02-01

    Bioinformatics is now a critical skill in many research and commercial environments as biological data are increasing in both size and complexity. South African researchers recognized this need in the mid-1990s and responded by working with the government as well as international bodies to develop initiatives to build bioinformatics capacity in the country. Significant injections of support from these bodies provided a springboard for the establishment of computational biology units at multiple universities throughout the country, which took on teaching, basic research and support roles. Several challenges were encountered, for example with unreliability of funding, lack of skills, and lack of infrastructure. However, the bioinformatics community worked together to overcome these, and South Africa is now arguably the leading country in bioinformatics on the African continent. Here we discuss how the discipline developed in the country, highlighting the challenges, successes, and lessons learnt. PMID:26845152

  4. CFD Research, Parallel Computation and Aerodynamic Optimization

    NASA Technical Reports Server (NTRS)

    Ryan, James S.

    1995-01-01

    During the last five years, CFD has matured substantially. Pure CFD research remains to be done, but much of the focus has shifted to integration of CFD into the design process. The work under these cooperative agreements reflects this trend. The recent work, and work which is planned, is designed to enhance the competitiveness of the US aerospace industry. CFD and optimization approaches are being developed and tested, so that the industry can better choose which methods to adopt in their design processes. The range of computer architectures has been dramatically broadened, as the assumption that only huge vector supercomputers could be useful has faded. Today, researchers and industry can trade off time, cost, and availability, choosing vector supercomputers, scalable parallel architectures, networked workstations, or heterogenous combinations of these to complete required computations efficiently.

  5. CERR: a computational environment for radiotherapy research.

    PubMed

    Deasy, Joseph O; Blanco, Angel I; Clark, Vanessa H

    2003-05-01

    A software environment is described, called the computational environment for radiotherapy research (CERR, pronounced "sir"). CERR partially addresses four broad needs in treatment planning research: (a) it provides a convenient and powerful software environment to develop and prototype treatment planning concepts, (b) it serves as a software integration environment to combine treatment planning software written in multiple languages (MATLAB, FORTRAN, C/C++, JAVA, etc.), together with treatment plan information (computed tomography scans, outlined structures, dose distributions, digital films, etc.), (c) it provides the ability to extract treatment plans from disparate planning systems using the widely available AAPM/RTOG archiving mechanism, and (d) it provides a convenient and powerful tool for sharing and reproducing treatment planning research results. The functional components currently being distributed, including source code, include: (1) an import program which converts the widely available AAPM/RTOG treatment planning format into a MATLAB cell-array data object, facilitating manipulation; (2) viewers which display axial, coronal, and sagittal computed tomography images, structure contours, digital films, and isodose lines or dose colorwash, (3) a suite of contouring tools to edit and/or create anatomical structures, (4) dose-volume and dose-surface histogram calculation and display tools, and (5) various predefined commands. CERR allows the user to retrieve any AAPM/RTOG key word information about the treatment plan archive. The code is relatively self-describing, because it relies on MATLAB structure field name definitions based on the AAPM/RTOG standard. New structure field names can be added dynamically or permanently. New components of arbitrary data type can be stored and accessed without disturbing system operation. CERR has been applied to aid research in dose-volume-outcome modeling, Monte Carlo dose calculation, and treatment planning optimization

  6. Biological insertion of computationally designed short transmembrane segments.

    PubMed

    Baeza-Delgado, Carlos; von Heijne, Gunnar; Marti-Renom, Marc A; Mingarro, Ismael

    2016-01-01

    The great majority of helical membrane proteins are inserted co-translationally into the ER membrane through a continuous ribosome-translocon channel. The efficiency of membrane insertion depends on transmembrane (TM) helix amino acid composition, the helix length and the position of the amino acids within the helix. In this work, we conducted a computational analysis of the composition and location of amino acids in transmembrane helices found in membrane proteins of known structure to obtain an extensive set of designed polypeptide segments with naturally occurring amino acid distributions. Then, using an in vitro translation system in the presence of biological membranes, we experimentally validated our predictions by analyzing its membrane integration capacity. Coupled with known strategies to control membrane protein topology, these findings may pave the way to de novo membrane protein design. PMID:26987712

  7. Biological insertion of computationally designed short transmembrane segments

    PubMed Central

    Baeza-Delgado, Carlos; von Heijne, Gunnar; Marti-Renom, Marc A.; Mingarro, Ismael

    2016-01-01

    The great majority of helical membrane proteins are inserted co-translationally into the ER membrane through a continuous ribosome-translocon channel. The efficiency of membrane insertion depends on transmembrane (TM) helix amino acid composition, the helix length and the position of the amino acids within the helix. In this work, we conducted a computational analysis of the composition and location of amino acids in transmembrane helices found in membrane proteins of known structure to obtain an extensive set of designed polypeptide segments with naturally occurring amino acid distributions. Then, using an in vitro translation system in the presence of biological membranes, we experimentally validated our predictions by analyzing its membrane integration capacity. Coupled with known strategies to control membrane protein topology, these findings may pave the way to de novo membrane protein design. PMID:26987712

  8. Computational procedures for optimal experimental design in biological systems.

    PubMed

    Balsa-Canto, E; Alonso, A A; Banga, J R

    2008-07-01

    Mathematical models of complex biological systems, such as metabolic or cell-signalling pathways, usually consist of sets of nonlinear ordinary differential equations which depend on several non-measurable parameters that can be hopefully estimated by fitting the model to experimental data. However, the success of this fitting is largely conditioned by the quantity and quality of data. Optimal experimental design (OED) aims to design the scheme of actuations and measurements which will result in data sets with the maximum amount and/or quality of information for the subsequent model calibration. New methods and computational procedures for OED in the context of biological systems are presented. The OED problem is formulated as a general dynamic optimisation problem where the time-dependent stimuli profiles, the location of sampling times, the duration of the experiments and the initial conditions are regarded as design variables. Its solution is approached using the control vector parameterisation method. Since the resultant nonlinear optimisation problem is in most of the cases non-convex, the use of a robust global nonlinear programming solver is proposed. For the sake of comparing among different experimental schemes, a Monte-Carlo-based identifiability analysis is then suggested. The applicability and advantages of the proposed techniques are illustrated by considering an example related to a cell-signalling pathway. PMID:18681746

  9. Complex network problems in physics, computer science and biology

    NASA Astrophysics Data System (ADS)

    Cojocaru, Radu Ionut

    There is a close relation between physics and mathematics and the exchange of ideas between these two sciences are well established. However until few years ago there was no such a close relation between physics and computer science. Even more, only recently biologists started to use methods and tools from statistical physics in order to study the behavior of complex system. In this thesis we concentrate on applying and analyzing several methods borrowed from computer science to biology and also we use methods from statistical physics in solving hard problems from computer science. In recent years physicists have been interested in studying the behavior of complex networks. Physics is an experimental science in which theoretical predictions are compared to experiments. In this definition, the term prediction plays a very important role: although the system is complex, it is still possible to get predictions for its behavior, but these predictions are of a probabilistic nature. Spin glasses, lattice gases or the Potts model are a few examples of complex systems in physics. Spin glasses and many frustrated antiferromagnets map exactly to computer science problems in the NP-hard class defined in Chapter 1. In Chapter 1 we discuss a common result from artificial intelligence (AI) which shows that there are some problems which are NP-complete, with the implication that these problems are difficult to solve. We introduce a few well known hard problems from computer science (Satisfiability, Coloring, Vertex Cover together with Maximum Independent Set and Number Partitioning) and then discuss their mapping to problems from physics. In Chapter 2 we provide a short review of combinatorial optimization algorithms and their applications to ground state problems in disordered systems. We discuss the cavity method initially developed for studying the Sherrington-Kirkpatrick model of spin glasses. We extend this model to the study of a specific case of spin glass on the Bethe

  10. Amphipols: Polymeric surfactants for membrane biology research.

    SciTech Connect

    Popot, J.-L.; Berry, E.A.; Charvolin, D.; Creuzenet, C.; Ebel, C.; Engelman, D.M.; Flotenmeyer, M.; Giusti, F.; Gohon, Y.; Hong, Q.; Lakey, J.H.; Leonard, K.; Shuman, H.A.; Timmins, P.; Warschawski, D.E.; Zito, F.; Zoonens, M.; Pucci, B.; Tribet, C.

    2003-06-20

    Membrane proteins classically are handled in aqueous solutions as complexes with detergents. The dissociating character of detergents, combined with the need to maintain an excess of them, frequently results in more or less rapid inactivation of the protein under study. Over the past few years, we have endeavored to develop a novel family of surfactants, dubbed amphipols (APs). APs are amphiphilic polymers that bind to the transmembrane surface of the protein in a noncovalent but, in the absence of a competing surfactant, quasi-irreversible manner. Membrane proteins complexed by APs are in their native state, stable, and they remain water soluble in the absence of detergent or free APs. An update is presented of the current knowledge about these compounds and their demonstrated or putative uses in membrane biology.

  11. Self Organizing Systems and the Research Implications for Biological Systems

    NASA Astrophysics Data System (ADS)

    Denkins-Taffe, Lauren R.; Alfred, Marcus; Lindesay, James

    2008-03-01

    The knowledge gained from the human genome project, has provided an added opportunity to study the dynamical relationships within biological systems and can lead to an increased knowledge of diseases and subsequent drug discovery. Through computation, methods in which to rebuild these systems are being studied. These methods, which have first been applied to simpler systems: predator-prey, and self sustaining ecosystems can be applied to the study of microscopic biological systems.

  12. High performance computing applications in neurobiological research

    NASA Technical Reports Server (NTRS)

    Ross, Muriel D.; Cheng, Rei; Doshay, David G.; Linton, Samuel W.; Montgomery, Kevin; Parnas, Bruce R.

    1994-01-01

    The human nervous system is a massively parallel processor of information. The vast numbers of neurons, synapses and circuits is daunting to those seeking to understand the neural basis of consciousness and intellect. Pervading obstacles are lack of knowledge of the detailed, three-dimensional (3-D) organization of even a simple neural system and the paucity of large scale, biologically relevant computer simulations. We use high performance graphics workstations and supercomputers to study the 3-D organization of gravity sensors as a prototype architecture foreshadowing more complex systems. Scaled-down simulations run on a Silicon Graphics workstation and scale-up, three-dimensional versions run on the Cray Y-MP and CM5 supercomputers.

  13. 76 FR 8357 - Biological and Environmental Research Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-14

    ... Biological and Environmental Research Advisory Committee AGENCY: Department of Energy; Office of Science... Environmental Research Advisory Committee (BERAC). The Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat... Research, SC-23/Germantown Building, 1000 Independence Avenue, SW., Washington, DC 20585-1290; phone:...

  14. 77 FR 55201 - Biological and Environmental Research Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-07

    ...This notice announces a teleconference of the Biological and Environmental Research Advisory Committee (BERAC). The Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat. 770) requires that public notice of these meetings be announced in the Federal...

  15. 77 FR 55200 - Biological and Environmental Research Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-07

    ...This notice announces a meeting of the Biological and Environmental Research Advisory Committee (BERAC). The Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat. 770) requires that public notice of these meetings be announced in the Federal...

  16. 77 FR 4028 - Biological and Environmental Research Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-26

    ...This notice announces a meeting of the Biological and Environmental Research Advisory Committee (BERAC). The Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat. 770) requires that public notice of these meetings be announced in the Federal...

  17. Computational biology: plus c'est la même chose, plus ça change

    PubMed Central

    2011-01-01

    A report on the joint 19th Annual International Conference on Intelligent Systems for Molecular Biology (ISMB)/10th Annual European Conference on Computational Biology (ECCB) meetings and the 7th International Society for Computational Biology Student Council Symposium, Vienna, Austria, 15-19 July 2011. PMID:21861851

  18. Stochastic Effects in Computational Biology of Space Radiation Cancer Risk

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Pluth, Janis; Harper, Jane; O'Neill, Peter

    2007-01-01

    Estimating risk from space radiation poses important questions on the radiobiology of protons and heavy ions. We are considering systems biology models to study radiation induced repair foci (RIRF) at low doses, in which less than one-track on average transverses the cell, and the subsequent DNA damage processing and signal transduction events. Computational approaches for describing protein regulatory networks coupled to DNA and oxidative damage sites include systems of differential equations, stochastic equations, and Monte-Carlo simulations. We review recent developments in the mathematical description of protein regulatory networks and possible approaches to radiation effects simulation. These include robustness, which states that regulatory networks maintain their functions against external and internal perturbations due to compensating properties of redundancy and molecular feedback controls, and modularity, which leads to general theorems for considering molecules that interact through a regulatory mechanism without exchange of matter leading to a block diagonal reduction of the connecting pathways. Identifying rate-limiting steps, robustness, and modularity in pathways perturbed by radiation damage are shown to be valid techniques for reducing large molecular systems to realistic computer simulations. Other techniques studied are the use of steady-state analysis, and the introduction of composite molecules or rate-constants to represent small collections of reactants. Applications of these techniques to describe spatial and temporal distributions of RIRF and cell populations following low dose irradiation are described.

  19. Energy and time determine scaling in biological and computer designs.

    PubMed

    Moses, Melanie; Bezerra, George; Edwards, Benjamin; Brown, James; Forrest, Stephanie

    2016-08-19

    Metabolic rate in animals and power consumption in computers are analogous quantities that scale similarly with size. We analyse vascular systems of mammals and on-chip networks of microprocessors, where natural selection and human engineering, respectively, have produced systems that minimize both energy dissipation and delivery times. Using a simple network model that simultaneously minimizes energy and time, our analysis explains empirically observed trends in the scaling of metabolic rate in mammals and power consumption and performance in microprocessors across several orders of magnitude in size. Just as the evolutionary transitions from unicellular to multicellular animals in biology are associated with shifts in metabolic scaling, our model suggests that the scaling of power and performance will change as computer designs transition to decentralized multi-core and distributed cyber-physical systems. More generally, a single energy-time minimization principle may govern the design of many complex systems that process energy, materials and information.This article is part of the themed issue 'The major synthetic evolutionary transitions'. PMID:27431524

  20. Biological Research in Canisters (BRIC) - Light Emitting Diode (LED)

    NASA Technical Reports Server (NTRS)

    Levine, Howard G.; Caron, Allison

    2016-01-01

    The Biological Research in Canisters - LED (BRIC-LED) is a biological research system that is being designed to complement the capabilities of the existing BRIC-Petri Dish Fixation Unit (PDFU) for the Space Life and Physical Sciences (SLPS) Program. A diverse range of organisms can be supported, including plant seedlings, callus cultures, Caenorhabditis elegans, microbes, and others. In the event of a launch scrub, the entire assembly can be replaced with an identical back-up unit containing freshly loaded specimens.

  1. Connecting Biology and Organic Chemistry Introductory Laboratory Courses through a Collaborative Research Project

    ERIC Educational Resources Information Center

    Boltax, Ariana L.; Armanious, Stephanie; Kosinski-Collins, Melissa S.; Pontrello, Jason K.

    2015-01-01

    Modern research often requires collaboration of experts in fields, such as math, chemistry, biology, physics, and computer science to develop unique solutions to common problems. Traditional introductory undergraduate laboratory curricula in the sciences often do not emphasize connections possible between the various disciplines. We designed an…

  2. 78 FR 20924 - Center for Biologics Evaluation and Research eSubmitter Pilot Evaluation Program for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-08

    ...The Food and Drug Administration's (FDA's) Center for Biologics Evaluation and Research (CBER) is announcing an invitation to sponsors of investigational new drug (IND) applications to participate in a pilot evaluation program for CBER's eSubmitter Program (eSubmitter). CBER's eSubmitter is a computer-assisted automated program that has been customized to facilitate the creation of IND......

  3. Computational Nanotechnology at NASA Ames Research Center, 1996

    NASA Technical Reports Server (NTRS)

    Globus, Al; Bailey, David; Langhoff, Steve; Pohorille, Andrew; Levit, Creon; Chancellor, Marisa K. (Technical Monitor)

    1996-01-01

    Some forms of nanotechnology appear to have enormous potential to improve aerospace and computer systems; computational nanotechnology, the design and simulation of programmable molecular machines, is crucial to progress. NASA Ames Research Center has begun a computational nanotechnology program including in-house work, external research grants, and grants of supercomputer time. Four goals have been established: (1) Simulate a hypothetical programmable molecular machine replicating itself and building other products. (2) Develop molecular manufacturing CAD (computer aided design) software and use it to design molecular manufacturing systems and products of aerospace interest, including computer components. (3) Characterize nanotechnologically accessible materials of aerospace interest. Such materials may have excellent strength and thermal properties. (4) Collaborate with experimentalists. Current in-house activities include: (1) Development of NanoDesign, software to design and simulate a nanotechnology based on functionalized fullerenes. Early work focuses on gears. (2) A design for high density atomically precise memory. (3) Design of nanotechnology systems based on biology. (4) Characterization of diamonoid mechanosynthetic pathways. (5) Studies of the laplacian of the electronic charge density to understand molecular structure and reactivity. (6) Studies of entropic effects during self-assembly. Characterization of properties of matter for clusters up to sizes exhibiting bulk properties. In addition, the NAS (NASA Advanced Supercomputing) supercomputer division sponsored a workshop on computational molecular nanotechnology on March 4-5, 1996 held at NASA Ames Research Center. Finally, collaborations with Bill Goddard at CalTech, Ralph Merkle at Xerox Parc, Don Brenner at NCSU (North Carolina State University), Tom McKendree at Hughes, and Todd Wipke at UCSC are underway.

  4. COMPUTATIONAL METHODS FOR STUDYING THE INTERACTION BETWEEN POLYCYCLIC AROMATIC HYDROCARBONS AND BIOLOGICAL MACROMOLECULES

    EPA Science Inventory

    Computational Methods for Studying the Interaction between Polycyclic Aromatic Hydrocarbons and Biological Macromolecules .

    The mechanisms for the processes that result in significant biological activity of PAHs depend on the interaction of these molecules or their metabol...

  5. Parallel computing in genomic research: advances and applications

    PubMed Central

    Ocaña, Kary; de Oliveira, Daniel

    2015-01-01

    Today’s genomic experiments have to process the so-called “biological big data” that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC) environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities. PMID:26604801

  6. Characterization of an orthovoltage biological irradiator used for radiobiological research

    PubMed Central

    Azimi, Rezvan; Alaei, Parham; Spezi, Emiliano; Hui, Susanta K.

    2015-01-01

    Orthovoltage irradiators are routinely used to irradiate specimens and small animals in biological research. There are several reports on the characteristics of these units for small field irradiations. However, there is limited knowledge about use of these units for large fields, which are essential for emerging large-field irregular shape irradiations, namely total marrow irradiation used as a conditioning regimen for hematological malignancies. This work describes characterization of a self-contained Orthovoltage biological irradiator for large fields using measurements and Monte Carlo simulations that could be used to compute the dose for in vivo or in vitro studies for large-field irradiation using this or a similar unit. Percentage depth dose, profiles, scatter factors, and half-value layers were measured and analyzed. A Monte Carlo model of the unit was created and used to generate depth dose and profiles, as well as scatter factors. An ion chamber array was also used for profile measurements of flatness and symmetry. The output was determined according to AAPM Task Group 61 guidelines. The depth dose measurements compare well with published data for similar beams. The Monte Carlo–generated depth dose and profiles match our measured doses to within 2%. Scatter factor measurements indicate gradual variation of these factors with field size. Dose rate measured by placing the ion chamber atop the unit's steel plate or solid water indicate enhanced readings of 5 to 28% compared with those measured in air. The stability of output over a 5-year period is within 2% of the 5-year average. PMID:25694476

  7. Characterization of an orthovoltage biological irradiator used for radiobiological research.

    PubMed

    Azimi, Rezvan; Alaei, Parham; Spezi, Emiliano; Hui, Susanta K

    2015-05-01

    Orthovoltage irradiators are routinely used to irradiate specimens and small animals in biological research. There are several reports on the characteristics of these units for small field irradiations. However, there is limited knowledge about use of these units for large fields, which are essential for emerging large-field irregular shape irradiations, namely total marrow irradiation used as a conditioning regimen for hematological malignancies. This work describes characterization of a self-contained Orthovoltage biological irradiator for large fields using measurements and Monte Carlo simulations that could be used to compute the dose for in vivo or in vitro studies for large-field irradiation using this or a similar unit. Percentage depth dose, profiles, scatter factors, and half-value layers were measured and analyzed. A Monte Carlo model of the unit was created and used to generate depth dose and profiles, as well as scatter factors. An ion chamber array was also used for profile measurements of flatness and symmetry. The output was determined according to AAPM Task Group 61 guidelines. The depth dose measurements compare well with published data for similar beams. The Monte Carlo-generated depth dose and profiles match our measured doses to within 2%. Scatter factor measurements indicate gradual variation of these factors with field size. Dose rate measured by placing the ion chamber atop the unit's steel plate or solid water indicate enhanced readings of 5 to 28% compared with those measured in air. The stability of output over a 5-year period is within 2% of the 5-year average. PMID:25694476

  8. Computing fuzzy associations for the analysis of biological literature.

    PubMed

    Perez-Iratxeta, Carolina; Keer, Harindar S; Bork, Peer; Andrade, Miguel A

    2002-06-01

    The increase of information in biology makes it difficult for researchers in any field to keep current with the literature. The MEDLINE database of scientific abstracts can be quickly scanned using electronic mechanisms. Potentially interesting abstracts can be selected by matching words joined by Boolean operators. However this means of selecting documents is not optimal. Nonspecific queries have to be effected, resulting in large numbers of irrelevant abstracts that have to be manually scanned To facilitate this analysis, we have developed a system that compiles a summary of subjects and related documents on the results of a MEDLINE query. For this, we have applied a fuzzy binary relation formalism that deduces relations between words present in a set of abstracts preprocessed with a standard grammatical tagger. Those relations are used to derive ensembles of related words and their associated subsets of abstracts. The algorithm can be used publicly at http:// www.bork.embl-heidelberg.de/xplormed/. PMID:12074170

  9. Advanced Scientific Computing Research Network Requirements

    SciTech Connect

    Bacon, Charles; Bell, Greg; Canon, Shane; Dart, Eli; Dattoria, Vince; Goodwin, Dave; Lee, Jason; Hicks, Susan; Holohan, Ed; Klasky, Scott; Lauzon, Carolyn; Rogers, Jim; Shipman, Galen; Skinner, David; Tierney, Brian

    2013-03-08

    The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the U.S. Department of Energy (DOE) Office of Science (SC), the single largest supporter of basic research in the physical sciences in the United States. In support of SC programs, ESnet regularly updates and refreshes its understanding of the networking requirements of the instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet to be a highly successful enabler of scientific discovery for over 25 years. In October 2012, ESnet and the Office of Advanced Scientific Computing Research (ASCR) of the DOE SC organized a review to characterize the networking requirements of the programs funded by the ASCR program office. The requirements identified at the review are summarized in the Findings section, and are described in more detail in the body of the report.

  10. Bridging Emotion Research: From Biology to Social Structure

    ERIC Educational Resources Information Center

    Rogers, Kimberly B.; Kavanagh, Liam

    2010-01-01

    Emotion research demonstrates that problems of theoretical interest or practical significance are not divided neatly along disciplinary boundaries. Researchers acknowledge both organic and social underpinnings of emotion, but the intersections between biological and structural processes can be difficult to negotiate. In this article, the authors…

  11. Subject Didactic Studies of Research Training in Biology and Physics.

    ERIC Educational Resources Information Center

    Lybeck, Leif

    1984-01-01

    The objectives and design of a 3-year study of research training and supervision in biology and physics are discussed. Scientific problems arising from work on the thesis will be a focus for the postgraduate students and their supervisors. Attention will be focused on supervisors' and students' conceptions of science, subject range, research,…

  12. Reduction of dynamical biochemical reactions networks in computational biology

    PubMed Central

    Radulescu, O.; Gorban, A. N.; Zinovyev, A.; Noel, V.

    2012-01-01

    Biochemical networks are used in computational biology, to model mechanistic details of systems involved in cell signaling, metabolism, and regulation of gene expression. Parametric and structural uncertainty, as well as combinatorial explosion are strong obstacles against analyzing the dynamics of large models of this type. Multiscaleness, an important property of these networks, can be used to get past some of these obstacles. Networks with many well separated time scales, can be reduced to simpler models, in a way that depends only on the orders of magnitude and not on the exact values of the kinetic parameters. The main idea used for such robust simplifications of networks is the concept of dominance among model elements, allowing hierarchical organization of these elements according to their effects on the network dynamics. This concept finds a natural formulation in tropical geometry. We revisit, in the light of these new ideas, the main approaches to model reduction of reaction networks, such as quasi-steady state (QSS) and quasi-equilibrium approximations (QE), and provide practical recipes for model reduction of linear and non-linear networks. We also discuss the application of model reduction to the problem of parameter identification, via backward pruning machine learning techniques. PMID:22833754

  13. The dilemma of dual use biological research: Polish perspective.

    PubMed

    Czarkowski, Marek

    2010-03-01

    Biological research with legitimate scientific purpose that may be misused to pose a biological threat to public health and/or national security is termed dual use. In Poland there are adequate conditions for conducting experiments that could be qualified as dual use research, and therefore, a risk of attack on Poland or other countries exists. Optimal solutions for limiting such threats are required, and the national system of biosecurity should enable early, reliable, and complete identification of this type of research. Scientists should have a fundamental role in this process, their duty being to immediately, upon identification, report research with dual use potential. An important entity in the identification system of dual use research should also be the Central Register of Biological and Biomedical Research, which gathers information about all biological and biomedical research being conducted in a given country. Publishers, editors, and review committees of journals and other scientific publications should be involved in evaluating results of clinical trials. The National Council of Biosecurity should be the governmental institution responsible for developing a system of dual use research threat prevention. Its role would be to develop codes of conduct, form counsel of expertise, and monitor the problem at national level, while the Dual Use Research Committee would be responsible for individual cases. In Poland, current actions aiming to provide biological safety were based on developing and passing an act about genetically modified organisms (GMO's) and creating a GMO Committee. Considering experiences of other nations, one should view these actions as fragmentary, and thus insufficient protection against dual use research threats. PMID:18546061

  14. Toward a computational framework for cognitive biology: Unifying approaches from cognitive neuroscience and comparative cognition

    NASA Astrophysics Data System (ADS)

    Fitch, W. Tecumseh

    2014-09-01

    Progress in understanding cognition requires a quantitative, theoretical framework, grounded in the other natural sciences and able to bridge between implementational, algorithmic and computational levels of explanation. I review recent results in neuroscience and cognitive biology that, when combined, provide key components of such an improved conceptual framework for contemporary cognitive science. Starting at the neuronal level, I first discuss the contemporary realization that single neurons are powerful tree-shaped computers, which implies a reorientation of computational models of learning and plasticity to a lower, cellular, level. I then turn to predictive systems theory (predictive coding and prediction-based learning) which provides a powerful formal framework for understanding brain function at a more global level. Although most formal models concerning predictive coding are framed in associationist terms, I argue that modern data necessitate a reinterpretation of such models in cognitive terms: as model-based predictive systems. Finally, I review the role of the theory of computation and formal language theory in the recent explosion of comparative biological research attempting to isolate and explore how different species differ in their cognitive capacities. Experiments to date strongly suggest that there is an important difference between humans and most other species, best characterized cognitively as a propensity by our species to infer tree structures from sequential data. Computationally, this capacity entails generative capacities above the regular (finite-state) level; implementationally, it requires some neural equivalent of a push-down stack. I dub this unusual human propensity "dendrophilia", and make a number of concrete suggestions about how such a system may be implemented in the human brain, about how and why it evolved, and what this implies for models of language acquisition. I conclude that, although much remains to be done, a

  15. Biological and Physical Space Research Laboratory 2002 Science Review

    NASA Technical Reports Server (NTRS)

    Curreri, P. A. (Editor); Robinson, M. B. (Editor); Murphy, K. L. (Editor)

    2003-01-01

    With the International Space Station Program approaching core complete, our NASA Headquarters sponsor, the new Code U Enterprise, Biological and Physical Research, is shifting its research emphasis from purely fundamental microgravity and biological sciences to strategic research aimed at enabling human missions beyond Earth orbit. Although we anticipate supporting microgravity research on the ISS for some time to come, our laboratory has been vigorously engaged in developing these new strategic research areas.This Technical Memorandum documents the internal science research at our laboratory as presented in a review to Dr. Ann Whitaker, MSFC Science Director, in July 2002. These presentations have been revised and updated as appropriate for this report. It provides a snapshot of the internal science capability of our laboratory as an aid to other NASA organizations and the external scientific community.

  16. NASA Space Biology Plant Research for 2010-2020

    NASA Technical Reports Server (NTRS)

    Levine, H. G.; Tomko, D. L.; Porterfield, D. M.

    2012-01-01

    The U.S. National Research Council (NRC) recently published "Recapturing a Future for Space Exploration: Life and Physical Sciences Research for a New Era" (http://www.nap.edu/catalog.php?record id=13048), and NASA completed a Space Biology Science Plan to develop a strategy for implementing its recommendations ( http://www.nasa.gov/exploration/library/esmd documents.html). The most important recommendations of the NRC report on plant biology in space were that NASA should: (1) investigate the roles of microbial-plant systems in long-term bioregenerative life support systems, and (2) establish a robust spaceflight program of research analyzing plant growth and physiological responses to the multiple stimuli encountered in spaceflight environments. These efforts should take advantage of recently emerged analytical technologies (genomics, transcriptomics, proteomics, metabolomics) and apply modern cellular and molecular approaches in the development of a vigorous flight-based and ground-based research program. This talk will describe NASA's strategy and plans for implementing these NRC Plant Space Biology recommendations. New research capabilities for Plant Biology, optimized by providing state-of-the-art automated technology and analytical techniques to maximize scientific return, will be described. Flight experiments will use the most appropriate platform to achieve science results (e.g., ISS, free flyers, sub-orbital flights) and NASA will work closely with its international partners and other U.S. agencies to achieve its objectives. One of NASA's highest priorities in Space Biology is the development research capabilities for use on the International Space Station and other flight platforms for studying multiple generations of large plants. NASA will issue recurring NASA Research Announcements (NRAs) that include a rapid turn-around model to more fully engage the biology community in designing experiments to respond to the NRC recommendations. In doing so, NASA

  17. An open investigation of the reproducibility of cancer biology research.

    PubMed

    Errington, Timothy M; Iorns, Elizabeth; Gunn, William; Tan, Fraser Elisabeth; Lomax, Joelle; Nosek, Brian A

    2014-01-01

    It is widely believed that research that builds upon previously published findings has reproduced the original work. However, it is rare for researchers to perform or publish direct replications of existing results. The Reproducibility Project: Cancer Biology is an open investigation of reproducibility in preclinical cancer biology research. We have identified 50 high impact cancer biology articles published in the period 2010-2012, and plan to replicate a subset of experimental results from each article. A Registered Report detailing the proposed experimental designs and protocols for each subset of experiments will be peer reviewed and published prior to data collection. The results of these experiments will then be published in a Replication Study. The resulting open methodology and dataset will provide evidence about the reproducibility of high-impact results, and an opportunity to identify predictors of reproducibility. PMID:25490932

  18. An open investigation of the reproducibility of cancer biology research

    PubMed Central

    Errington, Timothy M; Iorns, Elizabeth; Gunn, William; Tan, Fraser Elisabeth; Lomax, Joelle; Nosek, Brian A

    2014-01-01

    It is widely believed that research that builds upon previously published findings has reproduced the original work. However, it is rare for researchers to perform or publish direct replications of existing results. The Reproducibility Project: Cancer Biology is an open investigation of reproducibility in preclinical cancer biology research. We have identified 50 high impact cancer biology articles published in the period 2010-2012, and plan to replicate a subset of experimental results from each article. A Registered Report detailing the proposed experimental designs and protocols for each subset of experiments will be peer reviewed and published prior to data collection. The results of these experiments will then be published in a Replication Study. The resulting open methodology and dataset will provide evidence about the reproducibility of high-impact results, and an opportunity to identify predictors of reproducibility. DOI: http://dx.doi.org/10.7554/eLife.04333.001 PMID:25490932

  19. Interactomes to Biological Phase Space: a call to begin thinking at a new level in computational biology.

    SciTech Connect

    Davidson, George S.; Brown, William Michael

    2007-09-01

    Techniques for high throughput determinations of interactomes, together with high resolution protein collocalizations maps within organelles and through membranes will soon create a vast resource. With these data, biological descriptions, akin to the high dimensional phase spaces familiar to physicists, will become possible. These descriptions will capture sufficient information to make possible realistic, system-level models of cells. The descriptions and the computational models they enable will require powerful computing techniques. This report is offered as a call to the computational biology community to begin thinking at this scale and as a challenge to develop the required algorithms and codes to make use of the new data.3

  20. CFD research, parallel computation and aerodynamic optimization

    NASA Technical Reports Server (NTRS)

    Ryan, James S.

    1995-01-01

    Over five years of research in Computational Fluid Dynamics and its applications are covered in this report. Using CFD as an established tool, aerodynamic optimization on parallel architectures is explored. The objective of this work is to provide better tools to vehicle designers. Submarine design requires accurate force and moment calculations in flow with thick boundary layers and large separated vortices. Low noise production is critical, so flow into the propulsor region must be predicted accurately. The High Speed Civil Transport (HSCT) has been the subject of recent work. This vehicle is to be a passenger vehicle with the capability of cutting overseas flight times by more than half. A successful design must surpass the performance of comparable planes. Fuel economy, other operational costs, environmental impact, and range must all be improved substantially. For all these reasons, improved design tools are required, and these tools must eventually integrate optimization, external aerodynamics, propulsion, structures, heat transfer and other disciplines.

  1. Theoretical and computational models of biological ion channels

    NASA Astrophysics Data System (ADS)

    Roux, Benoit

    2004-03-01

    A theoretical framework for describing ion conduction through biological molecular pores is established and explored. The framework is based on a statistical mechanical formulation of the transmembrane potential (1) and of the equilibrium multi-ion potential of mean forces through selective ion channels (2). On the basis of these developments, it is possible to define computational schemes to address questions about the non-equilibrium flow of ions through ion channels. In the case of narrow channels (gramicidin or KcsA), it is possible to characterize the ion conduction in terms of the potential of mean force of the ions along the channel axis (i.e., integrating out the off-axis motions). This has been used for gramicidin (3) and for KcsA (4,5). In the case of wide pores (i.e., OmpF porin), this is no longer a good idea, but it is possible to use a continuum solvent approximations. In this case, a grand canonical monte carlo brownian dynamics algorithm was constructed for simulating the non-equilibrium flow of ions through wide pores. The results were compared with those from the Poisson-Nernst-Planck mean-field electrodiffusion theory (6-8). References; 1. B. Roux, Biophys. J. 73:2980-2989 (1997); 2. B. Roux, Biophys. J. 77, 139-153 (1999); 3. Allen, Andersen and Roux, PNAS (2004, in press); 4. Berneche and Roux. Nature, 414:73-77 (2001); 5. Berneche and Roux. PNAS, 100:8644-8648 (2003); 6. W. Im and S. Seefeld and B. Roux, Biophys. J. 79:788-801 (2000); 7. W. Im and B. Roux, J. Chem. Phys. 115:4850-4861 (2001); 8. W. Im and B. Roux, J. Mol. Biol. 322:851-869 (2002).

  2. Computational approaches to selecting and optimising targets for structural biology.

    PubMed

    Overton, Ian M; Barton, Geoffrey J

    2011-09-01

    Selection of protein targets for study is central to structural biology and may be influenced by numerous factors. A key aim is to maximise returns for effort invested by identifying proteins with the balance of biophysical properties that are conducive to success at all stages (e.g. solubility, crystallisation) in the route towards a high resolution structural model. Selected targets can be optimised through construct design (e.g. to minimise protein disorder), switching to a homologous protein, and selection of experimental methodology (e.g. choice of expression system) to prime for efficient progress through the structural proteomics pipeline. Here we discuss computational techniques in target selection and optimisation, with more detailed focus on tools developed within the Scottish Structural Proteomics Facility (SSPF); namely XANNpred, ParCrys, OB-Score (target selection) and TarO (target optimisation). TarO runs a large number of algorithms, searching for homologues and annotating the pool of possible alternative targets. This pool of putative homologues is presented in a ranked, tabulated format and results are also visualised as an automatically generated and annotated multiple sequence alignment. The target selection algorithms each predict the propensity of a selected protein target to progress through the experimental stages leading to diffracting crystals. This single predictor approach has advantages for target selection, when compared with an approach using two or more predictors that each predict for success at a single experimental stage. The tools described here helped SSPF achieve a high (21%) success rate in progressing cloned targets to diffraction-quality crystals. PMID:21906678

  3. Computational fire modeling for aircraft fire research

    SciTech Connect

    Nicolette, V.F.

    1996-11-01

    This report summarizes work performed by Sandia National Laboratories for the Federal Aviation Administration. The technical issues involved in fire modeling for aircraft fire research are identified, as well as computational fire tools for addressing those issues, and the research which is needed to advance those tools in order to address long-range needs. Fire field models are briefly reviewed, and the VULCAN model is selected for further evaluation. Calculations are performed with VULCAN to demonstrate its applicability to aircraft fire problems, and also to gain insight into the complex problem of fires involving aircraft. Simulations are conducted to investigate the influence of fire on an aircraft in a cross-wind. The interaction of the fuselage, wind, fire, and ground plane is investigated. Calculations are also performed utilizing a large eddy simulation (LES) capability to describe the large- scale turbulence instead of the more common k-{epsilon} turbulence model. Additional simulations are performed to investigate the static pressure and velocity distributions around a fuselage in a cross-wind, with and without fire. The results of these simulations provide qualitative insight into the complex interaction of a fuselage, fire, wind, and ground plane. Reasonable quantitative agreement is obtained in the few cases for which data or other modeling results exist Finally, VULCAN is used to quantify the impact of simplifying assumptions inherent in a risk assessment compatible fire model developed for open pool fire environments. The assumptions are seen to be of minor importance for the particular problem analyzed. This work demonstrates the utility of using a fire field model for assessing the limitations of simplified fire models. In conclusion, the application of computational fire modeling tools herein provides both qualitative and quantitative insights into the complex problem of aircraft in fires.

  4. Fundamental Biological Research on the International Space Station

    NASA Technical Reports Server (NTRS)

    Souza, K. A.; Yost, Bruce; Fletcher, L.; Dalton, Bonnie P. (Technical Monitor)

    2000-01-01

    The fundamental Biology Program of NASA's Life Sciences Division is chartered with enabling and sponsoring research on the International Space Station (ISS) in order to understand the effects of the space flight environment, particularly microgravity, on living systems. To accomplish this goal, NASA Ames Research Center (ARC) has been tasked with managing the development of a number of biological habitats, along with their support systems infrastructure. This integrated suite of habitats and support systems is being designed to support research requirements identified by the scientific community. As such, it will support investigations using cells and tissues, avian eggs, insects, plants, aquatic organisms and rodents. Studies following organisms through complete life cycles and over multiple generations will eventually be possible. As an adjunct to the development of these basic habitats, specific analytical and monitoring technologies are being targeted for maturation to complete the research cycle by transferring existing or emerging analytical techniques, sensors, and processes from the laboratory bench to the ISS research platform.

  5. Emerging Uses of Computer Technology in Qualitative Research.

    ERIC Educational Resources Information Center

    Parker, D. Randall

    The application of computer technology in qualitative research and evaluation ranges from simple word processing to doing sophisticated data sorting and retrieval. How computer software can be used for qualitative research is discussed. Researchers should consider the use of computers in data analysis in light of their own familiarity and comfort…

  6. Animats: computer-simulated animals in behavioral research.

    PubMed

    Watts, J M

    1998-10-01

    The term animat refers to a class of simulated animals. This article is intended as a nontechnical introduction to animat research. Animats can be robots interacting with the real world or computer simulations. In this article, the use of computer-generated animats is emphasized. The scientific use of animats has been pioneered by artificial intelligence and artificial life researchers. Behavior-based artificial intelligence uses animats capable of autonomous and adaptive activity as conceptual tools in the design of usefully intelligent systems. Artificial life proponents view some human artifacts, including informational structures that show adaptive behavior and self-replication, as animats may do, as analogous to biological organisms. Animat simulations may be used for rapid and inexpensive evaluation of new livestock environments or management techniques. The animat approach is a powerful heuristic for understanding the mechanisms that underlie behavior. The simple rules and capabilities of animat models generate emergent and sometimes unpredictable behavior. Adaptive variability in animat behavior may be exploited using artificial neural networks. These have computational properties similar to natural neurons and are capable of learning. Artificial neural networks can control behavior at all levels of an animat's functional organization. Improving the performance of animats often requires genetic programming. Genetic algorithms are computer programs that are capable of self-replication, simulating biological reproduction. Animats may thus evolve over generations. Selective forces may be provided by a human overseer or be part of the simulated environment. Animat techniques allow researchers to culture behavior outside the organism that usually produces it. This approach could contribute new insights in theoretical ethology on questions including the origins of social behavior and cooperation, adaptation, and the emergent nature of complex behavior. Animat

  7. The NASA computer science research program plan

    NASA Technical Reports Server (NTRS)

    1983-01-01

    A taxonomy of computer science is included, one state of the art of each of the major computer science categories is summarized. A functional breakdown of NASA programs under Aeronautics R and D, space R and T, and institutional support is also included. These areas were assessed against the computer science categories. Concurrent processing, highly reliable computing, and information management are identified.

  8. Applying the community partnership approach to human biology research.

    PubMed

    Ravenscroft, Julia; Schell, Lawrence M; Cole, Tewentahawih'tha'

    2015-01-01

    Contemporary human biology research employs a unique skillset for biocultural analysis. This skillset is highly appropriate for the study of health disparities because disparities result from the interaction of social and biological factors over one or more generations. Health disparities research almost always involves disadvantaged communities owing to the relationship between social position and health in stratified societies. Successful research with disadvantaged communities involves a specific approach, the community partnership model, which creates a relationship beneficial for researcher and community. Paramount is the need for trust between partners. With trust established, partners share research goals, agree on research methods and produce results of interest and importance to all partners. Results are shared with the community as they are developed; community partners also provide input on analyses and interpretation of findings. This article describes a partnership-based, 20 year relationship between community members of the Akwesasne Mohawk Nation and researchers at the University at Albany. As with many communities facing health disparity issues, research with Native Americans and indigenous peoples generally is inherently politicized. For Akwesasne, the contamination of their lands and waters is an environmental justice issue in which the community has faced unequal exposure to, and harm by environmental toxicants. As human biologists engage in more partnership-type research, it is important to understand the long term goals of the community and what is at stake so the research circle can be closed and 'helicopter' style research avoided. PMID:25380288

  9. First Steps in Computational Systems Biology: A Practical Session in Metabolic Modeling and Simulation

    ERIC Educational Resources Information Center

    Reyes-Palomares, Armando; Sanchez-Jimenez, Francisca; Medina, Miguel Angel

    2009-01-01

    A comprehensive understanding of biological functions requires new systemic perspectives, such as those provided by systems biology. Systems biology approaches are hypothesis-driven and involve iterative rounds of model building, prediction, experimentation, model refinement, and development. Developments in computer science are allowing for ever…

  10. A proposal for augmenting biological model construction with a semi-intelligent computational modeling assistant

    PubMed Central

    Christley, Scott; An, Gary

    2013-01-01

    The translational challenge in biomedical research lies in the effective and efficient transfer of mechanistic knowledge from one biological context to another. Implicit in this process is the establishment of causality from correlation in the form of mechanistic hypotheses. Effectively addressing the translational challenge requires the use of automated methods, including the ability to computationally capture the dynamic aspect of putative hypotheses such that they can be evaluated in a high throughput fashion. Ontologies provide structure and organization to biomedical knowledge; converting these representations into executable models/simulations is the next necessary step. Researchers need the ability to map their conceptual models into a model specification that can be transformed into an executable simulation program. We suggest this mapping process, which approximates certain steps in the development of a computational model, can be expressed as a set of logical rules, and a semi-intelligent computational agent, the Computational Modeling Assistant (CMA), can perform reasoning to develop a plan to achieve the construction of an executable model. Presented herein is a description and implementation for a model construction reasoning process between biomedical and simulation ontologies that is performed by the CMA to produce the specification of an executable model that can be used for dynamic knowledge representation. PMID:23990750

  11. FOREWORD: Third Nordic Symposium on Computer Simulation in Physics, Chemistry, Biology and Mathematics

    NASA Astrophysics Data System (ADS)

    Kaski, K.; Salomaa, M.

    1990-01-01

    These are Proceedings of the Third Nordic Symposium on Computer Simulation in Physics, Chemistry, Biology, and Mathematics, held August 25-26, 1989, at Lahti (Finland). The Symposium belongs to an annual series of Meetings, the first one of which was arranged in 1987 at Lund (Sweden) and the second one in 1988 at Kolle-Kolle near Copenhagen (Denmark). Although these Symposia have thus far been essentially Nordic events, their international character has increased significantly; the trend is vividly reflected through contributions in the present Topical Issue. The interdisciplinary nature of Computational Science is central to the activity; this fundamental aspect is also responsible, in an essential way, for its rapidly increasing impact. Crucially important to a wide spectrum of superficially disparate fields is the common need for extensive - and often quite demanding - computational modelling. For such theoretical models, no closed-form (analytical) solutions are available or they would be extremely difficult to find; hence one must rather resort to the Art of performing computational investigations. Among the unifying features in the computational research are the methods of simulation employed; methods which frequently are quite closely related with each other even for faculties of science that are quite unrelated. Computer simulation in Natural Sciences is presently apprehended as a discipline on its own right, occupying a broad region somewhere between the experimental and theoretical methods, but also partially overlapping with and complementing them. - Whichever its proper definition may be, the computational approach serves as a novel and an extremely versatile tool with which one can equally well perform "pure" experimental modelling and conduct "computational theory". Computational studies that have earlier been made possible only through supercomputers have opened unexpected, as well as exciting, novel frontiers equally in mathematics (e.g., fractals

  12. Biomedical Research Experiences for Biology Majors at a Small College

    ERIC Educational Resources Information Center

    Stover, Shawn K.; Mabry, Michelle L.

    2010-01-01

    A program-level assessment of the biology curriculum at a small liberal arts college validates a previous study demonstrating success in achieving learning outcomes related to content knowledge and communication skills. Furthermore, research opportunities have been provided to complement pedagogical strategies and give students a more complete…

  13. Glimpses of Biological Research and Education in Cuba.

    ERIC Educational Resources Information Center

    Margulis, Lynn; Kunz, Thomas H.

    1984-01-01

    Discusses Cuban medical facilities, biological research (focusing on sugarcane tissue culture, interferon, hybrid cattle, tropical fruits, and yeast biosynthetic pathways), science education programs at all levels, and institutions of higher education. Also examines such concerns as the Cuban literacy rate and efforts to improve the environment.…

  14. Biologically Enhanced Carbon Sequestration: Research Needs and Opportunities

    SciTech Connect

    Oldenburg, Curtis; Oldenburg, Curtis M.; Torn, Margaret S.

    2008-03-21

    Fossil fuel combustion, deforestation, and biomass burning are the dominant contributors to increasing atmospheric carbon dioxide (CO{sub 2}) concentrations and global warming. Many approaches to mitigating CO{sub 2} emissions are being pursued, and among the most promising are terrestrial and geologic carbon sequestration. Recent advances in ecology and microbial biology offer promising new possibilities for enhancing terrestrial and geologic carbon sequestration. A workshop was held October 29, 2007, at Lawrence Berkeley National Laboratory (LBNL) on Biologically Enhanced Carbon Sequestration (BECS). The workshop participants (approximately 30 scientists from California, Illinois, Oregon, Montana, and New Mexico) developed a prioritized list of research needed to make progress in the development of biological enhancements to improve terrestrial and geologic carbon sequestration. The workshop participants also identified a number of areas of supporting science that are critical to making progress in the fundamental research areas. The purpose of this position paper is to summarize and elaborate upon the findings of the workshop. The paper considers terrestrial and geologic carbon sequestration separately. First, we present a summary in outline form of the research roadmaps for terrestrial and geologic BECS. This outline is elaborated upon in the narrative sections that follow. The narrative sections start with the focused research priorities in each area followed by critical supporting science for biological enhancements as prioritized during the workshop. Finally, Table 1 summarizes the potential significance or 'materiality' of advances in these areas for reducing net greenhouse gas emissions.

  15. Trends in Computing for Climate Research

    NASA Astrophysics Data System (ADS)

    Lawrence, B.

    2014-12-01

    The grand challenges of climate science will stress our informatics infrastructure severely in the next decade. Our drive for ever greater simulation resolution/complexity/length/repetition, coupled with new remote and in-situ sensing platforms present us with problems in computation, data handling, and information management, to name but three. These problems are compounded by the background trends: Moore's Law is no longer doing us any favours: computing is getting harder to exploit as we have to bite the parallelism bullet, and Kryder's Law (if it ever existed) isn't going to help us store the data volumes we can see ahead. The variety of data, the rate it arrives, and the complexity of the tools we need and use, all strain our ability to cope. The solutions, as ever, will revolve around more and better software, but "more" and "better" will require some attention. In this talk we discuss how these issues have played out in the context of CMIP5, and might be expected to play out in CMIP6 and successors. Although the CMIPs will provide the thread, we will digress into modelling per se, regional climate modelling (CORDEX), observations from space (Obs4MIPs and friends), climate services (as they might play out in Europe), and the dependency of progress on how we manage people in our institutions. It will be seen that most of the issues we discuss apply to the wider environmental sciences, if not science in general. They all have implications for the need for both sustained infrastructure and ongoing research into environmental informatics.

  16. Spacecraft computer technology at Southwest Research Institute

    NASA Technical Reports Server (NTRS)

    Shirley, D. J.

    1993-01-01

    Southwest Research Institute (SwRI) has developed and delivered spacecraft computers for a number of different near-Earth-orbit spacecraft including shuttle experiments and SDIO free-flyer experiments. We describe the evolution of the basic SwRI spacecraft computer design from those weighing in at 20 to 25 lb and using 20 to 30 W to newer models weighing less than 5 lb and using only about 5 W, yet delivering twice the processing throughput. Because of their reduced size, weight, and power, these newer designs are especially applicable to planetary instrument requirements. The basis of our design evolution has been the availability of more powerful processor chip sets and the development of higher density packaging technology, coupled with more aggressive design strategies in incorporating high-density FPGA technology and use of high-density memory chips. In addition to reductions in size, weight, and power, the newer designs also address the necessity of survival in the harsh radiation environment of space. Spurred by participation in such programs as MSTI, LACE, RME, Delta 181, Delta Star, and RADARSAT, our designs have evolved in response to program demands to be small, low-powered units, radiation tolerant enough to be suitable for both Earth-orbit microsats and for planetary instruments. Present designs already include MIL-STD-1750 and Multi-Chip Module (MCM) technology with near-term plans to include RISC processors and higher-density MCM's. Long term plans include development of whole-core processors on one or two MCM's.

  17. X-38 research aircraft deorbit - computer animation

    NASA Technical Reports Server (NTRS)

    1997-01-01

    In the mid-1990's researchers at the NASA Dryden Flight Research Center, Edwards, California, and Johnson Space Center in Houston, Texas, began working actively with the sub-scale X-38 prototype crew return vehicle (CRV). This was an unpiloted lifting body designed at 80 percent of the size of a projected emergency crew return vehicle for the International Space Station. The X-38 and the actual CRV are patterned after a lifting-body shape first employed in the Air Force X-23 (SV-5) program in the mid-1960's and the Air Force-NASA X-24A lifting-body project in the early to mid-1970's. Built by Scaled Composites, Inc., in Mojave, California, and outfitted with avionics, computer systems, and other hardware at Johnson Space Center, two X-38 aircraft were involved in flight research at Dryden beginning in July of 1997. Before that, however, Dryden conducted some 13 flights at a drop zone near California City, California. These tests were done with a 1/6-scale model of the X-38 to test the parafoil concept that would be employed on the X-38 and the actual CRV. The basic concept is that the actual CRV will use an inertial navigation system together with the Global Positioning System of satellites to guide it from the International Space Station into the earth's atmosphere. A deorbit engine module will redirect the vehicle from orbit into the atmosphere where a series of parachutes and a parafoil will deploy in sequence to bring the vehicle to a landing, possibly in a field next to a hospital. Flight research at NASA Dryden for the X-38 began with an unpiloted captive carry flight in which the vehicle remained attached to its future launch vehicle, the Dryden B-52 008. There were four captive flights in 1997 and three in 1998, plus the first drop test on March 12, 1998, using the parachutes and parafoil. Further captive and drop tests occurred in 1999. Although the X-38 landed safely on the lakebed at Edwards after the March 1998 drop test, there had been some problems

  18. X-38 research aircraft landing - computer animation

    NASA Technical Reports Server (NTRS)

    1997-01-01

    In the mid-1990's researchers at the NASA Dryden Flight Research Center, Edwards, California, and Johnson Space Center in Houston, Texas, began working actively with the sub-scale X-38 prototype crew return vehicle (CRV). This was an unpiloted lifting body designed at 80 percent of the size of a projected emergency crew return vehicle for the International Space Station. The X-38 and the actual CRV are patterned after a lifting-body shape first employed in the Air Force X-23 (SV-5) program in the mid-1960's and the Air Force-NASA X-24A lifting-body project in the early to mid-1970's. Built by Scaled Composites, Inc., in Mojave, California, and outfitted with avionics, computer systems, and other hardware at Johnson Space Center, two X-38 aircraft were involved in flight research at Dryden beginning in July of 1997. Before that, however, Dryden conducted some 13 flights at a drop zone near California City, California. These tests were done with a 1/6-scale model of the X-38 aircraft to test the parafoil concept that would be employed on the X-38 aircraft and the actual CRV. The basic concept is that the actual CRV will use an inertial navigation system together with the Global Positioning System of satellites to guide it from the International Space Station into the earth's atmosphere. A deorbit engine module will redirect the vehicle from orbit into the atmosphere where a series of parachutes and a parafoil will deploy in sequence to bring the vehicle to a landing, possibly in a field next to a hospital. Flight research at NASA Dryden for the X-38 began with an unpiloted captive carry flight in which the vehicle remained attached to its future launch vehicle, the Dryden B-52 008. There were four captive flights in 1997 and three in 1998, plus the first drop test on March 12, 1998, using the parachutes and parafoil. Further captive and drop tests occurred in 1999. Although the X-38 landed safely on the lakebed at Edwards after the March 1998 drop test, there had

  19. Research on Key Technologies of Cloud Computing

    NASA Astrophysics Data System (ADS)

    Zhang, Shufen; Yan, Hongcan; Chen, Xuebin

    With the development of multi-core processors, virtualization, distributed storage, broadband Internet and automatic management, a new type of computing mode named cloud computing is produced. It distributes computation task on the resource pool which consists of massive computers, so the application systems can obtain the computing power, the storage space and software service according to its demand. It can concentrate all the computing resources and manage them automatically by the software without intervene. This makes application offers not to annoy for tedious details and more absorbed in his business. It will be advantageous to innovation and reduce cost. It's the ultimate goal of cloud computing to provide calculation, services and applications as a public facility for the public, So that people can use the computer resources just like using water, electricity, gas and telephone. Currently, the understanding of cloud computing is developing and changing constantly, cloud computing still has no unanimous definition. This paper describes three main service forms of cloud computing: SAAS, PAAS, IAAS, compared the definition of cloud computing which is given by Google, Amazon, IBM and other companies, summarized the basic characteristics of cloud computing, and emphasized on the key technologies such as data storage, data management, virtualization and programming model.

  20. Space Station Biological Research Project: Reference Experiment Book

    NASA Technical Reports Server (NTRS)

    Johnson, Catherine (Editor); Wade, Charles (Editor)

    1996-01-01

    The Space Station Biological Research Project (SSBRP), which is the combined efforts of the Centrifuge Facility (CF) and the Gravitational Biology Facility (GBF), is responsible for the development of life sciences hardware to be used on the International Space Station to support cell, developmental, and plant biology research. The SSBRP Reference Experiment Book was developed to use as a tool for guiding this development effort. The reference experiments characterize the research interests of the international scientific community and serve to identify the hardware capabilities and support equipment needed to support such research. The reference experiments also serve as a tool for understanding the operational aspects of conducting research on board the Space Station. This material was generated by the science community by way of their responses to reference experiment solicitation packages sent to them by SSBRP scientists. The solicitation process was executed in two phases. The first phase was completed in February of 1992 and the second phase completed in November of 1995. Representing these phases, the document is subdivided into a Section 1 and a Section 2. The reference experiments contained in this document are only representative microgravity experiments. They are not intended to define actual flight experiments. Ground and flight experiments will be selected through the formal NASA Research Announcement (NRA) and Announcement of Opportunity (AO) experiment solicitation, review, and selection process.

  1. The Increasing Urgency for Standards in Basic Biological Research

    PubMed Central

    Freedman, Leonard P.; Inglese, James

    2016-01-01

    Research advances build upon the validity and reproducibility of previously published data and findings. Yet irreproducibility in basic biological and preclinical research is pervasive in both academic and commercial settings. Lack of reproducibility has led to invalidated research breakthroughs, retracted papers, and aborted clinical trials. Concerns and requirements for transparent, reproducible, and translatable research are accelerated by the rapid growth of “post-publication peer review,” open access publishing, and data sharing that facilitate the identification of irreproducible data/studies; they are magnified by the explosion of high-throughput technologies, genomics, and other data-intensive disciplines. Collectively, these changes and challenges are decreasing the effectiveness of traditional research quality mechanisms and are contributing to unacceptable—and unsustainable—levels of irreproducibility. The global oncology and basic biological research communities can no longer tolerate or afford widespread irreproducible research. This article discusses (1) how irreproducibility in preclinical research can ultimately be traced to an absence of a unifying life science standards framework, and (2) makes an urgent case for the expanded development and use of consensus-based standards to both enhance reproducibility and drive innovations in cancer research. PMID:25035389

  2. Connecting biology and organic chemistry introductory laboratory courses through a collaborative research project.

    PubMed

    Boltax, Ariana L; Armanious, Stephanie; Kosinski-Collins, Melissa S; Pontrello, Jason K

    2015-01-01

    Modern research often requires collaboration of experts in fields, such as math, chemistry, biology, physics, and computer science to develop unique solutions to common problems. Traditional introductory undergraduate laboratory curricula in the sciences often do not emphasize connections possible between the various disciplines. We designed an interdisciplinary, medically relevant, project intended to help students see connections between chemistry and biology. Second term organic chemistry laboratory students designed and synthesized potential polymer inhibitors or inducers of polyglutamine protein aggregation. The use of novel target compounds added the uncertainty of scientific research to the project. Biology laboratory students then tested the novel potential pharmaceuticals in Huntington's disease model assays, using in vitro polyglutamine peptide aggregation and in vivo lethality studies in Drosophila. Students read articles from the primary literature describing the system from both chemical and biological perspectives. Assessment revealed that students emerged from both courses with a deeper understanding of the interdisciplinary nature of biology and chemistry and a heightened interest in basic research. The design of this collaborative project for introductory biology and organic chemistry labs demonstrated how the local interests and expertise at a university can be drawn from to create an effective way to integrate these introductory courses. Rather than simply presenting a series of experiments to be replicated, we hope that our efforts will inspire other scientists to think about how some aspect of authentic work can be brought into their own courses, and we also welcome additional collaborations to extend the scope of the scientific exploration. PMID:26148149

  3. DNASU plasmid and PSI:Biology-Materials repositories: resources to accelerate biological research

    PubMed Central

    Seiler, Catherine Y.; Park, Jin G.; Sharma, Amit; Hunter, Preston; Surapaneni, Padmini; Sedillo, Casey; Field, James; Algar, Rhys; Price, Andrea; Steel, Jason; Throop, Andrea; Fiacco, Michael; LaBaer, Joshua

    2014-01-01

    The mission of the DNASU Plasmid Repository is to accelerate research by providing high-quality, annotated plasmid samples and online plasmid resources to the research community through the curated DNASU database, website and repository (http://dnasu.asu.edu or http://dnasu.org). The collection includes plasmids from grant-funded, high-throughput cloning projects performed in our laboratory, plasmids from external researchers, and large collections from consortia such as the ORFeome Collaboration and the NIGMS-funded Protein Structure Initiative: Biology (PSI:Biology). Through DNASU, researchers can search for and access detailed information about each plasmid such as the full length gene insert sequence, vector information, associated publications, and links to external resources that provide additional protein annotations and experimental protocols. Plasmids can be requested directly through the DNASU website. DNASU and the PSI:Biology-Materials Repositories were previously described in the 2010 NAR Database Issue (Cormier, C.Y., Mohr, S.E., Zuo, D., Hu, Y., Rolfs, A., Kramer, J., Taycher, E., Kelley, F., Fiacco, M., Turnbull, G. et al. (2010) Protein Structure Initiative Material Repository: an open shared public resource of structural genomics plasmids for the biological community. Nucleic Acids Res., 38, D743–D749.). In this update we will describe the plasmid collection and highlight the new features in the website redesign, including new browse/search options, plasmid annotations and a dynamic vector mapping feature that was developed in collaboration with LabGenius. Overall, these plasmid resources continue to enable research with the goal of elucidating the role of proteins in both normal biological processes and disease. PMID:24225319

  4. Factors associated with computer and Internet technology implementation in biology, chemistry, and physics education in Turkish secondary schools

    NASA Astrophysics Data System (ADS)

    Ozer, Melike

    The main purposes of the research were to identify computer and Internet use by biology, chemistry and physics teachers in Turkish secondary schools and identify factors associated with computer and Internet technology. To this end, survey documents were sent by the Provincial Directorate of National Education to 250 selected schools' administrators for further distribution. Administrators were asked to complete the "Computer and Internet Use: School Survey," and to distribute the "Science Teacher Computer and Internet Use" surveys to the two teachers who teach science class. Surveys were then returned to the General Directorate of Educational Technologies. Research findings showed that computer and Internet use has not occurred effectively. Computers were first introduced to Turkish schools in 1984; unfortunately the current situation of computer and Internet use in science education is not at the projected earlier point in time. Considering the fact that science teachers' participation in technology-related professional development program is higher than other subject teachers, the use of computer and Internet technologies in Turkish secondary schools is still at its early stages. Lack of computer knowledge and not knowing how to integrate computers into education were the major factors reported. With regard to computer and Internet use, a regression model for Turkish schools, which includes access and knowledge, explains a large part of the variance in study results. There was a significant relationship between computer attitude (computer liking, usefulness, and confidence) and computer and Internet use. Although there was a significant negative relationship between Internet and computer uses and the attitudinal component, computer anxiety, it did not deter individuals from expressing a desire to engage in computer use in education.

  5. PathCase-SB: integrating data sources and providing tools for systems biology research

    PubMed Central

    2012-01-01

    Background Integration of metabolic pathways resources and metabolic network models, and deploying new tools on the integrated platform can help perform more effective and more efficient systems biology research on understanding the regulation of metabolic networks. Therefore, the tasks of (a) integrating under a single database environment regulatory metabolic networks and existing models, and (b) building tools to help with modeling and analysis are desirable and intellectually challenging computational tasks. Results PathCase Systems Biology (PathCase-SB) is built and released. This paper describes PathCase-SB user interfaces developed to date. The current PathCase-SB system provides a database-enabled framework and web-based computational tools towards facilitating the development of kinetic models for biological systems. PathCase-SB aims to integrate systems biology models data and metabolic network data of selected biological data sources on the web (currently, BioModels Database and KEGG, respectively), and to provide more powerful and/or new capabilities via the new web-based integrative framework. Conclusions Each of the current four PathCase-SB interfaces, namely, Browser, Visualization, Querying, and Simulation interfaces, have expanded and new capabilities as compared with the original data sources. PathCase-SB is already available on the web and being used by researchers across the globe. PMID:22697505

  6. LifeSat - A satellite for space biological research

    NASA Technical Reports Server (NTRS)

    Halstead, Thora W.; Morey-Holton, Emily R.

    1990-01-01

    The LifeSat Program addresses the need for continuing access by biological scientists to space experimentation by accommodating a wide range of experiments involving animals and plants for durations up to 60 days in an unmanned satellite. The program will encourage interdisciplinary and international cooperation at both the agency and scientist levels, and will provide a recoverable, reusable facility for low-cost missions addressing key scientific issues that can only be answered by space experimentation. It will provide opportunities for research in gravitational biology and on the effects of cosmic radiation on life systems. The scientific aspects of LifeSat are addressed here.

  7. Stem Cells: A Renaissance in Human Biology Research.

    PubMed

    Wu, Jun; Izpisua Belmonte, Juan Carlos

    2016-06-16

    The understanding of human biology and how it relates to that of other species represents an ancient quest. Limited access to human material, particularly during early development, has restricted researchers to only scratching the surface of this inherently challenging subject. Recent technological innovations, such as single cell "omics" and human stem cell derivation, have now greatly accelerated our ability to gain insights into uniquely human biology. The opportunities afforded to delve molecularly into scarce material and to model human embryogenesis and pathophysiological processes are leading to new insights of human development and are changing our understanding of disease and choice of therapy options. PMID:27315475

  8. Computers in aeronautics and space research at the Lewis Research Center

    NASA Technical Reports Server (NTRS)

    1991-01-01

    This brochure presents a general discussion of the role of computers in aerospace research at NASA's Lewis Research Center (LeRC). Four particular areas of computer applications are addressed: computer modeling and simulation, computer assisted engineering, data acquisition and analysis, and computer controlled testing.

  9. Chemical Master Equation Closure for Computer-Aided Synthetic Biology

    PubMed Central

    Smadbeck, Patrick; Kaznessis, Yiannis N.

    2016-01-01

    SUMMARY With inexpensive DNA synthesis technologies, we can now construct biological systems by quickly piecing together DNA sequences. Synthetic biology is the promising discipline that focuses on the construction of these new biological systems. Synthetic biology is an engineering discipline, and as such, it can benefit from mathematical modeling. This chapter focuses on mathematical models of biological systems. These models take the form of chemical reaction networks. The importance of stochasticity is discussed and methods to simulate stochastic reaction networks are reviewed. A closure scheme solution is also presented for the master equation of chemical reaction networks. The master equation is a complete model of randomly evolving molecular populations. Because of its ambitious character, the master equation remained unsolved for all but the simplest of molecular interaction networks for over seventy years. With the first complete solution of chemical master equations, a wide range of experimental observations of biomolecular interactions may be mathematically conceptualized. We anticipate that models based on the closure scheme described herein may assist in rationally designing synthetic biological systems. PMID:25487098

  10. Division of Computer Research Summary of Awards. Fiscal Year 1984.

    ERIC Educational Resources Information Center

    National Science Foundation, Washington, DC. Directorate for Mathematical and Physical Sciences.

    Provided in this report are summaries of grants awarded by the National Science Foundation Division of Computer Research in fiscal year 1984. Similar areas of research are grouped (for the purposes of this report only) into these major categories: (1) computational mathematics; (2) computer systems design; (3) intelligent systems; (4) software…

  11. BIO2010 and beyond: What undergraduate physics does the next generation of molecular biology researchers need?

    NASA Astrophysics Data System (ADS)

    Howard, Jonathon

    2004-03-01

    What fundamental skills in mathematics, chemistry, physics, computer science and engineering are required at the undergraduate level to prepare the next generation of biology majors who will become research scientists? To address this question, Bruce Alberts, President of the National Academy of Sciences, established BIO2010, a committee of the National Research Council (USA), chaired by Lubert Stryer. The report of the committee was published in 2003 as BIO2010: Transforming Undergraduate Education for Future Research Biologists (National Academies Press, Washington DC, www.national-academies.com). I will summarize the recommendations of the Physics and Engineering Panel that was chaired by John Hopfield and give my own views of what physics is essential for researchers in cell and molecular biology.

  12. The 2014 Gordon Research Conference: Physics Research & Education: The Complex Intersection of Biology and Physics

    NASA Astrophysics Data System (ADS)

    Sabella, Mel; Lang, Matthew

    2013-03-01

    The field of biological physics and the physics education of biology and medically oriented students have experienced tremendous growth in recent years. New findings, applications, and technologies in biological and medical physics are having far reaching consequences that affect and influence the science community, the education of future scientists and health-care workers, and the general population. As a result leaders in Physics Education Research have begun to focus their attention on the specific needs of students in the biological sciences, the different ways physicists and biologists view the nature of science and the interactions of scientists in these disciplines. In this poster we highlight some of these findings and pose questions for discussion. The Complex Intersection of Biology and Physics will be the topic of the next Gordon Research Conference on Physics Research and Education to be held in June 2014. The exact date and location are still to be determined.

  13. The opportunities for space biology research on the Space Station

    NASA Technical Reports Server (NTRS)

    Ballard, Rodney W.; Souza, Kenneth A.

    1987-01-01

    The life sciences research facilities for the Space Station are being designed to accommodate both animal and plant specimens for long durations studies. This will enable research on how living systems adapt to microgravity, how gravity has shaped and affected life on earth, and further the understanding of basic biological phenomena. This would include multigeneration experiments on the effects of microgravity on the reproduction, development, growth, physiology, behavior, and aging of organisms. To achieve these research goals, a modular habitat system and on-board variable gravity centrifuges, capable of holding various animal, plant, cells and tissues, is proposed for the science laboratory.

  14. Current dichotomy between traditional molecular biological and omic research in cancer biology and pharmacology.

    PubMed

    Reinhold, William C

    2015-12-10

    There is currently a split within the cancer research community between traditional molecular biological hypothesis-driven and the more recent "omic" forms or research. While the molecular biological approach employs the tried and true single alteration-single response formulations of experimentation, the omic employs broad-based assay or sample collection approaches that generate large volumes of data. How to integrate the benefits of these two approaches in an efficient and productive fashion remains an outstanding issue. Ideally, one would merge the understandability, exactness, simplicity, and testability of the molecular biological approach, with the larger amounts of data, simultaneous consideration of multiple alterations, consideration of genes both of known interest along with the novel, cross-sample comparisons among cell lines and patient samples, and consideration of directed questions while simultaneously gaining exposure to the novel provided by the omic approach. While at the current time integration of the two disciplines remains problematic, attempts to do so are ongoing, and will be necessary for the understanding of the large cell line screens including the Developmental Therapeutics Program's NCI-60, the Broad Institute's Cancer Cell Line Encyclopedia, and the Wellcome Trust Sanger Institute's Cancer Genome Project, as well as the the Cancer Genome Atlas clinical samples project. Going forward there is significant benefit to be had from the integration of the molecular biological and the omic forms or research, with the desired goal being improved translational understanding and application. PMID:26677427

  15. pClone: Synthetic Biology Tool Makes Promoter Research Accessible to Beginning Biology Students

    PubMed Central

    Eckdahl, Todd; Cronk, Brian; Andresen, Corinne; Frederick, Paul; Huckuntod, Samantha; Shinneman, Claire; Wacker, Annie; Yuan, Jason

    2014-01-01

    The Vision and Change report recommended genuine research experiences for undergraduate biology students. Authentic research improves science education, increases the number of scientifically literate citizens, and encourages students to pursue research. Synthetic biology is well suited for undergraduate research and is a growing area of science. We developed a laboratory module called pClone that empowers students to use advances in molecular cloning methods to discover new promoters for use by synthetic biologists. Our educational goals are consistent with Vision and Change and emphasize core concepts and competencies. pClone is a family of three plasmids that students use to clone a new transcriptional promoter or mutate a canonical promoter and measure promoter activity in Escherichia coli. We also developed the Registry of Functional Promoters, an open-access database of student promoter research results. Using pre- and posttests, we measured significant learning gains among students using pClone in introductory biology and genetics classes. Student posttest scores were significantly better than scores of students who did not use pClone. pClone is an easy and affordable mechanism for large-enrollment labs to meet the high standards of Vision and Change. PMID:26086659

  16. Algorithms in nature: the convergence of systems biology and computational thinking

    PubMed Central

    Navlakha, Saket; Bar-Joseph, Ziv

    2011-01-01

    Computer science and biology have enjoyed a long and fruitful relationship for decades. Biologists rely on computational methods to analyze and integrate large data sets, while several computational methods were inspired by the high-level design principles of biological systems. Recently, these two directions have been converging. In this review, we argue that thinking computationally about biological processes may lead to more accurate models, which in turn can be used to improve the design of algorithms. We discuss the similar mechanisms and requirements shared by computational and biological processes and then present several recent studies that apply this joint analysis strategy to problems related to coordination, network analysis, and tracking and vision. We also discuss additional biological processes that can be studied in a similar manner and link them to potential computational problems. With the rapid accumulation of data detailing the inner workings of biological systems, we expect this direction of coupling biological and computational studies to greatly expand in the future. PMID:22068329

  17. Cell stretching devices as research tools: engineering and biological considerations.

    PubMed

    Kamble, Harshad; Barton, Matthew J; Jun, Myeongjun; Park, Sungsu; Nguyen, Nam-Trung

    2016-08-16

    Cells within the human body are subjected to continuous, cyclic mechanical strain caused by various organ functions, movement, and growth. Cells are well known to have the ability to sense and respond to mechanical stimuli. This process is referred to as mechanotransduction. A better understanding of mechanotransduction is of great interest to clinicians and scientists alike to improve clinical diagnosis and understanding of medical pathology. However, the complexity involved in in vivo biological systems creates a need for better in vitro technologies, which can closely mimic the cells' microenvironment using induced mechanical strain. This technology gap motivates the development of cell stretching devices for better understanding of the cell response to mechanical stimuli. This review focuses on the engineering and biological considerations for the development of such cell stretching devices. The paper discusses different types of stretching concepts, major design consideration and biological aspects of cell stretching and provides a perspective for future development in this research area. PMID:27440436

  18. Accomplishment Summary 1968-1969. Biological Computer Laboratory.

    ERIC Educational Resources Information Center

    Von Foerster, Heinz; And Others

    This report summarizes theoretical, applied, and experimental studies in the areas of computational principles in complex intelligent systems, cybernetics, multivalued logic, and the mechanization of cognitive processes. This work is summarized under the following topic headings: properties of complex dynamic systems; computers and the language…

  19. Research on Computational Fluid Dynamics and Turbulence

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Preconditioning matrices for Chebyshev derivative operators in several space dimensions; the Jacobi matrix technique in computational fluid dynamics; and Chebyshev techniques for periodic problems are discussed.

  20. Invited Review Article: Advanced light microscopy for biological space research

    NASA Astrophysics Data System (ADS)

    De Vos, Winnok H.; Beghuin, Didier; Schwarz, Christian J.; Jones, David B.; van Loon, Jack J. W. A.; Bereiter-Hahn, Juergen; Stelzer, Ernst H. K.

    2014-10-01

    As commercial space flights have become feasible and long-term extraterrestrial missions are planned, it is imperative that the impact of space travel and the space environment on human physiology be thoroughly characterized. Scrutinizing the effects of potentially detrimental factors such as ionizing radiation and microgravity at the cellular and tissue level demands adequate visualization technology. Advanced light microscopy (ALM) is the leading tool for non-destructive structural and functional investigation of static as well as dynamic biological systems. In recent years, technological developments and advances in photochemistry and genetic engineering have boosted all aspects of resolution, readout and throughput, rendering ALM ideally suited for biological space research. While various microscopy-based studies have addressed cellular response to space-related environmental stressors, biological endpoints have typically been determined only after the mission, leaving an experimental gap that is prone to bias results. An on-board, real-time microscopical monitoring device can bridge this gap. Breadboards and even fully operational microscope setups have been conceived, but they need to be rendered more compact and versatile. Most importantly, they must allow addressing the impact of gravity, or the lack thereof, on physiologically relevant biological systems in space and in ground-based simulations. In order to delineate the essential functionalities for such a system, we have reviewed the pending questions in space science, the relevant biological model systems, and the state-of-the art in ALM. Based on a rigorous trade-off, in which we recognize the relevance of multi-cellular systems and the cellular microenvironment, we propose a compact, but flexible concept for space-related cell biological research that is based on light sheet microscopy.

  1. Invited review article: Advanced light microscopy for biological space research.

    PubMed

    De Vos, Winnok H; Beghuin, Didier; Schwarz, Christian J; Jones, David B; van Loon, Jack J W A; Bereiter-Hahn, Juergen; Stelzer, Ernst H K

    2014-10-01

    As commercial space flights have become feasible and long-term extraterrestrial missions are planned, it is imperative that the impact of space travel and the space environment on human physiology be thoroughly characterized. Scrutinizing the effects of potentially detrimental factors such as ionizing radiation and microgravity at the cellular and tissue level demands adequate visualization technology. Advanced light microscopy (ALM) is the leading tool for non-destructive structural and functional investigation of static as well as dynamic biological systems. In recent years, technological developments and advances in photochemistry and genetic engineering have boosted all aspects of resolution, readout and throughput, rendering ALM ideally suited for biological space research. While various microscopy-based studies have addressed cellular response to space-related environmental stressors, biological endpoints have typically been determined only after the mission, leaving an experimental gap that is prone to bias results. An on-board, real-time microscopical monitoring device can bridge this gap. Breadboards and even fully operational microscope setups have been conceived, but they need to be rendered more compact and versatile. Most importantly, they must allow addressing the impact of gravity, or the lack thereof, on physiologically relevant biological systems in space and in ground-based simulations. In order to delineate the essential functionalities for such a system, we have reviewed the pending questions in space science, the relevant biological model systems, and the state-of-the art in ALM. Based on a rigorous trade-off, in which we recognize the relevance of multi-cellular systems and the cellular microenvironment, we propose a compact, but flexible concept for space-related cell biological research that is based on light sheet microscopy. PMID:25362364

  2. Invited Review Article: Advanced light microscopy for biological space research

    SciTech Connect

    De Vos, Winnok H.; Beghuin, Didier; Schwarz, Christian J.; Jones, David B.; Loon, Jack J. W. A. van

    2014-10-15

    As commercial space flights have become feasible and long-term extraterrestrial missions are planned, it is imperative that the impact of space travel and the space environment on human physiology be thoroughly characterized. Scrutinizing the effects of potentially detrimental factors such as ionizing radiation and microgravity at the cellular and tissue level demands adequate visualization technology. Advanced light microscopy (ALM) is the leading tool for non-destructive structural and functional investigation of static as well as dynamic biological systems. In recent years, technological developments and advances in photochemistry and genetic engineering have boosted all aspects of resolution, readout and throughput, rendering ALM ideally suited for biological space research. While various microscopy-based studies have addressed cellular response to space-related environmental stressors, biological endpoints have typically been determined only after the mission, leaving an experimental gap that is prone to bias results. An on-board, real-time microscopical monitoring device can bridge this gap. Breadboards and even fully operational microscope setups have been conceived, but they need to be rendered more compact and versatile. Most importantly, they must allow addressing the impact of gravity, or the lack thereof, on physiologically relevant biological systems in space and in ground-based simulations. In order to delineate the essential functionalities for such a system, we have reviewed the pending questions in space science, the relevant biological model systems, and the state-of-the art in ALM. Based on a rigorous trade-off, in which we recognize the relevance of multi-cellular systems and the cellular microenvironment, we propose a compact, but flexible concept for space-related cell biological research that is based on light sheet microscopy.

  3. [Research activities in applied mathematics, fluid mechanics, and computer science

    NASA Technical Reports Server (NTRS)

    1995-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period April 1, 1995 through September 30, 1995.

  4. Research in Applied Mathematics, Fluid Mechanics and Computer Science

    NASA Technical Reports Server (NTRS)

    1999-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1998 through March 31, 1999.

  5. The Colorado Plateau: cultural, biological, and physical research

    USGS Publications Warehouse

    Cole, Kenneth L.

    2004-01-01

    Stretching from the four corners of Arizona, New Mexico, Colorado, and Utah, the Colorado Plateau is a natural laboratory for a wide range of studies. This volume presents 23 original articles drawn from more than 100 research projects presented at the Sixth Biennial Conference of Research on the Colorado Plateau. This scientific gathering revolved around research, inventory, and monitoring of lands in the region. The book's contents cover management techniques for cultural, biological, and physical resources, representing collaborative efforts among federal, university, and private sector scientists and land managers. Chapters on cultural concerns cover benchmarks of modern southwestern anthropological knowledge, models of past human activity and impact of modern visitation at newly established national monuments, challenges in implementing the 1964 Wilderness Act, and opportunities for increased federal research on Native American lands. The section on biological resources comprises sixteen chapters, with coverage that ranges from mammalian biogeography to responses of elk at the urban-wildland interface. Additional biological studies include the effects of fire and grazing on vegetation; research on bald eagles at Grand Canyon and tracking wild turkeys using radio collars; and management of palentological resources. Two final chapters on physical resources consider a proposed rerouting of the Rio de Flag River in urban Flagstaff, Arizona, and an examination of past climate patterns over the Plateau, using stream flow records and tree ring data. In light of similarities in habitat and climate across the Colorado Plateau, techniques useful to particular management units have been found to be applicable in many locations. This volume highlights an abundance of research that will prove useful for all of those working in the region, as well as for others seeking comparative studies that integrate research into land management actions.

  6. Computing in the Home: A Research Paradigm.

    ERIC Educational Resources Information Center

    Dutton, William; And Others

    1985-01-01

    Suggests typology as initial framework for study of patterns of computer use in the home, along with four sets of independent factors--social status, technical features, sociocultural setting, and personal attributes. This approach integrates patterns of computer utilization with technological, social, and psychological factors to account for…

  7. Applications of computer modeling to fusion research

    SciTech Connect

    Dawson, J.M.

    1989-01-01

    Progress achieved during this report period is presented on the following topics: Development and application of gyrokinetic particle codes to tokamak transport, development of techniques to take advantage of parallel computers; model dynamo and bootstrap current drive; and in general maintain our broad-based program in basic plasma physics and computer modeling.

  8. Current Trends and New Challenges of Databases and Web Applications for Systems Driven Biological Research

    PubMed Central

    Sreenivasaiah, Pradeep Kumar; Kim, Do Han

    2010-01-01

    Dynamic and rapidly evolving nature of systems driven research imposes special requirements on the technology, approach, design and architecture of computational infrastructure including database and Web application. Several solutions have been proposed to meet the expectations and novel methods have been developed to address the persisting problems of data integration. It is important for researchers to understand different technologies and approaches. Having familiarized with the pros and cons of the existing technologies, researchers can exploit its capabilities to the maximum potential for integrating data. In this review we discuss the architecture, design and key technologies underlying some of the prominent databases and Web applications. We will mention their roles in integration of biological data and investigate some of the emerging design concepts and computational technologies that are likely to have a key role in the future of systems driven biomedical research. PMID:21423387

  9. Computational mechanics and physics at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    South, Jerry C., Jr.

    1987-01-01

    An overview is given of computational mechanics and physics at NASA Langley Research Center. Computational analysis is a major component and tool in many of Langley's diverse research disciplines, as well as in the interdisciplinary research. Examples are given for algorithm development and advanced applications in aerodynamics, transition to turbulence and turbulence simulation, hypersonics, structures, and interdisciplinary optimization.

  10. Computational approaches to stochastic systems in physics and biology

    NASA Astrophysics Data System (ADS)

    Jeraldo Maldonado, Patricio Rodrigo

    calculation of the corresponding scaling laws. In Part II, I investigate the evolutionary dynamics of communities of microbes living in the gastrointestinal tracts of vertebrates, and ask to what degree their evolution is niche-driven, where organisms fitter to their environment become dominant, or if it is neutral, where the organisms evolve stochastically and are otherwise functionally equivalent within their communities. To that end, a series of computational tools were developed to pre-process, curate and reduce the data sets. In Chapter 4, I analyze the raw data for this research, namely short reads of 16S ribosomal RNA, and quantify how much of phylogenetic information is lost by using these short reads instead of full-length reads, and show that for lengths spanning 300 to 400 base pairs, we can recover some meaningful phylogenetic information. In Chapter 5, I introduce a pipeline for pre-processing, alignment and curation of libraries of short reads of rRNA. We show that this pipeline significantly reduces the artifacts usually associated with these sequences, resulting in better clustering of the sequences, and better phylogenetic trees representing their organismal relationships. In Chapter 6 I use the data processed with the above tools to analyze communities of microbes living in gastrointestinal tracts of vertebrates, and we ask to what extent the evolutionary dynamics of these communities is dominated by niche-based evolution, or if the communities behave neutrally, where evolution is random and all organisms are functionally equivalent. We conclude that there is evidence for strong niche-based dynamics, though we cannot fully discard some degree of neutral evolution. Finally, in Chapter 7 I propose a method to quantify the balance present in phylogenetic trees to compare a large-scale molecular phylogeny to full organismal taxonomies. I show that there is considerable structure in all of them, but direct comparison of both types of trees is difficult at the

  11. Facilities for Biological Research Aboard the International Space Station

    NASA Technical Reports Server (NTRS)

    Souza, Kenneth A.; Yost, Bruce D.; Berry, William E.; Johnson, Catherine C.

    1996-01-01

    A centrifuge designed as part of an integrated biological facility for installation onboard the International Space Station is presented. The requirements for the 2.5 m diameter centrifuge, which is designed for the support of biological experiments are discussed. The scientific objectives of the facility are to: provide a means of conducting fundamental studies in which gravitational acceleration is a controllable variable; provide a 1g control; determine the threshold acceleration for physiological response, and determine the value of centrifugation as a potential countermeasure for the biomedical problems associated with space flight. The implementation of the facility is reported on, and the following aspects of the facility are described: the host resources systems supply requirements such as power and data control; the habitat holding rack; the life sciences glove box; the centrifuge; the different habitats for cell culture, aquatic studies, plant research and insect research; the egg incubator, and the laboratory support equipment.

  12. Research Applications of Proteolytic Enzymes in Molecular Biology

    PubMed Central

    Mótyán, János András; Tóth, Ferenc; Tőzsér, József

    2013-01-01

    Proteolytic enzymes (also termed peptidases, proteases and proteinases) are capable of hydrolyzing peptide bonds in proteins. They can be found in all living organisms, from viruses to animals and humans. Proteolytic enzymes have great medical and pharmaceutical importance due to their key role in biological processes and in the life-cycle of many pathogens. Proteases are extensively applied enzymes in several sectors of industry and biotechnology, furthermore, numerous research applications require their use, including production of Klenow fragments, peptide synthesis, digestion of unwanted proteins during nucleic acid purification, cell culturing and tissue dissociation, preparation of recombinant antibody fragments for research, diagnostics and therapy, exploration of the structure-function relationships by structural studies, removal of affinity tags from fusion proteins in recombinant protein techniques, peptide sequencing and proteolytic digestion of proteins in proteomics. The aim of this paper is to review the molecular biological aspects of proteolytic enzymes and summarize their applications in the life sciences. PMID:24970197

  13. An overview of computer viruses in a research environment

    NASA Technical Reports Server (NTRS)

    Bishop, Matt

    1991-01-01

    The threat of attack by computer viruses is in reality a very small part of a much more general threat, specifically threats aimed at subverting computer security. Here, computer viruses are examined as a malicious logic in a research and development environment. A relation is drawn between the viruses and various models of security and integrity. Current research techniques aimed at controlling the threats posed to computer systems by threatening viruses in particular and malicious logic in general are examined. Finally, a brief examination of the vulnerabilities of research and development systems that malicious logic and computer viruses may exploit is undertaken.

  14. Quality of histone modification antibodies undermines chromatin biology research

    PubMed Central

    Kungulovski, Goran; Jeltsch, Albert

    2015-01-01

    Histone post-translational modification (PTM) antibodies are essential research reagents in chromatin biology. However, they suffer from variable properties and insufficient documentation of quality. Antibody manufacturers and vendors should provide detailed lot-specific documentation of quality, rendering further quality checks by end-customers unnecessary. A shift from polyclonal antibodies towards sustainable reagents like monoclonal or recombinant antibodies or histone binding domains would help to improve the reproducibility of experimental work in this field. PMID:26834995

  15. BRIC-60: Biological Research in Canisters (BRIC)-60

    NASA Technical Reports Server (NTRS)

    Richards, Stephanie E. (Compiler); Levine, Howard G.; Romero, Vergel

    2016-01-01

    The Biological Research in Canisters (BRIC) is an anodized-aluminum cylinder used to provide passive stowage for investigations evaluating the effects of space flight on small organisms. Specimens flown in the BRIC 60 mm petri dish (BRIC-60) hardware include Lycoperscion esculentum (tomato), Arabidopsis thaliana (thale cress), Glycine max (soybean) seedlings, Physarum polycephalum (slime mold) cells, Pothetria dispar (gypsy moth) eggs and Ceratodon purpureus (moss).

  16. openBIS: a flexible framework for managing and analyzing complex data in biology research

    PubMed Central

    2011-01-01

    Background Modern data generation techniques used in distributed systems biology research projects often create datasets of enormous size and diversity. We argue that in order to overcome the challenge of managing those large quantitative datasets and maximise the biological information extracted from them, a sound information system is required. Ease of integration with data analysis pipelines and other computational tools is a key requirement for it. Results We have developed openBIS, an open source software framework for constructing user-friendly, scalable and powerful information systems for data and metadata acquired in biological experiments. openBIS enables users to collect, integrate, share, publish data and to connect to data processing pipelines. This framework can be extended and has been customized for different data types acquired by a range of technologies. Conclusions openBIS is currently being used by several SystemsX.ch and EU projects applying mass spectrometric measurements of metabolites and proteins, High Content Screening, or Next Generation Sequencing technologies. The attributes that make it interesting to a large research community involved in systems biology projects include versatility, simplicity in deployment, scalability to very large data, flexibility to handle any biological data type and extensibility to the needs of any research domain. PMID:22151573

  17. Advances and Computational Tools towards Predictable Design in Biological Engineering

    PubMed Central

    2014-01-01

    The design process of complex systems in all the fields of engineering requires a set of quantitatively characterized components and a method to predict the output of systems composed by such elements. This strategy relies on the modularity of the used components or the prediction of their context-dependent behaviour, when parts functioning depends on the specific context. Mathematical models usually support the whole process by guiding the selection of parts and by predicting the output of interconnected systems. Such bottom-up design process cannot be trivially adopted for biological systems engineering, since parts function is hard to predict when components are reused in different contexts. This issue and the intrinsic complexity of living systems limit the capability of synthetic biologists to predict the quantitative behaviour of biological systems. The high potential of synthetic biology strongly depends on the capability of mastering this issue. This review discusses the predictability issues of basic biological parts (promoters, ribosome binding sites, coding sequences, transcriptional terminators, and plasmids) when used to engineer simple and complex gene expression systems in Escherichia coli. A comparison between bottom-up and trial-and-error approaches is performed for all the discussed elements and mathematical models supporting the prediction of parts behaviour are illustrated. PMID:25161694

  18. The Human Genome Project: Biology, Computers, and Privacy.

    ERIC Educational Resources Information Center

    Cutter, Mary Ann G.; Drexler, Edward; Gottesman, Kay S.; Goulding, Philip G.; McCullough, Laurence B.; McInerney, Joseph D.; Micikas, Lynda B.; Mural, Richard J.; Murray, Jeffrey C.; Zola, John

    This module, for high school teachers, is the second of two modules about the Human Genome Project (HGP) produced by the Biological Sciences Curriculum Study (BSCS). The first section of this module provides background information for teachers about the structure and objectives of the HGP, aspects of the science and technology that underlie the…

  19. Computational structural mechanics methods research using an evolving framework

    NASA Technical Reports Server (NTRS)

    Knight, N. F., Jr.; Lotts, C. G.; Gillian, R. E.

    1990-01-01

    Advanced structural analysis and computational methods that exploit high-performance computers are being developed in a computational structural mechanics research activity sponsored by the NASA Langley Research Center. These new methods are developed in an evolving framework and applied to representative complex structural analysis problems from the aerospace industry. An overview of the methods development environment is presented, and methods research areas are described. Selected application studies are also summarized.

  20. FOREWORD: Third Nordic Symposium on Computer Simulation in Physics, Chemistry, Biology and Mathematics

    NASA Astrophysics Data System (ADS)

    Kaski, K.; Salomaa, M.

    1990-01-01

    These are Proceedings of the Third Nordic Symposium on Computer Simulation in Physics, Chemistry, Biology, and Mathematics, held August 25-26, 1989, at Lahti (Finland). The Symposium belongs to an annual series of Meetings, the first one of which was arranged in 1987 at Lund (Sweden) and the second one in 1988 at Kolle-Kolle near Copenhagen (Denmark). Although these Symposia have thus far been essentially Nordic events, their international character has increased significantly; the trend is vividly reflected through contributions in the present Topical Issue. The interdisciplinary nature of Computational Science is central to the activity; this fundamental aspect is also responsible, in an essential way, for its rapidly increasing impact. Crucially important to a wide spectrum of superficially disparate fields is the common need for extensive - and often quite demanding - computational modelling. For such theoretical models, no closed-form (analytical) solutions are available or they would be extremely difficult to find; hence one must rather resort to the Art of performing computational investigations. Among the unifying features in the computational research are the methods of simulation employed; methods which frequently are quite closely related with each other even for faculties of science that are quite unrelated. Computer simulation in Natural Sciences is presently apprehended as a discipline on its own right, occupying a broad region somewhere between the experimental and theoretical methods, but also partially overlapping with and complementing them. - Whichever its proper definition may be, the computational approach serves as a novel and an extremely versatile tool with which one can equally well perform "pure" experimental modelling and conduct "computational theory". Computational studies that have earlier been made possible only through supercomputers have opened unexpected, as well as exciting, novel frontiers equally in mathematics (e.g., fractals

  1. Experimental Data from the Proteomics Research Center for Integrative Biology

    DOE Data Explorer

    Smith, Richard D.

    The possible roles and importance of proteomics are rapidly growing across essentially all areas of biological research. The precise and comprehensive measurement of levels of expressed proteins and their modified forms can provide new insights into the molecular nature of cell-signaling pathways and networks, the cell cycle, cellular differentiation, and other processes relevant to understanding human health and the progression of various disease states. The ability to characterize protein complexes complements this capability, allowing hypotheses to be tested and the biological system operation to be defined. The Proteomics Research Center for Integrative Biology is a national user facility established and funded by the National Institute of General Medical Sciences component of the National Institutes of Health. This Center has been established to serve the biomedical research community by developing and integrating new proteomic technologies for collaborative and service studies, disseminating the new technologies, and training scientists in their use. The Center is housed in DOE’s William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) at the Pacific Northwest National Laboratory.

  2. Biological underpinnings of psychogenic nonepileptic seizures: directions for future research.

    PubMed

    Asadi-Pooya, Ali A

    2016-07-01

    Psychogenic nonepileptic seizures (PNES) are relatively common occurrences in epilepsy centers, but their pathophysiology is still poorly understood. Research that elucidates the pathophysiology of PNES, including their neurobiological basis and biomarkers, may have important clinical implications. The literature provides some evidence that genetic factors, intrinsic factors, and environmental factors probably play a significant role as the biological underpinnings of PNES. Researchers may be able to learn more about the pathophysiology of PNES by investigating the effects of each of these factors on functional and structural brain connectivity. PMID:26956567

  3. Biological and chemical technologies research. FY 1995 annual summary report

    SciTech Connect

    1996-03-01

    The annual summary report presents the fiscal year (FY) 1995 research activities and accomplishments for the United States Department of Energy (DOE) Biological and Chemical Technologies Research (BCTR) Program. This BCTR program resides within the Office of Industrial Technologies (OIT) of the Office of Energy Efficiency and Renewable Energy (EE). The annual summary report for 1995 (ASR 95) contains the following: program description (including BCTR program mission statement, historical background, relevance, goals and objectives); program structure and organization, selected technical and programmatic highlights for 1995; detailed descriptions of individual projects; a listing of program output, including a bibliography of published work; patents; and awards arising from work supported by the BCTR.

  4. Biological Visualization, Imaging and Simulation(Bio-VIS) at NASA Ames Research Center: Developing New Software and Technology for Astronaut Training and Biology Research in Space

    NASA Technical Reports Server (NTRS)

    Smith, Jeffrey

    2003-01-01

    The Bio- Visualization, Imaging and Simulation (BioVIS) Technology Center at NASA's Ames Research Center is dedicated to developing and applying advanced visualization, computation and simulation technologies to support NASA Space Life Sciences research and the objectives of the Fundamental Biology Program. Research ranges from high resolution 3D cell imaging and structure analysis, virtual environment simulation of fine sensory-motor tasks, computational neuroscience and biophysics to biomedical/clinical applications. Computer simulation research focuses on the development of advanced computational tools for astronaut training and education. Virtual Reality (VR) and Virtual Environment (VE) simulation systems have become important training tools in many fields from flight simulation to, more recently, surgical simulation. The type and quality of training provided by these computer-based tools ranges widely, but the value of real-time VE computer simulation as a method of preparing individuals for real-world tasks is well established. Astronauts routinely use VE systems for various training tasks, including Space Shuttle landings, robot arm manipulations and extravehicular activities (space walks). Currently, there are no VE systems to train astronauts for basic and applied research experiments which are an important part of many missions. The Virtual Glovebox (VGX) is a prototype VE system for real-time physically-based simulation of the Life Sciences Glovebox where astronauts will perform many complex tasks supporting research experiments aboard the International Space Station. The VGX consists of a physical display system utilizing duel LCD projectors and circular polarization to produce a desktop-sized 3D virtual workspace. Physically-based modeling tools (Arachi Inc.) provide real-time collision detection, rigid body dynamics, physical properties and force-based controls for objects. The human-computer interface consists of two magnetic tracking devices

  5. Division of Biological and Medical Research annual technical report, 1981

    SciTech Connect

    Rosenthal, M.W.

    1982-06-01

    This report summarizes research during 1981 in the Division of Biological and Medical Research, Argonne National Laboratory. Studies in Low Level Radiation include comparison of lifetime effects in mice of low level neutron and gamma irradiation, delineation of the responses of dogs to continuous low level gamma irradiation, elucidation of mechanisms of radiation damage and repair in mammalian cells, and study of the genetic effects of high LET radiations. Carcinogenesis research addresses mechanisms of tumor initiation and promotion in rat liver, chemical carcinogenesis in cultured mammalian cells, and molecular and genetic mechanisms of chemical and ultraviolet mutagenesis in bacteria. Research in Toxicology uses a variety of cellular, whole animal, and chronobiological end points, chemical separations, and statistical models to evaluate the hazards and mechanisms of actions of metals, coal gasification by products, and other energy-related pollutants. Human Protein Index studies develop two-dimensional electrophoresis systems for diagnosis and detection of cancer and other disease. Biophysics research includes fundamental structural and biophysical investigations of immunoglobulins and key biological molecules using NMR, crystallographic, and x-ray and neutron small-angle scattering techniques. The final sections cover support facilities, educational activities, seminars, staff talks, staff, and funding agencies.

  6. [Research toward a heterogeneous networked computing cluster

    SciTech Connect

    Duke, D.W.; Green, T.P.

    1998-08-11

    Over the last year the Systems Development Group, SDG, has been involved in a number of projects. The primary projects include extending the UNIX version of DQS, a DCE version of DQS, a Java based queuing system, a Computer Aided Learning and Instruction model and working with the Florida Department of Law Enforcement in the formation of the Florida Computer Crime Center. Additionally SDG has assisted a number of state and local agencies. A synopsis of these projects is contained in this report.

  7. COMPUTER-ASSISTED STUDIES OF MOLECULAR STRUCTURE-BIOLOGICAL ACTIVITY RELATIONSHIPS

    EPA Science Inventory

    Computer-assisted methods can be used to investigate the relationships between the molecular structures of compounds and their biological activity. A number of approaches have been reported in the literature, including correlations of activity with substituent constants, conforma...

  8. Division of Biological and Medical Research research summary 1984-1985

    SciTech Connect

    Barr, S.H.

    1985-08-01

    The Division of Biological and Medical Research at Argonne National Laboratory conducts multidisciplinary research aimed at defining the biological and medical hazards to man from energy technologies and new energy options. These technically oriented studies have a strong base in fundamental research in a variety of scientific disciplines, including molecular and cellular biology, biophysics, genetics, radiobiology, pharmacology, biochemistry, chemistry, environmental toxicology, and epidemiology. This research summary is organized into six parts. The first five parts reflect the Divisional structure and contain the scientific program chapters, which summarize the activities of the individual groups during the calendar year 1984 and the first half of 1985. To provide better continuity and perspective, previous work is sometimes briefly described. Although the summaries are short, efforts have been made to indicate the range of research activities for each group.

  9. Workshop in computational molecular biology, April 15, 1991--April 14, 1994

    SciTech Connect

    Tavare, S.

    1995-04-12

    Funds from this award were used to the Workshop in Computational Molecular Biology, `91 Symposium entitled Interface: Computing Science and Statistics, Seattle, Washington, April 21, 1991; the Workshop in Statistical Issues in Molecular Biology held at Stanford, California, August 8, 1993; and the Session on Population Genetics a part of the 56th Annual Meeting, Institute of Mathematical Statistics, San Francisco, California, August 9, 1993.

  10. Virtual Cell: computational tools for modeling in cell biology

    PubMed Central

    Resasco, Diana C.; Gao, Fei; Morgan, Frank; Novak, Igor L.; Schaff, James C.; Slepchenko, Boris M.

    2011-01-01

    The Virtual Cell (VCell) is a general computational framework for modeling physico-chemical and electrophysiological processes in living cells. Developed by the National Resource for Cell Analysis and Modeling at the University of Connecticut Health Center, it provides automated tools for simulating a wide range of cellular phenomena in space and time, both deterministically and stochastically. These computational tools allow one to couple electrophysiology and reaction kinetics with transport mechanisms, such as diffusion and directed transport, and map them onto spatial domains of various shapes, including irregular three-dimensional geometries derived from experimental images. In this article, we review new robust computational tools recently deployed in VCell for treating spatially resolved models. PMID:22139996

  11. Computation and graphics in mathematical research

    SciTech Connect

    Hoffman, D.A.; Spruck, J.

    1992-08-13

    This report discusses: The description of the GANG Project and results for prior research; the center for geometry, analysis, numerics and graphics; description of GANG Laboratory; software development at GANG; and mathematical and scientific research activities.

  12. Computational molecular biology approaches to ligand-target interactions

    PubMed Central

    Lupieri, Paola; Nguyen, Chuong Ha Hung; Bafghi, Zhaleh Ghaemi; Giorgetti, Alejandro; Carloni, Paolo

    2009-01-01

    Binding of small molecules to their targets triggers complex pathways. Computational approaches are keys for predictions of the molecular events involved in such cascades. Here we review current efforts at characterizing the molecular determinants in the largest membrane-bound receptor family, the G-protein-coupled receptors (GPCRs). We focus on odorant receptors, which constitute more than half GPCRs. The work presented in this review uncovers structural and energetic aspects of components of the cellular cascade. Finally, a computational approach in the context of radioactive boron-based antitumoral therapies is briefly described. PMID:20119480

  13. A Contribution of the Computer to Biology Education at the University.

    ERIC Educational Resources Information Center

    Anxolabehere, D.; And Others

    1980-01-01

    Described is part of the O.P.E. laboratory computer-based biology program designed for undergraduate medical and biology students. Described is an embryology dialogue in which the student proceeds through three stages in the knowledge and understanding of the concept of competence. (Author/DS)

  14. The Effects of 3D Computer Simulation on Biology Students' Achievement and Memory Retention

    ERIC Educational Resources Information Center

    Elangovan, Tavasuria; Ismail, Zurida

    2014-01-01

    A quasi experimental study was conducted for six weeks to determine the effectiveness of two different 3D computer simulation based teaching methods, that is, realistic simulation and non-realistic simulation on Form Four Biology students' achievement and memory retention in Perak, Malaysia. A sample of 136 Form Four Biology students in Perak,…

  15. Methods of information geometry in computational system biology (consistency between chemical and biological evolution).

    PubMed

    Astakhov, Vadim

    2009-01-01

    Interest in simulation of large-scale metabolic networks, species development, and genesis of various diseases requires new simulation techniques to accommodate the high complexity of realistic biological networks. Information geometry and topological formalisms are proposed to analyze information processes. We analyze the complexity of large-scale biological networks as well as transition of the system functionality due to modification in the system architecture, system environment, and system components. The dynamic core model is developed. The term dynamic core is used to define a set of causally related network functions. Delocalization of dynamic core model provides a mathematical formalism to analyze migration of specific functions in biosystems which undergo structure transition induced by the environment. The term delocalization is used to describe these processes of migration. We constructed a holographic model with self-poetic dynamic cores which preserves functional properties under those transitions. Topological constraints such as Ricci flow and Pfaff dimension were found for statistical manifolds which represent biological networks. These constraints can provide insight on processes of degeneration and recovery which take place in large-scale networks. We would like to suggest that therapies which are able to effectively implement estimated constraints, will successfully adjust biological systems and recover altered functionality. Also, we mathematically formulate the hypothesis that there is a direct consistency between biological and chemical evolution. Any set of causal relations within a biological network has its dual reimplementation in the chemistry of the system environment. PMID:19623488

  16. DEVELOPMENT OF COMPUTATIONAL TOOLS FOR OPTIMAL IDENTIFICATION OF BIOLOGICAL NETWORKS

    EPA Science Inventory

    Following the theoretical analysis and computer simulations, the next step for the development of SNIP will be a proof-of-principle laboratory application. Specifically, we have obtained a synthetic transcriptional cascade (harbored in Escherichia coli...

  17. An Introduction to Computer Assisted Analysis in the Biological Sciences.

    ERIC Educational Resources Information Center

    Banaugh, R. P.

    This set of notes is designed to introduce the student to the development and use of computer-based models, and to analyze quantitative phenomena in the life sciences. Only BASIC programming language is used. The ten chapter titles are: The Growth of a Single Species; The Association of Two Species; Parameter Determination; Automated Parameter…

  18. Division of Biological and Medical Research annual technical report 1982

    SciTech Connect

    Rosenthal, M.W.

    1983-05-01

    This report summarizes research during 1982 in the Division of Biological and Medical Research, Argonne National Laboratory. Studies in Carcinogenesis address mechanisms of chemical and radiation carcinogenesis including the processes of tumor initiation and promotion. The studies employ rat liver and mouse skin models as well as human rodent cell culture systems. The use of liposomes for metal mobilization is also explored. Low Level Radiation studies include delineation of the hematopoietic and other responses of dogs to continuous low level gamma irradiation, comparison of lifetime effects in mice of low level neutron and gamma irradiation, and study of the genetic effects of high LET radiation. Molecular Biology research develops two-dimensional electrophoresis systems for diagnosis and detection of cancer and other diseases. Fundamental structural and biophysical investigations of immunoglobulins and other key proteins are included, as are studies of cell growth, and of molecular and cellular effects of solar uv light. Research in Toxicology uses cellular, physiological, whole animal, and chronobiological end points and chemical separations to elucidate mechanisms and evaluate hazards of coal conversion by-products, actinides, and toxic metals. The final sections cover support facilities, educational activities, seminars, staff talks, staff, and funding agencies.

  19. Scalable Computational Methods for the Analysis of High-Throughput Biological Data

    SciTech Connect

    Langston, Michael A

    2012-09-06

    This primary focus of this research project is elucidating genetic regulatory mechanisms that control an organism's responses to low-dose ionizing radiation. Although low doses (at most ten centigrays) are not lethal to humans, they elicit a highly complex physiological response, with the ultimate outcome in terms of risk to human health unknown. The tools of molecular biology and computational science will be harnessed to study coordinated changes in gene expression that orchestrate the mechanisms a cell uses to manage the radiation stimulus. High performance implementations of novel algorithms that exploit the principles of fixed-parameter tractability will be used to extract gene sets suggestive of co-regulation. Genomic mining will be performed to scrutinize, winnow and highlight the most promising gene sets for more detailed investigation. The overall goal is to increase our understanding of the health risks associated with exposures to low levels of radiation.

  20. Physical and Computational Modeling for Chemical and Biological Weapons Airflow Applications

    SciTech Connect

    McEligot, Donald Marinus; Mc Creery, Glenn Ernest; Pink, Robert John; Barringer, C.; Knight, K. J.

    2002-11-01

    There is a need for information on dispersion and infiltration of chemical and biological agents in complex building environments. A recent collaborative study conducted at the Idaho National Engineering and Environmental Laboratory (INEEL) and Bechtel Corporation Research and Development had the objective of assessing computational fluid dynamics (CFD) models for simulation of flow around complicated buildings through a comparison of experimental and numerical results. The test facility used in the experiments was INEEL’s unique large Matched-Index-of-Refraction (MIR) flow system. The CFD code used for modeling was Fluent, a widely available commercial flow simulation package. For the experiment, a building plan was selected to approximately represent an existing facility. It was found that predicted velocity profiles from above the building and in front of the building were in good agreement with the measurements.

  1. Research in applied mathematics, numerical analysis, and computer science

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  2. Electrical and chemical sensors for biological cell research

    NASA Astrophysics Data System (ADS)

    Edell, D. J.; McNeil, V. M.; Curley, M. G.; Wolfe, J. H.

    Electrical and chemical microsensors for biological cell research allow for the continuous study of biological systems under normal physiological conditions. Two sensor technologies which take most advantage of microfabrication technology are discussed. One is being developed for monitoring the environment of cancer cells during radiotherapy, chemotherapy, and hyperthermia treatment. Of current interest is the measurement of temperature and interstitial free oxygen concentration distributions in cancer tissues prior to and during various treatments. The second technology discussed is being developed for monitoring the extracellular ionic currents from electrogenic cells in culture. The ability to build integrated circuits over large areas of a silicon wafer which can impedance transform the signals and multiplex a large array of contacts is being used.

  3. DOE research in utilization of high-performance computers

    SciTech Connect

    Buzbee, B.L.; Worlton, W.J.; Michael, G.; Rodrigue, G.

    1980-12-01

    Department of Energy (DOE) and other Government research laboratories depend on high-performance computer systems to accomplish their programatic goals. As the most powerful computer systems become available, they are acquired by these laboratories so that advances can be made in their disciplines. These advances are often the result of added sophistication to numerical models whose execution is made possible by high-performance computer systems. However, high-performance computer systems have become increasingly complex; consequently, it has become increasingly difficult to realize their potential performance. The result is a need for research on issues related to the utilization of these systems. This report gives a brief description of high-performance computers, and then addresses the use of and future needs for high-performance computers within DOE, the growing complexity of applications within DOE, and areas of high-performance computer systems warranting research. 1 figure.

  4. Re-Centering the Research Computing Enterprise

    ERIC Educational Resources Information Center

    McRobbie, Michael A.

    2006-01-01

    The insatiable institutional demands for computing cycles, network bandwidth, and storage clearly demonstrate that IT is a mission-critical function in nearly all areas of higher education. Not too long ago, the important issue for the central data center was physical size and floor space. As IT leaders struggle to meet relentlessly increasing…

  5. The Schematic Structure of Computer Science Research Articles.

    ERIC Educational Resources Information Center

    Posteguillo, Santiago

    1999-01-01

    Presents a linguistic description of the schematic organization of 40 journal articles from three academic journals in computing research. Results indicate the introduction-methods-results-discussion research reporting pattern can not be applied to computer science articles, with the central part (methods- results) departing most from the…

  6. Gold nanoparticles in model biological membranes: A computational perspective.

    PubMed

    Rossi, Giulia; Monticelli, Luca

    2016-10-01

    The electronic, optical, catalytic, and magnetic properties of metal nanoparticles (NPs) make them extremely interesting for biomedical applications. In this rapidly moving field, monolayer-protected gold nanoparticles emerge both as a reference system and as promising candidates for drug and gene delivery, photothermal treatment, and imaging applications. Despite the technological relevance, there is still poor understanding of the molecular processes driving the interactions of metal nanoparticles with cells, and with cell membranes in particular. In this paper we review molecular-level computational studies of the interaction between monolayer-protected gold NPs and model lipid membranes. Our review comprises a brief description of the most relevant experimental results in this field and of the questions they raised, followed by a description of the computational achievements reported so far. This article is part of a Special Issue entitled: Biosimulations edited by Ilpo Vattulainen and Tomasz Róg. PMID:27060434

  7. Computing the structural influence matrix for biological systems.

    PubMed

    Giordano, Giulia; Cuba Samaniego, Christian; Franco, Elisa; Blanchini, Franco

    2016-06-01

    We consider the problem of identifying structural influences of external inputs on steady-state outputs in a biological network model. We speak of a structural influence if, upon a perturbation due to a constant input, the ensuing variation of the steady-state output value has the same sign as the input (positive influence), the opposite sign (negative influence), or is zero (perfect adaptation), for any feasible choice of the model parameters. All these signs and zeros can constitute a structural influence matrix, whose (i, j) entry indicates the sign of steady-state influence of the jth system variable on the ith variable (the output caused by an external persistent input applied to the jth variable). Each entry is structurally determinate if the sign does not depend on the choice of the parameters, but is indeterminate otherwise. In principle, determining the influence matrix requires exhaustive testing of the system steady-state behaviour in the widest range of parameter values. Here we show that, in a broad class of biological networks, the influence matrix can be evaluated with an algorithm that tests the system steady-state behaviour only at a finite number of points. This algorithm also allows us to assess the structural effect of any perturbation, such as variations of relevant parameters. Our method is applied to nontrivial models of biochemical reaction networks and population dynamics drawn from the literature, providing a parameter-free insight into the system dynamics. PMID:26395779

  8. Machine learning in cell biology - teaching computers to recognize phenotypes.

    PubMed

    Sommer, Christoph; Gerlich, Daniel W

    2013-12-15

    Recent advances in microscope automation provide new opportunities for high-throughput cell biology, such as image-based screening. High-complex image analysis tasks often make the implementation of static and predefined processing rules a cumbersome effort. Machine-learning methods, instead, seek to use intrinsic data structure, as well as the expert annotations of biologists to infer models that can be used to solve versatile data analysis tasks. Here, we explain how machine-learning methods work and what needs to be considered for their successful application in cell biology. We outline how microscopy images can be converted into a data representation suitable for machine learning, and then introduce various state-of-the-art machine-learning algorithms, highlighting recent applications in image-based screening. Our Commentary aims to provide the biologist with a guide to the application of machine learning to microscopy assays and we therefore include extensive discussion on how to optimize experimental workflow as well as the data analysis pipeline. PMID:24259662

  9. Impact of Interdisciplinary Undergraduate Research in Mathematics and Biology on the Development of a New Course Integrating Five STEM Disciplines

    PubMed Central

    Caudill, Lester; Hill, April; Lipan, Ovidiu

    2010-01-01

    Funded by innovative programs at the National Science Foundation and the Howard Hughes Medical Institute, University of Richmond faculty in biology, chemistry, mathematics, physics, and computer science teamed up to offer first- and second-year students the opportunity to contribute to vibrant, interdisciplinary research projects. The result was not only good science but also good science that motivated and informed course development. Here, we describe four recent undergraduate research projects involving students and faculty in biology, physics, mathematics, and computer science and how each contributed in significant ways to the conception and implementation of our new Integrated Quantitative Science course, a course for first-year students that integrates the material in the first course of the major in each of biology, chemistry, mathematics, computer science, and physics. PMID:20810953

  10. Signal-to-Noise Ratio Measures Efficacy of Biological Computing Devices and Circuits

    PubMed Central

    Beal, Jacob

    2015-01-01

    Engineering biological cells to perform computations has a broad range of important potential applications, including precision medical therapies, biosynthesis process control, and environmental sensing. Implementing predictable and effective computation, however, has been extremely difficult to date, due to a combination of poor composability of available parts and of insufficient characterization of parts and their interactions with the complex environment in which they operate. In this paper, the author argues that this situation can be improved by quantitative signal-to-noise analysis of the relationship between computational abstractions and the variation and uncertainty endemic in biological organisms. This analysis takes the form of a ΔSNRdB function for each computational device, which can be computed from measurements of a device’s input/output curve and expression noise. These functions can then be combined to predict how well a circuit will implement an intended computation, as well as evaluating the general suitability of biological devices for engineering computational circuits. Applying signal-to-noise analysis to current repressor libraries shows that no library is currently sufficient for general circuit engineering, but also indicates key targets to remedy this situation and vastly improve the range of computations that can be used effectively in the implementation of biological applications. PMID:26177070

  11. Signal-to-Noise Ratio Measures Efficacy of Biological Computing Devices and Circuits.

    PubMed

    Beal, Jacob

    2015-01-01

    Engineering biological cells to perform computations has a broad range of important potential applications, including precision medical therapies, biosynthesis process control, and environmental sensing. Implementing predictable and effective computation, however, has been extremely difficult to date, due to a combination of poor composability of available parts and of insufficient characterization of parts and their interactions with the complex environment in which they operate. In this paper, the author argues that this situation can be improved by quantitative signal-to-noise analysis of the relationship between computational abstractions and the variation and uncertainty endemic in biological organisms. This analysis takes the form of a ΔSNRdB function for each computational device, which can be computed from measurements of a device's input/output curve and expression noise. These functions can then be combined to predict how well a circuit will implement an intended computation, as well as evaluating the general suitability of biological devices for engineering computational circuits. Applying signal-to-noise analysis to current repressor libraries shows that no library is currently sufficient for general circuit engineering, but also indicates key targets to remedy this situation and vastly improve the range of computations that can be used effectively in the implementation of biological applications. PMID:26177070

  12. A Computer-Aided Self-Testing System for Biological Psychology.

    ERIC Educational Resources Information Center

    Leiblum, M. D.; And Others

    1994-01-01

    Describes the production of a computer-aided, self-testing system for university students enrolled in a first-year course in biological psychology. Project aspects described include selection, acquisition and description of software; question banks and test structures; modes of use (computer or printed version); evaluation; and future plans. (11…

  13. Using Mathematics to Bridge the Gap between Biology and Computer Science

    ERIC Educational Resources Information Center

    Hammerman, Natalie; Tolvo, Anthony; Goldberg, Robert

    2004-01-01

    The rapid rate of expansion of the disciplines of biotechnology, genomics, and bioinformatics emphasizes the increased interdependency between computer science and biology, with mathematics serving as the bridge between these disciplines. This paper demonstrates this inter-relationship within the context of a computational model for a biological…

  14. Xenopus laevis a success story of biological research in space

    NASA Astrophysics Data System (ADS)

    Horn, Eberhard R.

    2006-01-01

    The clawed toad Xenopus laevis is a common experimental animal used in many disciplines of life sciences, such as integrative, developmental and molecular biology or experimental medicine. Since 30 years, Xenopus is used in biological research in space. Important milestones were the years 1975, when Xenopus embryos flew for the first time on the Russian space station Salut-4 and 1994, when Xenopus eggs were successfully fertilized for the first time in space during the Japanese Spacelab mission STS-47 and developed in microgravity to vital tadpoles. Most Xenopus studies were related to embryogenesis and development. Observations during and after altered gravity revealed changes such as the thickening of the blastocoel roof, the dorsalization of the tail, and modifications of vestibular reflexes, fictive and freely swimming. Many changes were reversible even during microgravity exposure. Studies about the vestibuloocular reflex or synapse formation revealed an age-related sensitivity to altered gravity. Xenopus offers useful tools for studies about microgravity effects on living systems. Its oocyte is a suitable model to study ion channel function in space; the dorsalization model can be used to analyse growth factor sensibilities. Hardware for life support of adults, tadpoles and embryos (cf. SUPPLY unit in combination with miniaquaria) as well as for controlled experiments in space are prerequisites for an extension of research with Xenopus. The application aspect is based on the fact that fundamental research per se brings benefit to man.

  15. Research in thermal biology: Burning questions for coldwater stream fishes

    USGS Publications Warehouse

    McCullough, D.A.; Bartholow, J.M.; Jager, H.I.; Beschta, R.L.; Cheslak, E.F.; Deas, M.L.; Ebersole, J.L.; Foott, J.S.; Johnson, S.L.; Marine, K.R.; Mesa, M.G.; Petersen, J.H.; Souchon, Y.; Tiffan, K.F.; Wurtsbaugh, W.A.

    2009-01-01

    With the increasing appreciation of global warming impacts on ecological systems, in addition to the myriad of land management effects on water quality, the number of literature citations dealing with the effects of water temperature on freshwater fish has escalated in the past decade. Given the many biological scales at which water temperature effects have been studied, and the growing need to integrate knowledge from multiple disciplines of thermal biology to fully protect beneficial uses, we held that a survey of the most promising recent developments and an expression of some of the remaining unanswered questions with significant management implications would best be approached collectively by a diverse research community. We have identified five specific topic areas of renewed research where new techniques and critical thought could benefit coldwater stream fishes (particularly salmonids): molecular, organism, population/species, community and ecosystem, and policy issues in water quality. Our hope is that information gained through examination of recent research fronts linking knowledge at various scales will prove useful in managing water quality at a basin level to protect fish populations and whole ecosystems. Standards of the past were based largely on incipient lethal and optimum growth rate temperatures for fish species, while future standards should consider all integrated thermal impacts to the organism and ecosystem. ?? Taylor and Francis Group, LLC.

  16. Research in thermal biology: Burning questions for coldwater stream fishes

    SciTech Connect

    McCullough, Dr. Dale; Bartholow, Dr. John; Jager, Yetta; al., et.

    2009-01-01

    With the increasing appreciation of global warming impacts on ecological systems in addition to the myriad of land management effects on water quality, the number of literature citations dealing with the effects of water temperature on freshwater fish has escalated in the past decade. Given the many biological scales at which water temperature effects have been studied and the growing need to integrate knowledge from multiple disciplines of thermal biology to fully protect beneficial uses, we held that a survey of the most promising recent developments and an expression of some of the remaining unanswered questions with significant management implications would best be approached collectively by a diverse research community. We have identified five specific topic areas of renewed research where new techniques and critical thought could benefit coldwater stream fishes (particularly salmonids): molecular, organism, population/species, community and ecosystem, and policy issues in water quality. Our hope is that information gained through examination of recent research fronts linking knowledge at various scales will prove useful in managing water quality at a basin level to protect fish populations and whole ecosystems. Standards of the past were based largely on incipient lethal and optimum growth rate temperatures for fish species, while future standards should consider all integrated thermal impacts to the organism and ecosystem.

  17. Explorations: A Research-Based Program Introducing Undergraduates to Diverse Biology Research Topics Taught by Grad Students and Postdocs

    ERIC Educational Resources Information Center

    Brownell, Sara E.; Khalfan, Waheeda; Bergmann, Dominique; Simoni, Robert

    2013-01-01

    Undergraduate biology majors are often overwhelmed by and underinformed about the diversity and complexity of biological research that is conducted on research-intensive campuses. We present a program that introduces undergraduates to the diversity and scope of biological research and also provides unique teaching opportunities for graduate…

  18. Meeting report from the fourth meeting of the Computational Modeling in Biology Network (COMBINE)

    PubMed Central

    Waltemath, Dagmar; Bergmann, Frank T.; Chaouiya, Claudine; Czauderna, Tobias; Gleeson, Padraig; Goble, Carole; Golebiewski, Martin; Hucka, Michael; Juty, Nick; Krebs, Olga; Le Novère, Nicolas; Mi, Huaiyu; Moraru, Ion I.; Myers, Chris J.; Nickerson, David; Olivier, Brett G.; Rodriguez, Nicolas; Schreiber, Falk; Smith, Lucian; Zhang, Fengkai; Bonnet, Eric

    2014-01-01

    The Computational Modeling in Biology Network (COMBINE) is an initiative to coordinate the development of community standards and formats in computational systems biology and related fields. This report summarizes the topics and activities of the fourth edition of the annual COMBINE meeting, held in Paris during September 16-20 2013, and attended by a total of 96 people. This edition pioneered a first day devoted to modeling approaches in biology, which attracted a broad audience of scientists thanks to a panel of renowned speakers. During subsequent days, discussions were held on many subjects including the introduction of new features in the various COMBINE standards, new software tools that use the standards, and outreach efforts. Significant emphasis went into work on extensions of the SBML format, and also into community-building. This year’s edition once again demonstrated that the COMBINE community is thriving, and still manages to help coordinate activities between different standards in computational systems biology.

  19. [Research under reduced gravity. Part I: bases of gravitational biology].

    PubMed

    Volkmann, D; Sievers, A

    1992-02-01

    The orientation of organisms in space and their morphogenesis in relation to the gravitational field of the Earth are the main topics of research in the field of gravitational biology. For more than 100 years clinostats provided the only possibility to simulate physiological weightlessness. In contrast to animals, plants are characterized by intracellular gravireceptors. Nevertheless, there are some indications, e.g., the minimal energy of approx. 10(-18) J triggering a gravity-dependent response, for similar mechanisms of gravity perception. Stretch-activated ion channels might be the common structural basis. PMID:11536493

  20. Computer architectures for computational physics work done by Computational Research and Technology Branch and Advanced Computational Concepts Group

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Slides are reproduced that describe the importance of having high performance number crunching and graphics capability. They also indicate the types of research and development underway at Ames Research Center to ensure that, in the near term, Ames is a smart buyer and user, and in the long-term that Ames knows the best possible solutions for number crunching and graphics needs. The drivers for this research are real computational physics applications of interest to Ames and NASA. They are concerned with how to map the applications, and how to maximize the physics learned from the results of the calculations. The computer graphics activities are aimed at getting maximum information from the three-dimensional calculations by using the real time manipulation of three-dimensional data on the Silicon Graphics workstation. Work is underway on new algorithms that will permit the display of experimental results that are sparse and random, the same way that the dense and regular computed results are displayed.

  1. Activation of PPARδ: from computer modelling to biological effects.

    PubMed

    Kahremany, Shirin; Livne, Ariela; Gruzman, Arie; Senderowitz, Hanoch; Sasson, Shlomo

    2015-02-01

    PPARδ is a ligand-activated receptor that dimerizes with another nuclear receptor of the retinoic acid receptor family. The dimers interact with other co-activator proteins and form active complexes that bind to PPAR response elements and promote transcription of genes involved in lipid metabolism. It appears that various natural fatty acids and their metabolites serve as endogenous activators of PPARδ; however, there is no consensus in the literature on the nature of the prime activators of the receptor. In vitro and cell-based assays of PPARδ activation by fatty acids and their derivatives often produce conflicting results. The search for synthetic and selective PPARδ agonists, which may be pharmacologically useful, is intense. Current rational modelling used to obtain such compounds relies mostly on crystal structures of synthetic PPARδ ligands with the recombinant ligand binding domain (LBD) of the receptor. Here, we introduce an original computational prediction model for ligand binding to PPARδ LBD. The model was built based on EC50 data of 16 ligands with available crystal structures and validated by calculating binding probabilities of 82 different natural and synthetic compounds from the literature. These compounds were independently tested in cell-free and cell-based assays for their capacity to bind or activate PPARδ, leading to prediction accuracy of between 70% and 93% (depending on ligand type). This new computational tool could therefore be used in the search for natural and synthetic agonists of the receptor. PMID:25255770

  2. Caenorhabditis elegans, a Biological Model for Research in Toxicology.

    PubMed

    Tejeda-Benitez, Lesly; Olivero-Verbel, Jesus

    2016-01-01

    Caenorhabditis elegans is a nematode of microscopic size which, due to its biological characteristics, has been used since the 1970s as a model for research in molecular biology, medicine, pharmacology, and toxicology. It was the first animal whose genome was completely sequenced and has played a key role in the understanding of apoptosis and RNA interference. The transparency of its body, short lifespan, ability to self-fertilize and ease of culture are advantages that make it ideal as a model in toxicology. Due to the fact that some of its biochemical pathways are similar to those of humans, it has been employed in research in several fields. C. elegans' use as a biological model in environmental toxicological assessments allows the determination of multiple endpoints. Some of these utilize the effects on the biological functions of the nematode and others use molecular markers. Endpoints such as lethality, growth, reproduction, and locomotion are the most studied, and usually employ the wild type Bristol N2 strain. Other endpoints use reporter genes, such as green fluorescence protein, driven by regulatory sequences from other genes related to different mechanisms of toxicity, such as heat shock, oxidative stress, CYP system, and metallothioneins among others, allowing the study of gene expression in a manner both rapid and easy. These transgenic strains of C. elegans represent a powerful tool to assess toxicity pathways for mixtures and environmental samples, and their numbers are growing in diversity and selectivity. However, other molecular biology techniques, including DNA microarrays and MicroRNAs have been explored to assess the effects of different toxicants and samples. C. elegans has allowed the assessment of neurotoxic effects for heavy metals and pesticides, among those more frequently studied, as the nematode has a very well defined nervous system. More recently, nanoparticles are emergent pollutants whose toxicity can be explored using this nematode

  3. Computer-Assisted Microscopy in Science Teaching and Research.

    ERIC Educational Resources Information Center

    Radice, Gary P.

    1997-01-01

    Describes a technological approach to teaching the relationships between biological form and function. Computer-assisted image analysis was integrated into a microanatomy course. Students spend less time memorizing and more time observing, measuring, and interpreting, building technical and analytical skills. Appendices list hardware and software…

  4. Fundamental research on scalable DNA molecular computation

    NASA Astrophysics Data System (ADS)

    Wang, Sixue

    Beginning with the ground-breaking work on DNA computation by Adleman in 1994 [2], the idea of using DNA molecules to perform computations has been explored extensively. In this thesis, a computation based on a scalable DNA neural network was discussed and a neuron model was partially implemented using DNA molecules. In order to understand the behavior of short DNA strands in a polyacrylamide gel, we have measured the mobilities of various short single-stranded DNA (ssDNA) and double-stranded DNA (dsDNA) shorter than 100 bases. We found that sufficiently short lengths of ssDNA had a higher mobility than same lengths of dsDNA, with a crossover length Lx at which the mobilities are equal. The crossover length decreases approximately linearly with polyacrylamide gel acrylamide concentration. At the same time, the influence of DNA structure on its mobility was studied and the effect of single-stranded overhangs on dsDNA was discussed. The idea to make a scalable DNA neural network was discussed. To prepare our basis vector DNA oligomers, a 90 base DNA template with 50 base random strand in the middle and two 20 base primers on the ends was designed and purchased. By a series of dilutions, we obtained several aliquots, containing only 30 random sequence molecules each. These were amplified to roughly 5 pico mole quantities by 38 cycles of PCR with hot start DNA polymerase. We then used asymmetric PCR followed by polyacrylamide gel purification to get the necessary single-stranded basis vectors (ssDNA) and their complements. We tested the suitability of this scheme by adding two vectors formed from different linear of the basis vectors. The full scheme for DNA neural network computation was tested using two determinate ssDNA strands. We successfully transformed an input DNA oligomer to a different output oligomer using the polymerase reaction required by the proposed DNA neural network algorithm. Isothermal linear amplification was used to obtain a sufficient quantity of

  5. Computational simulation of a new system modelling ions electromigration through biological membranes

    PubMed Central

    2013-01-01

    Background The interest in cell membrane has grown drastically for their important role as controllers of biological functions in health and illness. In fact most important physiological processes are intimately related to the transport ability of the membrane, such as cell adhesion, cell signaling and immune defense. Furthermore, ion migration is connected with life-threatening pathologies such as metastases and atherosclerosis. Consequently, a large amount of research is consecrated to this topic. To better understand cell membranes, more accurate models of ionic flux are required and also their computational simulations. Results This paper is presenting the numerical simulation of a more general system modelling ion migration through biological membranes. The model includes both the effects of biochemical reaction between ions and fixed charges. The model is a nonlinear coupled system. In the first we describe the mathematical model. To realize the numerical simulation of our model, we proceed by a finite element discretisation and then by choosing an appropriate resolution algorithm to the nonlinearities. Conclusions We give numerical simulations obtained for different popular models of enzymatic reaction which were compared to those obtained in literature on systems of ordinary differential equations. The results obtained show a complete agreement between the two modellings. Furthermore, various numerical experiments are presented to confirm the accuracy, efficiency and stability of the proposed method. In particular, we show that the scheme is unconditionally stable and second-order accurate in space. PMID:24010551

  6. Chaste: An Open Source C++ Library for Computational Physiology and Biology

    PubMed Central

    Mirams, Gary R.; Arthurs, Christopher J.; Bernabeu, Miguel O.; Bordas, Rafel; Cooper, Jonathan; Corrias, Alberto; Davit, Yohan; Dunn, Sara-Jane; Fletcher, Alexander G.; Harvey, Daniel G.; Marsh, Megan E.; Osborne, James M.; Pathmanathan, Pras; Pitt-Francis, Joe; Southern, James; Zemzemi, Nejib; Gavaghan, David J.

    2013-01-01

    Chaste — Cancer, Heart And Soft Tissue Environment — is an open source C++ library for the computational simulation of mathematical models developed for physiology and biology. Code development has been driven by two initial applications: cardiac electrophysiology and cancer development. A large number of cardiac electrophysiology studies have been enabled and performed, including high-performance computational investigations of defibrillation on realistic human cardiac geometries. New models for the initiation and growth of tumours have been developed. In particular, cell-based simulations have provided novel insight into the role of stem cells in the colorectal crypt. Chaste is constantly evolving and is now being applied to a far wider range of problems. The code provides modules for handling common scientific computing components, such as meshes and solvers for ordinary and partial differential equations (ODEs/PDEs). Re-use of these components avoids the need for researchers to ‘re-invent the wheel’ with each new project, accelerating the rate of progress in new applications. Chaste is developed using industrially-derived techniques, in particular test-driven development, to ensure code quality, re-use and reliability. In this article we provide examples that illustrate the types of problems Chaste can be used to solve, which can be run on a desktop computer. We highlight some scientific studies that have used or are using Chaste, and the insights they have provided. The source code, both for specific releases and the development version, is available to download under an open source Berkeley Software Distribution (BSD) licence at http://www.cs.ox.ac.uk/chaste, together with details of a mailing list and links to documentation and tutorials. PMID:23516352

  7. Chaste: an open source C++ library for computational physiology and biology.

    PubMed

    Mirams, Gary R; Arthurs, Christopher J; Bernabeu, Miguel O; Bordas, Rafel; Cooper, Jonathan; Corrias, Alberto; Davit, Yohan; Dunn, Sara-Jane; Fletcher, Alexander G; Harvey, Daniel G; Marsh, Megan E; Osborne, James M; Pathmanathan, Pras; Pitt-Francis, Joe; Southern, James; Zemzemi, Nejib; Gavaghan, David J

    2013-01-01

    Chaste - Cancer, Heart And Soft Tissue Environment - is an open source C++ library for the computational simulation of mathematical models developed for physiology and biology. Code development has been driven by two initial applications: cardiac electrophysiology and cancer development. A large number of cardiac electrophysiology studies have been enabled and performed, including high-performance computational investigations of defibrillation on realistic human cardiac geometries. New models for the initiation and growth of tumours have been developed. In particular, cell-based simulations have provided novel insight into the role of stem cells in the colorectal crypt. Chaste is constantly evolving and is now being applied to a far wider range of problems. The code provides modules for handling common scientific computing components, such as meshes and solvers for ordinary and partial differential equations (ODEs/PDEs). Re-use of these components avoids the need for researchers to 're-invent the wheel' with each new project, accelerating the rate of progress in new applications. Chaste is developed using industrially-derived techniques, in particular test-driven development, to ensure code quality, re-use and reliability. In this article we provide examples that illustrate the types of problems Chaste can be used to solve, which can be run on a desktop computer. We highlight some scientific studies that have used or are using Chaste, and the insights they have provided. The source code, both for specific releases and the development version, is available to download under an open source Berkeley Software Distribution (BSD) licence at http://www.cs.ox.ac.uk/chaste, together with details of a mailing list and links to documentation and tutorials. PMID:23516352

  8. The opportunities for space biology research on the Space Station

    NASA Technical Reports Server (NTRS)

    Ballard, Rodney W.; Souza, Kenneth A.

    1987-01-01

    The goals of space biology research to be conducted aboard the Space Station in 1990s include long-term studies of reproduction, development, growth, physiology, behavior, and aging in both animals and plants. They also include studies of the mechanisms by which gravitational stimuli are sensed, processed, and transmitted to a responsive site, and of the effect of microgravity on each component. The Space Station configuration will include a life sciences research facility, where experiment cyles will be on a 90-day basis (since the Space Station missions planned for the 1990s call for 90-day intervals). A modular approach is taken to accomodate animal habitats, plant growth chambers, and other specimen holding facilities; the modular habitats would be transportable between the launch systems, habitat racks, a workbench, and a variable-gravity centrifuge (included for providing artificial gravity and accurately controlled acceleration levels aboard Space Station).

  9. BCTR: Biological and Chemical Technologies Research 1994 annual summary report

    SciTech Connect

    Petersen, G.

    1995-02-01

    The annual summary report presents the fiscal year (FY) 1994 research activities and accomplishments for the United States Department of Energy (DOE) Biological and Chemical Technologies Research (BCTR) Program of the Advanced Industrial Concepts Division (AICD). This AICD program resides within the Office of Industrial Technologies (OIT) of the Office of Energy Efficiency and Renewable Energy (EE). Although the OIT was reorganized in 1991 and AICD no longer exists, this document reports on efforts conducted under the former structure. The annual summary report for 1994 (ASR 94) contains the following: program description (including BCTR program mission statement, historical background, relevance, goals and objectives); program structure and organization, selected technical and programmatic highlights for 1994; detailed descriptions of individual projects; a listing of program output, including a bibliography of published work; patents, and awards arising from work supported by BCTR.

  10. Space Station Freedom: a unique laboratory for gravitational biology research.

    PubMed

    Phillips, R W; Cowing, K L

    1993-04-01

    The advent of Space Station Freedom (SSF) will provide a permanent laboratory in space with unparalleled opportunities to perform biological research. As with any spacecraft there will also be limitations. It is our intent to describe this space laboratory and present a picture of how scientists will conduct research in this unique environment we call space. SSF is an international venture which will continue to serve as a model for other peaceful international efforts. It is hoped that as the human race moves out from this planet back to the moon and then on to Mars that SSF can serve as a successful example of how things can and should be done. PMID:11537716

  11. A framework for integrating thermal biology into fragmentation research.

    PubMed

    Tuff, K T; Tuff, T; Davies, K F

    2016-04-01

    Habitat fragmentation changes thermal conditions in remnant patches, and thermal conditions strongly influence organism morphology, distribution, and activity patterns. However, few studies explore temperature as a mechanism driving ecological responses to fragmentation. Here we offer a conceptual framework that integrates thermal biology into fragmentation research to better understand individual, species, community, and ecosystem-level responses to fragmentation. Specifically, the framework addresses how fragmentation changes temperature and how the effects of those temperature changes spread through the ecosystem, from organism response via thermal sensitivity, to changes in species distribution and activity patterns, to shifts in community structure following species' responses, and ultimately to changes in ecosystem functions. We place a strong emphasis on future research directions by outlining "Critical gaps" for each step of the framework. Empirical efforts to apply and test this framework promise new understanding of fragmentation's ecological consequences and new strategies for conservation in an increasingly fragmented and warmer world. PMID:26892491

  12. Open-Source Software in Computational Research: A Case Study

    DOE PAGESBeta

    Syamlal, Madhava; O'Brien, Thomas J.; Benyahia, Sofiane; Gel, Aytekin; Pannala, Sreekanth

    2008-01-01

    A case study of open-source (OS) development of the computational research software MFIX, used for multiphase computational fluid dynamics simulations, is presented here. The verification and validation steps required for constructing modern computational software and the advantages of OS development in those steps are discussed. The infrastructure used for enabling the OS development of MFIX is described. The impact of OS development on computational research and education in gas-solids flow, as well as the dissemination of information to other areas such as geophysical and volcanology research, is demonstrated. This study shows that the advantages of OS development were realized inmore » the case of MFIX: verification by many users, which enhances software quality; the use of software as a means for accumulating and exchanging information; the facilitation of peer review of the results of computational research.« less

  13. Open-Source Software in Computational Research: A Case Study

    SciTech Connect

    Syamlal, Madhava; O'Brien, Thomas J.; Benyahia, Sofiane; Gel, Aytekin; Pannala, Sreekanth

    2008-01-01

    A case study of open-source (OS) development of the computational research software MFIX, used for multiphase computational fluid dynamics simulations, is presented here. The verification and validation steps required for constructing modern computational software and the advantages of OS development in those steps are discussed. The infrastructure used for enabling the OS development of MFIX is described. The impact of OS development on computational research and education in gas-solids flow, as well as the dissemination of information to other areas such as geophysical and volcanology research, is demonstrated. This study shows that the advantages of OS development were realized in the case of MFIX: verification by many users, which enhances software quality; the use of software as a means for accumulating and exchanging information; the facilitation of peer review of the results of computational research.

  14. Global biology - An interdisciplinary scientific research program at NASA, Ames Research Center

    NASA Technical Reports Server (NTRS)

    Lawless, J. G.; Colin, L.

    1983-01-01

    NASA has initiated new effort in Global Biology, the primary focus of which is to understand biogeochemical cycles. As part of this effort, an interdisciplinary team of scientists has formed at Ames Research Center to investigate the cycling of sulfur in the marine coastal zone and to study the cycling of nitrogen in terrestrial ecosystems. Both studies will use remotely sensed data, coupled with ground-based research, to identify and measure the transfer of major and minor biologically produced gases between these ecosystems and global reservoirs.

  15. Global Biology: An Interdisciplinary Scientific Research Program at NASA Ames Research Center

    NASA Technical Reports Server (NTRS)

    Lawless, James G.; Colin, Lawrence

    1984-01-01

    NASA has initiated new effort in Global Biology, the primary focus of which is to understand biogeochemical cycles. As part of this effort, an interdisciplinary team of scientists has formed at Ames Research Center to investigate the cycling of sulfur in the marine coastal zone and to study the cycling of nitrogen in terrestrial ecosystems. Both studies will use remotely sensed data, coupled with ground-based research, to identify and measure the transfer of major and minor biologically produced gases between these ecosystems and global reservoirs.

  16. Computational biology in anti-tuberculosis drug discovery.

    PubMed

    Murphy, Dennis J; Brown, James R

    2009-06-01

    The resurgence of drug resistant tuberculosis (TB) is a major global healthcare problem. Mycobacterium tuberculosis (MTB), TB's causative agent, evades the host immune system and drug regimes by entering prolonged periods of nonproliferation or dormancy. The identification of genes essential to the bacterium in its dormancy phase infections is a key strategy in the development of new anti-TB therapeutics. The rapid expansion of TB-related genomic data sources including DNA sequences, transcriptomic and proteomic profiles, and genome-wide essentiality data, present considerable opportunities to apply advanced computational analyses to predict potential drug targets. However, the translation of in silico predictions to effective clinical therapies remains a significant challenge. PMID:19519485

  17. Computing distribution of scale independent motifs in biological sequences

    PubMed Central

    Almeida, Jonas S; Vinga, Susana

    2006-01-01

    The use of Chaos Game Representation (CGR) or its generalization, Universal Sequence Maps (USM), to describe the distribution of biological sequences has been found objectionable because of the fractal structure of that coordinate system. Consequently, the investigation of distribution of symbolic motifs at multiple scales is hampered by an inexact association between distance and sequence dissimilarity. A solution to this problem could unleash the use of iterative maps as phase-state representation of sequences where its statistical properties can be conveniently investigated. In this study a family of kernel density functions is described that accommodates the fractal nature of iterative function representations of symbolic sequences and, consequently, enables the exact investigation of sequence motifs of arbitrary lengths in that scale-independent representation. Furthermore, the proposed kernel density includes both Markovian succession and currently used alignment-free sequence dissimilarity metrics as special solutions. Therefore, the fractal kernel described is in fact a generalization that provides a common framework for a diverse suite of sequence analysis techniques. PMID:17049089

  18. 2010 Tetrapyrroles, Chemistry & Biology of Gordon Research Conference

    SciTech Connect

    Angela Wilks

    2010-07-30

    The objective of the Chemistry & Biology of Tetrapyrroles Gordon Conference is to bring together researchers from diverse disciplines that otherwise would not interact. By bringing biologists, chemists, engineers and clinicians with a common interest in tetrapyrroles the conference provides a forum for cross-disciplinary ideas and collaboration. The perspective provided by biologists, chemists, and clinicians working in fields such as newly discovered defects in human porphyrin metabolism, the myriad of strategies for light harvesting in photosynthetic organisms, novel tetrapyrroles that serve as auxiliary chromophores or enzyme cofactors, synthetic strategies in the design of novel tetrapyrrole scaffolds, and tetrapyrrole based cell signaling and regulatory systems, makes this conference unique in the field. Over the years the growing evidence for the role of tetrapyrroles and their reactive intermediates in cell signaling and regulation has been of increasing importance at this conference. The 2010 conference on Chemistry & Biology of Tetrapyrroles will focus on many of these new frontiers as outlined in the preliminary program listed. Speakers will emphasize unpublished results and new findings in the field. The oral sessions will be followed by the highly interactive afternoon poster sessions. The poster sessions provide all conferees with the opportunity to present their latest research and to exchange ideas in a more informal setting. As in the past, this opportunity will continue during the nightly social gathering that takes place in the poster hall following the evening lectures. All conferees are encouraged to submit and present posters. At the conference the best poster in the areas of biology, chemistry and medicine will be selected by a panel of previous conference chairs.

  19. NASA Sponsored Research Involving Crystallization of Biological Materials

    NASA Technical Reports Server (NTRS)

    Downey, James Patton

    2000-01-01

    An overview of NASA's plans for the performing experiments involving the crystallization of biological materials on the International Space Station (ISS) is presented. In addition, a brief overview of past work is provided as background. Descriptions of flight hardware currently available for use on the ISS are given and projections of future developments are discussed. In addition, experiment selection and funding is described. As of the flight of STS-95, these crystallization projects have proven to be some of the most successful in the history of microgravity research. The NASA Microgravity Research Division alone has flown 185 different proteins, nucleic acids, viruses, and complexes on 43 different missions. 37 of the 185 have resulted, in, diffraction patterns with higher resolution than was obtained in all previous ground based experiments. This occurred despite the fact that an average of only 41 samples per protein were flown. A number of other samples have shown improved signal to noise characteristics, i.e. relative Wilson plots, when compared to the best ground experiments. In addition, a number of experiments investigating the effects of microgravity conditions on the crystallization of biological material have been conducted.

  20. [Activities of Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2001-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administrations missions. RIACS is located at the NASA Ames Research Center, Moffett Field, California. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1. Automated Reasoning for Autonomous Systems Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. 2. Human-Centered Computing Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities. 3. High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to analysis of large scientific datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.

  1. Why and how to expand the role of systems biology in pharmaceutical research and development.

    PubMed

    Phair, Robert D

    2012-01-01

    Seen from the perspective of funding organizations, investors, and the general public, the productivity of our world-wide biomedical research enterprise is declining despite increased investment. This opinion piece suggests a cause and a solution. The cause is the enormous complexity of human biology and pathophysiology. The unsolved human diseases involve so many interacting variables that single research laboratories headed by skilled principal investigators doing innovative experimental work cannot be expected to assemble the reductionist pieces into an integrated working model. Systems biology offers a solution, but it will require teamwork. Co-equal teams of experimental and computational biologists can construct multiscale differential equation models and test them against experimental data. A successful model provides actionable evidence-based guidance to the entire research and development team. These integrative biology teams may, for historical and cultural reasons, be unsustainable in academia, but they seem naturally suited to modern pharmaceutical research and development. One way to organize such teams and their workflow is described in detail. PMID:22161350

  2. Computing & Interpreting Effect Sizes in Educational Research

    ERIC Educational Resources Information Center

    Thompson, Bruce

    2009-01-01

    The present article provides a primer on using effect sizes in research. A small heuristic data set is used in order to make the discussion concrete. Additionally, various admonitions for best practice in reporting and interpreting effect sizes are presented. Among these is the admonition to not use Cohen's benchmarks for "small," "medium," and…

  3. A Computational Lens on Design Research

    ERIC Educational Resources Information Center

    Hoyles, Celia; Noss, Richard

    2015-01-01

    In this commentary, we briefly review the collective effort of design researchers to weave theory with empirical results, in order to gain a better understanding of the processes of learning. We seek to respond to this challenging agenda by centring on the evolution of one sub-field: namely that which involves investigations within a…

  4. Library Online! A Guide to Computer Research.

    ERIC Educational Resources Information Center

    Turrell, Linda

    The world of electronic technology is opening up vast new opportunities for learning, gathering, and sharing information. This guide is for teachers and students in grades 4-8 to learn how to use electronic tools to conduct research to find information at school or around the world. The guide includes introductory pages for each topic, student…

  5. TARGET: Research in Computer Aids for Translators.

    ERIC Educational Resources Information Center

    McCracken, Donald; Strazds, Andris E.

    1980-01-01

    Reviews the background of the "TARGET Project for Aids to Translation," its current facilities, and its goals. Describes the system's central feature as an interactive, multilingual terminology database intended to eliminate time wasted in researching unknown terms and to facilitate final document production, study of person-machine interface…

  6. Impact of Interdisciplinary Undergraduate Research in Mathematics and Biology on the Development of a New Course Integrating Five STEM Disciplines

    ERIC Educational Resources Information Center

    Caudill, Lester; Hill, April; Hoke, Kathy; Lipan, Ovidiu

    2010-01-01

    Funded by innovative programs at the National Science Foundation and the Howard Hughes Medical Institute, University of Richmond faculty in biology, chemistry, mathematics, physics, and computer science teamed up to offer first- and second-year students the opportunity to contribute to vibrant, interdisciplinary research projects. The result was…

  7. Computer vision research with new imaging technology

    NASA Astrophysics Data System (ADS)

    Hou, Guangqi; Liu, Fei; Sun, Zhenan

    2015-12-01

    Light field imaging is capable of capturing dense multi-view 2D images in one snapshot, which record both intensity values and directions of rays simultaneously. As an emerging 3D device, the light field camera has been widely used in digital refocusing, depth estimation, stereoscopic display, etc. Traditional multi-view stereo (MVS) methods only perform well on strongly texture surfaces, but the depth map contains numerous holes and large ambiguities on textureless or low-textured regions. In this paper, we exploit the light field imaging technology on 3D face modeling in computer vision. Based on a 3D morphable model, we estimate the pose parameters from facial feature points. Then the depth map is estimated through the epipolar plane images (EPIs) method. At last, the high quality 3D face model is exactly recovered via the fusing strategy. We evaluate the effectiveness and robustness on face images captured by a light field camera with different poses.

  8. Research in computer vision for autonomous systems

    NASA Astrophysics Data System (ADS)

    Kak, Avi; Yoder, Mark; Andress, Keith; Blask, Steve; Underwood, Tom

    1988-09-01

    This report addresses FLIR processing, LADAR processing and electronic terrain board modeling. In our discussion on FLIR processing, issues were analyzed for classifiability of FLIR features, computationally efficient algorithms for target segmentation, metrics, etc. The discussion on LADAR includes a comparison of a number of different approaches to the segmentation of target surfaces from range images, extraction of silhouettes at different ranges, and reasoning strategies for the recognition of targets and estimation of their aspects. Regarding electronic terrain board modeling, it was shown how the readily available wire-frame data for strategic targets can be converted into volumetric models utilizing the concepts of constructive solid geometry; then is was shown how from the resulting volumetric models it is possible to generate synthetic range images that are very similar to real LADAR images. Also shown is how sensor noise can be added to these synthetic images to make them even more realistic.

  9. National Energy Research Scientific Computing Center (NERSC): Advancing the frontiers of computational science and technology

    SciTech Connect

    Hules, J.

    1996-11-01

    National Energy Research Scientific Computing Center (NERSC) provides researchers with high-performance computing tools to tackle science`s biggest and most challenging problems. Founded in 1974 by DOE/ER, the Controlled Thermonuclear Research Computer Center was the first unclassified supercomputer center and was the model for those that followed. Over the years the center`s name was changed to the National Magnetic Fusion Energy Computer Center and then to NERSC; it was relocated to LBNL. NERSC, one of the largest unclassified scientific computing resources in the world, is the principal provider of general-purpose computing services to DOE/ER programs: Magnetic Fusion Energy, High Energy and Nuclear Physics, Basic Energy Sciences, Health and Environmental Research, and the Office of Computational and Technology Research. NERSC users are a diverse community located throughout US and in several foreign countries. This brochure describes: the NERSC advantage, its computational resources and services, future technologies, scientific resources, and computational science of scale (interdisciplinary research over a decade or longer; examples: combustion in engines, waste management chemistry, global climate change modeling).

  10. Computational full electron structure study of biological activity in Cyclophilin A.

    PubMed

    Zhou, Wenjin; Rossetto, Allison M; Pang, Xiaodong; Zhou, Linxiang

    2016-01-01

    Cyclosporine (CsA) is widely used in organ transplant patients to help prevent the patient's body from rejecting the organ. CsA has been shown to be a safe and highly effective immunosuppressive drug that binds with the protein Cyclophilin A (CypA) at active sites. However, the exact mechanism of this binding at the molecular level remains unknown. In this project, we elucidate the binding of CsA to CypA at the molecular level by computing their electron structures and revealing their interactions. We employ a novel technique called electron Computer-Aided Drug Design (eCADD) on the protein's full electron structure along with its hydrophobic pocket and the perturbation theory of the interaction between two wave functions. We have identified the wave function of CypA, the biological active residues and active atoms of CypA and CsA, the interaction site between CypA and CsA, and the hydrogen bonds in the ligand CsA binding site. All these calculated active residues, active atoms, and hydrogen bonds are in good agreement with recorded laboratory experiments and provide guidelines for designing new ligands of CypA. We believe that our eCADD framework can provide researchers with a cost-efficient new method of drug design based on the full electron structure of proteins. PMID:26264861

  11. Multiscale computational models in physical systems biology of intracellular trafficking

    PubMed Central

    Tourdot, Richard W.; Bradley, Ryan P.; Ramakrishnan, Natesan

    2015-01-01

    In intracellular trafficking, a definitive understanding of the interplay between protein binding and membrane morphology remains incomplete. The authors describe a computational approach by integrating coarse-grained molecular dynamics (CGMD) simulations with continuum Monte Carlo (CM) simulations of the membrane to study protein–membrane interactions and the ensuing membrane curvature. They relate the curvature field strength discerned from the molecular level to its effect at the cellular length-scale. They perform thermodynamic integration on the CM model to describe the free energy landscape of vesiculation in clathrin-mediated endocytosis. The method presented here delineates membrane morphologies and maps out the free energy changes associated with membrane remodeling due to varying coat sizes, coat curvature strengths, membrane bending rigidities, and tensions; furthermore several constraints on mechanisms underlying clathrin-mediated endocytosis have also been identified, Their CGMD simulations have revealed the importance of PIP2 for stable binding of proteins essential for curvature induction in the bilayer and have provided a molecular basis for the positive curvature induction by the epsin N-terminal homology (EIMTH) domain. Calculation of the free energy landscape for vesicle budding has identified the critical size and curvature strength of a clathrin coat required for nucleation and stabilisation of a mature vesicle. PMID:25257021

  12. VirtualPlant: A Software Platform to Support Systems Biology Research1[W][OA

    PubMed Central

    Katari, Manpreet S.; Nowicki, Steve D.; Aceituno, Felipe F.; Nero, Damion; Kelfer, Jonathan; Thompson, Lee Parnell; Cabello, Juan M.; Davidson, Rebecca S.; Goldberg, Arthur P.; Shasha, Dennis E.; Coruzzi, Gloria M.; Gutiérrez, Rodrigo A.

    2010-01-01

    Data generation is no longer the limiting factor in advancing biological research. In addition, data integration, analysis, and interpretation have become key bottlenecks and challenges that biologists conducting genomic research face daily. To enable biologists to derive testable hypotheses from the increasing amount of genomic data, we have developed the VirtualPlant software platform. VirtualPlant enables scientists to visualize, integrate, and analyze genomic data from a systems biology perspective. VirtualPlant integrates genome-wide data concerning the known and predicted relationships among genes, proteins, and molecules, as well as genome-scale experimental measurements. VirtualPlant also provides visualization techniques that render multivariate information in visual formats that facilitate the extraction of biological concepts. Importantly, VirtualPlant helps biologists who are not trained in computer science to mine lists of genes, microarray experiments, and gene networks to address questions in plant biology, such as: What are the molecular mechanisms by which internal or external perturbations affect processes controlling growth and development? We illustrate the use of VirtualPlant with three case studies, ranging from querying a gene of interest to the identification of gene networks and regulatory hubs that control seed development. Whereas the VirtualPlant software was developed to mine Arabidopsis (Arabidopsis thaliana) genomic data, its data structures, algorithms, and visualization tools are designed in a species-independent way. VirtualPlant is freely available at www.virtualplant.org. PMID:20007449

  13. Review of Computer Mediated Communication Research for Education

    ERIC Educational Resources Information Center

    Luppicini, Rocci

    2007-01-01

    This research review examines recent developments in computer-mediated communication (CMC) research for educational applications. The review draws on 170 recent research articles selected from 78 journals representing a wide range of disciplines. The review focuses on peer-reviewed empirical studies, but is open to a variety of methodologies. The…

  14. A Methodological Review of Computer Science Education Research

    ERIC Educational Resources Information Center

    Randolph, Justus; Julnes, George; Sutinen, Erkki; Lehman, Steve

    2008-01-01

    Methodological reviews have been used successfully to identify research trends and improve research practice in a variety of academic fields. Although there have been three methodological reviews of the emerging field of computer science education research, they lacked reliability or generalizability. Therefore, because of the capacity for a…

  15. Fiction as an Introduction to Computer Science Research

    ERIC Educational Resources Information Center

    Goldsmith, Judy; Mattei, Nicholas

    2014-01-01

    The undergraduate computer science curriculum is generally focused on skills and tools; most students are not exposed to much research in the field, and do not learn how to navigate the research literature. We describe how fiction reviews (and specifically science fiction) are used as a gateway to research reviews. Students learn a little about…

  16. An African Research Agenda for Computers in Education

    ERIC Educational Resources Information Center

    Cronje, Johannes

    2014-01-01

    This article presents an overview of research into computers and education undertaken at a the University of Pretoria since 1995. It seeks to explore the patterns that have emerged and to indicate potential directions for future research. In response to a call for research in the field to be taken seriously the article identifies the main themes…

  17. Educational Technology Research Journals: Computers & Education, 2002-2011

    ERIC Educational Resources Information Center

    Rackham, David D.; Hyatt, Frederick R.; Macfarlane, David C.; Nisse, Tony; Woodfield, Wendy; West, Richard E.

    2013-01-01

    In this study, the authors examined the journal "Computers & Education" to discover research trends in the articles published during 2002-2011. Research articles were analyzed to determine trends in the research methods and types of articles published, as well as the key topics published, top authors, and some of the most-cited…

  18. Advanced computational research in materials processing for design and manufacturing

    SciTech Connect

    Zacharia, T.

    1995-04-01

    Advanced mathematical techniques and computer simulation play a major role in providing enhanced understanding of conventional and advanced materials processing operations. Development and application of mathematical models and computer simulation techniques can provide a quantitative understanding of materials processes and will minimize the need for expensive and time consuming trial- and error-based product development. As computer simulations and materials databases grow in complexity, high performance computing and simulation are expected to play a key role in supporting the improvements required in advanced material syntheses and processing by lessening the dependence on expensive prototyping and re-tooling. Many of these numerical models are highly compute-intensive. It is not unusual for an analysis to require several hours of computational time on current supercomputers despite the simplicity of the models being studied. For example, to accurately simulate the heat transfer in a 1-m{sup 3} block using a simple computational method requires 10`2 arithmetic operations per second of simulated time. For a computer to do the simulation in real time would require a sustained computation rate 1000 times faster than that achievable by current supercomputers. Massively parallel computer systems, which combine several thousand processors able to operate concurrently on a problem are expected to provide orders of magnitude increase in performance. This paper briefly describes advanced computational research in materials processing at ORNL. Continued development of computational techniques and algorithms utilizing the massively parallel computers will allow the simulation of conventional and advanced materials processes in sufficient generality.

  19. Confero: an integrated contrast data and gene set platform for computational analysis and biological interpretation of omics data

    PubMed Central

    2013-01-01

    Background High-throughput omics technologies such as microarrays and next-generation sequencing (NGS) have become indispensable tools in biological research. Computational analysis and biological interpretation of omics data can pose significant challenges due to a number of factors, in particular the systems integration required to fully exploit and compare data from different studies and/or technology platforms. In transcriptomics, the identification of differentially expressed genes when studying effect(s) or contrast(s) of interest constitutes the starting point for further downstream computational analysis (e.g. gene over-representation/enrichment analysis, reverse engineering) leading to mechanistic insights. Therefore, it is important to systematically store the full list of genes with their associated statistical analysis results (differential expression, t-statistics, p-value) corresponding to one or more effect(s) or contrast(s) of interest (shortly termed as ” contrast data”) in a comparable manner and extract gene sets in order to efficiently support downstream analyses and further leverage data on a long-term basis. Filling this gap would open new research perspectives for biologists to discover disease-related biomarkers and to support the understanding of molecular mechanisms underlying specific biological perturbation effects (e.g. disease, genetic, environmental, etc.). Results To address these challenges, we developed Confero, a contrast data and gene set platform for downstream analysis and biological interpretation of omics data. The Confero software platform provides storage of contrast data in a simple and standard format, data transformation to enable cross-study and platform data comparison, and automatic extraction and storage of gene sets to build new a priori knowledge which is leveraged by integrated and extensible downstream computational analysis tools. Gene Set Enrichment Analysis (GSEA) and Over-Representation Analysis (ORA) are

  20. Applied Computational Fluid Dynamics at NASA Ames Research Center

    NASA Technical Reports Server (NTRS)

    Holst, Terry L.; Kwak, Dochan (Technical Monitor)

    1994-01-01

    The field of Computational Fluid Dynamics (CFD) has advanced to the point where it can now be used for many applications in fluid mechanics research and aerospace vehicle design. A few applications being explored at NASA Ames Research Center will be presented and discussed. The examples presented will range in speed from hypersonic to low speed incompressible flow applications. Most of the results will be from numerical solutions of the Navier-Stokes or Euler equations in three space dimensions for general geometry applications. Computational results will be used to highlight the presentation as appropriate. Advances in computational facilities including those associated with NASA's CAS (Computational Aerosciences) Project of the Federal HPCC (High Performance Computing and Communications) Program will be discussed. Finally, opportunities for future research will be presented and discussed. All material will be taken from non-sensitive, previously-published and widely-disseminated work.

  1. Persistence and Availability of Web Services in Computational Biology

    PubMed Central

    Schultheiss, Sebastian J.; Münch, Marc-Christian; Andreeva, Gergana D.; Rätsch, Gunnar

    2011-01-01

    We have conducted a study on the long-term availability of bioinformatics Web services: an observation of 927 Web services published in the annual Nucleic Acids Research Web Server Issues between 2003 and 2009. We found that 72% of Web sites are still available at the published addresses, only 9% of services are completely unavailable. Older addresses often redirect to new pages. We checked the functionality of all available services: for 33%, we could not test functionality because there was no example data or a related problem; 13% were truly no longer working as expected; we could positively confirm functionality only for 45% of all services. Additionally, we conducted a survey among 872 Web Server Issue corresponding authors; 274 replied. 78% of all respondents indicate their services have been developed solely by students and researchers without a permanent position. Consequently, these services are in danger of falling into disrepair after the original developers move to another institution, and indeed, for 24% of services, there is no plan for maintenance, according to the respondents. We introduce a Web service quality scoring system that correlates with the number of citations: services with a high score are cited 1.8 times more often than low-scoring services. We have identified key characteristics that are predictive of a service's survival, providing reviewers, editors, and Web service developers with the means to assess or improve Web services. A Web service conforming to these criteria receives more citations and provides more reliable service for its users. The most effective way of ensuring continued access to a service is a persistent Web address, offered either by the publishing journal, or created on the authors' own initiative, for example at http://bioweb.me. The community would benefit the most from a policy requiring any source code needed to reproduce results to be deposited in a public repository. PMID:21966383

  2. Recent Developments in the Application of Biologically Inspired Computation to Chemical Sensing

    NASA Astrophysics Data System (ADS)

    Marco, S.; Gutierrez-Gálvez, A.

    2009-05-01

    Biological olfaction outperforms chemical instrumentation in specificity, response time, detection limit, coding capacity, time stability, robustness, size, power consumption, and portability. This biological function provides outstanding performance due, to a large extent, to the unique architecture of the olfactory pathway, which combines a high degree of redundancy, an efficient combinatorial coding along with unmatched chemical information processing mechanisms. The last decade has witnessed important advances in the understanding of the computational primitives underlying the functioning of the olfactory system. In this work, the state of the art concerning biologically inspired computation for chemical sensing will be reviewed. Instead of reviewing the whole body of computational neuroscience of olfaction, we restrict this review to the application of models to the processing of real chemical sensor data.

  3. National Energy Research Scientific Computing Center 2007 Annual Report

    SciTech Connect

    Hules, John A.; Bashor, Jon; Wang, Ucilia; Yarris, Lynn; Preuss, Paul

    2008-10-23

    This report presents highlights of the research conducted on NERSC computers in a variety of scientific disciplines during the year 2007. It also reports on changes and upgrades to NERSC's systems and services aswell as activities of NERSC staff.

  4. Computational Science Guides and Accelerates Hydrogen Research (Fact Sheet)

    SciTech Connect

    Not Available

    2010-12-01

    This fact sheet describes NREL's accomplishments in using computational science to enhance hydrogen-related research and development in areas such as storage and photobiology. Work was performed by NREL's Chemical and Materials Science Center and Biosciences Center.

  5. Research on the Use of Computer-Assisted Instruction.

    ERIC Educational Resources Information Center

    Craft, C. O.

    1982-01-01

    Reviews recent research studies related to computer assisted instruction (CAI). The studies concerned program effectiveness, teaching of psychomotor skills, tool availability, and factors affecting the adoption of CAI. (CT)

  6. A Research and Development Strategy for High Performance Computing.

    ERIC Educational Resources Information Center

    Office of Science and Technology Policy, Washington, DC.

    This report is the result of a systematic review of the status and directions of high performance computing and its relationship to federal research and development. Conducted by the Federal Coordinating Council for Science, Engineering, and Technology (FCCSET), the review involved a series of workshops attended by numerous computer scientists and…

  7. Developing a Research Agenda for Ubiquitous Computing in Schools

    ERIC Educational Resources Information Center

    Zucker, Andrew

    2004-01-01

    Increasing numbers of states, districts, and schools provide every student with a computing device; for example, the middle schools in Maine maintain wireless Internet access and the students receive laptops. Research can provide policymakers with better evidence of the benefits and costs of 1:1 computing and establish which factors make 1:1…

  8. Educational Outcomes and Research from 1:1 Computing Settings

    ERIC Educational Resources Information Center

    Bebell, Damian; O'Dwyer, Laura M.

    2010-01-01

    Despite the growing interest in 1:1 computing initiatives, relatively little empirical research has focused on the outcomes of these investments. The current special edition of the Journal of Technology and Assessment presents four empirical studies of K-12 1:1 computing programs and one review of key themes in the conversation about 1:1 computing…

  9. Nuclear physics detector technology applied to plant biology research

    SciTech Connect

    Weisenberger, Andrew G.; Kross, Brian J.; Lee, Seung Joo; McKisson, John E.; Xi, Wenze; Zorn, Carl J.; Howell, Calvin; Crowell, A.S.; Reid, C.D.; Smith, Mark

    2013-08-01

    The ability to detect the emissions of radioactive isotopes through radioactive decay (e.g. beta particles, x-rays and gamma-rays) has been used for over 80 years as a tracer method for studying natural phenomena. More recently a positron emitting radioisotope of carbon: {sup 11}C has been utilized as a {sup 11}CO{sub 2} tracer for plant ecophysiology research. Because of its ease of incorporation into the plant via photosynthesis, the {sup 11}CO{sub 2} radiotracer is a powerful tool for use in plant biology research. Positron emission tomography (PET) imaging has been used to study carbon transport in live plants using {sup 11}CO{sub 2}. Presently there are several groups developing and using new PET instrumentation for plant based studies. Thomas Jefferson National Accelerator Facility (Jefferson Lab) in collaboration with the Duke University Phytotron and the Triangle Universities Nuclear Laboratory (TUNL) is involved in PET detector development for plant imaging utilizing technologies developed for nuclear physics research. The latest developments of the use of a LYSO scintillator based PET detector system for {sup 11}CO{sub 2} tracer studies in plants will be briefly outlined.

  10. Activities of the Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph

    1994-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. Research at RIACS is currently being done in the following areas: (1) parallel computing; (2) advanced methods for scientific computing; (3) high performance networks; and (4) learning systems. RIACS technical reports are usually preprints of manuscripts that have been submitted to research journals or conference proceedings. A list of these reports for the period January 1, 1994 through December 31, 1994 is in the Reports and Abstracts section of this report.

  11. Optical scattering by biological aerosols: experimental and computational results on spore simulants

    NASA Astrophysics Data System (ADS)

    Sindoni, Orazio I.; Saija, Rosalba; Iatì, Maria Antonia; Borghese, Ferdinando; Denti, Paolo; Fernandes, Gustavo E.; Pan, Yong-Le; Chang, Richard K.

    2006-07-01

    We present both a computational and an experimental approach to the problem of biological aerosol characterization, joining the expertises reached in the field of theoretical optical scattering by complex, arbitrary shaped particles (multipole expansion of the electromagnetic fields and Transition Matrix), and a novel experimental technique based on two-dimensional angular optical scattering (TAOS). The good agreement between experimental and computational results, together with the possibility for a laboratory single-particle angle-resolved investigation, opens a new scenario in biological particle modelling, and might have major implications for a rapid discrimination of airborne particles.

  12. Low-gravity Orbiting Research Laboratory Environment Potential Impact on Space Biology Research

    NASA Technical Reports Server (NTRS)

    Jules, Kenol

    2006-01-01

    One of the major objectives of any orbital space research platform is to provide a quiescent low gravity, preferably a zero gravity environment, to perform fundamental as well as applied research. However, small disturbances exist onboard any low earth orbital research platform. The impact of these disturbances must be taken into account by space research scientists during their research planning, design and data analysis in order to avoid confounding factors in their science results. The reduced gravity environment of an orbiting research platform in low earth orbit is a complex phenomenon. Many factors, among others, such as experiment operations, equipment operation, life support systems and crew activity (if it is a crewed platform), aerodynamic drag, gravity gradient, rotational effects as well as the vehicle structural resonance frequencies (structural modes) contribute to form the overall reduced gravity environment in which space research is performed. The contribution of these small disturbances or accelerations is precisely why the environment is NOT a zero gravity environment, but a reduced acceleration environment. This paper does not discuss other factors such as radiation, electromagnetic interference, thermal and pressure gradient changes, acoustic and CO2 build-up to name a few that affect the space research environment as well, but it focuses solely on the magnitude of the acceleration level found on orbiting research laboratory used by research scientists to conduct space research. For ease of analysis this paper divides the frequency spectrum relevant to most of the space research disciplines into three regimes: a) quasi-steady, b) vibratory and c) transient. The International Space Station is used as an example to illustrate the point. The paper discusses the impact of these three regimes on space biology research and results from space flown experiments are used to illustrate the potential negative impact of these disturbances (accelerations

  13. Grand Challenges for Biological and Environmental Research: A Long-Term Vision

    SciTech Connect

    Arkin, A.; Baliga, N.; Braam, J.; Church, G.; Collins, J; Cottingham, R.; Ecker, J.; Gerstein, M.; Gilna, P.; Greenberg, J.; Handelsman, J.; Hubbard, S.; Joachimiak, A.; Liao, J.; Looger, L.; Meyerowitz, E.; Mjolness, E.; Petsko, G.; Sayler, G.; Simpson, M.; Stacey, G.; Sussman, M.; Tiedje, J.; Bader, D.; Cessi, P.; Collins, W.; Denning, S.; Dickinson, R.; Easterling, D.; Edmonds, J.; Feddema, J.; Field, C.; Fridlind, A.; Fung, I.; Held, I.; Jackson, R.; Janetos, A.; Large, W.; Leinen, M.; Leung, R.; Long, S.; Mace, G.; Masiello, C.; Meehl, G.; Ort, D.; Otto-Bliesner, B.; Penner, J.; Prather, M.; Randall, D.; Rasch, P.; Schneider, E.; Shugart, H.; Thornton, P.; Washington, W.; Wildung, R.; Wiscombe, W.; Zak, D.; Zhang, M.; Bielicki, J.; Buford, M.; Cleland, E.; Dale, V.; Duke, C.; Ehleringer, J.; Hecht, A.; Kammen, D.; Marland, G.; Pataki, D.; Riley, M. Robertson, P.; Hubbard, S.

    2010-12-01

    The interactions and feedbacks among plants, animals, microbes, humans, and the environment ultimately form the world in which we live. This world is now facing challenges from a growing and increasingly affluent human population whose numbers and lifestyles are driving ever greater energy demand and impacting climate. These and other contributing factors will make energy and climate sustainability extremely difficult to achieve over the 20-year time horizon that is the focus of this report. Despite these severe challenges, there is optimism that deeper understanding of our environment will enable us to mitigate detrimental effects, while also harnessing biological and climate systems to ensure a sustainable energy future. This effort is advanced by scientific inquiries in the fields of atmospheric chemistry and physics, biology, ecology, and subsurface science - all made possible by computing. The Office of Biological and Environmental Research (BER) within the Department of Energy's (DOE) Office of Science has a long history of bringing together researchers from different disciplines to address critical national needs in determining the biological and environmental impacts of energy production and use, characterizing the interplay of climate and energy, and collaborating with other agencies and DOE programs to improve the world's most powerful climate models. BER science focuses on three distinct areas: (1) What are the roles of Earth system components (atmosphere, land, oceans, sea ice, and the biosphere) in determining climate? (2) How is the information stored in a genome translated into microbial, plant, and ecosystem processes that influence biofuel production, climate feedbacks, and the natural cycling of carbon? (3) What are the biological, geochemical, and physical forces that govern the behavior of Earth's subsurface environment? Ultimately, the goal of BER science is to support experimentation and modeling that can reliably predict the outcomes and

  14. Evolving a lingua franca and associated software infrastructure for computational systems biology: the Systems Biology Markup Language (SBML) project.

    PubMed

    Hucka, M; Finney, A; Bornstein, B J; Keating, S M; Shapiro, B E; Matthews, J; Kovitz, B L; Schilstra, M J; Funahashi, A; Doyle, J C; Kitano, H

    2004-06-01

    Biologists are increasingly recognising that computational modelling is crucial for making sense of the vast quantities of complex experimental data that are now being collected. The systems biology field needs agreed-upon information standards if models are to be shared, evaluated and developed cooperatively. Over the last four years, our team has been developing the Systems Biology Markup Language (SBML) in collaboration with an international community of modellers and software developers. SBML has become a de facto standard format for representing formal, quantitative and qualitative models at the level of biochemical reactions and regulatory networks. In this article, we summarise the current and upcoming versions of SBML and our efforts at developing software infrastructure for supporting and broadening its use. We also provide a brief overview of the many SBML-compatible software tools available today. PMID:17052114

  15. Microgravity research in plant biological systems: Realizing the potential of molecular biology

    NASA Technical Reports Server (NTRS)

    Lewis, Norman G.; Ryan, Clarence A.

    1993-01-01

    The sole all-pervasive feature of the environment that has helped shape, through evolution, all life on Earth is gravity. The near weightlessness of the Space Station Freedom space environment allows gravitational effects to be essentially uncoupled, thus providing an unprecedented opportunity to manipulate, systematically dissect, study, and exploit the role of gravity in the growth and development of all life forms. New and exciting opportunities are now available to utilize molecular biological and biochemical approaches to study the effects of microgravity on living organisms. By careful experimentation, we can determine how gravity perception occurs, how the resulting signals are produced and transduced, and how or if tissue-specific differences in gene expression occur. Microgravity research can provide unique new approaches to further our basic understanding of development and metabolic processes of cells and organisms, and to further the application of this new knowledge for the betterment of humankind.

  16. STRUCTURAL BIOLOGY AND MOLECULAR MEDICINE RESEARCH PROGRAM (LSBMM)

    SciTech Connect

    Eisenberg, David S.

    2008-07-15

    The UCLA-DOE Institute of Genomics and Proteomics is an organized research unit of the University of California, sponsored by the Department of Energy through the mechanism of a Cooperative Agreement. Today the Institute consists of 10 Principal Investigators and 7 Associate Members, developing and applying technologies to promote the biological and environmental missions of the Department of Energy, and 5 Core Technology Centers to sustain this work. The focus is on understanding genomes, pathways and molecular machines in organisms of interest to DOE, with special emphasis on developing enabling technologies. Since it was founded in 1947, the UCLA-DOE Institute has adapted its mission to the research needs of DOE and its progenitor agencies as these research needs have changed. The Institute started as the AEC Laboratory of Nuclear Medicine, directed by Stafford Warren, who later became the founding Dean of the UCLA School of Medicine. In this sense, the entire UCLA medical center grew out of the precursor of our Institute. In 1963, the mission of the Institute was expanded into environmental studies by Director Ray Lunt. I became the third director in 1993, and in close consultation with David Galas and John Wooley of DOE, shifted the mission of the Institute towards genomics and proteomics. Since 1993, the Principal Investigators and Core Technology Centers are entirely new, and the Institute has separated from its former division concerned with PET imaging. The UCLA-DOE Institute shares the space of Boyer Hall with the Molecular Biology Institute, and assumes responsibility for the operation of the main core facilities. Fig. 1 gives the organizational chart of the Institute. Some of the benefits to the public of research carried out at the UCLA-DOE Institute include the following: The development of publicly accessible, web-based databases, including the Database of Protein Interactions, and the ProLinks database of genomicly inferred protein function linkages

  17. Collaborative Classroom Management. Video to Accompany "A Biological Brain in a Cultural Classroom: Applying Biological Research to Classroom Management." [Videotape].

    ERIC Educational Resources Information Center

    2001

    This 43-minute VHS videotape is designed to be used in course and workshop settings with "A Biological Brain in a Cultural Classroom: Applying Biological Research to Classroom Management." The videotape's principal values are as an introduction to the issues explored in the book and as a catalyst for group discussions and activities related to…

  18. Using biological control research in the classroom to promote scientific inquiry and literacy

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Many scientists who research biological control also teach at universities or more informally through cooperative outreach. The purpose of this paper is to review biological control activities for the classroom in four refereed journals, The American Biology Teacher, Journal of Biological Education...

  19. Cell Science and Cell Biology Research at MSFC: Summary

    NASA Technical Reports Server (NTRS)

    2003-01-01

    The common theme of these research programs is that they investigate regulation of gene expression in cells, and ultimately gene expression is controlled by the macromolecular interactions between regulatory proteins and DNA. The NASA Critical Path Roadmap identifies Muscle Alterations and Atrophy and Radiation Effects as Very Serious Risks and Severe Risks, respectively, in long term space flights. The specific problem addressed by Dr. Young's research ("Skeletal Muscle Atrophy and Muscle Cell Signaling") is that skeletal muscle loss in space cannot be prevented by vigorous exercise. Aerobic skeletal muscles (i.e., red muscles) undergo the most extensive atrophy during long-term space flight. Of the many different potential avenues for preventing muscle atrophy, Dr. Young has chosen to study the beta-adrenergic receptor (betaAR) pathway. The reason for this choice is that a family of compounds called betaAR agonists will preferentially cause an increase in muscle mass of aerobic muscles (i.e., red muscle) in animals, potentially providing a specific pharmacological solution to muscle loss in microgravity. In addition, muscle atrophy is a widespread medical problem in neuromuscular diseases, spinal cord injury, lack of exercise, aging, and any disease requiring prolonged bedridden status. Skeletal muscle cells in cell culture are utilized as a model system to study this problem. Dr. Richmond's research ("Radiation & Cancer Biology of Mammary Cells in Culture") is directed toward developing a laboratory model for use in risk assessment of cancer caused by space radiation. This research is unique because a human model will be developed utilizing human mammary cells that are highly susceptible to tumor development. This approach is preferential over using animal cells because of problems in comparing radiation-induced cancers between humans and animals.

  20. Institute for Scientific Computing Research Annual Report: Fiscal Year 2004

    SciTech Connect

    Keyes, D E

    2005-02-07

    Large-scale scientific computation and all of the disciplines that support and help to validate it have been placed at the focus of Lawrence Livermore National Laboratory (LLNL) by the Advanced Simulation and Computing (ASC) program of the National Nuclear Security Administration (NNSA) and the Scientific Discovery through Advanced Computing (SciDAC) initiative of the Office of Science of the Department of Energy (DOE). The maturation of computational simulation as a tool of scientific and engineering research is underscored in the November 2004 statement of the Secretary of Energy that, ''high performance computing is the backbone of the nation's science and technology enterprise''. LLNL operates several of the world's most powerful computers--including today's single most powerful--and has undertaken some of the largest and most compute-intensive simulations ever performed. Ultrascale simulation has been identified as one of the highest priorities in DOE's facilities planning for the next two decades. However, computers at architectural extremes are notoriously difficult to use efficiently. Furthermore, each successful terascale simulation only points out the need for much better ways of interacting with the resulting avalanche of data. Advances in scientific computing research have, therefore, never been more vital to LLNL's core missions than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, LLNL must engage researchers at many academic centers of excellence. In Fiscal Year 2004, the Institute for Scientific Computing Research (ISCR) served as one of LLNL's main bridges to the academic community with a program of collaborative subcontracts, visiting faculty, student internships, workshops, and an active seminar series. The ISCR identifies researchers from the academic community for computer science and computational science collaborations with LLNL and hosts them for short- and

  1. Research on Objectives for High-School Biology

    ERIC Educational Resources Information Center

    Korgan, John J., Jr.; Wilson, John T.

    1973-01-01

    Describes procedures to develop instructional objectives for high school biology. Two kinds of objectives are identified as pre-objectives and performance objectives. Models to classify these in branches of biology and to ensure quality control are provided. (PS)

  2. G-LoSA: An efficient computational tool for local structure-centric biological studies and drug design.

    PubMed

    Lee, Hui Sun; Im, Wonpil

    2016-04-01

    Molecular recognition by protein mostly occurs in a local region on the protein surface. Thus, an efficient computational method for accurate characterization of protein local structural conservation is necessary to better understand biology and drug design. We present a novel local structure alignment tool, G-LoSA. G-LoSA aligns protein local structures in a sequence order independent way and provides a GA-score, a chemical feature-based and size-independent structure similarity score. Our benchmark validation shows the robust performance of G-LoSA to the local structures of diverse sizes and characteristics, demonstrating its universal applicability to local structure-centric comparative biology studies. In particular, G-LoSA is highly effective in detecting conserved local regions on the entire surface of a given protein. In addition, the applications of G-LoSA to identifying template ligands and predicting ligand and protein binding sites illustrate its strong potential for computer-aided drug design. We hope that G-LoSA can be a useful computational method for exploring interesting biological problems through large-scale comparison of protein local structures and facilitating drug discovery research and development. G-LoSA is freely available to academic users at http://im.compbio.ku.edu/GLoSA/. PMID:26813336

  3. Teaching Molecular Biological Techniques in a Research Content

    ERIC Educational Resources Information Center

    Stiller, John W.; Coggins, T. Chad

    2006-01-01

    Molecular biological methods, such as the polymerase chain reaction (PCR) and gel electrophoresis, are now commonly taught to students in introductory biology courses at the college and even high school levels. This often includes hands-on experience with one or more molecular techniques as part of a general biology laboratory. To assure that most…

  4. Computational Fluid Dynamics Program at NASA Ames Research Center

    NASA Technical Reports Server (NTRS)

    Holst, Terry L.

    1989-01-01

    The Computational Fluid Dynamics (CFD) Program at NASA Ames Research Center is reviewed and discussed. The technical elements of the CFD Program are listed and briefly discussed. These elements include algorithm research, research and pilot code development, scientific visualization, advanced surface representation, volume grid generation, and numerical optimization. Next, the discipline of CFD is briefly discussed and related to other areas of research at NASA Ames including experimental fluid dynamics, computer science research, computational chemistry, and numerical aerodynamic simulation. These areas combine with CFD to form a larger area of research, which might collectively be called computational technology. The ultimate goal of computational technology research at NASA Ames is to increase the physical understanding of the world in which we live, solve problems of national importance, and increase the technical capabilities of the aerospace community. Next, the major programs at NASA Ames that either use CFD technology or perform research in CFD are listed and discussed. Briefly, this list includes turbulent/transition physics and modeling, high-speed real gas flows, interdisciplinary research, turbomachinery demonstration computations, complete aircraft aerodynamics, rotorcraft applications, powered lift flows, high alpha flows, multiple body aerodynamics, and incompressible flow applications. Some of the individual problems actively being worked in each of these areas is listed to help define the breadth or extent of CFD involvement in each of these major programs. State-of-the-art examples of various CFD applications are presented to highlight most of these areas. The main emphasis of this portion of the presentation is on examples which will not otherwise be treated at this conference by the individual presentations. Finally, a list of principal current limitations and expected future directions is given.

  5. TRAINING RESEARCH UTILIZING MAN-COMPUTER INTERACTIONS, PROMISE AND REALITY.

    ERIC Educational Resources Information Center

    MCCLELLAND, WILLIAM A.

    THE PAPER WAS PRESENTED AS PART OF THE AVIONICS PANEL PROGRAM ON NATURAL AND ARTIFICIAL LOGIC PROCESSORS, SPONSORED BY THE ADVISORY GROUP FOR AERONAUTICAL RESEARCH AND DEVELOPMENT, NATO. SEVERAL CONCEPTUAL PROPOSITIONS IN REGARD TO MAN AND THE COMPUTER ARE OFFERED. THE NATURE OF TRAINING RESEARCH IS EXAMINED. THERE IS ALSO A BRIEF CATEGORIZATION…

  6. The Importance of Computer Programming Skills to Educational Researchers.

    ERIC Educational Resources Information Center

    Lawson, Stephen

    The use of the modern computer has revolutionized the field of educational research. Software packages are currently available that allow almost anyone to analyze data efficiently and rapidly. Yet, caution must temper the widespread acceptance and use of these programs. It is recommended that the researcher not rely solely on the use of "canned"…

  7. Computer Applications in Health Care. NCHSR Research Report Series.

    ERIC Educational Resources Information Center

    Medical Information Systems Cluster, Rockville, MD.

    This NCHSR research program in the application of computers in health care--conducted over the ten year span 1968-1978--identified two areas of application research, an inpatient care support system, and an outpatient care support system. Both of these systems were conceived as conceptual frameworks for a related network of projects and ideas that…

  8. An Analysis of 27 Years of Research into Computer Education Published in Australian Educational Computing

    ERIC Educational Resources Information Center

    Zagami, Jason

    2015-01-01

    Analysis of three decades of publications in Australian Educational Computing (AEC) provides insight into the historical trends in Australian educational computing, highlighting an emphasis on pedagogy, comparatively few articles on educational technologies, and strong research topic alignment with similar international journals. Analysis confirms…

  9. Using Biological-Control Research in the Classroom to Promote Scientific Inquiry & Literacy

    ERIC Educational Resources Information Center

    Richardson, Matthew L.; Richardson, Scott L.; Hall, David G.

    2012-01-01

    Scientists researching biological control should engage in education because translating research programs into classroom activities is a pathway to increase scientific literacy among students. Classroom activities focused on biological control target all levels of biological organization and can be cross-disciplinary by drawing from subject areas…

  10. Research in mathematics and computer science at Argonne

    SciTech Connect

    Pieper, G.W.

    1989-08-01

    This report reviews the research activities in the Mathematics and Computer Science Division at Argonne National Laboratory for the period January 1988 - August 1989. The body of the report gives a brief look at the MCS staff and the research facilities, and discusses various projects carried out in two major areas of research: analytical and numerical methods and advanced computing concepts. Projects funded by non-DOE sources are also discussed, and new technology transfer activities are described. Further information on division staff, visitors, workshops, and seminars is found in the appendices.

  11. Computer Labs as Techno-Pedagogical Tools for Learning Biology--Exploring ICT Practices in India

    ERIC Educational Resources Information Center

    Nayark, Ajitha K.; Barker, Miles

    2014-01-01

    In Indian secondary schools, as in many countries, Information and Communication Technologies, ICT, are changing the image of learning places, the roles of teachers and students, and often the entire classroom learning ambience. This study investigates current practices for learning biology in school computer labs in India in the light of the…

  12. Supporting Representational Competence in High School Biology with Computer-Based Biomolecular Visualizations

    ERIC Educational Resources Information Center

    Wilder, Anna; Brinkerhoff, Jonathan

    2007-01-01

    This study assessed the effectiveness of computer-based biomolecular visualization activities on the development of high school biology students' representational competence as a means of understanding and visualizing protein structure/function relationships. Also assessed were students' attitudes toward these activities. Sixty-nine students…

  13. Effects of Constructivist and Computer-Facilitated Strategies on Achievement in Heterogeneous Secondary Biology.

    ERIC Educational Resources Information Center

    Duffy, Maryellen; Barowy, William

    This paper describes the effects of the implementation of constructivist techniques with interactive computer simulations on conceptual understanding of plant nutrition and critical thinking skills in heterogeneously grouped secondary biology classrooms. The study focused on three strategies for teaching plant nutrition: (1) traditional; (2)…

  14. Effects of Computer-Assisted Instruction on Performance of Senior High School Biology Students in Ghana

    ERIC Educational Resources Information Center

    Owusu, K. A.; Monney, K. A.; Appiah, J. Y.; Wilmot, E. M.

    2010-01-01

    This study investigated the comparative efficiency of computer-assisted instruction (CAI) and conventional teaching method in biology on senior high school students. A science class was selected in each of two randomly selected schools. The pretest-posttest non equivalent quasi experimental design was used. The students in the experimental group…

  15. Using a Computer Simulation To Teach Science Process Skills to College Biology and Elementary Education Majors.

    ERIC Educational Resources Information Center

    Lee, Aimee T.; Hairston, Rosalina V.; Thames, Rachel; Lawrence, Tonya; Herron, Sherry S.

    2002-01-01

    Describes the Lateblight computer simulation implemented in the general biology laboratory and science methods course for elementary teachers to reinforce the processes of science and allow students to engage, explore, explain, elaborate, and evaluate the methods of building concepts in science. (Author/KHR)

  16. Applications of the Aurora parallel Prolog system to computational molecular biology

    SciTech Connect

    Lusk, E.L.; Overbeek, R.; Mudambi, S.; Szeredi, P.

    1993-09-01

    We describe an investigation into the use of the Aurora parallel Prolog system in two applications within the area of computational molecular biology. The computational requirements were large, due to the nature of the applications, and were large, due to the nature of the applications, and were carried out on a scalable parallel computer the BBN ``Butterfly`` TC-2000. Results include both a demonstration that logic programming can be effective in the context of demanding applications on large-scale parallel machines, and some insights into parallel programming in Prolog.

  17. Harnessing Polypharmacology with Computer-Aided Drug Design and Systems Biology.

    PubMed

    Wathieu, Henri; Issa, Naiem T; Byers, Stephen W; Dakshanamurthy, Sivanesan

    2016-01-01

    The ascent of polypharmacology in drug development has many implications for disease therapy, most notably in the efforts of drug discovery, drug repositioning, precision medicine and combination therapy. The single- target approach to drug development has encountered difficulties in predicting drugs that are both clinically efficacious and avoid toxicity. By contrast, polypharmacology offers the possibility of a controlled distribution of effects on a biological system. This review addresses possibilities and bottlenecks in the efficient computational application of polypharmacology. The two major areas we address are the discovery and prediction of multiple protein targets using the tools of computer-aided drug design, and the use of these protein targets in predicting therapeutic potential in the context of biological networks. The successful application of polypharmacology to systems biology and pharmacology has the potential to markedly accelerate the pace of development of novel therapies for multiple diseases, and has implications for the intellectual property landscape, likely requiring targeted changes in patent law. PMID:26907947

  18. The ACI-REF Program: Empowering Prospective Computational Researchers

    NASA Astrophysics Data System (ADS)

    Cuma, M.; Cardoen, W.; Collier, G.; Freeman, R. M., Jr.; Kitzmiller, A.; Michael, L.; Nomura, K. I.; Orendt, A.; Tanner, L.

    2014-12-01

    The ACI-REF program, Advanced Cyberinfrastructure - Research and Education Facilitation, represents a consortium of academic institutions seeking to further advance the capabilities of their respective campus research communities through an extension of the personal connections and educational activities that underlie the unique and often specialized cyberinfrastructure at each institution. This consortium currently includes Clemson University, Harvard University, University of Hawai'i, University of Southern California, University of Utah, and University of Wisconsin. Working together in a coordinated effort, the consortium is dedicated to the adoption of models and strategies which leverage the expertise and experience of its members with a goal of maximizing the impact of each institution's investment in research computing. The ACI-REFs (facilitators) are tasked with making connections and building bridges between the local campus researchers and the many different providers of campus, commercial, and national computing resources. Through these bridges, ACI-REFs assist researchers from all disciplines in understanding their computing and data needs and in mapping these needs to existing capabilities or providing assistance with development of these capabilities. From the Earth sciences perspective, we will give examples of how this assistance improved methods and workflows in geophysics, geography and atmospheric sciences. We anticipate that this effort will expand the number of researchers who become self-sufficient users of advanced computing resources, allowing them to focus on making research discoveries in a more timely and efficient manner.

  19. United States Department of Agriculture-Agricultural Research Service research on biological control of arthropods.

    PubMed

    Hopper, Keith R

    2003-01-01

    During 1999-2001, ARS scientists published over 100 papers on more than 30 species of insect pest and 60 species of predator and parasitoid. These papers address issues crucial to the three strategies of biological control: conservation, augmentation and introduction. Conservation biological control includes both conserving extant populations of natural enemies by using relatively non-toxic pesticides and increasing the abundance of natural enemies in crops by providing or improving refuges for population growth and dispersal into crops. ARS scientists have been very active in determining the effects of pesticides on beneficial arthropods and in studying movement of natural enemies from refuges into crops. Augmentation involves repeated releases of natural enemies in the field, which can be inoculative or inundative. Inoculative releases are used to initiate self-propagating populations at times or in places where they would be slow to colonize. ARS scientists have studied augmentative biological control of a variety of pest insects. The targets are mostly pests in annual crops or other ephemeral habitats, where self-reproducing populations of natural enemies are not sufficiently abundant early enough to keep pest populations in check. ARS research in augmentative biological control centers on methods for rearing large numbers of healthy, effective natural enemies and for releasing them where and when they are needed at a cost less than the value of the reduction in damage to the crop. ARS scientists have researched various aspects of introductions of exotic biological control agents against a diversity of pest insects. The major issues in biological control introductions are accurate identification and adequate systematics of both natural enemies and target pests, exploration for natural enemies, predicting the success of candidates for introduction and the likelihood of non-target impacts, quarantine and rearing methods, and post-introduction evaluation of

  20. From Lévy to Brownian: A Computational Model Based on Biological Fluctuation

    PubMed Central

    Nurzaman, Surya G.; Matsumoto, Yoshio; Nakamura, Yutaka; Shirai, Kazumichi; Koizumi, Satoshi; Ishiguro, Hiroshi

    2011-01-01

    Background Theoretical studies predict that Lévy walks maximizes the chance of encountering randomly distributed targets with a low density, but Brownian walks is favorable inside a patch of targets with high density. Recently, experimental data reports that some animals indeed show a Lévy and Brownian walk movement patterns when forage for foods in areas with low and high density. This paper presents a simple, Gaussian-noise utilizing computational model that can realize such behavior. Methodology/Principal Findings We extend Lévy walks model of one of the simplest creature, Escherichia coli, based on biological fluctuation framework. We build a simulation of a simple, generic animal to observe whether Lévy or Brownian walks will be performed properly depends on the target density, and investigate the emergent behavior in a commonly faced patchy environment where the density alternates. Conclusions/Significance Based on the model, animal behavior of choosing Lévy or Brownian walk movement patterns based on the target density is able to be generated, without changing the essence of the stochastic property in Escherichia coli physiological mechanism as explained by related researches. The emergent behavior and its benefits in a patchy environment are also discussed. The model provides a framework for further investigation on the role of internal noise in realizing adaptive and efficient foraging behavior. PMID:21304911

  1. Managing biological networks by using text mining and computer-aided curation

    NASA Astrophysics Data System (ADS)

    Yu, Seok Jong; Cho, Yongseong; Lee, Min-Ho; Lim, Jongtae; Yoo, Jaesoo

    2015-11-01

    In order to understand a biological mechanism in a cell, a researcher should collect a huge number of protein interactions with experimental data from experiments and the literature. Text mining systems that extract biological interactions from papers have been used to construct biological networks for a few decades. Even though the text mining of literature is necessary to construct a biological network, few systems with a text mining tool are available for biologists who want to construct their own biological networks. We have developed a biological network construction system called BioKnowledge Viewer that can generate a biological interaction network by using a text mining tool and biological taggers. It also Boolean simulation software to provide a biological modeling system to simulate the model that is made with the text mining tool. A user can download PubMed articles and construct a biological network by using the Multi-level Knowledge Emergence Model (KMEM), MetaMap, and A Biomedical Named Entity Recognizer (ABNER) as a text mining tool. To evaluate the system, we constructed an aging-related biological network that consist 9,415 nodes (genes) by using manual curation. With network analysis, we found that several genes, including JNK, AP-1, and BCL-2, were highly related in aging biological network. We provide a semi-automatic curation environment so that users can obtain a graph database for managing text mining results that are generated in the server system and can navigate the network with BioKnowledge Viewer, which is freely available at http://bioknowledgeviewer.kisti.re.kr.

  2. Biologically relevant molecular transducer with increased computing power and iterative abilities.

    PubMed

    Ratner, Tamar; Piran, Ron; Jonoska, Natasha; Keinan, Ehud

    2013-05-23

    As computing devices, which process data and interconvert information, transducers can encode new information and use their output for subsequent computing, offering high computational power that may be equivalent to a universal Turing machine. We report on an experimental DNA-based molecular transducer that computes iteratively and produces biologically relevant outputs. As a proof of concept, the transducer accomplished division of numbers by 3. The iterative power was demonstrated by a recursive application on an obtained output. This device reads plasmids as input and processes the information according to a predetermined algorithm, which is represented by molecular software. The device writes new information on the plasmid using hardware that comprises DNA-manipulating enzymes. The computation produces dual output: a quotient, represented by newly encoded DNA, and a remainder, represented by E. coli phenotypes. This device algorithmically manipulates genetic codes. PMID:23706637

  3. Institute for Scientific Computing Research Fiscal Year 2002 Annual Report

    SciTech Connect

    Keyes, D E; McGraw, J R; Bodtker, L K

    2003-03-11

    The Institute for Scientific Computing Research (ISCR) at Lawrence Livermore National Laboratory is jointly administered by the Computing Applications and Research Department (CAR) and the University Relations Program (URP), and this joint relationship expresses its mission. An extensively externally networked ISCR cost-effectively expands the level and scope of national computational science expertise available to the Laboratory through CAR. The URP, with its infrastructure for managing six institutes and numerous educational programs at LLNL, assumes much of the logistical burden that is unavoidable in bridging the Laboratory's internal computational research environment with that of the academic community. As large-scale simulations on the parallel platforms of DOE's Advanced Simulation and Computing (ASCI) become increasingly important to the overall mission of LLNL, the role of the ISCR expands in importance, accordingly. Relying primarily on non-permanent staffing, the ISCR complements Laboratory research in areas of the computer and information sciences that are needed at the frontier of Laboratory missions. The ISCR strives to be the ''eyes and ears'' of the Laboratory in the computer and information sciences, in keeping the Laboratory aware of and connected to important external advances. It also attempts to be ''feet and hands, in carrying those advances into the Laboratory and incorporating them into practice. In addition to conducting research, the ISCR provides continuing education opportunities to Laboratory personnel, in the form of on-site workshops taught by experts on novel software or hardware technologies. The ISCR also seeks to influence the research community external to the Laboratory to pursue Laboratory-related interests and to train the workforce that will be required by the Laboratory. Part of the performance of this function is interpreting to the external community appropriate (unclassified) aspects of the Laboratory's own contributions

  4. Community-driven development for computational biology at Sprints, Hackathons and Codefests

    PubMed Central

    2014-01-01

    Background Computational biology comprises a wide range of technologies and approaches. Multiple technologies can be combined to create more powerful workflows if the individuals contributing the data or providing tools for its interpretation can find mutual understanding and consensus. Much conversation and joint investigation are required in order to identify and implement the best approaches. Traditionally, scientific conferences feature talks presenting novel technologies or insights, followed up by informal discussions during coffee breaks. In multi-institution collaborations, in order to reach agreement on implementation details or to transfer deeper insights in a technology and practical skills, a representative of one group typically visits the other. However, this does not scale well when the number of technologies or research groups is large. Conferences have responded to this issue by introducing Birds-of-a-Feather (BoF) sessions, which offer an opportunity for individuals with common interests to intensify their interaction. However, parallel BoF sessions often make it hard for participants to join multiple BoFs and find common ground between the different technologies, and BoFs are generally too short to allow time for participants to program together. Results This report summarises our experience with computational biology Codefests, Hackathons and Sprints, which are interactive developer meetings. They are structured to reduce the limitations of traditional scientific meetings described above by strengthening the interaction among peers and letting the participants determine the schedule and topics. These meetings are commonly run as loosely scheduled "unconferences" (self-organized identification of participants and topics for meetings) over at least two days, with early introductory talks to welcome and organize contributors, followed by intensive collaborative coding sessions. We summarise some prominent achievements of those meetings and describe

  5. BRIC-100VC Biological Research in Canisters (BRIC)-100VC

    NASA Technical Reports Server (NTRS)

    Richards, Stephanie E.; Levine, Howard G. (Compiler); Romero, Vergel

    2016-01-01

    The Biological Research in Canisters (BRIC) is an anodized-aluminum cylinder used to provide passive stowage for investigations of the effects of space flight on small specimens. The BRIC 100 mm petri dish vacuum containment unit (BRIC-100VC) has supported Dugesia japonica (flatworm) within spring under normal atmospheric conditions for 29 days in space and Hemerocallis lilioasphodelus L. (daylily) somatic embryo development within a 5% CO2 gaseous environment for 4.5 months in space. BRIC-100VC is a completely sealed, anodized-aluminum cylinder (Fig. 1) providing containment and structural support of the experimental specimens. The top and bottom lids of the canister include rapid disconnect valves for filling the canister with selected gases. These specialized valves allow for specific atmospheric containment within the canister, providing a gaseous environment defined by the investigator. Additionally, the top lid has been designed with a toggle latch and O-ring assembly allowing for prompt sealing and removal of the lid. The outside dimensions of the BRIC-100VC canisters are 16.0 cm (height) x 11.4 cm (outside diameter). The lower portion of the canister has been equipped with sufficient storage space for passive temperature and relative humidity data loggers. The BRIC- 100VC canister has been optimized to accommodate standard 100 mm laboratory petri dishes or 50 mL conical tubes. Depending on storage orientation, up to 6 or 9 canisters have been flown within an International Space Station (ISS) stowage locker.

  6. Gross's anatomy: textual politics in science/biology education research

    NASA Astrophysics Data System (ADS)

    Reis, Giuliano

    2009-12-01

    In approaching how the grotesque is—or should be—situated within contemporary science (biology) education practices, Weinstein and Broda undertake a passionate reclaim of an education that is at the same time scientific, critical, and liberatory. However legitimate, their work offers more than they probably could have anticipated: It exemplifies how the textual structure of a research article can be such as to "tip-off" readers about how it is supposed to be understood. In this way, what one learns from reading the manuscript is grounded on the way the authors examine the data presented. That is, the findings are not intrinsic to the materials collected, but constructed within the analyses that precede/follow the account of each one of the four "specimens" reported. Therefore, the present commentary seeks to re-consider the original study from an alternative perspective, one that challenges its seemingly objective (re)construction of facts by placing emphasis on how the text contains instructions for its own interpretation and validation. Ultimately, the purpose here is to describe and discuss the interpretive and validation work that is done by this discursive mechanism of self-appraisal rather than discredit the two authors' initiative.

  7. A Computer Data Base for Clinicians, Managers and Researchers

    PubMed Central

    Gottfredson, Douglas K.

    1981-01-01

    Since 1972 the Salt Lake VA Medical Center has designed, developed and upgraded a computer system to improve the quality of health care for veterans. The computer system has greatly increased the ease and accuracy with which information is gathered, stored, retrieved and analysed. Though it has not been possible to anticipate every question which might be asked, we have attempted to recognize the special interests of various groups and individuals and to tailor the computer data base to meet their needs. The SL VAMC computer system facilitates meeting accountability requirements established by different agencies to assure quality of care. Computer techniques provide clinicians with information for assessment, planning, providing treatment, following progress and establishing discharge and after-care plans. Managers are provided information vital for decisions and to complete required reports. Researchers can readily study the effectiveness of assessment, diagnoses and treatment and recommend program improvements.

  8. BrisSynBio: a BBSRC/EPSRC-funded Synthetic Biology Research Centre

    PubMed Central

    Sedgley, Kathleen R.; Race, Paul R.; Woolfson, Derek N.

    2016-01-01

    BrisSynBio is the Bristol-based Biotechnology and Biological Sciences Research Council (BBSRC)/Engineering and Physical Sciences Research Council (EPSRC)-funded Synthetic Biology Research Centre. It is one of six such Centres in the U.K. BrisSynBio's emphasis is on rational and predictive bimolecular modelling, design and engineering in the context of synthetic biology. It trains the next generation of synthetic biologists in these approaches, to facilitate translation of fundamental synthetic biology research to industry and the clinic, and to do this within an innovative and responsible research framework. PMID:27284028

  9. Ethical Guidelines for Computer Security Researchers: "Be Reasonable"

    NASA Astrophysics Data System (ADS)

    Sassaman, Len

    For most of its existence, the field of computer science has been lucky enough to avoid ethical dilemmas by virtue of its relatively benign nature. The subdisciplines of programming methodology research, microprocessor design, and so forth have little room for the greater questions of human harm. Other, more recently developed sub-disciplines, such as data mining, social network analysis, behavioral profiling, and general computer security, however, open the door to abuse of users by practitioners and researchers. It is therefore the duty of the men and women who chart the course of these fields to set rules for themselves regarding what sorts of actions on their part are to be considered acceptable and what should be avoided or handled with caution out of ethical concerns. This paper deals solely with the issues faced by computer security researchers, be they vulnerability analysts, privacy system designers, malware experts, or reverse engineers.

  10. Advanced computational research in materials processing for design and manufacturing

    SciTech Connect

    Zacharia, T.

    1994-12-31

    The computational requirements for design and manufacture of automotive components have seen dramatic increases for producing automobiles with three times the mileage. Automotive component design systems are becoming increasingly reliant on structural analysis requiring both overall larger analysis and more complex analyses, more three-dimensional analyses, larger model sizes, and routine consideration of transient and non-linear effects. Such analyses must be performed rapidly to minimize delays in the design and development process, which drives the need for parallel computing. This paper briefly describes advanced computational research in superplastic forming and automotive crash worthiness.

  11. Computational Science Research in Support of Petascale Electromagnetic Modeling

    SciTech Connect

    Lee, L.-Q.; Akcelik, V; Ge, L; Chen, S; Schussman, G; Candel, A; Li, Z; Xiao, L; Kabel, A; Uplenchwar, R; Ng, C; Ko, K; /SLAC

    2008-06-20

    Computational science research components were vital parts of the SciDAC-1 accelerator project and are continuing to play a critical role in newly-funded SciDAC-2 accelerator project, the Community Petascale Project for Accelerator Science and Simulation (ComPASS). Recent advances and achievements in the area of computational science research in support of petascale electromagnetic modeling for accelerator design analysis are presented, which include shape determination of superconducting RF cavities, mesh-based multilevel preconditioner in solving highly-indefinite linear systems, moving window using h- or p- refinement for time-domain short-range wakefield calculations, and improved scalable application I/O.

  12. Research on Bacteria in the Mainstream of Biology.

    ERIC Educational Resources Information Center

    Magasanik, Boris

    1988-01-01

    Stresses the importance of investigating bacterial mechanisms to discover clues for a greater understanding of cells. Cites examples of study areas of biological significance which may reveal information about the evolution of prokaryotes and eukaryotes and lead to a comprehensive theory of cell biology. (RT)

  13. Large Scale Computing and Storage Requirements for Nuclear Physics Research

    SciTech Connect

    Gerber, Richard A.; Wasserman, Harvey J.

    2012-03-02

    IThe National Energy Research Scientific Computing Center (NERSC) is the primary computing center for the DOE Office of Science, serving approximately 4,000 users and hosting some 550 projects that involve nearly 700 codes for a wide variety of scientific disciplines. In addition to large-scale computing resources NERSC provides critical staff support and expertise to help scientists make the most efficient use of these resources to advance the scientific mission of the Office of Science. In May 2011, NERSC, DOE’s Office of Advanced Scientific Computing Research (ASCR) and DOE’s Office of Nuclear Physics (NP) held a workshop to characterize HPC requirements for NP research over the next three to five years. The effort is part of NERSC’s continuing involvement in anticipating future user needs and deploying necessary resources to meet these demands. The workshop revealed several key requirements, in addition to achieving its goal of characterizing NP computing. The key requirements include: 1. Larger allocations of computational resources at NERSC; 2. Visualization and analytics support; and 3. Support at NERSC for the unique needs of experimental nuclear physicists. This report expands upon these key points and adds others. The results are based upon representative samples, called “case studies,” of the needs of science teams within NP. The case studies were prepared by NP workshop participants and contain a summary of science goals, methods of solution, current and future computing requirements, and special software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, “multi-core” environment that is expected to dominate HPC architectures over the next few years. The report also includes a section with NERSC responses to the workshop findings. NERSC has many initiatives already underway that address key workshop findings and all of the action items are aligned with NERSC strategic plans.

  14. 9 CFR 112.9 - Biological products imported for research and evaluation.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... research and evaluation. 112.9 Section 112.9 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION... PACKAGING AND LABELING § 112.9 Biological products imported for research and evaluation. A biological product imported for research and evaluation under a permit issued in accordance with § 104.4, with...

  15. 9 CFR 112.9 - Biological products imported for research and evaluation.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... research and evaluation. 112.9 Section 112.9 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION... PACKAGING AND LABELING § 112.9 Biological products imported for research and evaluation. A biological product imported for research and evaluation under a permit issued in accordance with § 104.4, with...

  16. Preparing the "New" Biologist of the Future: Student Research at the Interface of Mathematics and Biology

    ERIC Educational Resources Information Center

    Duncan, Sarah I.; Bishop, Pamela; Lenhart, Suzanne

    2010-01-01

    We describe a unique Research Experience for Undergraduates and Research Experience for Veterinary students summer program at the National Institute for Mathematical and Biological Synthesis on the campus of the University of Tennessee, Knoxville. The program focused on interdisciplinary research at the interface of biology and mathematics.…

  17. Management of Biological Materials in Wastewater from Research & Development Facilities

    SciTech Connect

    Raney, Elizabeth A.; Moon, Thomas W.; Ballinger, Marcel Y.

    2011-04-01

    PNNL has developed and instituted a systematic approach to managing work with biological material that begins in the project planning phase and carries through implementation to waste disposal. This paper describes two major processes used at PNNL to analyze and mitigate the hazards associated with working with biological materials and evaluate them for disposal to the sewer, ground, or surface water in a manner that protects human health and the environment. The first of these processes is the Biological Work Permit which is used to identify requirements for handling, storing, and working with biological materials and the second is the Sewer Approval process which is used to evaluate discharges of wastewaters containing biological materials to assure they meet industrial wastewater permits and other environmental regulations and requirements.

  18. Computational approaches and metrics required for formulating biologically realistic nanomaterial pharmacokinetic models

    NASA Astrophysics Data System (ADS)

    Riviere, Jim E.; Scoglio, Caterina; Sahneh, Faryad D.; Monteiro-Riviere, Nancy A.

    2013-01-01

    The field of nanomaterial pharmacokinetics is in its infancy, with major advances largely restricted by a lack of biologically relevant metrics, fundamental differences between particles and small molecules of organic chemicals and drugs relative to biological processes involved in disposition, a scarcity of sufficiently rich and characterized in vivo data and a lack of computational approaches to integrating nanomaterial properties to biological endpoints. A central concept that links nanomaterial properties to biological disposition, in addition to their colloidal properties, is the tendency to form a biocorona which modulates biological interactions including cellular uptake and biodistribution. Pharmacokinetic models must take this crucial process into consideration to accurately predict in vivo disposition, especially when extrapolating from laboratory animals to humans since allometric principles may not be applicable. The dynamics of corona formation, which modulates biological interactions including cellular uptake and biodistribution, is thereby a crucial process involved in the rate and extent of biodisposition. The challenge will be to develop a quantitative metric that characterizes a nanoparticle's surface adsorption forces that are important for predicting biocorona dynamics. These types of integrative quantitative approaches discussed in this paper for the dynamics of corona formation must be developed before realistic engineered nanomaterial risk assessment can be accomplished.

  19. A Research Roadmap for Computation-Based Human Reliability Analysis

    SciTech Connect

    Boring, Ronald; Mandelli, Diego; Joe, Jeffrey; Smith, Curtis; Groth, Katrina

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  20. Neuromorphic Computing – From Materials Research to Systems Architecture Roundtable

    SciTech Connect

    Schuller, Ivan K.; Stevens, Rick; Pino, Robinson; Pechan, Michael

    2015-10-29

    Computation in its many forms is the engine that fuels our modern civilization. Modern computation—based on the von Neumann architecture—has allowed, until now, the development of continuous improvements, as predicted by Moore’s law. However, computation using current architectures and materials will inevitably—within the next 10 years—reach a limit because of fundamental scientific reasons. DOE convened a roundtable of experts in neuromorphic computing systems, materials science, and computer science in Washington on October 29-30, 2015 to address the following basic questions: Can brain-like (“neuromorphic”) computing devices based on new material concepts and systems be developed to dramatically outperform conventional CMOS based technology? If so, what are the basic research challenges for materials sicence and computing? The overarching answer that emerged was: The development of novel functional materials and devices incorporated into unique architectures will allow a revolutionary technological leap toward the implementation of a fully “neuromorphic” computer. To address this challenge, the following issues were considered: The main differences between neuromorphic and conventional computing as related to: signaling models, timing/clock, non-volatile memory, architecture, fault tolerance, integrated memory and compute, noise tolerance, analog vs. digital, and in situ learning New neuromorphic architectures needed to: produce lower energy consumption, potential novel nanostructured materials, and enhanced computation Device and materials properties needed to implement functions such as: hysteresis, stability, and fault tolerance Comparisons of different implementations: spin torque, memristors, resistive switching, phase change, and optical schemes for enhanced breakthroughs in performance, cost, fault tolerance, and/or manufacturability.

  1. Seeing Is Believing: Quantifying Is Convincing: Computational Image Analysis in Biology.

    PubMed

    Sbalzarini, Ivo F

    2016-01-01

    Imaging is center stage in biology. Advances in microscopy and labeling techniques have enabled unprecedented observations and continue to inspire new developments. Efficient and accurate quantification and computational analysis of the acquired images, however, are becoming the bottleneck. We review different paradigms of computational image analysis for intracellular, single-cell, and tissue-level imaging, providing pointers to the specialized literature and listing available software tools. We place particular emphasis on clear categorization of image-analysis frameworks and on identifying current trends and challenges in the field. We further outline some of the methodological advances that are required in order to use images as quantitative scientific measurements. PMID:27207361

  2. A biologically consistent hierarchical framework for self-referencing survivalist computation

    NASA Astrophysics Data System (ADS)

    Cottam, Ron; Ranson, Willy; Vounckx, Roger

    2000-05-01

    Extensively scaled formally rational hardware and software are indirectly fallible, at the very least through temporal restrictions on the evaluation of their correctness. In addition, the apparent inability of formal rationality to successfully describe living systems as anything other than inanimate structures suggests that the development of self-referencing computational machines will require a different approach. There is currently a strong movement towards the adoption of semiotics as a descriptive medium in theoretical biology. We present a related computational semiosic construction (1, 2) consistent with evolutionary hierarchical emergence (3), which may serve as a framework for implementing anticipatory-oriented survivalist processing in real environments.

  3. PNNLs Data Intensive Computing research battles Homeland Security threats

    SciTech Connect

    David Thurman; Joe Kielman; Katherine Wolf; David Atkinson

    2009-12-01

    The Pacific Northwest National Laboratorys (PNNL's) approach to data intensive computing (DIC) is focused on three key research areas: hybrid hardware architecture, software architectures, and analytic algorithms. Advancements in these areas will help to address, and solve, DIC issues associated with capturing, managing, analyzing and understanding, in near real time, data at volumes and rates that push the frontiers of current technologies.

  4. Three Decades of Research on Computer Applications in Health Care

    PubMed Central

    Michael Fitzmaurice, J.; Adams, Karen; Eisenberg, John M.

    2002-01-01

    The Agency for Healthcare Research and Quality and its predecessor organizations—collectively referred to here as AHRQ—have a productive history of funding research and development in the field of medical informatics, with grant investments since 1968 totaling $107 million. Many computerized interventions that are commonplace today, such as drug interaction alerts, had their genesis in early AHRQ initiatives. This review provides a historical perspective on AHRQ investment in medical informatics research. It shows that grants provided by AHRQ resulted in achievements that include advancing automation in the clinical laboratory and radiology, assisting in technology development (computer languages, software, and hardware), evaluating the effectiveness of computer-based medical information systems, facilitating the evolution of computer-aided decision making, promoting computer-initiated quality assurance programs, backing the formation and application of comprehensive data banks, enhancing the management of specific conditions such as HIV infection, and supporting health data coding and standards initiatives. Other federal agencies and private organizations have also supported research in medical informatics, some earlier and to a greater degree than AHRQ. The results and relative roles of these related efforts are beyond the scope of this review. PMID:11861630

  5. Results of a Research Evaluating Quality of Computer Science Education

    ERIC Educational Resources Information Center

    Záhorec, Ján; Hašková, Alena; Munk, Michal

    2012-01-01

    The paper presents the results of an international research on a comparative assessment of the current status of computer science education at the secondary level (ISCED 3A) in Slovakia, the Czech Republic, and Belgium. Evaluation was carried out based on 14 specific factors gauging the students' point of view. The authors present qualitative…

  6. Expanding HPC and Research Computing--The Sustainable Way

    ERIC Educational Resources Information Center

    Grush, Mary

    2009-01-01

    Increased demands for research and high-performance computing (HPC)--along with growing expectations for cost and environmental savings--are putting new strains on the campus data center. More and more, CIOs like the University of Notre Dame's (Indiana) Gordon Wishon are seeking creative ways to build more sustainable models for data center and…

  7. Computer-Based Instruction in Qualitative Research Practices.

    ERIC Educational Resources Information Center

    Busby, J. S.; Payne, K.

    1998-01-01

    Discusses problems in qualitative-research-practice instruction and describes a computer-based instructional system based on linking domain problems to particular pedagogic mechanisms, and then linking these mechanisms to various implementation decisions. Topics include skill transfer and relational-database management systems. (Author/LRW)

  8. A FRAMEWORK FOR A COMPUTATIONAL TOXICOLOGY RESEARCH PROGRAM IN ORD

    EPA Science Inventory

    "A Framework for a Computational Toxicology Research Program in ORD" was drafted by a Technical Writing Team having representatives from all of ORD's Laboratories and Centers. The document describes a framework for the development of an program within ORD to utilize approaches d...

  9. PNNLs Data Intensive Computing research battles Homeland Security threats

    ScienceCinema

    David Thurman; Joe Kielman; Katherine Wolf; David Atkinson

    2012-12-31

    The Pacific Northwest National Laboratorys (PNNL's) approach to data intensive computing (DIC) is focused on three key research areas: hybrid hardware architecture, software architectures, and analytic algorithms. Advancements in these areas will help to address, and solve, DIC issues associated with capturing, managing, analyzing and understanding, in near real time, data at volumes and rates that push the frontiers of current technologies.

  10. Computer-assisted design in perceptual-motor skills research

    NASA Technical Reports Server (NTRS)

    Rogers, C. A., Jr.

    1974-01-01

    A categorization was made of independent variables previously found to be potent in simple perceptual-motor tasks. A computer was then used to generate hypothetical factorial designs. These were evaluated in terms of literature trends and pragmatic criteria. Potential side-effects of machine-assisted research strategy were discussed.

  11. Using Research to Develop Computational Fluency in Young Mathematicians

    ERIC Educational Resources Information Center

    O'Loughlin, Tricia Ann

    2007-01-01

    This article describes one teacher's journey to improve her teaching of mathematics by conducting classroom-based inquiry to meet the specific mathematical needs in her second grade classroom. Her process to develop and improve computational fluency through research-based methods is detailed. (Contains 9 figures.)

  12. Meeting report from the first meetings of the Computational Modeling in Biology Network (COMBINE)

    PubMed Central

    Le Novère, Nicolas; Hucka, Michael; Anwar, Nadia; Bader, Gary D; Demir, Emek; Moodie, Stuart; Sorokin, Anatoly

    2011-01-01

    The Computational Modeling in Biology Network (COMBINE), is an initiative to coordinate the development of the various community standards and formats in computational systems biology and related fields. This report summarizes the activities pursued at the first annual COMBINE meeting held in Edinburgh on October 6-9 2010 and the first HARMONY hackathon, held in New York on April 18-22 2011. The first of those meetings hosted 81 attendees. Discussions covered both official COMBINE standards-(BioPAX, SBGN and SBML), as well as emerging efforts and interoperability between different formats. The second meeting, oriented towards software developers, welcomed 59 participants and witnessed many technical discussions, development of improved standards support in community software systems and conversion between the standards. Both meetings were resounding successes and showed that the field is now mature enough to develop representation formats and related standards in a coordinated manner. PMID:22180826

  13. Secure Encapsulation and Publication of Biological Services in the Cloud Computing Environment

    PubMed Central

    Zhang, Weizhe; Wang, Xuehui; Lu, Bo; Kim, Tai-hoon

    2013-01-01

    Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved. PMID:24078906

  14. Research in Community-Based Biological Education. 4 Case-Studies.

    ERIC Educational Resources Information Center

    Atchia, Michael, Ed.

    Several case studies of research into the biological needs of communities in developing countries were conducted and two strategies for relating biological education (in both formal and nonformal contexts) to community development were identified. Four of these case studies are presented. They are: (1) "From Biological Knowledge to Community…

  15. Interactions of dendrimers with biological drug targets: reality or mystery - a gap in drug delivery and development research.

    PubMed

    Ahmed, Shaimaa; Vepuri, Suresh B; Kalhapure, Rahul S; Govender, Thirumala

    2016-07-21

    Dendrimers have emerged as novel and efficient materials that can be used as therapeutic agents/drugs or as drug delivery carriers to enhance therapeutic outcomes. Molecular dendrimer interactions are central to their applications and realising their potential. The molecular interactions of dendrimers with drugs or other materials in drug delivery systems or drug conjugates have been extensively reported in the literature. However, despite the growing application of dendrimers as biologically active materials, research focusing on the mechanistic analysis of dendrimer interactions with therapeutic biological targets is currently lacking in the literature. This comprehensive review on dendrimers over the last 15 years therefore attempts to identify the reasons behind the apparent lack of dendrimer-receptor research and proposes approaches to address this issue. The structure, hierarchy and applications of dendrimers are briefly highlighted, followed by a review of their various applications, specifically as biologically active materials, with a focus on their interactions at the target site. It concludes with a technical guide to assist researchers on how to employ various molecular modelling and computational approaches for research on dendrimer interactions with biological targets at a molecular level. This review highlights the impact of a mechanistic analysis of dendrimer interactions on a molecular level, serves to guide and optimise their discovery as medicinal agents, and hopes to stimulate multidisciplinary research between scientific, experimental and molecular modelling research teams. PMID:27100841

  16. High performance computing in biology: multimillion atom simulations of nanoscale systems.

    PubMed

    Sanbonmatsu, K Y; Tung, C-S

    2007-03-01

    Computational methods have been used in biology for sequence analysis (bioinformatics), all-atom simulation (molecular dynamics and quantum calculations), and more recently for modeling biological networks (systems biology). Of these three techniques, all-atom simulation is currently the most computationally demanding, in terms of compute load, communication speed, and memory load. Breakthroughs in electrostatic force calculation and dynamic load balancing have enabled molecular dynamics simulations of large biomolecular complexes. Here, we report simulation results for the ribosome, using approximately 2.64 million atoms, the largest all-atom biomolecular simulation published to date. Several other nano-scale systems with different numbers of atoms were studied to measure the performance of the NAMD molecular dynamics simulation program on the Los Alamos National Laboratory Q Machine. We demonstrate that multimillion atom systems represent a 'sweet spot' for the NAMD code on large supercomputers. NAMD displays an unprecedented 85% parallel scaling efficiency for the ribosome system on 1024 CPUs. We also review recent targeted molecular dynamics simulations of the ribosome that prove useful for studying conformational changes of this large biomolecular complex in atomic detail. PMID:17187988

  17. Computational adaptive optics for broadband optical interferometric tomography of biological tissue

    NASA Astrophysics Data System (ADS)

    Boppart, Stephen A.

    2015-03-01

    High-resolution real-time tomography of biological tissues is important for many areas of biological investigations and medical applications. Cellular level optical tomography, however, has been challenging because of the compromise between transverse imaging resolution and depth-of-field, the system and sample aberrations that may be present, and the low imaging sensitivity deep in scattering tissues. The use of computed optical imaging techniques has the potential to address several of these long-standing limitations and challenges. Two related techniques are interferometric synthetic aperture microscopy (ISAM) and computational adaptive optics (CAO). Through three-dimensional Fourierdomain resampling, in combination with high-speed OCT, ISAM can be used to achieve high-resolution in vivo tomography with enhanced depth sensitivity over a depth-of-field extended by more than an order-of-magnitude, in realtime. Subsequently, aberration correction with CAO can be performed in a tomogram, rather than to the optical beam of a broadband optical interferometry system. Based on principles of Fourier optics, aberration correction with CAO is performed on a virtual pupil using Zernike polynomials, offering the potential to augment or even replace the more complicated and expensive adaptive optics hardware with algorithms implemented on a standard desktop computer. Interferometric tomographic reconstructions are characterized with tissue phantoms containing sub-resolution scattering particles, and in both ex vivo and in vivo biological tissue. This review will collectively establish the foundation for high-speed volumetric cellular-level optical interferometric tomography in living tissues.

  18. The role of ontologies in biological and biomedical research: a functional perspective.

    PubMed

    Hoehndorf, Robert; Schofield, Paul N; Gkoutos, Georgios V

    2015-11-01

    Ontologies are widely used in biological and biomedical research. Their success lies in their combination of four main features present in almost all ontologies: provision of standard identifiers for classes and relations that represent the phenomena within a domain; provision of a vocabulary for a domain; provision of metadata that describes the intended meaning of the classes and relations in ontologies; and the provision of machine-readable axioms and definitions that enable computational access to some aspects of the meaning of classes and relations. While each of these features enables applications that facilitate data integration, data access and analysis, a great potential lies in the possibility of combining these four features to support integrative analysis and interpretation of multimodal data. Here, we provide a functional perspective on ontologies in biology and biomedicine, focusing on what ontologies can do and describing how they can be used in support of integrative research. We also outline perspectives for using ontologies in data-driven science, in particular their application in structured data mining and machine learning applications. PMID:25863278

  19. The role of ontologies in biological and biomedical research: a functional perspective

    PubMed Central

    Schofield, Paul N.; Gkoutos, Georgios V.

    2015-01-01

    Ontologies are widely used in biological and biomedical research. Their success lies in their combination of four main features present in almost all ontologies: provision of standard identifiers for classes and relations that represent the phenomena within a domain; provision of a vocabulary for a domain; provision of metadata that describes the intended meaning of the classes and relations in ontologies; and the provision of machine-readable axioms and definitions that enable computational access to some aspects of the meaning of classes and relations. While each of these features enables applications that facilitate data integration, data access and analysis, a great potential lies in the possibility of combining these four features to support integrative analysis and interpretation of multimodal data. Here, we provide a functional perspective on ontologies in biology and biomedicine, focusing on what ontologies can do and describing how they can be used in support of integrative research. We also outline perspectives for using ontologies in data-driven science, in particular their application in structured data mining and machine learning applications. PMID:25863278

  20. Statistical Methodologies to Integrate Experimental and Computational Research

    NASA Technical Reports Server (NTRS)

    Parker, P. A.; Johnson, R. T.; Montgomery, D. C.

    2008-01-01

    Development of advanced algorithms for simulating engine flow paths requires the integration of fundamental experiments with the validation of enhanced mathematical models. In this paper, we provide an overview of statistical methods to strategically and efficiently conduct experiments and computational model refinement. Moreover, the integration of experimental and computational research efforts is emphasized. With a statistical engineering perspective, scientific and engineering expertise is combined with statistical sciences to gain deeper insights into experimental phenomenon and code development performance; supporting the overall research objectives. The particular statistical methods discussed are design of experiments, response surface methodology, and uncertainty analysis and planning. Their application is illustrated with a coaxial free jet experiment and a turbulence model refinement investigation. Our goal is to provide an overview, focusing on concepts rather than practice, to demonstrate the benefits of using statistical methods in research and development, thereby encouraging their broader and more systematic application.

  1. Computational modeling of chemo-bio-mechanical coupling: a systems-biology approach toward wound healing.

    PubMed

    Buganza Tepole, A; Kuhl, E

    2016-01-01

    Wound healing is a synchronized cascade of chemical, biological, and mechanical phenomena, which act in concert to restore the damaged tissue. An imbalance between these events can induce painful scarring. Despite intense efforts to decipher the mechanisms of wound healing, the role of mechanics remains poorly understood. Here, we establish a computational systems biology model to identify the chemical, biological, and mechanical mechanisms of scar formation. First, we introduce the generic problem of coupled chemo-bio-mechanics. Then, we introduce the model problem of wound healing in terms of a particular chemical signal, inflammation, a particular biological cell type, fibroblasts, and a particular mechanical model, isotropic hyperelasticity. We explore the cross-talk between chemical, biological, and mechanical signals and show that all three fields have a significant impact on scar formation. Our model is the first step toward rigorous multiscale, multifield modeling in wound healing. Our formulation has the potential to improve effective wound management and optimize treatment on an individualized patient-specific basis. PMID:25421487

  2. NASA Space Biology Research Associate Program for the 21st Century

    NASA Technical Reports Server (NTRS)

    Sonnenfeld, Gerald

    2000-01-01

    The Space Biology Research Associate Program for the 21st Century provided a unique opportunity to train individuals to conduct biological research in hypo- and hyper-gravity, and to conduct ground-based research. This grant was developed to maximize the potential for Space Biology as an emerging discipline and to train a cadre of space biologists. The field of gravitational and space biology is rapidly growing at the future of the field is reflected in the quality and education of its personnel. Our chief objective was to train and develop these scientists rapidly and in a cost effective model.

  3. COMPUTER-AIDED DRUG DISCOVERY AND DEVELOPMENT (CADDD): in silico-chemico-biological approach

    PubMed Central

    Kapetanovic, I.M.

    2008-01-01

    It is generally recognized that drug discovery and development are very time and resources consuming processes. There is an ever growing effort to apply computational power to the combined chemical and biological space in order to streamline drug discovery, design, development and optimization. In biomedical arena, computer-aided or in silico design is being utilized to expedite and facilitate hit identification, hit-to-lead selection, optimize the absorption, distribution, metabolism, excretion and toxicity profile and avoid safety issues. Commonly used computational approaches include ligand-based drug design (pharmacophore, a 3-D spatial arrangement of chemical features essential for biological activity), structure-based drug design (drug-target docking), and quantitative structure-activity and quantitative structure-property relationships. Regulatory agencies as well as pharmaceutical industry are actively involved in development of computational tools that will improve effectiveness and efficiency of drug discovery and development process, decrease use of animals, and increase predictability. It is expected that the power of CADDD will grow as the technology continues to evolve. PMID:17229415

  4. The NASA Specialized Center of Research and Training (NSCORT) in Gravitational Biology

    NASA Technical Reports Server (NTRS)

    Spooner, B. S.; Guikema, J. A.

    1992-01-01

    The Life Sciences Division of NASA has initiated a NASA Specialized Centers of Research and Training (NSCORT) program. Three Centers were designated in late 1990, as the culmination of an in-depth peer review analysis of proposals from universities across the nation and around the world. Kansas State University was selected as the NSCORT in Gravitational Biology. This Center is headquartered in the KSU Division of Biology and has a research, training, and outreach function that focuses on cellular and developmental biology.

  5. Path-Integration Computation of the Transport Properties of Polymers Nanoparticles and Complex Biological Structures

    NASA Astrophysics Data System (ADS)

    Douglas, Jack

    2014-03-01

    finite cross-section, DNA, nanoparticles with grafted chain layers and knotted polymers. The path-integration method, which grew up from research in Karl Freed's group, is evidently a powerful tool for computing basic transport properties of complex-shaped objects and should find increasing application in polymer science, nanotechnological applications and biology.

  6. Computational enzyme design approaches with significant biological outcomes: progress and challenges

    PubMed Central

    Li, Xiaoman; Zhang, Ziding; Song, Jiangning

    2012-01-01

    Enzymes are powerful biocatalysts, however, so far there is still a large gap between the number of enzyme-based practical applications and that of naturally occurring enzymes. Multiple experimental approaches have been applied to generate nearly all possible mutations of target enzymes, allowing the identification of desirable variants with improved properties to meet the practical needs. Meanwhile, an increasing number of computational methods have been developed to assist in the modification of enzymes during the past few decades. With the development of bioinformatic algorithms, computational approaches are now able to provide more precise guidance for enzyme engineering and make it more efficient and less laborious. In this review, we summarize the recent advances of method development with significant biological outcomes to provide important insights into successful computational protein designs. We also discuss the limitations and challenges of existing methods and the future directions that should improve them. PMID:24688648

  7. 9 CFR 112.9 - Biological products imported for research and evaluation.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 9 Animals and Animal Products 1 2012-01-01 2012-01-01 false Biological products imported for research and evaluation. 112.9 Section 112.9 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION... PACKAGING AND LABELING § 112.9 Biological products imported for research and evaluation. A...

  8. 9 CFR 112.9 - Biological products imported for research and evaluation.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 9 Animals and Animal Products 1 2013-01-01 2013-01-01 false Biological products imported for research and evaluation. 112.9 Section 112.9 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION... PACKAGING AND LABELING § 112.9 Biological products imported for research and evaluation. A...

  9. 9 CFR 112.9 - Biological products imported for research and evaluation.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 9 Animals and Animal Products 1 2014-01-01 2014-01-01 false Biological products imported for research and evaluation. 112.9 Section 112.9 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION... PACKAGING AND LABELING § 112.9 Biological products imported for research and evaluation. A...

  10. Recent research trends in the use of predators for biological control

    Technology Transfer Automated Retrieval System (TEKTRAN)

    We focus on recent interesting research trends in biological control using predators by selecting four areas of current research: 1) Intraguild predation (IGP): defined as the “killing and eating of species that use similar resources and are thus potential competitors”. In biological control, the si...

  11. Interdisciplinary Biomathematics: Engaging Undergraduates in Research on the Fringe of Mathematical Biology

    ERIC Educational Resources Information Center

    Fowler, Kathleen; Luttman, Aaron; Mondal, Sumona

    2013-01-01

    The US National Science Foundation's (NSF's) Undergraduate Biology and Mathematics (UBM) program significantly increased undergraduate research in the biomathematical sciences. We discuss three UBM-funded student research projects at Clarkson University that lie at the intersection of not just mathematics and biology, but also other…

  12. Research Programs Constituting U.S. Participation in the International Biological Program.

    ERIC Educational Resources Information Center

    National Academy of Sciences--National Research Council, Washington, DC. Div. of Biology and Agriculture.

    The United States contribution to the International Biological Program, which aims to understand more clearly the interrelationships within ecosystems, is centered on multidisciplinary research programs investigating the biological basis of ecological productivity and human welfare. Integrated research programs have been established for the…

  13. Structural and Computational Biology in the Design of Immunogenic Vaccine Antigens

    PubMed Central

    Liljeroos, Lassi; Malito, Enrico; Ferlenghi, Ilaria; Bottomley, Matthew James

    2015-01-01

    Vaccination is historically one of the most important medical interventions for the prevention of infectious disease. Previously, vaccines were typically made of rather crude mixtures of inactivated or attenuated causative agents. However, over the last 10–20 years, several important technological and computational advances have enabled major progress in the discovery and design of potently immunogenic recombinant protein vaccine antigens. Here we discuss three key breakthrough approaches that have potentiated structural and computational vaccine design. Firstly, genomic sciences gave birth to the field of reverse vaccinology, which has enabled the rapid computational identification of potential vaccine antigens. Secondly, major advances in structural biology, experimental epitope mapping, and computational epitope prediction have yielded molecular insights into the immunogenic determinants defining protective antigens, enabling their rational optimization. Thirdly, and most recently, computational approaches have been used to convert this wealth of structural and immunological information into the design of improved vaccine antigens. This review aims to illustrate the growing power of combining sequencing, structural and computational approaches, and we discuss how this may drive the design of novel immunogens suitable for future vaccines urgently needed to increase the global prevention of infectious disease. PMID:26526043

  14. Creating a pipeline of talent for informatics: STEM initiative for high school students in computer science, biology, and biomedical informatics.

    PubMed

    Dutta-Moscato, Joyeeta; Gopalakrishnan, Vanathi; Lotze, Michael T; Becich, Michael J

    2014-01-01

    This editorial provides insights into how informatics can attract highly trained students by involving them in science, technology, engineering, and math (STEM) training at the high school level and continuing to provide mentorship and research opportunities through the formative years of their education. Our central premise is that the trajectory necessary to be expert in the emergent fields in front of them requires acceleration at an early time point. Both pathology (and biomedical) informatics are new disciplines which would benefit from involvement by students at an early stage of their education. In 2009, Michael T Lotze MD, Kirsten Livesey (then a medical student, now a medical resident at University of Pittsburgh Medical Center (UPMC)), Richard Hersheberger, PhD (Currently, Dean at Roswell Park), and Megan Seippel, MS (the administrator) launched the University of Pittsburgh Cancer Institute (UPCI) Summer Academy to bring high school students for an 8 week summer academy focused on Cancer Biology. Initially, pathology and biomedical informatics were involved only in the classroom component of the UPCI Summer Academy. In 2011, due to popular interest, an informatics track called Computer Science, Biology and Biomedical Informatics (CoSBBI) was launched. CoSBBI currently acts as a feeder program for the undergraduate degree program in bioinformatics at the University of Pittsburgh, which is a joint degree offered by the Departments of Biology and Computer Science. We believe training in bioinformatics is the best foundation for students interested in future careers in pathology informatics or biomedical informatics. We describe our approach to the recruitment, training and research mentoring of high school students to create a pipeline of exceptionally well-trained applicants for both the disciplines of pathology informatics and biomedical informatics. We emphasize here how mentoring of high school students in pathology informatics and biomedical informatics

  15. Creating a pipeline of talent for informatics: STEM initiative for high school students in computer science, biology, and biomedical informatics

    PubMed Central

    Dutta-Moscato, Joyeeta; Gopalakrishnan, Vanathi; Lotze, Michael T.; Becich, Michael J.

    2014-01-01

    This editorial provides insights into how informatics can attract highly trained students by involving them in science, technology, engineering, and math (STEM) training at the high school level and continuing to provide mentorship and research opportunities through the formative years of their education. Our central premise is that the trajectory necessary to be expert in the emergent fields in front of them requires acceleration at an early time point. Both pathology (and biomedical) informatics are new disciplines which would benefit from involvement by students at an early stage of their education. In 2009, Michael T Lotze MD, Kirsten Livesey (then a medical student, now a medical resident at University of Pittsburgh Medical Center (UPMC)), Richard Hersheberger, PhD (Currently, Dean at Roswell Park), and Megan Seippel, MS (the administrator) launched the University of Pittsburgh Cancer Institute (UPCI) Summer Academy to bring high school students for an 8 week summer academy focused on Cancer Biology. Initially, pathology and biomedical informatics were involved only in the classroom component of the UPCI Summer Academy. In 2011, due to popular interest, an informatics track called Computer Science, Biology and Biomedical Informatics (CoSBBI) was launched. CoSBBI currently acts as a feeder program for the undergraduate degree program in bioinformatics at the University of Pittsburgh, which is a joint degree offered by the Departments of Biology and Computer Science. We believe training in bioinformatics is the best foundation for students interested in future careers in pathology informatics or biomedical informatics. We describe our approach to the recruitment, training and research mentoring of high school students to create a pipeline of exceptionally well-trained applicants for both the disciplines of pathology informatics and biomedical informatics. We emphasize here how mentoring of high school students in pathology informatics and biomedical informatics

  16. Joint computational and experimental aerodynamics research on a hypersonic vehicle

    SciTech Connect

    Oberkampf, W.L.; Aeschliman, D.P.; Walker, M.M.

    1992-01-01

    A closely coupled computational and experimental aerodynamics research program was conducted on a hypersonic vehicle configuration at Mach 8. Aerodynamic force and moment measurements and flow visualization results were obtained in the Sandia National Laboratories hypersonic wind tunnel for laminar boundary layer conditions. Parabolized and iterative Navier-Stokes simulations were used to predict flow fields and forces and moments on the hypersonic configuration. The basic vehicle configuration is a spherically blunted 10{degrees} cone with a slice parallel with the axis of the vehicle. On the slice portion of the vehicle, a flap can be attached so that deflection angles of 10{degrees}, 20{degrees}, and 30{degrees} can be obtained. Comparisons are made between experimental and computational results to evaluate quality of each and to identify areas where improvements are needed. This extensive set of high-quality experimental force and moment measurements is recommended for use in the calibration and validation of computational aerodynamics codes. 22 refs.

  17. Gross's Anatomy: Textual Politics in Science/Biology Education Research

    ERIC Educational Resources Information Center

    Reis, Giuliano

    2009-01-01

    In approaching how the grotesque is--or should be--situated within contemporary science (biology) education practices, Weinstein and Broda undertake a passionate reclaim of an education that is at the same time scientific, critical, and liberatory. However legitimate, their work offers more than they probably could have anticipated: It exemplifies…

  18. Emerging Technologies: An Opportunity for Weed Biology Research

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The main objective of the Emerging Technologies Symposium at the 2007 WSSA Annual Meeting was to provide the weed science community with the principles behind emerging technologies and how they can be used to study weed biology. Specifically, aspects and applications related to genomic database deve...

  19. USDA-ARS RESEARCH ON BIOLOGICAL CONTROL OF ARTHROPODS

    Technology Transfer Automated Retrieval System (TEKTRAN)

    During 1999-2001, ARS scientists published over 100 papers on biocontrol of 30 insect pests. These papers address issues crucial to the three strategies of biological control: conservation, augmentation, and introduction. ARS scientists have been very active in determining the effects of pesticides...

  20. Computational Fluid Dynamics at NASA Ames Research Center

    NASA Technical Reports Server (NTRS)

    Kutler, Paul

    1994-01-01

    Computational fluid dynamics (CFD) is beginning to play a major role in the aircraft industry of the United States because of the realization that CFD can be a new and effective design tool and thus could provide a company with a competitive advantage. It is also playing a significant role in research institutions, both governmental and academic, as a tool for researching new fluid physics, as well as supplementing and complementing experimental testing. In this presentation, some of the progress made to date in CFD at NASA Ames will be reviewed. The presentation addresses the status of CFD in terms of methods, examples of CFD solutions, and computer technology. In addition, the role CFD will play in supporting the revolutionary goals set forth by the Aeronautical Policy Review Committee established by the Office of Science and Technology Policy is noted. The need for validated CFD tools is also briefly discussed.