Sample records for computer scientists recognized

  1. 2017 ISCB Accomplishment by a Senior Scientist Award: Pavel Pevzner

    PubMed Central

    Fogg, Christiana N.; Kovats, Diane E.; Berger, Bonnie

    2017-01-01

    The International Society for Computational Biology ( ISCB) recognizes an established scientist each year with the Accomplishment by a Senior Scientist Award for significant contributions he or she has made to the field. This award honors scientists who have contributed to the advancement of computational biology and bioinformatics through their research, service, and education work. Pavel Pevzner, PhD, Ronald R. Taylor Professor of Computer Science and Director of the NIH Center for Computational Mass Spectrometry at University of California, San Diego, has been selected as the winner of the 2017 Accomplishment by a Senior Scientist Award. The ISCB awards committee, chaired by Dr. Bonnie Berger of the Massachusetts Institute of Technology, selected Pevzner as the 2017 winner. Pevzner will receive his award and deliver a keynote address at the 2017 Intelligent Systems for Molecular Biology-European Conference on Computational Biology joint meeting ( ISMB/ECCB 2017) held in Prague, Czech Republic from July 21-July 25, 2017. ISMB/ECCB is a biennial joint meeting that brings together leading scientists in computational biology and bioinformatics from around the globe. PMID:28713548

  2. Identifying the Factors Leading to Success: How an Innovative Science Curriculum Cultivates Student Motivation

    ERIC Educational Resources Information Center

    Scogin, Stephen C.

    2016-01-01

    "PlantingScience" is an award-winning program recognized for its innovation and use of computer-supported scientist mentoring. Science learners work on inquiry-based experiments in their classrooms and communicate asynchronously with practicing plant scientist-mentors about the projects. The purpose of this study was to identify specific…

  3. Message from the ISCB: 2015 ISCB Accomplishment by a Senior Scientist Award: Cyrus Chothia.

    PubMed

    Fogg, Christiana N; Kovats, Diane E

    2015-07-01

    The International Society for Computational Biology (ISCB; http://www.iscb.org) honors a senior scientist annually for his or her outstanding achievements with the ISCB Accomplishment by a Senior Scientist Award. This award recognizes a leader in the field of computational biology for his or her significant contributions to the community through research, service and education. Cyrus Chothia, an emeritus scientist at the Medical Research Council Laboratory of Molecular Biology and emeritus fellow of Wolfson College at Cambridge University, England, is the 2015 ISCB Accomplishment by a Senior Scientist Award winner.Chothia was selected by the Awards Committee, which is chaired by Dr Bonnie Berger of the Massachusetts Institute of Technology. He will receive his award and deliver a keynote presentation at 2015 Intelligent Systems for Molecular Biology/European Conference on Computational Biology in Dublin, Ireland, in July 2015. dkovats@iscb.org. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  4. Achieving Operational Adaptability: Capacity Building Needs to Become a Warfighting Function

    DTIC Science & Technology

    2010-04-26

    platypus effect as described by David Green in The Serendipity Machine: A Voyage of Discovery Through the Unexpected World of Computers. Early in...the 18th century, the discovery of the platypus challenged the categories of animal life recognized and utilized by scientists in Europe. Scientists...resisted changing their categories for years. At first, they believed the platypus was a fabrication. Later, they resisted change since they were

  5. Employing Inquiry-Based Computer Simulations and Embedded Scientist Videos to Teach Challenging Climate Change and Nature of Science Concepts

    ERIC Educational Resources Information Center

    Cohen, Edward Charles

    2013-01-01

    Design based research was utilized to investigate how students use a greenhouse effect simulation in order to derive best learning practices. During this process, students recognized the authentic scientific process involving computer simulations. The simulation used is embedded within an inquiry-based technology-mediated science curriculum known…

  6. Supercomputing Sheds Light on the Dark Universe

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Habib, Salman; Heitmann, Katrin

    2012-11-15

    At Argonne National Laboratory, scientists are using supercomputers to shed light on one of the great mysteries in science today, the Dark Universe. With Mira, a petascale supercomputer at the Argonne Leadership Computing Facility, a team led by physicists Salman Habib and Katrin Heitmann will run the largest, most complex simulation of the universe ever attempted. By contrasting the results from Mira with state-of-the-art telescope surveys, the scientists hope to gain new insights into the distribution of matter in the universe, advancing future investigations of dark energy and dark matter into a new realm. The team's research was named amore » finalist for the 2012 Gordon Bell Prize, an award recognizing outstanding achievement in high-performance computing.« less

  7. 2017 ISCB Overton Prize: Christoph Bock

    PubMed Central

    Fogg, Christiana N.; Kovats, Diane E.; Berger, Bonnie

    2017-01-01

    The International Society for Computational Biology (ISCB) each year recognizes the achievements of an early to mid-career scientist with the Overton Prize. This prize honors the untimely death of Dr. G. Christian Overton, an admired computational biologist and founding ISCB Board member. Winners of the Overton Prize are independent investigators who are in the early to middle phases of their careers and are selected because of their significant contributions to computational biology through research, teaching, and service. ISCB is pleased to recognize Dr. Christoph Bock, Principal Investigator at the CeMM Research Center for Molecular Medicine of the Austrian Academy of Sciences in Vienna, Austria, as the 2017 winner of the Overton Prize. Bock will be presenting a keynote presentation at the 2017 International Conference on Intelligent Systems for Molecular Biology/European Conference on Computational Biology (ISMB/ECCB) in Prague, Czech Republic being held during July 21-25, 2017. PMID:28713546

  8. 2017 ISCB Overton Prize: Christoph Bock.

    PubMed

    Fogg, Christiana N; Kovats, Diane E; Berger, Bonnie

    2017-01-01

    The International Society for Computational Biology (ISCB) each year recognizes the achievements of an early to mid-career scientist with the Overton Prize. This prize honors the untimely death of Dr. G. Christian Overton, an admired computational biologist and founding ISCB Board member. Winners of the Overton Prize are independent investigators who are in the early to middle phases of their careers and are selected because of their significant contributions to computational biology through research, teaching, and service. ISCB is pleased to recognize Dr. Christoph Bock, Principal Investigator at the CeMM Research Center for Molecular Medicine of the Austrian Academy of Sciences in Vienna, Austria, as the 2017 winner of the Overton Prize. Bock will be presenting a keynote presentation at the 2017 International Conference on Intelligent Systems for Molecular Biology/European Conference on Computational Biology (ISMB/ECCB) in Prague, Czech Republic being held during July 21-25, 2017.

  9. 76 FR 2373 - Science Advisory Board Staff Office; Request for Nominations of Experts to Augment the SAB...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-13

    ... Office is requesting public nominations for scientists and engineers to augment the SAB Scientific and... STAA Program was established in 1980 to recognize Agency scientists and engineers who published their... seeking nominations of nationally and internationally recognized scientists and engineers having...

  10. The making of the Women in Biology forum (WiB) at Bioclues.

    PubMed

    Singhania, Reeta Rani; Madduru, Dhatri; Pappu, Pranathi; Panchangam, Sameera; Suravajhala, Renuka; Chandrasekharan, Mohanalatha

    2014-01-01

    The Women in Biology forum (WiB) of Bioclues (India) began in 2009 to promote and support women pursuing careers in bioinformatics and computational biology. WiB was formed in order to help women scientists deprived of basic research, boost the prominence of women scientists particularly from developing countries, and bridge the gender gap to innovation. WiB has also served as a platform to highlight the work of established female scientists in these fields. Several award-winning women researchers have shared their experiences and provided valuable suggestions to WiB. Headed by Mohanalatha Chandrasekharan and supported by Dr. Reeta Rani Singhania and Renuka Suravajhala, WiB has seen major progress in the last couple of years particularly in the two avenues Mentoring and Research, off the four avenues in Bioclues: Mentoring, Outreach, Research and Entrepreneurship (MORE). In line with the Bioclues vision for bioinformatics in India, the WiB Journal Club (JoC) recognizes women scientists working on functional genomics and bioinformatics, and provides scientific mentorship and support for project design and hypothesis formulation. As a part of Bioclues, WiB members practice the group's open-desk policy and its belief that all members are free to express their own thoughts and opinions. The WiB forum appreciates suggestions and welcomes scientists from around the world to be a part of their mission to encourage women to pursue computational biology and bioinformatics.

  11. NREL Scientist Selected for Major Award by the American Chemical Society

    Science.gov Websites

    Chemistry. The award recognizes his many research, teaching, writing and administrative accomplishments recognized surface scientist, Czanderna has educated thousands through his teaching and writing. He is an

  12. 2016 ISCB Overton Prize awarded to Debora Marks

    PubMed Central

    Fogg, Christiana N.; Kovats, Diane E.

    2016-01-01

    The International Society for Computational Biology (ISCB) recognizes the achievements of an early- to mid-career scientist with the Overton Prize each year. The Overton Prize was established to honor the untimely loss of Dr. G. Christian Overton, a respected computational biologist and founding ISCB Board member. Winners of the Overton Prize are independent investigators in the early to middle phases of their careers who are selected because of their significant contributions to computational biology through research, teaching, and service. 2016 will mark the fifteenth bestowment of the ISCB Overton Prize.  ISCB is pleased to confer this award the to Debora Marks, Assistant Professor of Systems Biology and director of the Raymond and Beverly Sackler Laboratory for Computational Biology at Harvard Medical School. PMID:27429747

  13. 2016 ISCB Overton Prize awarded to Debora Marks.

    PubMed

    Fogg, Christiana N; Kovats, Diane E

    2016-01-01

    The International Society for Computational Biology (ISCB) recognizes the achievements of an early- to mid-career scientist with the Overton Prize each year. The Overton Prize was established to honor the untimely loss of Dr. G. Christian Overton, a respected computational biologist and founding ISCB Board member. Winners of the Overton Prize are independent investigators in the early to middle phases of their careers who are selected because of their significant contributions to computational biology through research, teaching, and service. 2016 will mark the fifteenth bestowment of the ISCB Overton Prize.  ISCB is pleased to confer this award the to Debora Marks, Assistant Professor of Systems Biology and director of the Raymond and Beverly Sackler Laboratory for Computational Biology at Harvard Medical School.

  14. MatLab Script and Functional Programming

    NASA Technical Reports Server (NTRS)

    Shaykhian, Gholam Ali

    2007-01-01

    MatLab Script and Functional Programming: MatLab is one of the most widely used very high level programming languages for scientific and engineering computations. It is very user-friendly and needs practically no formal programming knowledge. Presented here are MatLab programming aspects and not just the MatLab commands for scientists and engineers who do not have formal programming training and also have no significant time to spare for learning programming to solve their real world problems. Specifically provided are programs for visualization. The MatLab seminar covers the functional and script programming aspect of MatLab language. Specific expectations are: a) Recognize MatLab commands, script and function. b) Create, and run a MatLab function. c) Read, recognize, and describe MatLab syntax. d) Recognize decisions, loops and matrix operators. e) Evaluate scope among multiple files, and multiple functions within a file. f) Declare, define and use scalar variables, vectors and matrices.

  15. Enabling the Discovery of Gravitational Radiation

    NASA Astrophysics Data System (ADS)

    Isaacson, Richard

    2017-01-01

    The discovery of gravitational radiation was announced with the publication of the results of a physics experiment involving over a thousand participants. This was preceded by a century of theoretical work, involving a similarly large group of physicists, mathematicians, and computer scientists. This huge effort was enabled by a substantial commitment of resources, both public and private, to develop the different strands of this complex research enterprise, and to build a community of scientists to carry it out. In the excitement following the discovery, the role of key enablers of this success has not always been adequately recognized in popular accounts. In this talk, I will try to call attention to a few of the key ingredients that proved crucial to enabling the successful discovery of gravitational waves, and the opening of a new field of science.

  16. Advanced Methodologies for NASA Science Missions

    NASA Astrophysics Data System (ADS)

    Hurlburt, N. E.; Feigelson, E.; Mentzel, C.

    2017-12-01

    Most of NASA's commitment to computational space science involves the organization and processing of Big Data from space-based satellites, and the calculations of advanced physical models based on these datasets. But considerable thought is also needed on what computations are needed. The science questions addressed by space data are so diverse and complex that traditional analysis procedures are often inadequate. The knowledge and skills of the statistician, applied mathematician, and algorithmic computer scientist must be incorporated into programs that currently emphasize engineering and physical science. NASA's culture and administrative mechanisms take full cognizance that major advances in space science are driven by improvements in instrumentation. But it is less well recognized that new instruments and science questions give rise to new challenges in the treatment of satellite data after it is telemetered to the ground. These issues might be divided into two stages: data reduction through software pipelines developed within NASA mission centers; and science analysis that is performed by hundreds of space scientists dispersed through NASA, U.S. universities, and abroad. Both stages benefit from the latest statistical and computational methods; in some cases, the science result is completely inaccessible using traditional procedures. This paper will review the current state of NASA and present example applications using modern methodologies.

  17. 2017 ISCB Innovator Award: Aviv Regev

    PubMed Central

    Fogg, Christiana N.; Kovats, Diane; Berger, Bonnie

    2017-01-01

    2017 marks the second year of the International Society for Computational Biology (ISCB) Innovator Award, which recognizes an ISCB scientist who is within two decades of having completed his or her graduate degree and has consistently made outstanding contributions to the field. The 2017 winner is Dr. Aviv Regev, Professor of Biology at the Massachusetts Institute of Technology (MIT), a Core Member and Chair of the Faculty of the Broad Institute of MIT and Harvard, and an HHMI Investigator. Regev will receive her award and deliver a keynote address during International Conference on Intelligent Systems for Molecular Biology/European Conference on Computational Biology (ISMB/ECCB) 2017 in Prague, Czech Republic (July 21 - 25, 2017). PMID:28713547

  18. A program of correlated observations using the EGRET instrument on GRO and the IMB neutrino detector

    NASA Technical Reports Server (NTRS)

    Svoboda, Robert C.

    1992-01-01

    A reliable, real-time supernova monitoring system was devised using the IMB neutrino detector to serve as an 'early-warning' system for EGRET and other instruments on GRO. New methods and software were developed to allow the IMB monitoring computer in Cleveland to: recognize that a trigger burst had occurred; make a judgement on whether the burst was spurrious or an actual supernova; prepare brief summary files and 'quick-look' data so that a final disposition could be made by a trained scientist; and contact the 'watch' scientist via personal beeper in Baton Rouge. This system ran from Dec. 1990 to Apr. 1991, when the neutrino detector failed for unrelated reasons. In addition to the supernova system, high-energy neutrino data was prepared and formatted for comparison with EGRET gamma-ray data.

  19. Implicit Theories of Creativity in Computer Science in the United States and China

    ERIC Educational Resources Information Center

    Tang, Chaoying; Baer, John; Kaufman, James C.

    2015-01-01

    To study implicit concepts of creativity in computer science in the United States and mainland China, we first asked 308 Chinese computer scientists for adjectives that would describe a creative computer scientist. Computer scientists and non-computer scientists from China (N = 1069) and the United States (N = 971) then rated how well those…

  20. Designing and Implementing a Computational Methods Course for Upper-level Undergraduates and Postgraduates in Atmospheric and Oceanic Sciences

    NASA Astrophysics Data System (ADS)

    Nelson, E.; L'Ecuyer, T. S.; Douglas, A.; Hansen, Z.

    2017-12-01

    In the modern computing age, scientists must utilize a wide variety of skills to carry out scientific research. Programming, including a focus on collaborative development, has become more prevalent in both academic and professional career paths. Faculty in the Department of Atmospheric and Oceanic Sciences at the University of Wisconsin—Madison recognized this need and recently approved a new course offering for undergraduates and postgraduates in computational methods that was first held in Spring 2017. Three programming languages were covered in the inaugural course semester and development themes such as modularization, data wrangling, and conceptual code models were woven into all of the sections. In this presentation, we will share successes and challenges in developing a research project-focused computational course that leverages hands-on computer laboratory learning and open-sourced course content. Improvements and changes in future iterations of the course based on the first offering will also be discussed.

  1. Environmental Problems and the Scientist

    ERIC Educational Resources Information Center

    Batisse, Michel

    1973-01-01

    Suggests that any environmental problem can be traced at biosphere, technosphere, sociosphere, and noosphere level. Scientists have generally ignored the latter two spheres in making scientific discoveries. New social ethics need to be recognized that are based on progress, and scientists must consider how these ethics are influenced by their…

  2. Developing the next generation of diverse computer scientists: the need for enhanced, intersectional computing identity theory

    NASA Astrophysics Data System (ADS)

    Rodriguez, Sarah L.; Lehman, Kathleen

    2017-10-01

    This theoretical paper explores the need for enhanced, intersectional computing identity theory for the purpose of developing a diverse group of computer scientists for the future. Greater theoretical understanding of the identity formation process specifically for computing is needed in order to understand how students come to understand themselves as computer scientists. To ensure that the next generation of computer scientists is diverse, this paper presents a case for examining identity development intersectionally, understanding the ways in which women and underrepresented students may have difficulty identifying as computer scientists and be systematically oppressed in their pursuit of computer science careers. Through a review of the available scholarship, this paper suggests that creating greater theoretical understanding of the computing identity development process will inform the way in which educational stakeholders consider computer science practices and policies.

  3. Mathematical challenges from theoretical/computational chemistry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-12-31

    The committee believes that this report has relevance and potentially valuable suggestions for a wide range of readers. Target audiences include: graduate departments in the mathematical and chemical sciences; federal and private agencies that fund research in the mathematical and chemical sciences; selected industrial and government research and development laboratories; developers of software and hardware for computational chemistry; and selected individual researchers. Chapter 2 of this report covers some history of computational chemistry for the nonspecialist, while Chapter 3 illustrates the fruits of some past successful cross-fertilization between mathematical scientists and computational/theoretical chemists. In Chapter 4 the committee has assembledmore » a representative, but not exhaustive, survey of research opportunities. Most of these are descriptions of important open problems in computational/theoretical chemistry that could gain much from the efforts of innovative mathematical scientists, written so as to be accessible introductions to the nonspecialist. Chapter 5 is an assessment, necessarily subjective, of cultural differences that must be overcome if collaborative work is to be encouraged between the mathematical and the chemical communities. Finally, the report ends with a brief list of conclusions and recommendations that, if followed, could promote accelerated progress at this interface. Recognizing that bothersome language issues can inhibit prospects for collaborative research at the interface between distinctive disciplines, the committee has attempted throughout to maintain an accessible style, in part by using illustrative boxes, and has included at the end of the report a glossary of technical terms that may be familiar to only a subset of the target audiences listed above.« less

  4. Developing a Science Commons for Geosciences

    NASA Astrophysics Data System (ADS)

    Lenhardt, W. C.; Lander, H.

    2016-12-01

    Many scientific communities, recognizing the research possibilities inherent in data sets, have created domain specific archives such as the Incorporated Research Institutions for Seismology (iris.edu) and ClinicalTrials.gov. Though this is an important step forward, most scientists, including geoscientists, also use a variety of software tools and at least some amount of computation to conduct their research. While the archives make it simpler for scientists to locate the required data, provisioning disk space, compute resources, and network bandwidth can still require significant efforts. This challenge exists despite the wealth of resources available to researchers, namely lab IT resources, institutional IT resources, national compute resources (XSEDE, OSG), private clouds, public clouds, and the development of cyberinfrastructure technologies meant to facilitate use of those resources. Further tasks include obtaining and installing required tools for analysis and visualization. If the research effort is a collaboration or involves certain types of data, then the partners may well have additional non-scientific tasks such as securing the data and developing secure sharing methods for the data. These requirements motivate our investigations into the "Science Commons". This paper will present a working definition of a science commons, compare and contrast examples of existing science commons, and describe a project based at RENCI to implement a science commons for risk analytics. We will then explore what a similar tool might look like for the geosciences.

  5. Recruitment of Foreigners in the Market for Computer Scientists in the United States

    PubMed Central

    Bound, John; Braga, Breno; Golden, Joseph M.

    2016-01-01

    We present and calibrate a dynamic model that characterizes the labor market for computer scientists. In our model, firms can recruit computer scientists from recently graduated college students, from STEM workers working in other occupations or from a pool of foreign talent. Counterfactual simulations suggest that wages for computer scientists would have been 2.8–3.8% higher, and the number of Americans employed as computers scientists would have been 7.0–13.6% higher in 2004 if firms could not hire more foreigners than they could in 1994. In contrast, total CS employment would have been 3.8–9.0% lower, and consequently output smaller. PMID:27170827

  6. Immune Cells in Blood Recognize Tumors

    Cancer.gov

    NCI scientists have developed a novel strategy for identifying immune cells circulating in the blood that recognize specific proteins on tumor cells, a finding they believe may have potential implications for immune-based therapies.

  7. How to succeed in science: a concise guide for young biomedical scientists. Part II: making discoveries

    PubMed Central

    Yewdell, Jonathan W.

    2009-01-01

    Making discoveries is the most important part of being a scientist, and also the most fun. Young scientists need to develop the experimental and mental skill sets that enable them to make discoveries, including how to recognize and exploit serendipity when it strikes. Here, I provide practical advice to young scientists on choosing a research topic, designing, performing and interpreting experiments and, last but not least, on maintaining your sanity in the process. PMID:18401347

  8. How to succeed in science: a concise guide for young biomedical scientists. Part II: making discoveries.

    PubMed

    Yewdell, Jonathan W

    2008-06-01

    Making discoveries is the most important part of being a scientist, and also the most fun. Young scientists need to develop the experimental and mental skill sets that enable them to make discoveries, including how to recognize and exploit serendipity when it strikes. Here, I provide practical advice to young scientists on choosing a research topic, designing, performing and interpreting experiments and, last but not least, on maintaining your sanity in the process.

  9. Embodiment and Human Development.

    PubMed

    Marshall, Peter J

    2016-12-01

    We are recognizing increasingly that the study of cognitive, social, and emotional processes must account for their embodiment in living, acting beings. The related field of embodied cognition (EC) has coalesced around dissatisfaction with the lack of attention to the body in cognitive science. For developmental scientists, the emphasis in the literature on adult EC on the role of the body in cognition may not seem particularly novel, given that bodily action was central to Piaget's theory of cognitive development. However, as the influence of the Piagetian account waned, developmental notions of embodiment were shelved in favor of mechanical computational approaches. In this article, I argue that by reconsidering embodiment, we can address a key issue with computational accounts: how meaning is constructed by the developing person. I also suggest that the process-relational approach to developmental systems can provide a system of concepts for framing a fully embodied, integrative developmental science.

  10. Embodiment and Human Development

    PubMed Central

    Marshall, Peter J.

    2016-01-01

    We are recognizing increasingly that the study of cognitive, social, and emotional processes must account for their embodiment in living, acting beings. The related field of embodied cognition (EC) has coalesced around dissatisfaction with the lack of attention to the body in cognitive science. For developmental scientists, the emphasis in the literature on adult EC on the role of the body in cognition may not seem particularly novel, given that bodily action was central to Piaget’s theory of cognitive development. However, as the influence of the Piagetian account waned, developmental notions of embodiment were shelved in favor of mechanical computational approaches. In this article, I argue that by reconsidering embodiment, we can address a key issue with computational accounts: how meaning is constructed by the developing person. I also suggest that the process-relational approach to developmental systems can provide a system of concepts for framing a fully embodied, integrative developmental science. PMID:27833651

  11. Global Climate Models for the Classroom: The Educational Impact of Student Work with a Key Tool of Climate Scientists

    NASA Astrophysics Data System (ADS)

    Bush, D. F.; Sieber, R.; Seiler, G.; Chandler, M. A.; Chmura, G. L.

    2017-12-01

    Efforts to address climate change require public understanding of Earth and climate science. To meet this need, educators require instructional approaches and scientific technologies that overcome cultural barriers to impart conceptual understanding of the work of climate scientists. We compared student inquiry learning with now ubiquitous climate education toy models, data and tools against that which took place using a computational global climate model (GCM) from the National Aeronautics and Space Administration (NASA). Our study at McGill University and John Abbott College in Montreal, QC sheds light on how best to teach the research processes important to Earth and climate scientists studying atmospheric and Earth system processes but ill-understood by those outside the scientific community. We followed a pre/post, control/treatment experimental design that enabled detailed analysis and statistically significant results. Our research found more students succeed at understanding climate change when exposed to actual climate research processes and instruments. Inquiry-based education with a GCM resulted in significantly higher scores pre to post on diagnostic exams (quantitatively) and more complete conceptual understandings (qualitatively). We recognize the difficulty in planning and teaching inquiry with complex technology and we also found evidence that lectures support learning geared toward assessment exams.

  12. The discovery of structural form

    PubMed Central

    Kemp, Charles; Tenenbaum, Joshua B.

    2008-01-01

    Algorithms for finding structure in data have become increasingly important both as tools for scientific data analysis and as models of human learning, yet they suffer from a critical limitation. Scientists discover qualitatively new forms of structure in observed data: For instance, Linnaeus recognized the hierarchical organization of biological species, and Mendeleev recognized the periodic structure of the chemical elements. Analogous insights play a pivotal role in cognitive development: Children discover that object category labels can be organized into hierarchies, friendship networks are organized into cliques, and comparative relations (e.g., “bigger than” or “better than”) respect a transitive order. Standard algorithms, however, can only learn structures of a single form that must be specified in advance: For instance, algorithms for hierarchical clustering create tree structures, whereas algorithms for dimensionality-reduction create low-dimensional spaces. Here, we present a computational model that learns structures of many different forms and that discovers which form is best for a given dataset. The model makes probabilistic inferences over a space of graph grammars representing trees, linear orders, multidimensional spaces, rings, dominance hierarchies, cliques, and other forms and successfully discovers the underlying structure of a variety of physical, biological, and social domains. Our approach brings structure learning methods closer to human abilities and may lead to a deeper computational understanding of cognitive development. PMID:18669663

  13. Computer network access to scientific information systems for minority universities

    NASA Astrophysics Data System (ADS)

    Thomas, Valerie L.; Wakim, Nagi T.

    1993-08-01

    The evolution of computer networking technology has lead to the establishment of a massive networking infrastructure which interconnects various types of computing resources at many government, academic, and corporate institutions. A large segment of this infrastructure has been developed to facilitate information exchange and resource sharing within the scientific community. The National Aeronautics and Space Administration (NASA) supports both the development and the application of computer networks which provide its community with access to many valuable multi-disciplinary scientific information systems and on-line databases. Recognizing the need to extend the benefits of this advanced networking technology to the under-represented community, the National Space Science Data Center (NSSDC) in the Space Data and Computing Division at the Goddard Space Flight Center has developed the Minority University-Space Interdisciplinary Network (MU-SPIN) Program: a major networking and education initiative for Historically Black Colleges and Universities (HBCUs) and Minority Universities (MUs). In this paper, we will briefly explain the various components of the MU-SPIN Program while highlighting how, by providing access to scientific information systems and on-line data, it promotes a higher level of collaboration among faculty and students and NASA scientists.

  14. The long road to the use of microscope in clinical medicine in vivo: from early pioneering proposals to the modern perspectives of optical biopsy.

    PubMed

    Ponti, Giovanni; Muscatello, Umberto; Sgantzos, Markos

    2015-01-01

    For a long period the scientists did not recognized the potentialities of the compound microscope in medicine. Only few scientists recognized the potentialities of the microscope for the medicine; among them G. Campani who proposed the utilization of his microscope to investigate the skin lesions directly on the patient. The proposal was illustrated in a letter Acta Eruditorum of 1686. The recent development of optical techniques, capable of providing in-focus images of an object from different planes with high spatial resolution, significantly increased the diagnostic potential of the microscope directly on the patient.

  15. ESIP’s new ICUC smartphone app - linking citizen scientists to their own places of wonder

    EPA Science Inventory

    The Gulf of Maine Council’s EcoSystem Indicator Partnership (ESIP) was formed in 2006 to look at changes in the health of the Gulf of Maine ecosystem through the use of environmental indicators. ESIP has always recognized the value of datasets generated by citizen scientist...

  16. Interactive visualization of Earth and Space Science computations

    NASA Technical Reports Server (NTRS)

    Hibbard, William L.; Paul, Brian E.; Santek, David A.; Dyer, Charles R.; Battaiola, Andre L.; Voidrot-Martinez, Marie-Francoise

    1994-01-01

    Computers have become essential tools for scientists simulating and observing nature. Simulations are formulated as mathematical models but are implemented as computer algorithms to simulate complex events. Observations are also analyzed and understood in terms of mathematical models, but the number of these observations usually dictates that we automate analyses with computer algorithms. In spite of their essential role, computers are also barriers to scientific understanding. Unlike hand calculations, automated computations are invisible and, because of the enormous numbers of individual operations in automated computations, the relation between an algorithm's input and output is often not intuitive. This problem is illustrated by the behavior of meteorologists responsible for forecasting weather. Even in this age of computers, many meteorologists manually plot weather observations on maps, then draw isolines of temperature, pressure, and other fields by hand (special pads of maps are printed for just this purpose). Similarly, radiologists use computers to collect medical data but are notoriously reluctant to apply image-processing algorithms to that data. To these scientists with life-and-death responsibilities, computer algorithms are black boxes that increase rather than reduce risk. The barrier between scientists and their computations can be bridged by techniques that make the internal workings of algorithms visible and that allow scientists to experiment with their computations. Here we describe two interactive systems developed at the University of Wisconsin-Madison Space Science and Engineering Center (SSEC) that provide these capabilities to Earth and space scientists.

  17. NIH Director's Award Recognizes Rapid Response to Avert Potential Health Crisis | Frederick National Laboratory for Cancer Research

    Cancer.gov

    In July 2012, members of a multidisciplinary research team of both SAIC-Frederick and NCI Center for Cancer Research scientists were recognized with the NIH Director’s Award for their outstanding work to rapidly evaluate a potential threat to the n

  18. An International Short Course for Training Professionals as Effective Science Communicators

    ERIC Educational Resources Information Center

    Sarathchandra, Dilshani; Maredia, Karim M.

    2014-01-01

    Scholars have recognized a need for educational programs that prepare scientists, Extension practitioners, and other stakeholders to communicate science effectively. Such programs have the potential to increase public awareness and aid policy development. Having recognized this need, faculty at Michigan State University (MSU) developed an…

  19. Who Believes in the Storybook Image of the Scientist?

    PubMed

    Veldkamp, Coosje L S; Hartgerink, Chris H J; van Assen, Marcel A L M; Wicherts, Jelte M

    2017-01-01

    Do lay people and scientists themselves recognize that scientists are human and therefore prone to human fallibilities such as error, bias, and even dishonesty? In a series of three experimental studies and one correlational study (total N = 3,278) we found that the "storybook image of the scientist" is pervasive: American lay people and scientists from over 60 countries attributed considerably more objectivity, rationality, open-mindedness, intelligence, integrity, and communality to scientists than to other highly-educated people. Moreover, scientists perceived even larger differences than lay people did. Some groups of scientists also differentiated between different categories of scientists: established scientists attributed higher levels of the scientific traits to established scientists than to early-career scientists and Ph.D. students, and higher levels to Ph.D. students than to early-career scientists. Female scientists attributed considerably higher levels of the scientific traits to female scientists than to male scientists. A strong belief in the storybook image and the (human) tendency to attribute higher levels of desirable traits to people in one's own group than to people in other groups may decrease scientists' willingness to adopt recently proposed practices to reduce error, bias and dishonesty in science.

  20. New position at Lamont-Doherty

    NASA Astrophysics Data System (ADS)

    Three scientists at the Lamont-Doherty Geological Observatory have been appointed to the position of Doherty Senior Research Scientist, newly created to recognize members of the observatory's senior research staff who have demonstrated “the highest levels of scholarship, scientific leadership, and promise of continuing excellence.”William Ruddiman, associate director of the observatory's Oceans and Climate Division, Taro Takahashi, associate director of the Geochemistry Division, and Dennis Kent, known for his research in paleomagnetics and rock magnetism, were each appointed to 5-year terms as Doherty Senior Research Scientists.

  1. [History of creation of the doctrine, equipment and methods of formation of biological feedback].

    PubMed

    Bokser, O Ia

    1999-01-01

    The theoretical and experimental priorities of Russian scientists A. V. Zaporozhets and M. I. Lisina in creating the doctrine of biological feedback (BFB) in 1955 are justified. The priority of American scientists (N. Miller, 1969) in the discovery of the fact that BFB can form in animals is recognized. USA scientists were also be the first to develop and provide a base for manufacturing commercial devices for shaping BFB that have gained wide practical recognition in medicine, sports, and psychophysiology.

  2. Climate@Home: Crowdsourcing Climate Change Research

    NASA Astrophysics Data System (ADS)

    Xu, C.; Yang, C.; Li, J.; Sun, M.; Bambacus, M.

    2011-12-01

    Climate change deeply impacts human wellbeing. Significant amounts of resources have been invested in building super-computers that are capable of running advanced climate models, which help scientists understand climate change mechanisms, and predict its trend. Although climate change influences all human beings, the general public is largely excluded from the research. On the other hand, scientists are eagerly seeking communication mediums for effectively enlightening the public on climate change and its consequences. The Climate@Home project is devoted to connect the two ends with an innovative solution: crowdsourcing climate computing to the general public by harvesting volunteered computing resources from the participants. A distributed web-based computing platform will be built to support climate computing, and the general public can 'plug-in' their personal computers to participate in the research. People contribute the spare computing power of their computers to run a computer model, which is used by scientists to predict climate change. Traditionally, only super-computers could handle such a large computing processing load. By orchestrating massive amounts of personal computers to perform atomized data processing tasks, investments on new super-computers, energy consumed by super-computers, and carbon release from super-computers are reduced. Meanwhile, the platform forms a social network of climate researchers and the general public, which may be leveraged to raise climate awareness among the participants. A portal is to be built as the gateway to the climate@home project. Three types of roles and the corresponding functionalities are designed and supported. The end users include the citizen participants, climate scientists, and project managers. Citizen participants connect their computing resources to the platform by downloading and installing a computing engine on their personal computers. Computer climate models are defined at the server side. Climate scientists configure computer model parameters through the portal user interface. After model configuration, scientists then launch the computing task. Next, data is atomized and distributed to computing engines that are running on citizen participants' computers. Scientists will receive notifications on the completion of computing tasks, and examine modeling results via visualization modules of the portal. Computing tasks, computing resources, and participants are managed by project managers via portal tools. A portal prototype has been built for proof of concept. Three forums have been setup for different groups of users to share information on science aspect, technology aspect, and educational outreach aspect. A facebook account has been setup to distribute messages via the most popular social networking platform. New treads are synchronized from the forums to facebook. A mapping tool displays geographic locations of the participants and the status of tasks on each client node. A group of users have been invited to test functions such as forums, blogs, and computing resource monitoring.

  3. If we designed airplanes like we design drugs…

    NASA Astrophysics Data System (ADS)

    Woltosz, Walter S.

    2012-01-01

    In the early days, airplanes were put together with parts designed for other purposes (bicycles, farm equipment, textiles, automotive equipment, etc.). They were then flown by their brave designers to see if the design would work—often with disastrous results. Today, airplanes, helicopters, missiles, and rockets are designed in computers in a process that involves iterating through enormous numbers of designs before anything is made. Until very recently, novel drug-like molecules were nearly always made first like early airplanes, then tested to see if they were any good (although usually not on the brave scientists who created them!). The resulting extremely high failure rate is legendary. This article describes some of the evolution of computer-based design in the aerospace industry and compares it with the progress made to date in computer-aided drug design. Software development for pharmaceutical research has been largely entrepreneurial, with only relatively limited support from government and industry end-user organizations. The pharmaceutical industry is still about 30 years behind aerospace and other industries in fully recognizing the value of simulation and modeling and funding the development of the tools needed to catch up.

  4. If we designed airplanes like we design drugs....

    PubMed

    Woltosz, Walter S

    2012-01-01

    In the early days, airplanes were put together with parts designed for other purposes (bicycles, farm equipment, textiles, automotive equipment, etc.). They were then flown by their brave designers to see if the design would work--often with disastrous results. Today, airplanes, helicopters, missiles, and rockets are designed in computers in a process that involves iterating through enormous numbers of designs before anything is made. Until very recently, novel drug-like molecules were nearly always made first like early airplanes, then tested to see if they were any good (although usually not on the brave scientists who created them!). The resulting extremely high failure rate is legendary. This article describes some of the evolution of computer-based design in the aerospace industry and compares it with the progress made to date in computer-aided drug design. Software development for pharmaceutical research has been largely entrepreneurial, with only relatively limited support from government and industry end-user organizations. The pharmaceutical industry is still about 30 years behind aerospace and other industries in fully recognizing the value of simulation and modeling and funding the development of the tools needed to catch up.

  5. Basic instincts

    NASA Astrophysics Data System (ADS)

    Hutson, Matthew

    2018-05-01

    In their adaptability, young children demonstrate common sense, a kind of intelligence that, so far, computer scientists have struggled to reproduce. Gary Marcus, a developmental cognitive scientist at New York University in New York City, believes the field of artificial intelligence (AI) would do well to learn lessons from young thinkers. Researchers in machine learning argue that computers trained on mountains of data can learn just about anything—including common sense—with few, if any, programmed rules. But Marcus says computer scientists are ignoring decades of work in the cognitive sciences and developmental psychology showing that humans have innate abilities—programmed instincts that appear at birth or in early childhood—that help us think abstractly and flexibly. He believes AI researchers ought to include such instincts in their programs. Yet many computer scientists, riding high on the successes of machine learning, are eagerly exploring the limits of what a naïve AI can do. Computer scientists appreciate simplicity and have an aversion to debugging complex code. Furthermore, big companies such as Facebook and Google are pushing AI in this direction. These companies are most interested in narrowly defined, near-term problems, such as web search and facial recognition, in which blank-slate AI systems can be trained on vast data sets and work remarkably well. But in the longer term, computer scientists expect AIs to take on much tougher tasks that require flexibility and common sense. They want to create chatbots that explain the news, autonomous taxis that can handle chaotic city traffic, and robots that nurse the elderly. Some computer scientists are already trying. Such efforts, researchers hope, will result in AIs that sit somewhere between pure machine learning and pure instinct. They will boot up following some embedded rules, but will also learn as they go.

  6. Who Believes in the Storybook Image of the Scientist?

    PubMed Central

    Veldkamp, Coosje L. S.; Hartgerink, Chris H. J.; van Assen, Marcel A. L. M.; Wicherts, Jelte M.

    2017-01-01

    ABSTRACT Do lay people and scientists themselves recognize that scientists are human and therefore prone to human fallibilities such as error, bias, and even dishonesty? In a series of three experimental studies and one correlational study (total N = 3,278) we found that the “storybook image of the scientist” is pervasive: American lay people and scientists from over 60 countries attributed considerably more objectivity, rationality, open-mindedness, intelligence, integrity, and communality to scientists than to other highly-educated people. Moreover, scientists perceived even larger differences than lay people did. Some groups of scientists also differentiated between different categories of scientists: established scientists attributed higher levels of the scientific traits to established scientists than to early-career scientists and Ph.D. students, and higher levels to Ph.D. students than to early-career scientists. Female scientists attributed considerably higher levels of the scientific traits to female scientists than to male scientists. A strong belief in the storybook image and the (human) tendency to attribute higher levels of desirable traits to people in one’s own group than to people in other groups may decrease scientists’ willingness to adopt recently proposed practices to reduce error, bias and dishonesty in science. PMID:28001440

  7. Facilities | Computational Science | NREL

    Science.gov Websites

    technology innovation by providing scientists and engineers the ability to tackle energy challenges that scientists and engineers to take full advantage of advanced computing hardware and software resources

  8. Image Recognition and Feature Detection in Solar Physics

    NASA Astrophysics Data System (ADS)

    Martens, Petrus C.

    2012-05-01

    The Solar Dynamics Observatory (SDO) data repository will dwarf the archives of all previous solar physics missions put together. NASA recognized early on that the traditional methods of analyzing the data -- solar scientists and grad students in particular analyzing the images by hand -- would simply not work and tasked our Feature Finding Team (FFT) with developing automated feature recognition modules for solar events and phenomena likely to be observed by SDO. Having these metadata available on-line will enable solar scientist to conduct statistical studies involving large sets of events that would be impossible now with traditional means. We have followed a two-track approach in our project: we have been developing some existing task-specific solar feature finding modules to be "pipe-line" ready for the stream of SDO data, plus we are designing a few new modules. Secondly, we took it upon us to develop an entirely new "trainable" module that would be capable of identifying different types of solar phenomena starting from a limited number of user-provided examples. Both approaches are now reaching fruition, and I will show examples and movies with results from several of our feature finding modules. In the second part of my presentation I will focus on our “trainable” module, which is the most innovative in character. First, there is the strong similarity between solar and medical X-ray images with regard to their texture, which has allowed us to apply some advances made in medical image recognition. Second, we have found that there is a strong similarity between the way our trainable module works and the way our brain recognizes images. The brain can quickly recognize similar images from key characteristics, just as our code does. We conclude from that that our approach represents the beginning of a more human-like procedure for computer image recognition.

  9. An Analysis of Computer-Mediated Communication between Middle School Students and Scientist Role Models: A Pilot Study.

    ERIC Educational Resources Information Center

    Murfin, Brian

    1994-01-01

    Reports on a study of the effectiveness of computer-mediated communication (CMC) in providing African American and female middle school students with scientist role models. Quantitative and qualitative data gathered by analyzing messages students and scientists posted on a shared electronic bulletin board showed that CMC could be an effective…

  10. Developing the Next Generation of Science Data System Engineers

    NASA Technical Reports Server (NTRS)

    Moses, John F.; Behnke, Jeanne; Durachka, Christopher D.

    2016-01-01

    At Goddard, engineers and scientists with a range of experience in science data systems are needed to employ new technologies and develop advances in capabilities for supporting new Earth and Space science research. Engineers with extensive experience in science data, software engineering and computer-information architectures are needed to lead and perform these activities. The increasing types and complexity of instrument data and emerging computer technologies coupled with the current shortage of computer engineers with backgrounds in science has led the need to develop a career path for science data systems engineers and architects.The current career path, in which undergraduate students studying various disciplines such as Computer Engineering or Physical Scientist, generally begins with serving on a development team in any of the disciplines where they can work in depth on existing Goddard data systems or serve with a specific NASA science team. There they begin to understand the data, infuse technologies, and begin to know the architectures of science data systems. From here the typical career involves peermentoring, on-the-job training or graduate level studies in analytics, computational science and applied science and mathematics. At the most senior level, engineers become subject matter experts and system architect experts, leading discipline-specific data centers and large software development projects. They are recognized as a subject matter expert in a science domain, they have project management expertise, lead standards efforts and lead international projects. A long career development remains necessary not only because of the breadth of knowledge required across physical sciences and engineering disciplines, but also because of the diversity of instrument data being developed today both by NASA and international partner agencies and because multidiscipline science and practitioner communities expect to have access to all types of observational data.This paper describes an approach to defining career-path guidance for college-bound high school and undergraduate engineering students, junior and senior engineers from various disciplines.

  11. Developing the Next Generation of Science Data System Engineers

    NASA Astrophysics Data System (ADS)

    Moses, J. F.; Durachka, C. D.; Behnke, J.

    2015-12-01

    At Goddard, engineers and scientists with a range of experience in science data systems are needed to employ new technologies and develop advances in capabilities for supporting new Earth and Space science research. Engineers with extensive experience in science data, software engineering and computer-information architectures are needed to lead and perform these activities. The increasing types and complexity of instrument data and emerging computer technologies coupled with the current shortage of computer engineers with backgrounds in science has led the need to develop a career path for science data systems engineers and architects. The current career path, in which undergraduate students studying various disciplines such as Computer Engineering or Physical Scientist, generally begins with serving on a development team in any of the disciplines where they can work in depth on existing Goddard data systems or serve with a specific NASA science team. There they begin to understand the data, infuse technologies, and begin to know the architectures of science data systems. From here the typical career involves peer mentoring, on-the-job training or graduate level studies in analytics, computational science and applied science and mathematics. At the most senior level, engineers become subject matter experts and system architect experts, leading discipline-specific data centers and large software development projects. They are recognized as a subject matter expert in a science domain, they have project management expertise, lead standards efforts and lead international projects. A long career development remains necessary not only because of the breath of knowledge required across physical sciences and engineering disciplines, but also because of the diversity of instrument data being developed today both by NASA and international partner agencies and because multi-discipline science and practitioner communities expect to have access to all types of observational data. This paper describes an approach to defining career-path guidance for college-bound high school and undergraduate engineering students, junior and senior engineers from various disciplines.

  12. Scout: high-performance heterogeneous computing made simple

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jablin, James; Mc Cormick, Patrick; Herlihy, Maurice

    2011-01-26

    Researchers must often write their own simulation and analysis software. During this process they simultaneously confront both computational and scientific problems. Current strategies for aiding the generation of performance-oriented programs do not abstract the software development from the science. Furthermore, the problem is becoming increasingly complex and pressing with the continued development of many-core and heterogeneous (CPU-GPU) architectures. To acbieve high performance, scientists must expertly navigate both software and hardware. Co-design between computer scientists and research scientists can alleviate but not solve this problem. The science community requires better tools for developing, optimizing, and future-proofing codes, allowing scientists to focusmore » on their research while still achieving high computational performance. Scout is a parallel programming language and extensible compiler framework targeting heterogeneous architectures. It provides the abstraction required to buffer scientists from the constantly-shifting details of hardware while still realizing higb-performance by encapsulating software and hardware optimization within a compiler framework.« less

  13. Developing the Inner Scientist: Book Club Participation and the Nature of Science

    ERIC Educational Resources Information Center

    Griffard, Phyllis Baudoin; Mosleh, Tayseer; Kubba, Saad

    2013-01-01

    The leap from science student to scientist involves recognizing that science is a tentative, evolving body of knowledge that is socially constructed and culturally influenced; this is known as The Nature of Science (NOS). The aim of this study was to document NOS growth in first-year premedical students who participated in a science book club as a…

  14. Intersectionality as a Framework for Inclusive Environments

    NASA Astrophysics Data System (ADS)

    Nunez, A. M.

    2016-12-01

    To create more inclusive environments for the advancement of scientific inquiry, it is critical to consider the role of intersectionality. Originating in activism and legal scholarship grounded in the realities of women of color, the concept of intersectionality emphasizes how societal power dynamics shape the differential construction of life opportunities of diverse demographic groups across a variety of social identities, contexts, and historical conditions. Importantly, intersectionality also recognizes that individuals can simultaneously hold privileged and marginalized identities. For example, while white women scientists are less represented in leadership and decision-making positions than their male counterparts, but they typically do not experience the marginalization of being mistaken for cleaning staff at their institutions, as many African American and Latina scientists report. Thus, white women are relatively privileged in this context. This case and national survey data demonstrate the critical importance of recognizing that the intersection of racial and gender identities creates complex and multi-faceted challenges for diverse women scientists in navigating the organizational culture of science. Educational research indicates that interventions seeking to create more inclusivity in science should take into account the relationships between various social identities, contexts, and broader historical conditions that affect the advancement of historically underrepresented minority groups. Therefore, this presentation will provide a conceptual framework of intersectionality to guide interventions to encourage all scientists to recognize the distinctive intellectual and social contributions of those from diverse gender, race, class, disability, sexual orientation, and other identity backgrounds. It will also address how this framework can be applied to develop programs, policies, and practices that transform organizational cultures to be more inclusive along structural, linguistic, and interpersonal contexts.

  15. From Both Sides, Now: Librarians Team up with Computer Scientist to Deliver Virtual Computer-Information Literacy Instruction

    ERIC Educational Resources Information Center

    Loesch, Martha Fallahay

    2011-01-01

    Two members of the library faculty at Seton Hall University teamed up with a respected professor of mathematics and computer science, in order to create an online course that introduces information literacy both from the perspectives of the computer scientist and from the instruction librarian. This collaboration is unique in that it addresses the…

  16. Enabling drug discovery project decisions with integrated computational chemistry and informatics

    NASA Astrophysics Data System (ADS)

    Tsui, Vickie; Ortwine, Daniel F.; Blaney, Jeffrey M.

    2017-03-01

    Computational chemistry/informatics scientists and software engineers in Genentech Small Molecule Drug Discovery collaborate with experimental scientists in a therapeutic project-centric environment. Our mission is to enable and improve pre-clinical drug discovery design and decisions. Our goal is to deliver timely data, analysis, and modeling to our therapeutic project teams using best-in-class software tools. We describe our strategy, the organization of our group, and our approaches to reach this goal. We conclude with a summary of the interdisciplinary skills required for computational scientists and recommendations for their training.

  17. Provenance-Powered Automatic Workflow Generation and Composition

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Lee, S.; Pan, L.; Lee, T. J.

    2015-12-01

    In recent years, scientists have learned how to codify tools into reusable software modules that can be chained into multi-step executable workflows. Existing scientific workflow tools, created by computer scientists, require domain scientists to meticulously design their multi-step experiments before analyzing data. However, this is oftentimes contradictory to a domain scientist's daily routine of conducting research and exploration. We hope to resolve this dispute. Imagine this: An Earth scientist starts her day applying NASA Jet Propulsion Laboratory (JPL) published climate data processing algorithms over ARGO deep ocean temperature and AMSRE sea surface temperature datasets. Throughout the day, she tunes the algorithm parameters to study various aspects of the data. Suddenly, she notices some interesting results. She then turns to a computer scientist and asks, "can you reproduce my results?" By tracking and reverse engineering her activities, the computer scientist creates a workflow. The Earth scientist can now rerun the workflow to validate her findings, modify the workflow to discover further variations, or publish the workflow to share the knowledge. In this way, we aim to revolutionize computer-supported Earth science. We have developed a prototyping system to realize the aforementioned vision, in the context of service-oriented science. We have studied how Earth scientists conduct service-oriented data analytics research in their daily work, developed a provenance model to record their activities, and developed a technology to automatically generate workflow starting from user behavior and adaptability and reuse of these workflows for replicating/improving scientific studies. A data-centric repository infrastructure is established to catch richer provenance to further facilitate collaboration in the science community. We have also established a Petri nets-based verification instrument for provenance-based automatic workflow generation and recommendation.

  18. Center for computation and visualization of geometric structures. Final report, 1992 - 1995

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-11-01

    This report describes the overall goals and the accomplishments of the Geometry Center of the University of Minnesota, whose mission is to develop, support, and promote computational tools for visualizing geometric structures, for facilitating communication among mathematical and computer scientists and between these scientists and the public at large, and for stimulating research in geometry.

  19. What do computer scientists tweet? Analyzing the link-sharing practice on Twitter.

    PubMed

    Schmitt, Marco; Jäschke, Robert

    2017-01-01

    Twitter communication has permeated every sphere of society. To highlight and share small pieces of information with possibly vast audiences or small circles of the interested has some value in almost any aspect of social life. But what is the value exactly for a scientific field? We perform a comprehensive study of computer scientists using Twitter and their tweeting behavior concerning the sharing of web links. Discerning the domains, hosts and individual web pages being tweeted and the differences between computer scientists and a Twitter sample enables us to look in depth at the Twitter-based information sharing practices of a scientific community. Additionally, we aim at providing a deeper understanding of the role and impact of altmetrics in computer science and give a glance at the publications mentioned on Twitter that are most relevant for the computer science community. Our results show a link sharing culture that concentrates more heavily on public and professional quality information than the Twitter sample does. The results also show a broad variety in linked sources and especially in linked publications with some publications clearly related to community-specific interests of computer scientists, while others with a strong relation to attention mechanisms in social media. This refers to the observation that Twitter is a hybrid form of social media between an information service and a social network service. Overall the computer scientists' style of usage seems to be more on the information-oriented side and to some degree also on professional usage. Therefore, altmetrics are of considerable use in analyzing computer science.

  20. New Frontiers in Analyzing Dynamic Group Interactions: Bridging Social and Computer Science

    PubMed Central

    Lehmann-Willenbrock, Nale; Hung, Hayley; Keyton, Joann

    2017-01-01

    This special issue on advancing interdisciplinary collaboration between computer scientists and social scientists documents the joint results of the international Lorentz workshop, “Interdisciplinary Insights into Group and Team Dynamics,” which took place in Leiden, The Netherlands, July 2016. An equal number of scholars from social and computer science participated in the workshop and contributed to the papers included in this special issue. In this introduction, we first identify interaction dynamics as the core of group and team models and review how scholars in social and computer science have typically approached behavioral interactions in groups and teams. Next, we identify key challenges for interdisciplinary collaboration between social and computer scientists, and we provide an overview of the different articles in this special issue aimed at addressing these challenges. PMID:29249891

  1. Human computers: the first pioneers of the information age.

    PubMed

    Grier, D A

    2001-03-01

    Before computers were machines, they were people. They were men and women, young and old, well educated and common. They were the workers who convinced scientists that large-scale calculation had value. Long before Presper Eckert and John Mauchly built the ENIAC at the Moore School of Electronics, Philadelphia, or Maurice Wilkes designed the EDSAC for Manchester University, human computers had created the discipline of computation. They developed numerical methodologies and proved them on practical problems. These human computers were not savants or calculating geniuses. Some knew little more than basic arithmetic. A few were near equals of the scientists they served and, in a different time or place, might have become practicing scientists had they not been barred from a scientific career by their class, education, gender or ethnicity.

  2. New Frontiers in Analyzing Dynamic Group Interactions: Bridging Social and Computer Science.

    PubMed

    Lehmann-Willenbrock, Nale; Hung, Hayley; Keyton, Joann

    2017-10-01

    This special issue on advancing interdisciplinary collaboration between computer scientists and social scientists documents the joint results of the international Lorentz workshop, "Interdisciplinary Insights into Group and Team Dynamics," which took place in Leiden, The Netherlands, July 2016. An equal number of scholars from social and computer science participated in the workshop and contributed to the papers included in this special issue. In this introduction, we first identify interaction dynamics as the core of group and team models and review how scholars in social and computer science have typically approached behavioral interactions in groups and teams. Next, we identify key challenges for interdisciplinary collaboration between social and computer scientists, and we provide an overview of the different articles in this special issue aimed at addressing these challenges.

  3. Assessing the Publication Productivity and Impact of Eminent Geoscientists

    NASA Astrophysics Data System (ADS)

    Laird, Jennifer D.; Bell, Robin E.; Pfirman, Stephanie

    2007-09-01

    Publication is a critical component of modern science. By publishing their findings, scientists can ensure that their results are disseminated and substantiated. This brief report analyzes the publication and citation histories of American Geophysical Union (AGU) Fellows to elucidate different styles of productivity in the geoscience community. AGU Fellows are arguably the most eminent Earth scientists, recognized by their peers for their leadership within and outside the community.

  4. AGU Hosts Networking Event for Female Scientists

    NASA Astrophysics Data System (ADS)

    McEntee, Chris

    2013-01-01

    At Fall Meeting this year I had the pleasure of cohosting a new event, a Networking Reception for Early Career Female Scientists and Students, with Jane Lubchenco, under secretary of Commerce for Oceans and Atmosphere and National Oceanic and Atmospheric Administration administrator, and Marcia McNutt, director of the U.S. Geological Survey. AGU recognizes the importance of having a diverse pool of new researchers who can enrich Earth and space sciences with their skills and innovation. That's why one of our four strategic goals is to help build the global talent pool and provide early-career scientists with networking opportunities like this one.

  5. Building place-based collaborations to develop high school students' groundwater systems knowledge and decision-making capacity

    NASA Astrophysics Data System (ADS)

    Podrasky, A.; Covitt, B. A.; Woessner, W.

    2017-12-01

    The availability of clean water to support human uses and ecological integrity has become an urgent interest for many scientists, decision makers and citizens. Likewise, as computational capabilities increasingly revolutionize and become integral to the practice of science, technology, engineering and math (STEM) disciplines, the STEM+ Computing (STEM+C) Partnerships program seeks to integrate the use of computational approaches in K-12 STEM teaching and learning. The Comp Hydro project, funded by a STEM+C grant from the National Science Foundation, brings together a diverse team of scientists, educators, professionals and citizens at sites in Arizona, Colorado, Maryland and Montana to foster water literacy, as well as computational science literacy, by integrating authentic, place- and data- based learning using physical, mathematical, computational and conceptual models. This multi-state project is currently engaging four teams of six teachers who work during two academic years with educators and scientists at each site. Teams work to develop instructional units specific to their region that integrate hydrologic science and computational modeling. The units, currently being piloted in high school earth and environmental science classes, provide a classroom context to investigate student understanding of how computation is used in Earth systems science. To develop effective science instruction that is rich in place- and data- based learning, effective collaborations between researchers, educators, scientists, professionals and citizens are crucial. In this poster, we focus on project implementation in Montana, where an instructional unit has been developed and is being tested through collaboration among University scientists, researchers and educators, high school teachers and agency and industry scientists and engineers. In particular, we discuss three characteristics of effective collaborative science education design for developing and implementing place- and data- based science education to support students in developing socio-scientific and computational literacy sufficient for making decisions about real world issues such as groundwater contamination. These characteristics include that science education experiences are real, responsive/accessible and rigorous.

  6. Patent Law for Computer Scientists

    NASA Astrophysics Data System (ADS)

    Closa, Daniel; Gardiner, Alex; Giemsa, Falk; Machek, Jörg

    More than five centuries ago the first patent statute was passed by the Venetian senate. It already had most of the features of modern patent law, recognizing the public interest in innovation and granting exclusive right in exchange for a full disclosure. Some 350 years later the industrial revolution led to globalisation. The wish to protect intellectual property on a more international level evolved and supranational treaties were negotiated. Patent laws are still different in many countries, however, and inventors are sometimes at a loss to understand which basic requirements should be satisfied if an invention is to be granted a patent. This is particularly true for inventions implemented on a computer. While roughly a third of all applications (and granted patents) relate, in one way or another, to a computer, applications where the innovation mainly resides in software or in a business method are treated differently by the major patent offices. The procedures at the USPTO, JPO and EPO and, in particular, the differences in the treatment of applications centring on software are briefly explained. In later sections of this book, a wealth of examples will be presented. The methodology behind the treatment of these examples is explained.

  7. Identifying the Factors Leading to Success: How an Innovative Science Curriculum Cultivates Student Motivation

    NASA Astrophysics Data System (ADS)

    Scogin, Stephen C.

    2016-06-01

    PlantingScience is an award-winning program recognized for its innovation and use of computer-supported scientist mentoring. Science learners work on inquiry-based experiments in their classrooms and communicate asynchronously with practicing plant scientist-mentors about the projects. The purpose of this study was to identify specific factors contributing to the program's effectiveness in engaging students. Using multiple data sources, grounded theory (Strauss and Corbin in Basics of qualitative research. Sage, Newbury Park, 1990) was used to develop a conceptual model identifying the central phenomenon, causal conditions, intervening conditions, strategies, contexts, and student outcomes of the project. Student motivation was determined to be the central phenomenon explaining the success of the program, with student empowerment, online mentor interaction, and authenticity of the scientific experiences serving as causal conditions. Teachers contributed to student motivation by giving students more freedom, challenging students to take projects deeper, encouraging, and scaffolding. Scientists contributed to student motivation by providing explanations, asking questions, encouraging, and offering themselves as partners in the inquiry process. Several positive student outcomes of the program were uncovered and included increased positivity, greater willingness to take projects deeper, better understanding of scientific concepts, and greater commitments to collaboration. The findings of this study provide relevant information on how to develop curriculum, use technology, and train practitioners and mentors to utilize strategies and actions that improve learners' motivation to engage in authentic science in the classroom.

  8. Recognizing occupational effects of diacetyl: What can we learn from this history?

    PubMed Central

    Kreiss, Kathleen

    2017-01-01

    For half of the 30-odd years that diacetyl-exposed workers have developed disabling lung disease, obliterative bronchiolitis was unrecognized as an occupational risk. Delays in its recognition as an occupational lung disease are attributable to the absence of a work-related temporal pattern of symptoms; failure to recognize clusters of cases; complexity of exposure environments; and absence of epidemiologic characterization of workforces giving rise to case clusters. Few physicians are familiar with this rare disease, and motivation to investigate the unknown requires familiarity with what is known and what is anomalous. In pursuit of the previously undescribed risk, investigators benefited greatly from multi-disciplinary collaboration, in this case including physicians, epidemiologists, environmental scientists, toxicologists, industry representatives, and worker advocates. In the 15 years since obliterative bronchiolitis was described in microwave popcorn workers, α-dicarbonyl-related lung disease has been found in flavoring manufacturing workers, other food production workers, diacetyl manufacturing workers, and coffee production workers, alongside case reports in other industries. Within the field of occupational health, impacts include new ventures in public health surveillance, attention to spirometry quality for serial measurements, identifying other indolent causes of obliterative bronchiolitis apart from accidental over-exposures, and broadening the spectrum of diagnostic abnormalities in the disease. Within toxicology, impacts include new attention to appropriate animal models of obliterative bronchiolitis, pertinence of computational fluid dynamic-physiologically based pharmacokinetic modeling, and contributions to mechanistic understanding of respiratory epithelial necrosis, airway fibrosis, and central nervous system effects. In these continuing efforts, collaboration between laboratory scientists, clinicians, occupational public health practitioners in government and industry, and employers remains critical for improving the health of workers inhaling volatile α-dicarbonyl compounds. PMID:27326900

  9. Recognizing occupational effects of diacetyl: What can we learn from this history?

    PubMed

    Kreiss, Kathleen

    2017-08-01

    For half of the 30-odd years that diacetyl-exposed workers have developed disabling lung disease, obliterative bronchiolitis was unrecognized as an occupational risk. Delays in its recognition as an occupational lung disease are attributable to the absence of a work-related temporal pattern of symptoms; failure to recognize clusters of cases; complexity of exposure environments; and absence of epidemiologic characterization of workforces giving rise to case clusters. Few physicians are familiar with this rare disease, and motivation to investigate the unknown requires familiarity with what is known and what is anomalous. In pursuit of the previously undescribed risk, investigators benefited greatly from multi-disciplinary collaboration, in this case including physicians, epidemiologists, environmental scientists, toxicologists, industry representatives, and worker advocates. In the 15 years since obliterative bronchiolitis was described in microwave popcorn workers, α-dicarbonyl-related lung disease has been found in flavoring manufacturing workers, other food production workers, diacetyl manufacturing workers, and coffee production workers, alongside case reports in other industries. Within the field of occupational health, impacts include new ventures in public health surveillance, attention to spirometry quality for serial measurements, identifying other indolent causes of obliterative bronchiolitis apart from accidental over-exposures, and broadening the spectrum of diagnostic abnormalities in the disease. Within toxicology, impacts include new attention to appropriate animal models of obliterative bronchiolitis, pertinence of computational fluid dynamic-physiologically based pharmacokinetic modeling, and contributions to mechanistic understanding of respiratory epithelial necrosis, airway fibrosis, and central nervous system effects. In these continuing efforts, collaboration between laboratory scientists, clinicians, occupational public health practitioners in government and industry, and employers remains critical for improving the health of workers inhaling volatile α-dicarbonyl compounds. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  10. Scientific Computing Paradigm

    NASA Technical Reports Server (NTRS)

    VanZandt, John

    1994-01-01

    The usage model of supercomputers for scientific applications, such as computational fluid dynamics (CFD), has changed over the years. Scientific visualization has moved scientists away from looking at numbers to looking at three-dimensional images, which capture the meaning of the data. This change has impacted the system models for computing. This report details the model which is used by scientists at NASA's research centers.

  11. Utilizing Professional Vision in Supporting Preservice Teachers' Learning About Contextualized Scientific Practices. Collaborative Discourse Practices Between Teachers and Scientists

    NASA Astrophysics Data System (ADS)

    Sezen-Barrie, Asli

    2018-03-01

    Drawn from the cultural-historical theories of knowing and doing science, this article uses the concept of professional vision to explore what scientists and experienced teachers see and articulate as important aspects of climate science practices. The study takes an abductive reasoning approach to analyze scientists' videotaped lectures to recognize what scientists pay attention to in their explanations of climate science practices. It then analyzes how ideas scientists attended align with experienced teachers' sense-making of scientific practices to teach climate change. The findings show that experienced teachers' and scientists' explanations showed alignment in the focus on scientific practices, but indicated variations in the temporal and spatial reasoning of climate data. Furthermore, the interdisciplinarity of climate science was emphasized in climate scientists' lectures, but was not apparent once scientists and teachers shared the same culture in meetings to provide feedback to preservice teachers. Given the importance of teaching through scientific practices in classrooms, this study provides suggestions to capture the epistemic diversity of scientific disciplines.

  12. Reply to Comments on “AGU Statement: Investigation of Scientists and Officials in L'Aquila, Italy, Is Unfounded”

    NASA Astrophysics Data System (ADS)

    McPhaden, Michael

    2010-10-01

    It is critical to recognize the benefits and limitations of scientific knowledge, particularly when it comes to predicting hazards. I agree with G. J. Wasserburg that AGU should help scientists communicate their work accurately and understandably so it can provide the greatest value to society. This objective is explicit in AGU's new strategic plan (http://www.agu.org/about/strategic_plan.shtml) and is consistent with our vision of both advancing and communicating Earth and space science to ensure a sustainable future. We as a community have an obligation to increase the role of science in informing policy to mitigate the impacts of natural disasters. Such efforts require an open exchange of ideas and information and a clear understanding of the limitations of our knowledge. In response to Flavio Dobran, I agree that scientists are not above the law and, like all citizens, must be held accountable for their actions. However, laws and lawmakers must also recognize what science can and cannot do. We cannot yet reliably predict precisely when earthquakes will occur.

  13. First AGU Climate Communication Prize awarded

    NASA Astrophysics Data System (ADS)

    McEntee, Christine

    2012-02-01

    Gavin Schmidt, a climate scientist at the NASA Goddard Institute for Space Studies and cofounder of the RealClimate blog (http://www.realclimate.org/), received the first AGU Climate Communication Prize at the honors ceremony. The prize recognizes excellence in climate communication as well as the promotion of scientific literacy, clarity of messaging, and efforts to foster respect and understanding for science-based values related to climate change. Sponsored by Nature's Own—a Boulder, Colo.-based company specializing in the sale of minerals, fossils, and decorative stone specimens—the prize comes with a $25,000 cash award. "AGU created this award to raise the visibility of climate change as a critical issue facing the world today, to demonstrate our support for scientists who commit themselves to the effective communication of climate change science, and to encourage more scientists to engage with the public and policy makers on how climate research can contribute to the sustainability of our planet," said AGU president Michael Mc Phaden. "That's why we are so pleased to recognize Gavin for his dedicated leadership and outstanding scientific achievements. We hope that his work will serve as an inspiration for others."

  14. Enabling Earth Science: The Facilities and People of the NCCS

    NASA Technical Reports Server (NTRS)

    2002-01-01

    The NCCS's mass data storage system allows scientists to store and manage the vast amounts of data generated by these computations, and its high-speed network connections allow the data to be accessed quickly from the NCCS archives. Some NCCS users perform studies that are directly related to their ability to run computationally expensive and data-intensive simulations. Because the number and type of questions scientists research often are limited by computing power, the NCCS continually pursues the latest technologies in computing, mass storage, and networking technologies. Just as important as the processors, tapes, and routers of the NCCS are the personnel who administer this hardware, create and manage accounts, maintain security, and assist the scientists, often working one on one with them.

  15. Automated Recognition of Geologically Significant Shapes in MER PANCAM and MI Images

    NASA Technical Reports Server (NTRS)

    Morris, Robert; Shipman, Mark; Roush, Ted L.

    2004-01-01

    Autonomous recognition of scientifically important information provides the capability of: 1) Prioritizing data return; 2) Intelligent data compression; 3) Reactive behavior onboard robotic vehicles. Such capabilities are desirable as mission scenarios include longer durations with decreasing interaction from mission control. To address such issues, we have implemented several computer algorithms, intended to autonomously recognize morphological shapes of scientific interest within a software architecture envisioned for future rover missions. Mars Exploration Rovers (MER) instrument payloads include a Panoramic Camera (PANCAM) and Microscopic Imager (MI). These provide a unique opportunity to evaluate our algorithms when applied to data obtained from the surface of Mars. Early in the mission we applied our algorithms to images available at the mission web site (http://marsrovers.jpl.nasa.gov/gallery/images.html), even though these are not at full resolution. Some algorithms would normally use ancillary information, e.g. camera pointing and position of the sun, but these data were not readily available. The initial results of applying our algorithms to the PANCAM and MI images are encouraging. The horizon is recognized in all images containing it; such information could be used to eliminate unwanted areas from the image prior to data transmission to Earth. Additionally, several rocks were identified that represent targets for the mini-thermal emission spectrometer. Our algorithms also recognize the layers, identified by mission scientists. Such information could be used to prioritize data return or in a decision-making process regarding future rover activities. The spherules seen in MI images were also autonomously recognized. Our results indicate that reliable recognition of scientifically relevant morphologies in images is feasible.

  16. Big Data: An Opportunity for Collaboration with Computer Scientists on Data-Driven Science

    NASA Astrophysics Data System (ADS)

    Baru, C.

    2014-12-01

    Big data technologies are evolving rapidly, driven by the need to manage ever increasing amounts of historical data; process relentless streams of human and machine-generated data; and integrate data of heterogeneous structure from extremely heterogeneous sources of information. Big data is inherently an application-driven problem. Developing the right technologies requires an understanding of the applications domain. Though, an intriguing aspect of this phenomenon is that the availability of the data itself enables new applications not previously conceived of! In this talk, we will discuss how the big data phenomenon creates an imperative for collaboration among domain scientists (in this case, geoscientists) and computer scientists. Domain scientists provide the application requirements as well as insights about the data involved, while computer scientists help assess whether problems can be solved with currently available technologies or require adaptaion of existing technologies and/or development of new technologies. The synergy can create vibrant collaborations potentially leading to new science insights as well as development of new data technologies and systems. The area of interface between geosciences and computer science, also referred to as geoinformatics is, we believe, a fertile area for interdisciplinary research.

  17. CREASE 6.0 Catalog of Resources for Education in Ada and Software Engineering

    DTIC Science & Technology

    1992-02-01

    Programming Software Engineering Strong Typing Tasking Audene . Computer Scientists Terbook(s): Barnes, J. Programming in Ada, 3rd ed. Addison-Wesley...Ada. Concept: Abstract Data Types Management Overview Package Real-Time Programming Tasking Audene Computer Scientists Textbook(s): Barnes, J

  18. Parallel computing in genomic research: advances and applications

    PubMed Central

    Ocaña, Kary; de Oliveira, Daniel

    2015-01-01

    Today’s genomic experiments have to process the so-called “biological big data” that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC) environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities. PMID:26604801

  19. Parallel computing in genomic research: advances and applications.

    PubMed

    Ocaña, Kary; de Oliveira, Daniel

    2015-01-01

    Today's genomic experiments have to process the so-called "biological big data" that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC) environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities.

  20. Preface

    NASA Astrophysics Data System (ADS)

    Jung, Young Mee; Baranska, Malgorzata

    2018-05-01

    This special issue of the Spectrochimica Acta A is dedicated to the retirement of Professor Yukihiro Ozaki of Kwansei Gakuin University, Japan as an internationally well recognized scientist in molecular spectroscopy studies including vibrational and electronic spectroscopy.

  1. An economic and financial exploratory

    NASA Astrophysics Data System (ADS)

    Cincotti, S.; Sornette, D.; Treleaven, P.; Battiston, S.; Caldarelli, G.; Hommes, C.; Kirman, A.

    2012-11-01

    This paper describes the vision of a European Exploratory for economics and finance using an interdisciplinary consortium of economists, natural scientists, computer scientists and engineers, who will combine their expertise to address the enormous challenges of the 21st century. This Academic Public facility is intended for economic modelling, investigating all aspects of risk and stability, improving financial technology, and evaluating proposed regulatory and taxation changes. The European Exploratory for economics and finance will be constituted as a network of infrastructure, observatories, data repositories, services and facilities and will foster the creation of a new cross-disciplinary research community of social scientists, complexity scientists and computing (ICT) scientists to collaborate in investigating major issues in economics and finance. It is also considered a cradle for training and collaboration with the private sector to spur spin-offs and job creations in Europe in the finance and economic sectors. The Exploratory will allow Social Scientists and Regulators as well as Policy Makers and the private sector to conduct realistic investigations with real economic, financial and social data. The Exploratory will (i) continuously monitor and evaluate the status of the economies of countries in their various components, (ii) use, extend and develop a large variety of methods including data mining, process mining, computational and artificial intelligence and every other computer and complex science techniques coupled with economic theory and econometric, and (iii) provide the framework and infrastructure to perform what-if analysis, scenario evaluations and computational, laboratory, field and web experiments to inform decision makers and help develop innovative policy, market and regulation designs.

  2. Award-Winning Animation Helps Scientists See Nature at Work | News | NREL

    Science.gov Websites

    Scientists See Nature at Work August 8, 2008 A computer-aided image combines a photo of a man with a three -dimensional, computer-generated image. The man has long brown hair and a long beard. He is wearing a blue - simultaneously. "It is very difficult to parallelize the process to run even on a huge computer,"

  3. From Years of Work in Psychology and Computer Science, Scientists Build Theories of Thinking and Learning.

    ERIC Educational Resources Information Center

    Wheeler, David L.

    1988-01-01

    Scientists feel that progress in artificial intelligence and the availability of thousands of experimental results make this the right time to build and test theories on how people think and learn, using the computer to model minds. (MSE)

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCaskey, Alexander J.

    Hybrid programming models for beyond-CMOS technologies will prove critical for integrating new computing technologies alongside our existing infrastructure. Unfortunately the software infrastructure required to enable this is lacking or not available. XACC is a programming framework for extreme-scale, post-exascale accelerator architectures that integrates alongside existing conventional applications. It is a pluggable framework for programming languages developed for next-gen computing hardware architectures like quantum and neuromorphic computing. It lets computational scientists efficiently off-load classically intractable work to attached accelerators through user-friendly Kernel definitions. XACC makes post-exascale hybrid programming approachable for domain computational scientists.

  5. 76 FR 34977 - Science Advisory Board Staff Office Notification of a Public Meeting of the Science Advisory...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-15

    ... of nationally and internationally recognized scientists and engineers with demonstrated expertise and..., invasive species, water chemistry, environmental engineering, environmental monitoring, and environmental...

  6. Incentives to Encourage Scientific Web Contribution (Invited)

    NASA Astrophysics Data System (ADS)

    Antunes, A. K.

    2010-12-01

    We suggest improvements to citation standards and creation of remuneration opportunities to encourage career scientist contributions to Web2.0 and social media science channels. At present, agencies want to accomplish better outreach and engagement with no funding, while scientists sacrifice their personal time to contribute to web and social media sites. Securing active participation by scientists requires career recognition of the value scientists provide to web knowledge bases and to the general public. One primary mechanism to encourage participation is citation standards, which let a contributor improve their reputation in a quantifiable way. But such standards must be recognized by their scientific and workplace communities. Using case studies such as the acceptance of web in the workplace and the growth of open access journals, we examine what agencies and individual can do as well as the time scales needed to secure increased active contribution by scientists. We also discuss ways to jumpstart this process.

  7. The Ethical Challenges of Socially Responsible Science

    PubMed Central

    Resnik, David B.; Elliott, Kevin C.

    2015-01-01

    Social responsibility is an essential part of the responsible conduct of research that presents difficult ethical questions for scientists. Recognizing one’s social responsibilities as a scientist is an important first step toward exercising social responsibility, but it is only the beginning, since scientists may confront difficult value questions when deciding how to act responsibly. Ethical dilemmas related to socially responsible science fall into at least three basic categories: 1) dilemmas related to problem selection, 2) dilemmas related to publication and data sharing, and 3) dilemmas related to engaging society. In responding to these dilemmas, scientists must decide how to balance their social responsibilities against other professional commitments and how to avoid compromising their objectivity. In this article, we will examine the philosophical and ethical basis of social responsibility in science, discuss some of the ethical dilemmas related to exercising social responsibility, and make five recommendations to help scientists deal with these issues. PMID:26193168

  8. The Ethical Challenges of Socially Responsible Science.

    PubMed

    Resnik, David B; Elliott, Kevin C

    2016-01-01

    Social responsibility is an essential part of the responsible conduct of research that presents difficult ethical questions for scientists. Recognizing one's social responsibilities as a scientist is an important first step toward exercising social responsibility, but it is only the beginning, since scientists may confront difficult value questions when deciding how to act responsibly. Ethical dilemmas related to socially responsible science fall into at least three basic categories: 1) dilemmas related to problem selection, 2) dilemmas related to publication and data sharing, and 3) dilemmas related to engaging society. In responding to these dilemmas, scientists must decide how to balance their social responsibilities against other professional commitments and how to avoid compromising their objectivity. In this article, we will examine the philosophical and ethical basis of social responsibility in science, discuss some of the ethical dilemmas related to exercising social responsibility, and make five recommendations to help scientists deal with these issues.

  9. Working with and promoting early career scientists within a larger community

    NASA Astrophysics Data System (ADS)

    Pratt, K.

    2017-12-01

    For many scientific communities, engaging early career researchers is critical for success. These young scientists (graduate students, postdocs, and newly appointed professors) are actively forming collaborations and instigating new research programs. They also stand to benefit hugely from being part of a scientific community, gaining access to career development activities, becoming part of strong collaborator networks, and achieving recognition in their field of study — all of which will help their professional development. There are many ways community leaders can work proactively to support and engage early career scientists, and it it is often a community manager's job to work with leadership to implement such activities. In this presentation, I will outline ways of engaging early career scientists at events and tailored workshops, of promoting development of their leadership skills, and of creating opportunities for recognizing early career scientists within larger scientific communities. In this talk, I will draw from my experience working with the Deep Carbon Observatory Early Career Scientist Network, supported by the Alfred P. Sloan Foundation.

  10. "Ask Argonne" - Charlie Catlett, Computer Scientist, Part 2

    ScienceCinema

    Catlett, Charlie

    2018-02-14

    A few weeks back, computer scientist Charlie Catlett talked a bit about the work he does and invited questions from the public during Part 1 of his "Ask Argonne" video set (http://bit.ly/1joBtzk). In Part 2, he answers some of the questions that were submitted. Enjoy!

  11. "Ask Argonne" - Charlie Catlett, Computer Scientist, Part 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Catlett, Charlie

    2014-06-17

    A few weeks back, computer scientist Charlie Catlett talked a bit about the work he does and invited questions from the public during Part 1 of his "Ask Argonne" video set (http://bit.ly/1joBtzk). In Part 2, he answers some of the questions that were submitted. Enjoy!

  12. 77 FR 50505 - Science Advisory Board Staff Office Request for Nominations of Experts for the SAB Hydraulic...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-21

    ... and internationally recognized scientists and engineers having experience and expertise related to...; geochemistry and analytical chemistry; environmental monitoring; conducting laboratory and/or field-based...

  13. Icy Layers in Craters

    NASA Image and Video Library

    2018-02-20

    In this image from NASA's Mars Reconnaissance Rover (MRO) we can see the edge of a mound of ice in one of these mid-latitude craters. Some of it has already been removed, so we can see layering that used to be in the crater's interior. Scientists use ice deposits like these to figure out how the climate has changed on Mars. Another upside of recognizing this ice is that future astronauts will have plenty of drinking water. Scientists now realize that ice is very common on the Martian surface. It often fills up craters and valleys in the mid-latitudes in older climates, although when it's covered in dust it can be hard to recognize. Today the climate on Mars makes this ice unstable and some of it has evaporated away. https://photojournal.jpl.nasa.gov/catalog/PIA22255

  14. From Lived Experiences to Game Creation: How Scaffolding Supports Elementary School Students Learning Computer Science Principles in an After School Setting

    ERIC Educational Resources Information Center

    Her Many Horses, Ian

    2016-01-01

    The world, and especially our own country, is in dire need of a larger and more diverse population of computer scientists. While many organizations have approached this problem of too few computer scientists in various ways, a promising, and I believe necessary, path is to expose elementary students to authentic practices of the discipline.…

  15. Nurturing reliable and robust open-source scientific software

    NASA Astrophysics Data System (ADS)

    Uieda, L.; Wessel, P.

    2017-12-01

    Scientific results are increasingly the product of software. The reproducibility and validity of published results cannot be ensured without access to the source code of the software used to produce them. Therefore, the code itself is a fundamental part of the methodology and must be published along with the results. With such a reliance on software, it is troubling that most scientists do not receive formal training in software development. Tools such as version control, continuous integration, and automated testing are routinely used in industry to ensure the correctness and robustness of software. However, many scientist do not even know of their existence (although efforts like Software Carpentry are having an impact on this issue; software-carpentry.org). Publishing the source code is only the first step in creating an open-source project. For a project to grow it must provide documentation, participation guidelines, and a welcoming environment for new contributors. Expanding the project community is often more challenging than the technical aspects of software development. Maintainers must invest time to enforce the rules of the project and to onboard new members, which can be difficult to justify in the context of the "publish or perish" mentality. This problem will continue as long as software contributions are not recognized as valid scholarship by hiring and tenure committees. Furthermore, there are still unsolved problems in providing attribution for software contributions. Many journals and metrics of academic productivity do not recognize citations to sources other than traditional publications. Thus, some authors choose to publish an article about the software and use it as a citation marker. One issue with this approach is that updating the reference to include new contributors involves writing and publishing a new article. A better approach would be to cite a permanent archive of individual versions of the source code in services such as Zenodo (zenodo.org). However, citations to these sources are not always recognized when computing citation metrics. In summary, the widespread development of reliable and robust open-source software relies on the creation of formal training programs in software development best practices and the recognition of software as a valid form of scholarship.

  16. Thriving Earth Exchange: AGU's new grand challenge

    NASA Astrophysics Data System (ADS)

    McEntee, Chris; Williams, Billy

    2012-11-01

    Imagine a world where scientists are acknowledged and celebrated for their good works. Imagine being able to make a powerful impact, applying your expertise and experience to create real solutions to ensure a sustainable future. Now imagine those two ideas linked: AGU scientists engaged in creating solutions that are recognized and celebrated for their positive impact on our world. The AGU Grand Challenge: Thriving Earth Exchange, a new idea from our member leaders, is about making this dream real.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ayer, Vidya M.; Miguez, Sheila; Toby, Brian H.

    Scientists have been central to the historical development of the computer industry, but the importance of software only continues to grow for all areas of scientific research and in particular for powder diffraction. Knowing how to program a computer is a basic and useful skill for scientists. The article introduces the three types of programming languages and why scripting languages are now preferred for scientists. Of them, the authors assert Python is the most useful and easiest to learn. Python is introduced. Also presented is an overview to a few of the many add-on packages available to extend the capabilitiesmore » of Python, for example, for numerical computations, scientific graphics and graphical user interface programming.« less

  18. 2009 ALCF annual report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beckman, P.; Martin, D.; Drugan, C.

    2010-11-23

    This year the Argonne Leadership Computing Facility (ALCF) delivered nearly 900 million core hours of science. The research conducted at their leadership class facility touched our lives in both minute and massive ways - whether it was studying the catalytic properties of gold nanoparticles, predicting protein structures, or unearthing the secrets of exploding stars. The authors remained true to their vision to act as the forefront computational center in extending science frontiers by solving pressing problems for our nation. Our success in this endeavor was due mainly to the Department of Energy's (DOE) INCITE (Innovative and Novel Computational Impact onmore » Theory and Experiment) program. The program awards significant amounts of computing time to computationally intensive, unclassified research projects that can make high-impact scientific advances. This year, DOE allocated 400 million hours of time to 28 research projects at the ALCF. Scientists from around the world conducted the research, representing such esteemed institutions as the Princeton Plasma Physics Laboratory, National Institute of Standards and Technology, and European Center for Research and Advanced Training in Scientific Computation. Argonne also provided Director's Discretionary allocations for research challenges, addressing such issues as reducing aerodynamic noise, critical for next-generation 'green' energy systems. Intrepid - the ALCF's 557-teraflops IBM Blue/Gene P supercomputer - enabled astounding scientific solutions and discoveries. Intrepid went into full production five months ahead of schedule. As a result, the ALCF nearly doubled the days of production computing available to the DOE Office of Science, INCITE awardees, and Argonne projects. One of the fastest supercomputers in the world for open science, the energy-efficient system uses about one-third as much electricity as a machine of comparable size built with more conventional parts. In October 2009, President Barack Obama recognized the excellence of the entire Blue Gene series by awarding it to the National Medal of Technology and Innovation. Other noteworthy achievements included the ALCF's collaboration with the National Energy Research Scientific Computing Center (NERSC) to examine cloud computing as a potential new computing paradigm for scientists. Named Magellan, the DOE-funded initiative will explore which science application programming models work well within the cloud, as well as evaluate the challenges that come with this new paradigm. The ALCF obtained approval for its next-generation machine, a 10-petaflops system to be delivered in 2012. This system will allow us to resolve ever more pressing problems, even more expeditiously through breakthrough science in the years to come.« less

  19. Long live the Data Scientist, but can he/she persist?

    NASA Astrophysics Data System (ADS)

    Wyborn, L. A.

    2011-12-01

    In recent years the fourth paradigm of data intensive science has slowly taken hold as the increased capacity of instruments and an increasing number of instruments (in particular sensor networks) have changed how fundamental research is undertaken. Most modern scientific research is about digital capture of data direct from instruments, processing it by computers, storing the results on computers and only publishing a small fraction of data in hard copy publications. At the same time, the rapid increase in capacity of supercomputers, particularly at petascale, means that far larger data sets can be analysed and to greater resolution than previously possible. The new cloud computing paradigm which allows distributed data, software and compute resources to be linked by seamless workflows, is creating new opportunities in processing of high volumes of data to an increasingly larger number of researchers. However, to take full advantage of these compute resources, data sets for analysis have to be aggregated from multiple sources to create high performance data sets. These new technology developments require that scientists must become more skilled in data management and/or have a higher degree of computer literacy. In almost every science discipline there is now an X-informatics branch and a computational X branch (eg, Geoinformatics and Computational Geoscience): both require a new breed of researcher that has skills in both the science fundamentals and also knowledge of some ICT aspects (computer programming, data base design and development, data curation, software engineering). People that can operate in both science and ICT are increasingly known as 'data scientists'. Data scientists are a critical element of many large scale earth and space science informatics projects, particularly those that are tackling current grand challenges at an international level on issues such as climate change, hazard prediction and sustainable development of our natural resources. These projects by their very nature require the integration of multiple digital data sets from multiple sources. Often the preparation of the data for computational analysis can take months and requires painstaking attention to detail to ensure that anomalies identified are real and are not just artefacts of the data preparation and/or the computational analysis. Although data scientists are increasingly vital to successful data intensive earth and space science projects, unless they are recognised for their capabilities in both the science and the computational domains they are likely to migrate to either a science role or an ICT role as their career advances. Most reward and recognition systems do not recognise those with skills in both, hence, getting trained data scientists to persist beyond one or two projects can be challenge. Those data scientists that persist in the profession are characteristically committed and enthusiastic people who have the support of their organisations to take on this role. They also tend to be people who share developments and are critical to the success of the open source software movement. However, the fact remains that survival of the data scientist as a species is being threatened unless something is done to recognise their invaluable contributions to the new fourth paradigm of science.

  20. Two Students Win AGU Scholarships

    NASA Astrophysics Data System (ADS)

    Howard, Claire

    2013-11-01

    AGU is pleased to announce the winners of two scholarships. Marc Neveu is the recipient of the 2013 David S. Miller Young Scientist Scholarship, which recognizes a student of the Earth sciences whose academic work exhibits interest and promise. Hima Hassenruck-Gudipati is the 2013 recipient of the David E. Lumley Scholarship, which recognizes a high-achieving student who is working on problems of global importance in the energy and environmental sectors of industry and academia.

  1. How to Cloud for Earth Scientists: An Introduction

    NASA Technical Reports Server (NTRS)

    Lynnes, Chris

    2018-01-01

    This presentation is a tutorial on getting started with cloud computing for the purposes of Earth Observation datasets. We first discuss some of the main advantages that cloud computing can provide for the Earth scientist: copious processing power, immense and affordable data storage, and rapid startup time. We also talk about some of the challenges of getting the most out of cloud computing: re-organizing the way data are analyzed, handling node failures and attending.

  2. The Demise of the Synapse As the Locus of Memory: A Looming Paradigm Shift?

    PubMed

    Trettenbrein, Patrick C

    2016-01-01

    Synaptic plasticity is widely considered to be the neurobiological basis of learning and memory by neuroscientists and researchers in adjacent fields, though diverging opinions are increasingly being recognized. From the perspective of what we might call "classical cognitive science" it has always been understood that the mind/brain is to be considered a computational-representational system. Proponents of the information-processing approach to cognitive science have long been critical of connectionist or network approaches to (neuro-)cognitive architecture, pointing to the shortcomings of the associative psychology that underlies Hebbian learning as well as to the fact that synapses are practically unfit to implement symbols. Recent work on memory has been adding fuel to the fire and current findings in neuroscience now provide first tentative neurobiological evidence for the cognitive scientists' doubts about the synapse as the (sole) locus of memory in the brain. This paper briefly considers the history and appeal of synaptic plasticity as a memory mechanism, followed by a summary of the cognitive scientists' objections regarding these assertions. Next, a variety of tentative neuroscientific evidence that appears to substantiate questioning the idea of the synapse as the locus of memory is presented. On this basis, a novel way of thinking about the role of synaptic plasticity in learning and memory is proposed.

  3. Real Science, Real Learning: Bridging the Gap Between Scientists, Educators and Students

    NASA Astrophysics Data System (ADS)

    Lewis, Y.

    2006-05-01

    Today as never before, America needs its citizens to be literate in science and technology. Not only must we only inspire a new generation of scientists and engineers and technologists, we must foster a society capable of meeting complex, 21st-century challenges. Unfortunately, the need for creative, flexible thinkers is growing at a time when our young students are lagging in science interest and performance. Over the past 17 years, the JASON Project has worked to link real science and scientists to the classroom. This link provide viable pipeline to creating the next generation scientists and researchers. Ultimately, JASON's mission is to improve the way science is taught by enabling students to learn directly from leading scientists. Through partnerships with agencies such as NOAA and NASA, JASON creates multimedia classroom products based on current scientific research. Broadcasts of science expeditions, hosted by leading researchers, are coupled with classroom materials that include interactive computer-based simulations, video- on-demand, inquiry-based experiments and activities, and print materials for students and teachers. A "gated" Web site hosts online resources and provides a secure platform to network with scientists and other classrooms in a nationwide community of learners. Each curriculum is organized around a specific theme for a comprehensive learning experience. It may be taught as a complete package, or individual components can be selected to teach specific, standards-based concepts. Such thematic units include: Disappearing Wetlands, Mysteries of Earth and Mars, and Monster Storms. All JASON curriculum units are grounded in "inquiry-based learning." The highly interactive curriculum will enable students to access current, real-world scientific research and employ the scientific method through reflection, investigation, identification of problems, sharing of data, and forming and testing hypotheses. JASON specializes in effectively applying technology in science education by designing animated interactive visualizations that promote student understanding of complex scientific concepts and systems (Rieber, 1990, 1996). JASON's experience in utilizing the power of simulation technology has been widely recognized for its effectiveness in exciting and engaging students in science learning by independent evaluations of JASON's multimedia science curriculum (Ba et al., 2001; Goldenberg et al., 2003). The data collected indicates that JASON's science products have had a positive impact on students' science learning, have positively influenced their perceptions of scientists and of becoming scientists, and have helped diverse students grasp a deeper understanding of complex scientific content, concepts and technologies.

  4. The unique contribution of behavioral scientists to medical education: the top ten competencies.

    PubMed

    Sternlieb, Jeffrey L

    2014-01-01

    Understandably, the focus of most physicians is primarily on the biomedical-What is this disease or injury? Behavioral scientists from various disciplines in medical education generally have a broader approach-Who is this person with these symptoms and what is their story? Since behavioral scientists are often alone among U. S. residency faculty, physicians may fail to recognize the value of their approach to medical resident training. This review identifies and describes the top areas of expertise that behavioral scientists bring to medical education and how their training prepares them to think differently than other medical educators. In the course of identifying each competency, this review will emphasize the ways in which their skills and techniques are the origin of subtle impact in their teaching encounters, explore ways of targeting that impact, and discuss examples of this impact.

  5. Venus and Mars or Down to Earth: Stereotypes and Realities of Gender Differences.

    PubMed

    Fiske, Susan T

    2010-11-01

    Psychological scientists, like lay people, often think in categorical dichotomies that contrast men and women and exaggerate the differences between groups. These value-laden divides tend to privilege one side over the other, often to the advantage of the scientists' own identity group. Besides balancing perspectives in the academic marketplace of ideas, scientists can recognize the complexity of stigma. Gender, like many categories, entails two fundamental dimensions that characterize intergroup stigma (and all interpersonal perception): perceived warmth and competence. These dimensions identify groups viewed with ambivalence (e.g., traditional women are stereotypically warm but incompetent, whereas professional women are allegedly competent but cold). In gender and in other areas, psychological scientists can go beyond value-laden dichotomies and consider the fundamental, continuous dimensions along which we think about stigma. © The Author(s) 2010.

  6. 78 FR 33416 - Notification of a Public Meeting of the Science Advisory Board Environmental Justice Technical...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-04

    ....federalregister.gov/articles/2013/05/09/2013-11165/technical-guidance-for-assessing-environmental-justice-in... of nationally and internationally recognized scientists to serve on this panel. Additional background...

  7. Two Students Win AGU Scholarships

    NASA Astrophysics Data System (ADS)

    Howard, Claire

    2014-10-01

    AGU is pleased to announce the winners of two student scholarships. Caterina Brighi is the recipient of the 2014 David S. Miller Young Scientist Scholarship, which recognizes a student of the Earth sciences whose academic work exhibits interest and promise.

  8. Weathering the empire: meteorological research in the early British Straits Settlements.

    PubMed

    Williamson, Fiona

    2015-09-01

    This article explores meteorological interest and experimentation in the early history of the Straits Settlements. It centres on the establishment of an observatory in 1840s Singapore and examines the channels that linked the observatory to a global community of scientists, colonial officers and a reading public. It will argue that, although the value of overseas meteorological investigation was recognized by the British government, investment was piecemeal and progress in the field often relied on the commitment and enthusiasm of individuals. In the Straits Settlements, as elsewhere, these individuals were drawn from military or medical backgrounds, rather than trained as dedicated scientists. Despite this, meteorology was increasingly recognized as of fundamental importance to imperial interests. Thus this article connects meteorology with the history of science and empire more fully and examines how research undertaken in British dependencies is revealing of the operation of transnational networks in the exchange of scientific knowledge.

  9. The IT in Secondary Science Book. A Compendium of Ideas for Using Computers and Teaching Science.

    ERIC Educational Resources Information Center

    Frost, Roger

    Scientists need to measure and communicate, to handle information, and model ideas. In essence, they need to process information. Young scientists have the same needs. Computers have become a tremendously important addition to the processing of information through database use, graphing and modeling and also in the collection of information…

  10. 75 FR 64996 - Takes of Marine Mammals Incidental to Specified Activities; Marine Geophysical Survey in the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-21

    ... cruises. A laptop computer is located on the observer platform for ease of data entry. The computer is... lines, the receiving systems will receive the returning acoustic signals. The study (e.g., equipment...-board assistance by the scientists who have proposed the study. The Chief Scientist is Dr. Franco...

  11. Science preparedness and science response: perspectives on the dynamics of preparedness conference.

    PubMed

    Lant, Timothy; Lurie, Nicole

    2013-01-01

    The ability of the scientific modeling community to meaningfully contribute to postevent response activities during public health emergencies was the direct result of a discrete set of preparedness activities as well as advances in theory and technology. Scientists and decision-makers have recognized the value of developing scientific tools (e.g. models, data sets, communities of practice) to prepare them to be able to respond quickly--in a manner similar to preparedness activities by first-responders and emergency managers. Computational models have matured in their ability to better inform response plans by modeling human behaviors and complex systems. We advocate for further development of science preparedness activities as deliberate actions taken in advance of an unpredicted event (or an event with unknown consequences) to increase the scientific tools and evidence-base available to decision makers and the whole-of-community to limit adverse outcomes.

  12. Multiscale Cancer Modeling

    PubMed Central

    Macklin, Paul; Cristini, Vittorio

    2013-01-01

    Simulating cancer behavior across multiple biological scales in space and time, i.e., multiscale cancer modeling, is increasingly being recognized as a powerful tool to refine hypotheses, focus experiments, and enable more accurate predictions. A growing number of examples illustrate the value of this approach in providing quantitative insight on the initiation, progression, and treatment of cancer. In this review, we introduce the most recent and important multiscale cancer modeling works that have successfully established a mechanistic link between different biological scales. Biophysical, biochemical, and biomechanical factors are considered in these models. We also discuss innovative, cutting-edge modeling methods that are moving predictive multiscale cancer modeling toward clinical application. Furthermore, because the development of multiscale cancer models requires a new level of collaboration among scientists from a variety of fields such as biology, medicine, physics, mathematics, engineering, and computer science, an innovative Web-based infrastructure is needed to support this growing community. PMID:21529163

  13. What do computer scientists tweet? Analyzing the link-sharing practice on Twitter

    PubMed Central

    Schmitt, Marco

    2017-01-01

    Twitter communication has permeated every sphere of society. To highlight and share small pieces of information with possibly vast audiences or small circles of the interested has some value in almost any aspect of social life. But what is the value exactly for a scientific field? We perform a comprehensive study of computer scientists using Twitter and their tweeting behavior concerning the sharing of web links. Discerning the domains, hosts and individual web pages being tweeted and the differences between computer scientists and a Twitter sample enables us to look in depth at the Twitter-based information sharing practices of a scientific community. Additionally, we aim at providing a deeper understanding of the role and impact of altmetrics in computer science and give a glance at the publications mentioned on Twitter that are most relevant for the computer science community. Our results show a link sharing culture that concentrates more heavily on public and professional quality information than the Twitter sample does. The results also show a broad variety in linked sources and especially in linked publications with some publications clearly related to community-specific interests of computer scientists, while others with a strong relation to attention mechanisms in social media. This refers to the observation that Twitter is a hybrid form of social media between an information service and a social network service. Overall the computer scientists’ style of usage seems to be more on the information-oriented side and to some degree also on professional usage. Therefore, altmetrics are of considerable use in analyzing computer science. PMID:28636619

  14. Analysis of severe storm data

    NASA Technical Reports Server (NTRS)

    Hickey, J. S.

    1983-01-01

    The Mesoscale Analysis and Space Sensor (MASS) Data Management and Analysis System developed by Atsuko Computing International (ACI) on the MASS HP-1000 Computer System within the Systems Dynamics Laboratory of the Marshall Space Flight Center is described. The MASS Data Management and Analysis System was successfully implemented and utilized daily by atmospheric scientists to graphically display and analyze large volumes of conventional and satellite derived meteorological data. The scientists can process interactively various atmospheric data (Sounding, Single Level, Gird, and Image) by utilizing the MASS (AVE80) share common data and user inputs, thereby reducing overhead, optimizing execution time, and thus enhancing user flexibility, useability, and understandability of the total system/software capabilities. In addition ACI installed eight APPLE III graphics/imaging computer terminals in individual scientist offices and integrated them into the MASS HP-1000 Computer System thus providing significant enhancement to the overall research environment.

  15. NASA/DOD Aerospace Knowledge Diffusion Research Project. Paper 19: Computer and information technology and aerospace knowledge diffusion

    NASA Technical Reports Server (NTRS)

    Pinelli, Thomas E.; Kennedy, John M.; Barclay, Rebecca O.; Bishop, Ann P.

    1992-01-01

    To remain a world leader in aerospace, the US must improve and maintain the professional competency of its engineers and scientists, increase the research and development (R&D) knowledge base, improve productivity, and maximize the integration of recent technological developments into the R&D process. How well these objectives are met, and at what cost, depends on a variety of factors, but largely on the ability of US aerospace engineers and scientists to acquire and process the results of federally funded R&D. The Federal Government's commitment to high speed computing and networking systems presupposes that computer and information technology will play a major role in the aerospace knowledge diffusion process. However, we know little about information technology needs, uses, and problems within the aerospace knowledge diffusion process. The use of computer and information technology by US aerospace engineers and scientists in academia, government, and industry is reported.

  16. Introduction to the Space Physics Analysis Network (SPAN)

    NASA Technical Reports Server (NTRS)

    Green, J. L. (Editor); Peters, D. J. (Editor)

    1985-01-01

    The Space Physics Analysis Network or SPAN is emerging as a viable method for solving an immediate communication problem for the space scientist. SPAN provides low-rate communication capability with co-investigators and colleagues, and access to space science data bases and computational facilities. The SPAN utilizes up-to-date hardware and software for computer-to-computer communications allowing binary file transfer and remote log-on capability to over 25 nationwide space science computer systems. SPAN is not discipline or mission dependent with participation from scientists in such fields as magnetospheric, ionospheric, planetary, and solar physics. Basic information on the network and its use are provided. It is anticipated that SPAN will grow rapidly over the next few years, not only from the standpoint of more network nodes, but as scientists become more proficient in the use of telescience, more capability will be needed to satisfy the demands.

  17. Soil microbiology and soil health assessment

    USDA-ARS?s Scientific Manuscript database

    Soil scientists have long recognized the importance of soil biology in ecological health. In particular, soil microbes are crucial for many soil functions including decomposition, nutrient cycling, synthesis of plant growth regulators, and degradation of synthetic chemicals. Currently, soil biologis...

  18. An Innovative Approach to Bridge a Skill Gap and Grow a Workforce Pipeline: The Computer System, Cluster, and Networking Summer Institute

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Connor, Carolyn Marie; Jacobson, Andree Lars; Bonnie, Amanda Marie

    Sustainable and effective computing infrastructure depends critically on the skills and expertise of domain scientists and of committed and well-trained advanced computing professionals. But, in its ongoing High Performance Computing (HPC) work, Los Alamos National Laboratory noted a persistent shortage of well-prepared applicants, particularly for entry-level cluster administration, file systems administration, and high speed networking positions. Further, based upon recruiting efforts and interactions with universities graduating students in related majors of interest (e.g., computer science (CS)), there has been a long standing skillset gap, as focused training in HPC topics is typically lacking or absent in undergraduate and in evenmore » many graduate programs. Given that the effective operation and use of HPC systems requires specialized and often advanced training, that there is a recognized HPC skillset gap, and that there is intense global competition for computing and computational science talent, there is a long-standing and critical need for innovative approaches to help bridge the gap and create a well-prepared, next generation HPC workforce. Our paper places this need in the context of the HPC work and workforce requirements at Los Alamos National Laboratory (LANL) and presents one such innovative program conceived to address the need, bridge the gap, and grow an HPC workforce pipeline at LANL. The Computer System, Cluster, and Networking Summer Institute (CSCNSI) completed its 10th year in 2016. The story of the CSCNSI and its evolution is detailed below with a description of the design of its Boot Camp, and a summary of its success and some key factors that have enabled that success.« less

  19. An Innovative Approach to Bridge a Skill Gap and Grow a Workforce Pipeline: The Computer System, Cluster, and Networking Summer Institute

    DOE PAGES

    Connor, Carolyn Marie; Jacobson, Andree Lars; Bonnie, Amanda Marie; ...

    2016-11-01

    Sustainable and effective computing infrastructure depends critically on the skills and expertise of domain scientists and of committed and well-trained advanced computing professionals. But, in its ongoing High Performance Computing (HPC) work, Los Alamos National Laboratory noted a persistent shortage of well-prepared applicants, particularly for entry-level cluster administration, file systems administration, and high speed networking positions. Further, based upon recruiting efforts and interactions with universities graduating students in related majors of interest (e.g., computer science (CS)), there has been a long standing skillset gap, as focused training in HPC topics is typically lacking or absent in undergraduate and in evenmore » many graduate programs. Given that the effective operation and use of HPC systems requires specialized and often advanced training, that there is a recognized HPC skillset gap, and that there is intense global competition for computing and computational science talent, there is a long-standing and critical need for innovative approaches to help bridge the gap and create a well-prepared, next generation HPC workforce. Our paper places this need in the context of the HPC work and workforce requirements at Los Alamos National Laboratory (LANL) and presents one such innovative program conceived to address the need, bridge the gap, and grow an HPC workforce pipeline at LANL. The Computer System, Cluster, and Networking Summer Institute (CSCNSI) completed its 10th year in 2016. The story of the CSCNSI and its evolution is detailed below with a description of the design of its Boot Camp, and a summary of its success and some key factors that have enabled that success.« less

  20. Most Social Scientists Shun Free Use of Supercomputers.

    ERIC Educational Resources Information Center

    Kiernan, Vincent

    1998-01-01

    Social scientists, who frequently complain that the federal government spends too little on them, are passing up what scholars in the physical and natural sciences see as the government's best give-aways: free access to supercomputers. Some social scientists say the supercomputers are difficult to use; others find desktop computers provide…

  1. The Unified English Braille Code: Examination by Science, Mathematics, and Computer Science Technical Expert Braille Readers

    ERIC Educational Resources Information Center

    Holbrook, M. Cay; MacCuspie, P. Ann

    2010-01-01

    Braille-reading mathematicians, scientists, and computer scientists were asked to examine the usability of the Unified English Braille Code (UEB) for technical materials. They had little knowledge of the code prior to the study. The research included two reading tasks, a short tutorial about UEB, and a focus group. The results indicated that the…

  2. Meet EPA Scientist Valerie Zartarian, Ph.D.

    EPA Pesticide Factsheets

    Senior exposure scientist and research environmental engineer Valerie Zartarian, Ph.D. helps build computer models and other tools that advance our understanding of how people interact with chemicals.

  3. Hot, Hot, Hot Computer Careers.

    ERIC Educational Resources Information Center

    Basta, Nicholas

    1988-01-01

    Discusses the increasing need for electrical, electronic, and computer engineers; and scientists. Provides current status of the computer industry and average salaries. Considers computer chip manufacture and the current chip shortage. (MVL)

  4. Interfacing microbiology and biotechnology. Conference abstracts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maupin, Julia A.

    2001-05-19

    The Interfacing Microbiology and Biotechnology Conference was attended by over 100 faculty, post-docs, students, and research scientists from the US, Europe, and Latin America. The conference successfully stimulated communication and the dissemination of knowledge among scientists involved in basic and applied research. The focus of the conference was on microbial physiology and genetics and included sessions on C1 metabolism, archaeal metabolism, proteases and chaperones, gene arrays, and metabolic engineering. The meeting provided the setting for in-depth discussions between scientists who are internationally recognized for their research in these fields. The following objectives were met: (1) The promotion of interaction andmore » future collaborative projects among scientists involved in basic and applied research which incorporates microbial physiology, genetics, and biochemistry; (2) the facilitation of communication of new research findings through seminars, posters, and abstracts; (3 ) the stimulation of enthusiasm and education among participants including graduate and undergraduate students.« less

  5. Should We All be Scientists? Re-thinking Laboratory Research as a Calling.

    PubMed

    Bezuidenhout, Louise; Warne, Nathaniel A

    2017-07-19

    In recent years there have been major shifts in how the role of science-and scientists-are understood. The critical examination of scientific expertise within the field of Science and Technology Studies (STS) are increasingly eroding notions of the "otherness" of scientists. It would seem to suggest that anyone can be a scientist-when provided with the appropriate training and access to data. In contrast, however, ethnographic evidence from the scientific community tells a different story. Scientists are quick to recognize that not everyone can-or should-be a scientist. Appealing to notions such as "good hands" or "gut feelings", scientists narrate a distinction between good and bad scientists that cannot be reduced to education, access, or opportunity. The key to good science requires scientists to express an intuitive feeling for their discipline, but also that individuals derive considerable personal satisfaction from their work. Discussing this personal joy in-and "fittingness" of-scientific occupations using the fields of STS, ethics and science policy is highly problematic. In this paper we turn to theology discourse to analyze the notion of "callings" as a means of understanding this issue. Callings highlight the identification and examination of individual talents to determine fit occupations for specific persons. Framing science as a calling represents a novel view of research that places the talents and dispositions of individuals and their relationship to the community at the center of flourishing practices.

  6. Tumor suppressor identified as inhibitor of inflammation

    Cancer.gov

    Scientists at NCI have found that a protein, FBXW7, which acts as a tumor suppressor, is also important for the reduction in strength of inflammatory pathways. It has long been recognized that a complex interaction exists between cancer causing mechanisms

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shankar, Arjun

    Computer scientist Arjun Shankar is director of the Compute and Data Environment for Science (CADES), ORNL’s multidisciplinary big data computing center. CADES offers computing, networking and data analytics to facilitate workflows for both ORNL and external research projects.

  8. User's guide to the LLL BASIC interpreter. [For 8080-based MCS-80 microcomputer system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allison, T.; Eckard, R.; Barber, J.

    1977-06-09

    Scientists are finding increased applications for microcomputers as process controllers in their experiments. However, while microcomputers are small and inexpensive, they are difficult to program in machine or assembly language. A high-level language is needed to enable scientists to develop their own microcomputer programs for their experiments on location. Recognizing this need, LLL contracted to have such a language developed. This report describes the result--the LLL BASIC interpreter, which operates with LLL's 8080-based MCS-80 microcomputer system. 4 tables.

  9. An Accidental Scientist: Chance, Failure, Risk-Taking, and Mentoring.

    PubMed

    McGrath, Patrick J

    2018-04-06

    I never intended to become a scientist. My career developed on the basis of chance happenings, repeated failure, the willingness to take risks and the acceptance and provision of mentoring. My career has included periods of difficulty and shifted back and forth between academic health centers and universities in Canada. Although I have been amply recognized for my successes, my greatest learning has come from my failures. My greatest satisfaction has been in the development, evaluation and dissemination of interventions. The combination of intellectual stimulation and emotional gratification has meant a rewarding career.

  10. Jungle Computing: Distributed Supercomputing Beyond Clusters, Grids, and Clouds

    NASA Astrophysics Data System (ADS)

    Seinstra, Frank J.; Maassen, Jason; van Nieuwpoort, Rob V.; Drost, Niels; van Kessel, Timo; van Werkhoven, Ben; Urbani, Jacopo; Jacobs, Ceriel; Kielmann, Thilo; Bal, Henri E.

    In recent years, the application of high-performance and distributed computing in scientific practice has become increasingly wide spread. Among the most widely available platforms to scientists are clusters, grids, and cloud systems. Such infrastructures currently are undergoing revolutionary change due to the integration of many-core technologies, providing orders-of-magnitude speed improvements for selected compute kernels. With high-performance and distributed computing systems thus becoming more heterogeneous and hierarchical, programming complexity is vastly increased. Further complexities arise because urgent desire for scalability and issues including data distribution, software heterogeneity, and ad hoc hardware availability commonly force scientists into simultaneous use of multiple platforms (e.g., clusters, grids, and clouds used concurrently). A true computing jungle.

  11. Development of a Learning-Oriented Computer Assisted Instruction Designed to Improve Skills in the Clinical Assessment of the Nutritional Status: A Pilot Evaluation

    PubMed Central

    García de Diego, Laura; Cuervo, Marta; Martínez, J. Alfredo

    2015-01-01

    Computer assisted instruction (CAI) is an effective tool for evaluating and training students and professionals. In this article we will present a learning-oriented CAI, which has been developed for students and health professionals to acquire and retain new knowledge through the practice. A two-phase pilot evaluation was conducted, involving 8 nutrition experts and 30 postgraduate students, respectively. In each training session, the software developed guides users in the integral evaluation of a patient’s nutritional status and helps them to implement actions. The program includes into the format clinical tools, which can be used to recognize possible patient’s needs, to improve the clinical reasoning and to develop professional skills. Among them are assessment questionnaires and evaluation criteria, cardiovascular risk charts, clinical guidelines and photographs of various diseases. This CAI is a complete software package easy to use and versatile, aimed at clinical specialists, medical staff, scientists, educators and clinical students, which can be used as a learning tool. This application constitutes an advanced method for students and health professionals to accomplish nutritional assessments combining theoretical and empirical issues, which can be implemented in their academic curriculum. PMID:25978456

  12. Development of a learning-oriented computer assisted instruction designed to improve skills in the clinical assessment of the nutritional status: a pilot evaluation.

    PubMed

    García de Diego, Laura; Cuervo, Marta; Martínez, J Alfredo

    2015-01-01

    Computer assisted instruction (CAI) is an effective tool for evaluating and training students and professionals. In this article we will present a learning-oriented CAI, which has been developed for students and health professionals to acquire and retain new knowledge through the practice. A two-phase pilot evaluation was conducted, involving 8 nutrition experts and 30 postgraduate students, respectively. In each training session, the software developed guides users in the integral evaluation of a patient's nutritional status and helps them to implement actions. The program includes into the format clinical tools, which can be used to recognize possible patient's needs, to improve the clinical reasoning and to develop professional skills. Among them are assessment questionnaires and evaluation criteria, cardiovascular risk charts, clinical guidelines and photographs of various diseases. This CAI is a complete software package easy to use and versatile, aimed at clinical specialists, medical staff, scientists, educators and clinical students, which can be used as a learning tool. This application constitutes an advanced method for students and health professionals to accomplish nutritional assessments combining theoretical and empirical issues, which can be implemented in their academic curriculum.

  13. Beyond "Hitting the Books"

    ERIC Educational Resources Information Center

    Entress, Cole; Wagner, Aimee

    2014-01-01

    Scientists, science teachers, and serious students recognize that success in science classes requires consistent practice--including study at home. Whether balancing chemical equations, calculating angular momentum, or memorizing the steps of cell division, students must review material repeatedly to fully understand new ideas--and must practice…

  14. Recognized Leader in Electrochemical Purification

    ScienceCinema

    Hoppe, Eric

    2018-01-16

    PNNL scientists developed an electrochemical method for purifying copper, a key material that makes possible radiation detection systems of unprecedented sensitivity. The method begins with the purest copper materials available, and results in the lowest-background copper in the world. Chemist Eric Hoppe explains the process.

  15. Defining Long Term Goals and Setting Priorities for Education and Outreach, 2003 to 2013 - Panel Report

    NASA Astrophysics Data System (ADS)

    Grier, J. A.; Atkinson, D. H.; Barlow, N.; Griffin, I.; Hoffman, J.; Kelly-Serrato, B.; Kesthelyi, L.; Klein, M.; Klug, S.; Kolvoord, B.; Lanagan, P.; Lebofsky, L. A.; Lindstrom, M.; Lopes, R.; Lowes, L.; Manifold, J.; Mastrapa, R.; Milazzo, M.; Miner, E.; Morris, P.; Runyon, C.; Sohus, A.; Urquhart, M.; Warasila, R. L.; Withers, P.; Wood, Chuck

    2001-11-01

    Education and Public Outreach (E/PO) activities are an integral part of NASA's mandated mission and detailed in its Strategic Plan. The Office of Space Science Solar System Exploration (OSS SSE) E/PO program has made great strides in defining priorities and achieving its goals in the last five years. The Education and Public Outreach panel for NASA's Decadal Survey has generated a list of key issues to be addressed for the years 2003-2013 to assist the OSS SSE in future prioritization and planning. Key issues under discussion include: improving the involvement of planetary science professionals in E/PO activities; combating scientific elitism; examining the association between E/PO programs and public relations; re-examining funding E/PO activities from an audience perspective as opposed to a mission-centered perspective; improving access to resources for scientists, educators, students and partner organizations; promoting communication between educational programs at NASA; and reaching traditionally underrepresented groups, women, minorities and the disabled with science education programs. The panel is developing a list of specific recommendations to be implemented to improve OSS SSE E/PO activities in the next decade. These recommendations deal with topics such as: the production of evaluated resource web sites for scientists and educators; the development of a policy of long-term funding for the maintenance of web sites and other tools after they are created; methods for reaching those who do not have computer access through television and public programs; and the development of a reward system to recognize and encourage scientist involvement in E/PO activities. Such key issues and recommendations will be presented, along with materials from current programs and initiatives for E/PO in the OSS SSE.

  16. Developing a strategy for computational lab skills training through Software and Data Carpentry: Experiences from the ELIXIR Pilot action

    PubMed Central

    Pawlik, Aleksandra; van Gelder, Celia W.G.; Nenadic, Aleksandra; Palagi, Patricia M.; Korpelainen, Eija; Lijnzaad, Philip; Marek, Diana; Sansone, Susanna-Assunta; Hancock, John; Goble, Carole

    2017-01-01

    Quality training in computational skills for life scientists is essential to allow them to deliver robust, reproducible and cutting-edge research. A pan-European bioinformatics programme, ELIXIR, has adopted a well-established and progressive programme of computational lab and data skills training from Software and Data Carpentry, aimed at increasing the number of skilled life scientists and building a sustainable training community in this field. This article describes the Pilot action, which introduced the Carpentry training model to the ELIXIR community. PMID:28781745

  17. Developing a strategy for computational lab skills training through Software and Data Carpentry: Experiences from the ELIXIR Pilot action.

    PubMed

    Pawlik, Aleksandra; van Gelder, Celia W G; Nenadic, Aleksandra; Palagi, Patricia M; Korpelainen, Eija; Lijnzaad, Philip; Marek, Diana; Sansone, Susanna-Assunta; Hancock, John; Goble, Carole

    2017-01-01

    Quality training in computational skills for life scientists is essential to allow them to deliver robust, reproducible and cutting-edge research. A pan-European bioinformatics programme, ELIXIR, has adopted a well-established and progressive programme of computational lab and data skills training from Software and Data Carpentry, aimed at increasing the number of skilled life scientists and building a sustainable training community in this field. This article describes the Pilot action, which introduced the Carpentry training model to the ELIXIR community.

  18. Crisis Communication during Natural Disasters: Meeting Real and Perceived Needs

    NASA Astrophysics Data System (ADS)

    Jones, L.

    2017-12-01

    When significant natural disasters strike, our modern information-driven society turns to scientists, demanding information about the event. As part of their civic duty scientists respond, recognizing how the scientific information could be used to improve response to the disaster and reduce losses. However, what we often find is that the demand for information is not for improved response but to satisfy psychological, often subconscious needs. Human beings evolved our larger brains to better survive against larger and stronger predators. Recognizing that a movement of grass and the lack of birdsong means that a predator is hiding would in turn mean a greater likelihood of having progeny. Our ability to theorize comes from the need to create patterns in the face of danger that will keep us safe. From wondering about someone's exercise habits when we hear they have a heart attack, to blaming hurricane victims for not heeding evacuation orders even if they had no means to evacuate, we respond to disasters by trying to make a pattern that means that we will not suffer the same fate. Much of the demand for information after a natural disaster is a search for these patterns. Faced with a random distribution, many people still make patterns that can reduce their anxiety. The result is that meanings are ascribed to the information that is not supported by the data and was not part of the communication as intended by the scientist. The challenge for science communicators is to recognize this need and present the information is a way that both reduces the anxiety that arises from a lack of knowledge or uncertainty while making clear what patterns can or cannot be made about future risks.

  19. Perceived barriers to physician-scientist careers among female undergraduate medical students at the College of Medicine - Alfaisal University: a Saudi Arabian perspective.

    PubMed

    Abu-Zaid, Ahmed; Altinawi, Basmah

    2014-04-01

    At present, only a negligible number of matriculating and graduating female medical students express interest in physician-scientist careers. The aim of this study is to explore the perceived barriers towards pursuing physician-scientist careers by female undergraduate medical students at College of Medicine - Alfaisal University, Saudi Arabia. An online, anonymous, self-rating survey was administered. The survey assessed students' perceived barriers towards potential physician-scientist careers by responding to typical 5-point Likert scale statements. One hundred sixteen students (116/171) participated in the survey with a 67.8% response rate. The top three barriers to such physician-scientist careers were greater preference towards patient care than research (75%), lack of conviction as regards merging a fruitful research profession with satisfying motherhood life (52.6%) and paucity of recognizing successful and well-known female physician-scientist role models in the country (48.3%). Our results showed that the perceived barriers to physician-scientist careers by College of Medicine - Alfaisal University's female undergraduate medical students were largely identical to the Western literature with few differences and more influence of cultural reasons. It is crucial for medical educators in Saudi Arabia to work on mechanisms that stimulate female students' interest in research and resolve all barriers that stand in the face of students towards considering physician-scientist careers.

  20. NUCLEAR ESPIONAGE: Report Details Spying on Touring Scientists.

    PubMed

    Malakoff, D

    2000-06-30

    A congressional report released this week details dozens of sometimes clumsy attempts by foreign agents to obtain nuclear secrets from U.S. nuclear scientists traveling abroad, ranging from offering scientists prostitutes to prying off the backs of their laptop computers. The report highlights the need to better prepare traveling researchers to safeguard secrets and resist such temptations, say the two lawmakers who requested the report and officials at the Department of Energy, which employs the scientists.

  1. Prevention and control of malaria and sleeping sickness in Africa: where are we and where are we going?

    PubMed

    Corbel, Vincent; Henry, Marie-Claire

    2011-03-16

    The International Symposium on Malaria and Human African Trypanosomiasis: New Strategies for their Prevention & Control was held 7-8 October, 2010 in Cotonou, Benin with about 250 participants from 20 countries. This scientific event aimed at identifying the gaps and research priorities in the prevention and control of malaria and sleeping sickness in Africa and to promote exchange between North and South in the fields of medical entomology, epidemiology, immunology and parasitology. A broad range of influential partners from academia (scientists), stakeholders, public health workers and industry attempted the meeting and about 40 oral communications and 20 posters were presented by phD students and internationally-recognized scientists from the North and the South. Finally, a special award ceremony was held to recognize efforts in pioneer work conducted by staff involved in the diagnostic of the Sleeping illness in West Africa with partnership and assistance from WHO and Sanofi-Aventis group.

  2. A new look at emotion perception: Concepts speed and shape facial emotion recognition.

    PubMed

    Nook, Erik C; Lindquist, Kristen A; Zaki, Jamil

    2015-10-01

    Decades ago, the "New Look" movement challenged how scientists thought about vision by suggesting that conceptual processes shape visual perceptions. Currently, affective scientists are likewise debating the role of concepts in emotion perception. Here, we utilized a repetition-priming paradigm in conjunction with signal detection and individual difference analyses to examine how providing emotion labels-which correspond to discrete emotion concepts-affects emotion recognition. In Study 1, pairing emotional faces with emotion labels (e.g., "sad") increased individuals' speed and sensitivity in recognizing emotions. Additionally, individuals with alexithymia-who have difficulty labeling their own emotions-struggled to recognize emotions based on visual cues alone, but not when emotion labels were provided. Study 2 replicated these findings and further demonstrated that emotion concepts can shape perceptions of facial expressions. Together, these results suggest that emotion perception involves conceptual processing. We discuss the implications of these findings for affective, social, and clinical psychology. (c) 2015 APA, all rights reserved).

  3. Keeping an Eye on the Prize

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hazi, A U

    2007-02-06

    Setting performance goals is part of the business plan for almost every company. The same is true in the world of supercomputers. Ten years ago, the Department of Energy (DOE) launched the Accelerated Strategic Computing Initiative (ASCI) to help ensure the safety and reliability of the nation's nuclear weapons stockpile without nuclear testing. ASCI, which is now called the Advanced Simulation and Computing (ASC) Program and is managed by DOE's National Nuclear Security Administration (NNSA), set an initial 10-year goal to obtain computers that could process up to 100 trillion floating-point operations per second (teraflops). Many computer experts thought themore » goal was overly ambitious, but the program's results have proved them wrong. Last November, a Livermore-IBM team received the 2005 Gordon Bell Prize for achieving more than 100 teraflops while modeling the pressure-induced solidification of molten metal. The prestigious prize, which is named for a founding father of supercomputing, is awarded each year at the Supercomputing Conference to innovators who advance high-performance computing. Recipients for the 2005 prize included six Livermore scientists--physicists Fred Streitz, James Glosli, and Mehul Patel and computer scientists Bor Chan, Robert Yates, and Bronis de Supinski--as well as IBM researchers James Sexton and John Gunnels. This team produced the first atomic-scale model of metal solidification from the liquid phase with results that were independent of system size. The record-setting calculation used Livermore's domain decomposition molecular-dynamics (ddcMD) code running on BlueGene/L, a supercomputer developed by IBM in partnership with the ASC Program. BlueGene/L reached 280.6 teraflops on the Linpack benchmark, the industry standard used to measure computing speed. As a result, it ranks first on the list of Top500 Supercomputer Sites released in November 2005. To evaluate the performance of nuclear weapons systems, scientists must understand how materials behave under extreme conditions. Because experiments at high pressures and temperatures are often difficult or impossible to conduct, scientists rely on computer models that have been validated with obtainable data. Of particular interest to weapons scientists is the solidification of metals. ''To predict the performance of aging nuclear weapons, we need detailed information on a material's phase transitions'', says Streitz, who leads the Livermore-IBM team. For example, scientists want to know what happens to a metal as it changes from molten liquid to a solid and how that transition affects the material's characteristics, such as its strength.« less

  4. The Manhattan Project and its Effects on American Women Scientists

    NASA Astrophysics Data System (ADS)

    Fletcher, Samuel

    2008-04-01

    There have been many detailed historical accounts of the Manhattan Project, but few have recognized the technical role women scientists and engineers crucially played in the Project's success. Despite their absence from these prominent accounts, recent studies have revealed that, in fact, women participated in every non-combat operation associated with the Manhattan Project. With such extensive participation of women and such a former lack of historical attention upon them, little analysis has been done on how the Manhattan Project might have influenced the prospectus of women scientists after the war. This talk has two aims: 1) to recount some of the technical and scientific contributions of women to the Manhattan Project, and 2) to examine what effects these contributions had on the women's careers as scientists. In other words, I intend offer a preliminary explanation of the extent to which the Manhattan Project acted both as a boon and as a detriment to American women scientists. And finally, I will address what this historical analysis could imply about the effects of current efforts to recruit women into science.

  5. Final Report. Center for Scalable Application Development Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mellor-Crummey, John

    2014-10-26

    The Center for Scalable Application Development Software (CScADS) was established as a part- nership between Rice University, Argonne National Laboratory, University of California Berkeley, University of Tennessee – Knoxville, and University of Wisconsin – Madison. CScADS pursued an integrated set of activities with the aim of increasing the productivity of DOE computational scientists by catalyzing the development of systems software, libraries, compilers, and tools for leadership computing platforms. Principal Center activities were workshops to engage the research community in the challenges of leadership computing, research and development of open-source software, and work with computational scientists to help them develop codesmore » for leadership computing platforms. This final report summarizes CScADS activities at Rice University in these areas.« less

  6. Characterization of real-time computers

    NASA Technical Reports Server (NTRS)

    Shin, K. G.; Krishna, C. M.

    1984-01-01

    A real-time system consists of a computer controller and controlled processes. Despite the synergistic relationship between these two components, they have been traditionally designed and analyzed independently of and separately from each other; namely, computer controllers by computer scientists/engineers and controlled processes by control scientists. As a remedy for this problem, in this report real-time computers are characterized by performance measures based on computer controller response time that are: (1) congruent to the real-time applications, (2) able to offer an objective comparison of rival computer systems, and (3) experimentally measurable/determinable. These measures, unlike others, provide the real-time computer controller with a natural link to controlled processes. In order to demonstrate their utility and power, these measures are first determined for example controlled processes on the basis of control performance functionals. They are then used for two important real-time multiprocessor design applications - the number-power tradeoff and fault-masking and synchronization.

  7. Early Childhood Interventions: Proven Results, Future Promise

    ERIC Educational Resources Information Center

    Karoly, Lynn A.; Kilburn, M. Rebecca; Cannon, Jill S.

    2005-01-01

    Parents, policymakers, business leaders, and the general public increasingly recognize the importance of the first few years in the life of a child for promoting healthy physical, emotional, social, and intellectual development. Whether the evidence comes from sophisticated research by brain scientists or the simple observation of the…

  8. Introduction to Childhood Studies

    ERIC Educational Resources Information Center

    Kehily, Mary Jane, Ed.

    2004-01-01

    Educationalists and social scientists are increasingly interested in childhood as a distinct social category, and Childhood Studies is now a recognized area of research and analysis. This book brings together the key themes of Childhood Studies in a broad and accessible introduction for students and practitioners working in this field.…

  9. A Mind for Adventure

    ERIC Educational Resources Information Center

    Strother, Mark A.

    2007-01-01

    Formal schooling began centuries before scientists would discover how the brains of children actually learn. Not surprisingly, traditional teaching was often boring and brain antagonistic. But great teachers in every era intuitively recognized what has now been validated by neuroscience: powerful learning is an adventure of the mind. Students,…

  10. Teaching Scholarly Activity in Psychiatric Training: Years 6 and 7

    ERIC Educational Resources Information Center

    Zisook, Sidney; Boland, Robert; Cowley, Deborah; Cyr, Rebecca L.; Pato, Michele T.; Thrall, Grace

    2013-01-01

    Objective: To address nationally recognized needs for increased numbers of psychiatric clinician-scholars and physician-scientists, the American Association of Directors of Psychiatric Residency Training (AADPRT) has provided a series of full-day conferences of psychiatry residency training directors designed to increase their competence in…

  11. Enhancing Scientific Literacy by Targeting Specific Scientific Skills

    ERIC Educational Resources Information Center

    Hicks, Sylvia; MacDonald, Shane; Martin, Ela

    2017-01-01

    The term scientific literacy is increasingly used by governments and teaching bodies, stemming from a growing international concern by scientists and government, who recognize the economic significance of developing scientific skills (McGregor & Kearton, 2010). However, in a society that requires students to be scientifically literate,…

  12. Pathways to Success.

    ERIC Educational Resources Information Center

    Sloan, Lloyd Ren, Ed.; Starr, B. James, Ed.

    Recognizing the need to increase the number of people from ethnically defined populations serving as research scientists in the mental health field, the National Institute of Mental Health (NIMH) created the Minority Access to Research Careers program in 1979. This program, now known as the Career Opportunities in Research Education and Training…

  13. The Fermilab Connection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fermilab

    More than 4,000 scientists in 53 countries use Fermilab and its particle accelerators, detectors and computers for their research. That includes about 2,500 scientists from 223 U.S. institutions in 42 states, plus the District of Columbia and Puerto Rico.

  14. EarthCube: A Community-Driven Cyberinfrastructure for the Geosciences

    NASA Astrophysics Data System (ADS)

    Koskela, Rebecca; Ramamurthy, Mohan; Pearlman, Jay; Lehnert, Kerstin; Ahern, Tim; Fredericks, Janet; Goring, Simon; Peckham, Scott; Powers, Lindsay; Kamalabdi, Farzad; Rubin, Ken; Yarmey, Lynn

    2017-04-01

    EarthCube is creating a dynamic, System of Systems (SoS) infrastructure and data tools to collect, access, analyze, share, and visualize all forms of geoscience data and resources, using advanced collaboration, technological, and computational capabilities. EarthCube, as a joint effort between the U.S. National Science Foundation Directorate for Geosciences and the Division of Advanced Cyberinfrastructure, is a quickly growing community of scientists across all geoscience domains, as well as geoinformatics researchers and data scientists. EarthCube has attracted an evolving, dynamic virtual community of more than 2,500 contributors, including earth, ocean, polar, planetary, atmospheric, geospace, computer and social scientists, educators, and data and information professionals. During 2017, EarthCube will transition to the implementation phase. The implementation will balance "innovation" and "production" to advance cross-disciplinary science goals as well as the development of future data scientists. This presentation will describe the current architecture design for the EarthCube cyberinfrastructure and implementation plan.

  15. Volunteer Clouds and Citizen Cyberscience for LHC Physics

    NASA Astrophysics Data System (ADS)

    Aguado Sanchez, Carlos; Blomer, Jakob; Buncic, Predrag; Chen, Gang; Ellis, John; Garcia Quintas, David; Harutyunyan, Artem; Grey, Francois; Lombrana Gonzalez, Daniel; Marquina, Miguel; Mato, Pere; Rantala, Jarno; Schulz, Holger; Segal, Ben; Sharma, Archana; Skands, Peter; Weir, David; Wu, Jie; Wu, Wenjing; Yadav, Rohit

    2011-12-01

    Computing for the LHC, and for HEP more generally, is traditionally viewed as requiring specialized infrastructure and software environments, and therefore not compatible with the recent trend in "volunteer computing", where volunteers supply free processing time on ordinary PCs and laptops via standard Internet connections. In this paper, we demonstrate that with the use of virtual machine technology, at least some standard LHC computing tasks can be tackled with volunteer computing resources. Specifically, by presenting volunteer computing resources to HEP scientists as a "volunteer cloud", essentially identical to a Grid or dedicated cluster from a job submission perspective, LHC simulations can be processed effectively. This article outlines both the technical steps required for such a solution and the implications for LHC computing as well as for LHC public outreach and for participation by scientists from developing regions in LHC research.

  16. Computer-based communication in support of scientific and technical work. [conferences on management information systems used by scientists of NASA programs

    NASA Technical Reports Server (NTRS)

    Vallee, J.; Wilson, T.

    1976-01-01

    Results are reported of the first experiments for a computer conference management information system at the National Aeronautics and Space Administration. Between August 1975 and March 1976, two NASA projects with geographically separated participants (NASA scientists) used the PLANET computer conferencing system for portions of their work. The first project was a technology assessment of future transportation systems. The second project involved experiments with the Communication Technology Satellite. As part of this project, pre- and postlaunch operations were discussed in a computer conference. These conferences also provided the context for an analysis of the cost of computer conferencing. In particular, six cost components were identified: (1) terminal equipment, (2) communication with a network port, (3) network connection, (4) computer utilization, (5) data storage and (6) administrative overhead.

  17. Know Your Discipline: Teaching the Philosophy of Computer Science

    ERIC Educational Resources Information Center

    Tedre, Matti

    2007-01-01

    The diversity and interdisciplinarity of computer science and the multiplicity of its uses in other sciences make it hard to define computer science and to prescribe how computer science should be carried out. The diversity of computer science also causes friction between computer scientists from different branches. Computer science curricula, as…

  18. Reframing science communication: How the use of metaphor, rhetoric, and other tools of persuasion can strengthen the public understanding of science (without weakening the integrity of the scientific process)

    NASA Astrophysics Data System (ADS)

    Soderberg, Jeanne

    This paper is about "truthiness", its resulting impact on the public understanding of science (and subsequently science policy), and why scientists need to learn how to navigate truthiness in order to ensure that the scientific body of knowledge is both preserved and shared. In order to contend with truthiness, scientists must understand and acknowledge how people receive and process information, how they form their reactions and opinions about it, and how they can be manipulated by various agencies and players to feel and think in certain ways. In order to accomplish these objectives, scientists must also understand various aspects of culture, language, psychology, neuroscience, and communication. Most importantly, scientists must recognize their own humanity, and learn how to accept and work with their own human boundaries. Truth can indeed be beauty. And, there is absolutely nothing unscientific about creating beauty in order to demonstrate and explain truth.

  19. Benefits of Exchange Between Computer Scientists and Perceptual Scientists: A Panel Discussion

    NASA Technical Reports Server (NTRS)

    Kaiser, Mary K.; Null, Cynthia H. (Technical Monitor)

    1995-01-01

    We have established several major goals for this panel: 1) Introduce the computer graphics community to some specific leaders in the use of perceptual psychology relating to computer graphics; 2) Enumerate the major results that are known, and provide a set of resources for finding others; 3) Identify research areas where knowledge of perceptual psychology can help computer system designers improve their systems; and 4) Provide advice to researchers on how they can establish collaborations in their own research programs. We believe this will be a very important panel. In addition to generating lively discussion, we hope to point out some of the fundamental issues that occur at the boundary between computer science and perception, and possibly help researchers avoid some of the common pitfalls.

  20. Boom. Bust. Build.

    ERIC Educational Resources Information Center

    Kite, Vance; Park, Soonhye

    2018-01-01

    In 2006 Jeanette Wing, a professor of computer science at Carnegie Mellon University, proposed computational thinking (CT) as a literacy just as important as reading, writing, and mathematics. Wing defined CT as a set of skills and strategies computer scientists use to solve complex, computational problems (Wing 2006). The computer science and…

  1. Workforce Retention Study in Support of the U.S. Army Aberdeen Test Center Human Capital Management Strategy

    DTIC Science & Technology

    2016-09-01

    Sciences Group 6% 1550s Computer Scientists Group 5% Other 1500s ORSAa, Mathematics, & Statistics Group 3% 1600s Equipment & Facilities Group 4...Employee removal based on misconduct, delinquency , suitability, unsatisfactory performance, or failure to qualify for conversion to a career appointment...average of 10.4% in many areas, but over double the average for the 1550s (Computer Scientists) and other 1500s (ORSA, Mathematics, and Statistics ). Also

  2. Research Institute for Advanced Computer Science: Annual Report October 1998 through September 1999

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.; Gross, Anthony R. (Technical Monitor)

    1999-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center (ARC). It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. ARC has been designated NASA's Center of Excellence in Information Technology. In this capacity, ARC is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA ARC and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, and visiting scientist programs, designed to encourage and facilitate collaboration between the university and NASA information technology research communities.

  3. Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2000-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth; (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking. Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, and visiting scientist programs, designed to encourage and facilitate collaboration between the university and NASA information technology research communities.

  4. Some cases in applications of nanotechnology to food and agricultural systems

    USDA-ARS?s Scientific Manuscript database

    Food nanotechnology is an emerging technology. Many scientists and engineers have recognized well the potential of nanotechnology to lead all the industries in the 21st century. Even though successful applications of nanotechnology to foods are still limited, some basic concepts based on nano-scale ...

  5. Recognizing the Achievements of Women in Science.

    ERIC Educational Resources Information Center

    Dujari, Anuradha

    2000-01-01

    Lists the women Nobel Prize laureates and questions why, with the exception of Marie Curie, all these women scientists are not well known by the public. Explains why so few women have won the Nobel Prize in science and medicine as compared to other fields. (Contains 18 references.) (YDS)

  6. Ecosystem services: a new NRS-FIA analytical science initiative

    Treesearch

    Brian G. Tavernia; Mark D. Nelson; James D. Garner

    2015-01-01

    Forest ecosystem services (ES) are linked to sustaining human well-being. Recognizing an inappropriate economic valuation of ecosystem properties and processes, many ecologists, economists, and political scientists have pushed for an increasing awareness and appreciation of ES. Many definitions of ES include both direct and indirect benefits humans derive from...

  7. The social structure of family and farm forestry in Alabama

    Treesearch

    John Schelhas; Robert Zabawa

    2009-01-01

    Social research on, and programs for, forest landowners in the United States has tended to view them as individuals, and to be oriented toward transferring new knowledge, technical assistance, fi nancial assistance, and even cultural content to autonomous forest landowners. However, social scientists have long recognized that a great...

  8. Young Scientists Explore Insects. Book 2 Primary Level.

    ERIC Educational Resources Information Center

    Penn, Linda

    Designed to present interesting facts about science and to heighten the curiosity of primary age students, this book contains suggestions for students to investigate the natural world and numerous black and white illustrations. The activities focus on nine easily recognized insects: bees, beetles, lady bugs, lightning bugs, ants, mosquitoes,…

  9. IV. Aquatics

    Treesearch

    Michael K. Young

    2011-01-01

    The problem of invasive aquatic species has long been recognized by scientists at the Rocky Mountain Research Station. Fausch and others (2006, 2009) recently overviewed this issue. A point that often distinguishes nonnative aquatic species from nonnatives in other environments is that the presence of some species is frequently prized by managers and the public. For...

  10. The Urban Tree Project

    ERIC Educational Resources Information Center

    Barnett, Michael; Houle, Meredith; Hufnagel, Elizabeth; Pancic, Alexander; Lehman, Mike; Hoffman, Emily

    2010-01-01

    Geospatial technologies have emerged over the last 15 years as one of the key tools used by environmental scientists (NRC 2006). In fact, educators have recognized that coupling geospatial technologies with environmental science topics and scientific data sets opens the door to local and regional scientific investigations (McInerney 2006). In this…

  11. Social Indicators and Program Evaluation.

    ERIC Educational Resources Information Center

    Elliott, Elizabeth

    The paper examines the concept of social indicators as ways of evaluating macro level adult education programs. In general social indicators deal with social factors which affect the quality of life of the population. Social scientists are recognizing the need for both economic and social indicators. Even as the need for social indicators is…

  12. A COMPARISON OF BENTHIC MACROINVERTEBRATE SAMPLING METHODS FOR NON-WADEABLE RIVERINE ECOSYSTEMS

    EPA Science Inventory

    Bioassessment of non-wadeable streams in the U.S. is becoming more common, but methods for these systems are not as well developed as for wadeable streams. This problem was recognized by the U.S. Environmental Protection Agency (EPA) regional scientists as critical to their moni...

  13. Herbicide-resistant weeds threaten soil conservation gains: finding a balance for soil and farm sustainability

    USDA-ARS?s Scientific Manuscript database

    Tillage has been an integral part of agriculture since the dawn of civilization. Growers and scientists have long recognized both beneficial and detrimental aspects to tillage. There is no question that most tillage promotes soil loss, adversely affects surface water quality and negatively impacts...

  14. Chronicles of Fibroporia radiculosa (= Antrodia radiculosa) TFFH 294

    Treesearch

    Carol A. Clausen; Katie M. Jenkins

    2011-01-01

    The brown-rot fungus, Fibroporia radiculosa, has been included in numerous research studies because many isolates of this fungus demonstrate an unusually high tolerance to copper. This fungus has undergone several recognized changes in taxonomic nomenclature, and through DNA technology, scientists have correctly identified isolates that had been misidentified...

  15. Fisheries Information Network in Indonesia.

    ERIC Educational Resources Information Center

    Balachandran, Sarojini

    During the early 1980s the Indonesian government made a policy decision to develop fisheries as an important sector of the national economy. In doing so, it recognized the need for the collection and dissemination of fisheries research information not only for the scientists themselves, but also for the ultimate transfer of technology through…

  16. Understanding Peer Influence in Children and Adolescents

    ERIC Educational Resources Information Center

    Prinstein, Mitchell J., Ed.; Dodge, Kenneth A., Ed.

    2008-01-01

    Scientists, educators, and parents of teens have long recognized the potency of peer influences on children and youth, but until recently, questions of how and why adolescents emulate their peers were largely overlooked. This book presents a framework for understanding the processes by which peers shape each other's attitudes and behavior, and…

  17. States Move toward Computer Science Standards. Policy Update. Vol. 23, No. 17

    ERIC Educational Resources Information Center

    Tilley-Coulson, Eve

    2016-01-01

    While educators and parents recognize computer science as a key skill for career readiness, only five states have adopted learning standards in this area. Tides are changing, however, as the Every Student Succeeds Act (ESSA) recognizes with its call on states to provide a "well-rounded education" for students, to include computer science…

  18. Assessing Uncertainty in Deep Learning Techniques that Identify Atmospheric Rivers in Climate Simulations

    NASA Astrophysics Data System (ADS)

    Mahesh, A.; Mudigonda, M.; Kim, S. K.; Kashinath, K.; Kahou, S.; Michalski, V.; Williams, D. N.; Liu, Y.; Prabhat, M.; Loring, B.; O'Brien, T. A.; Collins, W. D.

    2017-12-01

    Atmospheric rivers (ARs) can be the difference between CA facing drought or hurricane-level storms. ARs are a form of extreme weather defined as long, narrow columns of moisture which transport water vapor outside the tropics. When they make landfall, they release the vapor as rain or snow. Convolutional neural networks (CNNs), a machine learning technique that uses filters to recognize features, are the leading computer vision mechanism for classifying multichannel images. CNNs have been proven to be effective in identifying extreme weather events in climate simulation output (Liu et. al. 2016, ABDA'16, http://bit.ly/2hlrFNV). Here, we compare three different CNN architectures, tuned with different hyperparameters and training schemes. We compare two-layer, three-layer, four-layer, and sixteen-layer CNNs' ability to recognize ARs in Community Atmospheric Model version 5 output, and we explore the ability of data augmentation and pre-trained models to increase the accuracy of the classifier. Because pre-training the model with regular images (i.e. benches, stoves, and dogs) yielded the highest accuracy rate, this strategy, also known as transfer learning, may be vital in future scientific CNNs, which likely will not have access to a large labelled training dataset. By choosing the most effective CNN architecture, climate scientists can build an accurate historical database of ARs, which can be used to develop a predictive understanding of these phenomena.

  19. Recent Advances and Issues in Computers. Oryx Frontiers of Science Series.

    ERIC Educational Resources Information Center

    Gay, Martin K.

    Discussing recent issues in computer science, this book contains 11 chapters covering: (1) developments that have the potential for changing the way computers operate, including microprocessors, mass storage systems, and computing environments; (2) the national computational grid for high-bandwidth, high-speed collaboration among scientists, and…

  20. VENUS AND MARS OR DOWN TO EARTH: STEREOTYPES AND REALITIES OF GENDER DIFFERENCES

    PubMed Central

    Fiske, Susan T.

    2013-01-01

    Psychological scientists, like lay people, often think in categorical dichotomies that contrast men and women and exaggerate the differences between groups. These value-laden divides tend to privilege one side over the other, often to the advantage of the scientists’ own identity group. Besides balancing perspectives in the academic marketplace of ideas, scientists can recognize the complexity of stigma. Gender, like many categories, entails two fundamental dimensions that characterize intergroup stigma (and all interpersonal perception): perceived warmth and competence. These dimensions identify groups viewed with ambivalence (e.g., traditional women are stereotypically warm but incompetent, whereas professional women are allegedly competent but cold). In gender and in other areas, psychological scientists can go beyond value-laden dichotomies and consider the fundamental, continuous dimensions along which we think about stigma. PMID:23678365

  1. [Project HRANAFINA--Croatian anatomical and physiological terminology].

    PubMed

    Vodanović, Marin

    2012-01-01

    HRANAFINA--Croatian Anatomical and Physiological Terminology is a project of the University of Zagreb School of Dental Medicine funded by the Croatian Science Foundation. It is performed in cooperation with other Croatian universities with medical schools. This project has a two-pronged aim: firstly, building of Croatian anatomical and physiological terminology and secondly, Croatian anatomical and physiological terminology usage popularization between health professionals, medical students, scientists and translators. Internationally recognized experts from Croatian universities with medical faculties and linguistics experts are involved in the project. All project activities are coordinated in agreement with the National Coordinator for Development of Croatian Professional Terminology. The project enhances Croatian professional terminology and Croatian language in general, increases competitiveness of Croatian scientists on international level and facilitates the involvement of Croatian scientists, health care providers and medical students in European projects.

  2. `Untangling Sickle-cell Anemia and the Teaching of Heterozygote Protection'

    NASA Astrophysics Data System (ADS)

    Howe, Eric Michael

    2007-01-01

    Introductory biology textbooks often use the example of sickle-cell anemia to illustrate the concept of heterozygote protection. Ordinarily scientists expect the frequency of a gene associated with a debilitating illness would be low owing to its continual elimination by natural selection. The gene that causes sickle-cell anemia, however, has a relatively high frequency in many parts of the world. Historically, scientists proposed and defended several alternative theories to account for this anomaly, though it is now widely recognized among the scientific community that high frequencies of the gene reflect its benefit to heterozygotes against malaria. Textbooks normally develop this concept with reference to the often-used maps of Africa showing how in areas where the frequency of the sickle-cell gene is high, there is also higher exposure to the disease malaria. While sickle-cell anemia is often the example of choice for explaining and illustrating the concept of heterozygote protection, the present paper argues that exploring the history of scientific research behind our contemporary understanding has advantages for helping students understand multiple factors related to population genetics (e.g. mutation, gene flow, drift) in addition to heterozygote protection. In so doing, this approach invites students to evaluate the legitimacy of their own alternative conceptions about introductory population genetics or about the genetics of the disease sickle-cell anemia. The various historical theories scientists proposed and defended often resemble those of students who first learn about the disease. As such, a discussion of how scientists reached consensus about the role of heterozygote protection may help students understand and appreciate what are now recognized to be limitations in the views they bring to their classrooms. The paper concludes by discussing the ramifications of this approach in potentially helping students to examine certain aspects of the nature of science.

  3. Recognizing the importance of conversation between experts and non-experts in science communication

    NASA Astrophysics Data System (ADS)

    Rushlow, C. R.; Soderquist, B.; Cohn, T.; Eitel, K.

    2016-12-01

    Science communication is often perceived by scientists as the flow of information from experts to non-experts, and institutions have responded by providing science communication training that focuses on best practices for disseminating information. This unidirectional approach neglects a key component of science communication: scientists must understand the needs and values of the stakeholders for whom they are producing information, whether the stakeholders are community members, resource managers, or policy makers. We designed an activity for graduate students enrolled in a science communication class at the McCall Outdoor Science School to both alert them to this misconception, and to give them an opportunity to rectify it. Over the course of 24-hours, we challenged students to have a conversation about climate change with someone they encountered in the community of McCall, ID. Using material from their conversations, students created a story in podcast or video form to share with the class. Through reflecting on this activity, students experienced a change in their perceptions of their identities as science communicators. Many students expressed an increased interest in listening to the stories of community members to learn more about the community's needs and values. We repeated the activity with early career scientists attending a climate workshop in McCall offered by the USGS Northwest Climate Science Center, focusing our evaluation around the science identity model of Carlone and Johnson (2007). Evaluations suggest that participants recognized their role as scientists in not only to providing information, but also in listening to the values and needs of the people for whom they are working. We believe this understanding is fundamental to being a good science communicator and ensuring that science remains relevant to communities.

  4. Building Effective Pipelines to Increase Diversity in the Geosciences

    NASA Astrophysics Data System (ADS)

    Snow, E.; Robinson, C. R.; Neal-Mujahid, R.

    2017-12-01

    The U.S. Geological Survey (USGS) recognizes and understands the importance of a diverse workforce in advancing our science. Valuing Differences is one of the guiding principles of the USGS, and is the critical basis of the collaboration among the Youth and Education in Science (YES) program in the USGS Office of Science, Quality, and Integrity (OSQI), the Office of Diversity and Equal Opportunity (ODEO), and USGS science centers to build pipeline programs targeting diverse young scientists. Pipeline programs are robust, sustained relationships between two entities that provide a pathway from one to the other, in this case, from minority serving institutions to the USGS. The USGS has benefited from pipeline programs for many years. Our longest running program, with University of Puerto Rico Mayaguez (UPR), is a targeted outreach and internship program that has been managed by USGS scientists in Florida since the mid-1980's Originally begun as the Minority Participation in the Earth Sciences (MPES ) Program, it has evolved over the years, and in its several forms has brought dozens of interns to the USGS. Based in part on that success, in 2006 USGS scientists in Woods Hole MA worked with their Florida counterparts to build a pipeline program with City College of New York (CCNY). In this program, USGS scientists visit CCNY monthly, giving a symposium and meeting with students and faculty. The talks are so successful that the college created a course around them. In 2017, the CCNY and UPR programs brought 12 students to the USGS for summer internships. The CCNY model has been so successful that USGS is exploring creating similar pipeline programs. The YES office is coordinating with ODEO and USGS science centers to identify partner universities and build relationships that will lead to robust partnership where USGS scientists will visit regularly to engage with faculty and students and recruit students for USGS internships. The ideal partner universities will have a high population of underserved students, strong support for minority and first-generation students, proximity to a USGS office, and faculty and/or majors in several of the fields most important to USGS science: geology, geochemistry, energy, biology, ecology, environmental health, hydrology, climate science, GIS, high-capacity computing, and remote sensing.

  5. Electronic Ecosystem.

    ERIC Educational Resources Information Center

    Travis, John

    1991-01-01

    A discipline in which scientists seek to simulate and synthesize lifelike behaviors within computers, chemical mixtures, and other media is discussed. A computer program with self-replicating digital "organisms" that evolve as they compete for computer time and memory is described. (KR)

  6. Chemistry Research

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Philip Morris research center scientists use a computer program called CECTRP, for Chemical Equilibrium Composition and Transport Properties, to gain insight into the behavior of atoms as they progress along the reaction pathway. Use of the program lets the scientist accurately predict the behavior of a given molecule or group of molecules. Computer generated data must be checked by laboratory experiment, but the use of CECTRP saves the researchers hundreds of hours of laboratory time since experiments must run only to validate the computer's prediction. Philip Morris estimates that had CECTRP not been available, at least two man years would have been required to develop a program to perform similar free energy calculations.

  7. Developing an online programme in computational biology.

    PubMed

    Vincent, Heather M; Page, Christopher

    2013-11-01

    Much has been written about the need for continuing education and training to enable life scientists and computer scientists to manage and exploit the different types of biological data now becoming available. Here we describe the development of an online programme that combines short training courses, so that those who require an educational programme can progress to complete a formal qualification. Although this flexible approach fits the needs of course participants, it does not fit easily within the organizational structures of a campus-based university.

  8. System biology of gene regulation.

    PubMed

    Baitaluk, Michael

    2009-01-01

    A famous joke story that exhibits the traditionally awkward alliance between theory and experiment and showing the differences between experimental biologists and theoretical modelers is when a University sends a biologist, a mathematician, a physicist, and a computer scientist to a walking trip in an attempt to stimulate interdisciplinary research. During a break, they watch a cow in a field nearby and the leader of the group asks, "I wonder how one could decide on the size of a cow?" Since a cow is a biological object, the biologist responded first: "I have seen many cows in this area and know it is a big cow." The mathematician argued, "The true volume is determined by integrating the mathematical function that describes the outer surface of the cow's body." The physicist suggested: "Let's assume the cow is a sphere...." Finally the computer scientist became nervous and said that he didn't bring his computer because there is no Internet connection up there on the hill. In this humorous but explanatory story suggestions proposed by theorists can be taken to reflect the view of many experimental biologists that computer scientists and theorists are too far removed from biological reality and therefore their theories and approaches are not of much immediate usefulness. Conversely, the statement of the biologist mirrors the view of many traditional theoretical and computational scientists that biological experiments are for the most part simply descriptive, lack rigor, and that much of the resulting biological data are of questionable functional relevance. One of the goals of current biology as a multidisciplinary science is to bring people from different scientific areas together on the same "hill" and teach them to speak the same "language." In fact, of course, when presenting their data, most experimentalist biologists do provide an interpretation and explanation for the results, and many theorists/computer scientists aim to answer (or at least to fully describe) questions of biological relevance. Thus systems biology could be treated as such a socioscientific phenomenon and a new approach to both experiments and theory that is defined by the strategy of pursuing integration of complex data about the interactions in biological systems from diverse experimental sources using interdisciplinary tools and personnel.

  9. T-Cell Warriors—Equipped to Kill Cancer Cells | Center for Cancer Research

    Cancer.gov

    When the body recognizes tumor cells as foreign, a natural immune response arises to attack them. Unfortunately, tumors have ways to evade immune surveillance systems and antitumor responses are often too weak to defeat the disease. Rather than relying on the body’s natural response, scientists can now manipulate a patient’s own immune cells so that they latch on to tumor cells by recognizing specific proteins on their surface. A type of immune cell that has been explored for this purpose is the killer (cytotoxic) T cell, which eliminates cells infected by viruses, damaged cells, and tumor cells.

  10. Argonne Out Loud: Computation, Big Data, and the Future of Cities

    ScienceCinema

    Catlett, Charlie

    2018-01-16

    Charlie Catlett, a Senior Computer Scientist at Argonne and Director of the Urban Center for Computation and Data at the Computation Institute of the University of Chicago and Argonne, talks about how he and his colleagues are using high-performance computing, data analytics, and embedded systems to better understand and design cities.

  11. The rare-earth elements: Vital to modern technologies and lifestyles

    USGS Publications Warehouse

    Van Gosen, Bradley S.; Verplanck, Philip L.; Long, Keith R.; Gambogi, Joseph; Seal, Robert R.

    2014-01-01

    Until recently, the rare-earth elements (REEs) were familiar to a relatively small number of people, such as chemists, geologists, specialized materials scientists, and engineers. In the 21st century, the REEs have gained visibility through many media outlets because of (1) the public has recognized the critical, specialized properties that REEs contribute to modern technology, as well as (2) China's dominance in production and supply of the REEs and (3) international dependence on China for the majority of the world's REE supply.Since the late 1990s, China has provided 85–95 percent of the world’s REEs. In 2010, China announced their intention to reduce REE exports. During this timeframe, REE use increased substantially. REEs are used as components in high technology devices, including smart phones, digital cameras, computer hard disks, fluorescent and light-emitting-diode (LED) lights, flat screen televisions, computer monitors, and electronic displays. Large quantities of some REEs are used in clean energy and defense technologies. Because of the many important uses of REEs, nations dependent on new technologies, such as Japan, the United States, and members of the European Union, reacted with great concern to China’s intent to reduce its REE exports. Consequently, exploration activities intent on discovering economic deposits of REEs and bringing them into production have increased.

  12. SOURCE EXPLORER: Towards Web Browser Based Tools for Astronomical Source Visualization and Analysis

    NASA Astrophysics Data System (ADS)

    Young, M. D.; Hayashi, S.; Gopu, A.

    2014-05-01

    As a new generation of large format, high-resolution imagers come online (ODI, DECAM, LSST, etc.) we are faced with the daunting prospect of astronomical images containing upwards of hundreds of thousands of identifiable sources. Visualizing and interacting with such large datasets using traditional astronomical tools appears to be unfeasible, and a new approach is required. We present here a method for the display and analysis of arbitrarily large source datasets using dynamically scaling levels of detail, enabling scientists to rapidly move from large-scale spatial overviews down to the level of individual sources and everything in-between. Based on the recognized standards of HTML5+JavaScript, we enable observers and archival users to interact with their images and sources from any modern computer without having to install specialized software. We demonstrate the ability to produce large-scale source lists from the images themselves, as well as overlaying data from publicly available source ( 2MASS, GALEX, SDSS, etc.) or user provided source lists. A high-availability cluster of computational nodes allows us to produce these source maps on demand and customized based on user input. User-generated source lists and maps are persistent across sessions and are available for further plotting, analysis, refinement, and culling.

  13. Stanford/NASA-Ames Center of Excellence in model-based human performance

    NASA Technical Reports Server (NTRS)

    Wandell, Brian A.

    1990-01-01

    The human operator plays a critical role in many aeronautic and astronautic missions. The Stanford/NASA-Ames Center of Excellence in Model-Based Human Performance (COE) was initiated in 1985 to further our understanding of the performance capabilities and performance limits of the human component of aeronautic and astronautic projects. Support from the COE is devoted to those areas of experimental and theoretical work designed to summarize and explain human performance by developing computable performance models. The ultimate goal is to make these computable models available to other scientists for use in design and evaluation of aeronautic and astronautic instrumentation. Within vision science, two topics have received particular attention. First, researchers did extensive work analyzing the human ability to recognize object color relatively independent of the spectral power distribution of the ambient lighting (color constancy). The COE has supported a number of research papers in this area, as well as the development of a substantial data base of surface reflectance functions, ambient illumination functions, and an associated software package for rendering and analyzing image data with respect to these spectral functions. Second, the COE supported new empirical studies on the problem of selecting colors for visual display equipment to enhance human performance in discrimination and recognition tasks.

  14. LLL 8080 BASIC-II interpreter user's manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McGoldrick, P.R.; Dickinson, J.; Allison, T.G.

    1978-04-03

    Scientists are finding increased applications for microprocessors as process controllers in their experiments. However, while microprocessors are small and inexpensive, they are difficult to program in machine or assembly language. A high-level language is needed to enable scientists to develop their own microcomputer programs for their experiments on location. Recognizing this need, LLL contracted to have such a language developed. This report describes the resulting LLL BASIC interpreter, which opeates with LLL's 8080-based MCS-8 microcomputer system. All numerical operations are done using Advanced Micro Device's Am9511 arithmetic processor chip or optionally by using a software simulation of that chip. 1more » figure.« less

  15. Contributions from Women to the Radiation Sciences: A Brief History.

    PubMed

    Martinez, Nicole E

    2017-04-01

    Contributions from men to radiation science are well known, particularly the early contributions from such luminaries as William Roentgen, James Chadwick, Niels Bohr, Robert Oppenheimer, and the like. Although not ignored per se, beyond Marie Curie and Lise Meitner, the contributions of female nuclear scientists are not as widely recognized. This paper provides a concise historical summary of contributions to radiation science from the discovery of radiation through the current status of international leadership within the radiation protection community. Beyond lead scientists and academics, this paper also considers support personnel as well as the role women have played in the advancement of radiation epidemiology.

  16. Finding "Models" in Clouds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leung, Ruby

    2017-05-01

    Internationally recognized Climate Scientist Ruby Leung is a cloud gazer. But rather than looking for shapes, Ruby’s life’s calling is to develop regional atmospheric models to better predict and understand the effects of global climate change at scales relevant to humans and the environment. Ruby’s accomplishments include developing novel methods for modeling mountain clouds and precipitation in climate models, and improving understanding of hydroclimate variability and change. She also has led efforts to develop regional climate modeling capabilities in the Weather Research and Forecasting model that is widely adopted by scientists worldwide. Ruby is part of a team of PNNLmore » researchers studying the impacts of global warming.« less

  17. Young Scientists Explore Butterflies and Moths. Book 4 Primary Level.

    ERIC Educational Resources Information Center

    Penn, Linda

    Designed to present interesting facts about science and to heighten the curiosity of primary age students, this book contains activities about the natural world and numerous black and white illustrations. The activities focus on butterflies and moths and their stages of development. The first section contains exercises on recognizing insect body…

  18. Moving from Content Knowledge to Engagement

    ERIC Educational Resources Information Center

    McDonald, James; Dominguez, Lynn

    2005-01-01

    While the goal of science education used to be to produce more scientists, that goal has changed with the introduction of the National Science Education Standards (NSES) (NRC 1996). Society has recognized that it is essential for everyone, regardless of vocation, to understand the fundamentals of science and technology. The phrase that has come to…

  19. A Comparative Model of Field Investigations: Aligning School Science Inquiry with the Practices of Contemporary Science

    ERIC Educational Resources Information Center

    Windschitl, Mark; Dvornich, Karen; Ryken, Amy E.; Tudor, Margaret; Koehler, Gary

    2007-01-01

    Field investigations are not characterized by randomized and manipulated control group experiments; however, most school science and high-stakes tests recognize only this paradigm of investigation. Scientists in astronomy, genetics, field biology, oceanography, geology, and meteorology routinely select naturally occurring events and conditions and…

  20. Atmospheric/oceanic influence on climate in the southern Appalachians

    Treesearch

    Mark S. Riedel

    2006-01-01

    Despite a wealth of research, scientists still disagree about the existence, magnitude, duration and potential causes of global warming and climate change. For example, only recently have we recognized that, given historical global climate patterns, much of the global warming trend we are experiencing appears to be natural. We analyzed long-term climatologic records...

  1. 3 CFR 8749 - Proclamation 8749 of November 1, 2011. National Native American Heritage Month, 2011

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... of the United States of America A Proclamation From the Aleutian Islands to the Florida Everglades... among America’s most distinguished authors, artists, scientists, and political leaders, and in their... promises. My Administration recognizes the painful chapters in our shared history, and we are fully...

  2. Art & Evolution

    ERIC Educational Resources Information Center

    Terry, Mark

    2005-01-01

    In this article, the author presents a two-week evolution unit for his biology class. He uses Maria Sybilla Merian (1647-1717) as an example of an Enlightenment mind at work--in this case a woman recognized as one of the great artists and natural scientists of her time. Her representations of butterflies, caterpillars and their pupae, and the…

  3. One Fungus, One Name: Defining the genus Fusarium in a scientifically robust way that preserves longstanding use

    USDA-ARS?s Scientific Manuscript database

    In this letter, we advocate recognizing the genus Fusarium as the sole name for a group that includes virtually all Fusarium species of importance in plant pathology, mycotoxicology, medicine and basic research. This phylogenetically-guided circumscription will free scientists from any obligation to...

  4. Bovine tuberculosis and the establishment of an eradication program in the United States: Role of veterinarians

    USDA-ARS?s Scientific Manuscript database

    The significance of the identification of Mycobacterium bovis as a zoonotic pathogen in 1882 was not initially recognized. After years of research by veterinarians, and other scientists, the importance of M. bovis as a pathogen and the public health ramifications, were appreciated. Veterinarians pla...

  5. Designing and using phenological studies to define management strategies for aquatic plants

    USDA-ARS?s Scientific Manuscript database

    Scientists and managers alike have recognized that weed management activities in the past were timed more for the convenience of the applicator or response of the resource manager than in consideration of the biology of the target plant. A thorough understanding of the life history and phenology of...

  6. Opening remarks for the Fort Valley Centennial Celebration

    Treesearch

    G. Sam Foster

    2008-01-01

    The Rocky Mountain Research Station recognizes and values the contributions of our scientists and collaborators for their work over the past century at Fort Valley Experimental Forest. With the help of our partners and collaborators, Rocky Mountain Research Station is working to improve coordination across its research Program Areas and Experimental Forests and Ranges...

  7. Defining Old Growth: Implications For Management

    Treesearch

    David L. White; F. Thomas Lloyd

    1994-01-01

    USDA Forest Service (USFS), with the help of scientists from The Nature Conservancy (TNC), Forest Service Research and ther organizations, is developing old-growth definitions for 35 forest types within the Eastern United States (U.S.). Old-growth forests were officially recognized as a resource by the USFS in 1988 and shortly thereafter, the Eastern Old-Growth...

  8. Environmental Education for a Sustainable Future

    ERIC Educational Resources Information Center

    Schlesinger, William H.

    2004-01-01

    The American public is now faced with a baffling array of new environmental issues--much more complicated than the problems people faced 30 years ago. Scientists recognize new threats to the biosphere, the fabric of natural ecosystems, and the diversity of plants and animals that inhabit them. Unlike the obvious, toxic pollutants that spurred the…

  9. Measuring Job Content: Skills, Technology, and Management Practices. Discussion Paper No. 1357-08

    ERIC Educational Resources Information Center

    Handel, Michael J.

    2008-01-01

    The conceptualization and measurement of key job characteristics has not changed greatly for most social scientists since the Dictionary of Occupational Titles and Quality of Employment surveys were created, despite their recognized limitations. However, debates over the roles of job skill requirements, technology, and new management practices in…

  10. Comparative evaluation of green technologies using the Emergy Footprint and other measures of ecosystems services and sustainability

    EPA Science Inventory

    The work processes and products of the environment are now recognized by scientists and economists alike for the contributions that they make to human and natural systems. Thus, the nature and evaluation of nonmarket goods and services are the subject of much current research in...

  11. Cases on Research-Based Teaching Methods in Science Education

    ERIC Educational Resources Information Center

    de Silva, Eugene, Ed.

    2015-01-01

    While the great scientists of the past recognized a need for a multidisciplinary approach, today's schools often treat math and science as subjects separate from the rest. This not only creates a disinterest among students, but also a potential learning gap once students reach college and then graduate into the workforce. "Cases on…

  12. Applying Instructional Design Theories to Bioinformatics Education in Microarray Analysis and Primer Design Workshops

    ERIC Educational Resources Information Center

    Shachak, Aviv; Ophir, Ron; Rubin, Eitan

    2005-01-01

    The need to support bioinformatics training has been widely recognized by scientists, industry, and government institutions. However, the discussion of instructional methods for teaching bioinformatics is only beginning. Here we report on a systematic attempt to design two bioinformatics workshops for graduate biology students on the basis of…

  13. Czanderna Receives Research Award

    Science.gov Websites

    ., May 5, 1999 — A scientist at the U.S. Department of Energy's National Renewable Energy Laboratory (NREL) was recognized for his contributions to the science and technology of energy-related research . The Energy Technology Division (ETD) of The Electrochemical Society selected Dr. Al Czanderna for its

  14. How biological soil crusts became recognized as a functional unit: a selective history

    USGS Publications Warehouse

    Lange, Otto L.; Belnap, Jayne

    2016-01-01

    It is surprising that despite the world-wide distribution and general importance of biological soil crusts (biocrusts), scientific recognition and functional analysis of these communities is a relatively young field of science. In this chapter, we sketch the historical lines that led to the recognition of biocrusts as a community with important ecosystem functions. The idea of biocrusts as a functional ecological community has come from two main scientific branches: botany and soil science. For centuries, botanists have long recognized that multiple organisms colonize the soil surface in the open and often dry areas occurring between vascular plants. Much later, after the initial taxonomic and phyto-sociological descriptions were made, soil scientists and agronomists observed that these surface organisms interacted with soils in ways that changed the soil structure. In the 1970’s, research on these communities as ecological units that played an important functional role in drylands began in earnest, and these studies have continued to this day. Here, we trace the history of these studies from the distant past until 1990, when biocrusts became well-known to scientists and the public.

  15. Toward a Computational Neuropsychology of High-Level Vision.

    DTIC Science & Technology

    1984-08-20

    known as visual agnosia ’ (also called "mindblindness’)l this patient failed to *recognize her nurses, got lost frequently when travelling familiar routes...visual agnosia are not blind: these patients can compare two shapes reliably when Computational neuropsychology 16 both are visible, but they cannot...visually recognize what an object is (although many can recognize objects by touch). This sort of agnosia has been well-documented in the literature (see

  16. An Unsupervised Anomalous Event Detection and Interactive Analysis Framework for Large-scale Satellite Data

    NASA Astrophysics Data System (ADS)

    LIU, Q.; Lv, Q.; Klucik, R.; Chen, C.; Gallaher, D. W.; Grant, G.; Shang, L.

    2016-12-01

    Due to the high volume and complexity of satellite data, computer-aided tools for fast quality assessments and scientific discovery are indispensable for scientists in the era of Big Data. In this work, we have developed a framework for automated anomalous event detection in massive satellite data. The framework consists of a clustering-based anomaly detection algorithm and a cloud-based tool for interactive analysis of detected anomalies. The algorithm is unsupervised and requires no prior knowledge of the data (e.g., expected normal pattern or known anomalies). As such, it works for diverse data sets, and performs well even in the presence of missing and noisy data. The cloud-based tool provides an intuitive mapping interface that allows users to interactively analyze anomalies using multiple features. As a whole, our framework can (1) identify outliers in a spatio-temporal context, (2) recognize and distinguish meaningful anomalous events from individual outliers, (3) rank those events based on "interestingness" (e.g., rareness or total number of outliers) defined by users, and (4) enable interactively query, exploration, and analysis of those anomalous events. In this presentation, we will demonstrate the effectiveness and efficiency of our framework in the application of detecting data quality issues and unusual natural events using two satellite datasets. The techniques and tools developed in this project are applicable for a diverse set of satellite data and will be made publicly available for scientists in early 2017.

  17. RIACS FY2002 Annual Report

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.; Gross, Anthony R. (Technical Monitor)

    2002-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. Operated by the Universities Space Research Association (a non-profit university consortium), RIACS is located at the NASA Ames Research Center, Moffett Field, California. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in September 2003. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology (IT) Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1) Automated Reasoning for Autonomous Systems; 2) Human-Centered Computing; and 3) High Performance Computing and Networking. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains including aerospace technology, earth science, life sciences, and astrobiology. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.

  18. Exascale computing and what it means for shock physics

    NASA Astrophysics Data System (ADS)

    Germann, Timothy

    2015-06-01

    The U.S. Department of Energy is preparing to launch an Exascale Computing Initiative, to address the myriad challenges required to deploy and effectively utilize an exascale-class supercomputer (i.e., one capable of performing 1018 operations per second) in the 2023 timeframe. Since physical (power dissipation) requirements limit clock rates to at most a few GHz, this will necessitate the coordination of on the order of a billion concurrent operations, requiring sophisticated system and application software, and underlying mathematical algorithms, that may differ radically from traditional approaches. Even at the smaller workstation or cluster level of computation, the massive concurrency and heterogeneity within each processor will impact computational scientists. Through the multi-institutional, multi-disciplinary Exascale Co-design Center for Materials in Extreme Environments (ExMatEx), we have initiated an early and deep collaboration between domain (computational materials) scientists, applied mathematicians, computer scientists, and hardware architects, in order to establish the relationships between algorithms, software stacks, and architectures needed to enable exascale-ready materials science application codes within the next decade. In my talk, I will discuss these challenges, and what it will mean for exascale-era electronic structure, molecular dynamics, and engineering-scale simulations of shock-compressed condensed matter. In particular, we anticipate that the emerging hierarchical, heterogeneous architectures can be exploited to achieve higher physical fidelity simulations using adaptive physics refinement. This work is supported by the U.S. Department of Energy, Office of Science, Office of Advanced Scientific Computing Research.

  19. CESDIS

    NASA Technical Reports Server (NTRS)

    1994-01-01

    CESDIS, the Center of Excellence in Space Data and Information Sciences was developed jointly by NASA, Universities Space Research Association (USRA), and the University of Maryland in 1988 to focus on the design of advanced computing techniques and data systems to support NASA Earth and space science research programs. CESDIS is operated by USRA under contract to NASA. The Director, Associate Director, Staff Scientists, and administrative staff are located on-site at NASA's Goddard Space Flight Center in Greenbelt, Maryland. The primary CESDIS mission is to increase the connection between computer science and engineering research programs at colleges and universities and NASA groups working with computer applications in Earth and space science. Research areas of primary interest at CESDIS include: 1) High performance computing, especially software design and performance evaluation for massively parallel machines; 2) Parallel input/output and data storage systems for high performance parallel computers; 3) Data base and intelligent data management systems for parallel computers; 4) Image processing; 5) Digital libraries; and 6) Data compression. CESDIS funds multiyear projects at U. S. universities and colleges. Proposals are accepted in response to calls for proposals and are selected on the basis of peer reviews. Funds are provided to support faculty and graduate students working at their home institutions. Project personnel visit Goddard during academic recess periods to attend workshops, present seminars, and collaborate with NASA scientists on research projects. Additionally, CESDIS takes on specific research tasks of shorter duration for computer science research requested by NASA Goddard scientists.

  20. Computational Thinking: A Digital Age Skill for Everyone

    ERIC Educational Resources Information Center

    Barr, David; Harrison, John; Conery, Leslie

    2011-01-01

    In a seminal article published in 2006, Jeanette Wing described computational thinking (CT) as a way of "solving problems, designing systems, and understanding human behavior by drawing on the concepts fundamental to computer science." Wing's article gave rise to an often controversial discussion and debate among computer scientists,…

  1. Application of advanced computing techniques to the analysis and display of space science measurements

    NASA Technical Reports Server (NTRS)

    Klumpar, D. M.; Lapolla, M. V.; Horblit, B.

    1995-01-01

    A prototype system has been developed to aid the experimental space scientist in the display and analysis of spaceborne data acquired from direct measurement sensors in orbit. We explored the implementation of a rule-based environment for semi-automatic generation of visualizations that assist the domain scientist in exploring one's data. The goal has been to enable rapid generation of visualizations which enhance the scientist's ability to thoroughly mine his data. Transferring the task of visualization generation from the human programmer to the computer produced a rapid prototyping environment for visualizations. The visualization and analysis environment has been tested against a set of data obtained from the Hot Plasma Composition Experiment on the AMPTE/CCE satellite creating new visualizations which provided new insight into the data.

  2. Analyzing task-based user study data to determine colormap efficiency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ashton, Zoe Charon Maria; Wendelberger, Joanne Roth; Ticknor, Lawrence O.

    2015-07-23

    Domain scientists need colormaps to visualize their data and are especially useful for identifying areas of interest, like in ocean data to identify eddies or characterize currents. However, traditional Rainbow colormap performs poorly for understanding details, because of the small perceptual range. In order to assist domain scientists in recognizing and identifying important details in their data, different colormaps need to be applied to allow higher perceptual definition. Visual artist Francesca Samsel used her understanding of color theory to create new colormaps to improve perception. While domain scientists find the new colormaps to be useful, we implemented a rigorous andmore » quantitative study to determine whether or not the new colormaps have perceptually more colors. Color count data from one of these studies will be analyzed in depth in order to determine whether or not the new colormaps have more perceivable colors and what affects the number of perceivable colors.« less

  3. The Application of NASA Technology to Public Health

    NASA Technical Reports Server (NTRS)

    Rickman, Douglas L.; Watts, C.

    2007-01-01

    NASA scientists have a history of applying technologies created to handle satellite data to human health at various spatial scales. Scientists are now engaged in multiple public health application projects that integrate NASA satellite data with measures of public health. Such integration requires overcoming disparities between the environmental and the health data. Ground based sensors, satellite imagery, model outputs and other environmental sources have inconsistent spatial and temporal distributions. The MSFC team has recognized the approach used by environmental scientists to fill in the empty places can also be applied to outcomes, exposures and similar data. A revisit to the classic epidemiology study of 1854 using modern day surface modeling and GIS technology, demonstrates how spatial technology can enhance and change the future of environmental epidemiology. Thus, NASA brings to public health, not just a set of data, but an innovative way of thinking about the data.

  4. Information processing, computation, and cognition.

    PubMed

    Piccinini, Gualtiero; Scarantino, Andrea

    2011-01-01

    Computation and information processing are among the most fundamental notions in cognitive science. They are also among the most imprecisely discussed. Many cognitive scientists take it for granted that cognition involves computation, information processing, or both - although others disagree vehemently. Yet different cognitive scientists use 'computation' and 'information processing' to mean different things, sometimes without realizing that they do. In addition, computation and information processing are surrounded by several myths; first and foremost, that they are the same thing. In this paper, we address this unsatisfactory state of affairs by presenting a general and theory-neutral account of computation and information processing. We also apply our framework by analyzing the relations between computation and information processing on one hand and classicism, connectionism, and computational neuroscience on the other. We defend the relevance to cognitive science of both computation, at least in a generic sense, and information processing, in three important senses of the term. Our account advances several foundational debates in cognitive science by untangling some of their conceptual knots in a theory-neutral way. By leveling the playing field, we pave the way for the future resolution of the debates' empirical aspects.

  5. Scientists at Work. Final Report.

    ERIC Educational Resources Information Center

    Education Turnkey Systems, Inc., Falls Church, VA.

    This report summarizes activities related to the development, field testing, evaluation, and marketing of the "Scientists at Work" program which combines computer assisted instruction with database tools to aid cognitively impaired middle and early high school children in learning and applying thinking skills to science. The brief report reviews…

  6. The Ocean 180 Video Challenge: An Innovative Broader Impacts Strategy for Helping Scientists Share Discoveries and Connect with Classrooms

    NASA Astrophysics Data System (ADS)

    Tankersley, R. A.; Watson, M.; Windsor, J. G.; Buckley, M.; Diederick, L.

    2014-12-01

    Scientists conduct exciting, ground-breaking research that addresses many of world's greatest challenges. Yet, far too often, the importance, meaning, and relevance of their discoveries are never shared with persons outside their discipline. Recognizing the need for scientists to communicate more effectively with the public, the Florida Center for Ocean Sciences Education Excellence (COSEE Florida) saw an opportunity to connect the two through film. In the fall 2013, COSEE Florida launched the Ocean 180 Video Challenge to tap into the competitive spirit of scientists and inspire them to share their latest discoveries with the public. The competition encouraged scientists to submit short, 3-minute video abstracts summarizing the important findings of recent peer-reviewed papers and highlighting the relevance, meaning, and implications of the research to persons outside their discipline. Videos were initially screened and evaluated by a team of science and communication experts and the winners (from a field of ten finalists) were selected by more than 30,000 middle school students from 285 schools in 13 countries. Our presentation will review the outcomes and lessons learned from the 2014 competition and describe how contest videos are being used for professional development/training and educational purposes. We will also describe how video competitions can benefit both scientists and the target audience and be effective outreach strategies for encouraging scientists to share new discoveries and their enthusiasm for science with K-12 students and the public.

  7. Bridging Social and Semantic Computing - Design and Evaluation of User Interfaces for Hybrid Systems

    ERIC Educational Resources Information Center

    Bostandjiev, Svetlin Alex I.

    2012-01-01

    The evolution of the Web brought new interesting problems to computer scientists that we loosely classify in the fields of social and semantic computing. Social computing is related to two major paradigms: computations carried out by a large amount of people in a collective intelligence fashion (i.e. wikis), and performing computations on social…

  8. Profugus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, Thomas; Hamilton, Steven; Slattery, Stuart

    Profugus is an open-source mini-application (mini-app) for radiation transport and reactor applications. It contains the fundamental computational kernels used in the Exnihilo code suite from Oak Ridge National Laboratory. However, Exnihilo is production code with a substantial user base. Furthermore, Exnihilo is export controlled. This makes collaboration with computer scientists and computer engineers difficult. Profugus is designed to bridge that gap. By encapsulating the core numerical algorithms in an abbreviated code base that is open-source, computer scientists can analyze the algorithms and easily make code-architectural changes to test performance without compromising the production code values of Exnihilo. Profugus is notmore » meant to be production software with respect to problem analysis. The computational kernels in Profugus are designed to analyze performance, not correctness. Nonetheless, users of Profugus can setup and run problems with enough real-world features to be useful as proof-of-concept for actual production work.« less

  9. Combinatorial Algorithms to Enable Computational Science and Engineering: Work from the CSCAPES Institute

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boman, Erik G.; Catalyurek, Umit V.; Chevalier, Cedric

    2015-01-16

    This final progress report summarizes the work accomplished at the Combinatorial Scientific Computing and Petascale Simulations Institute. We developed Zoltan, a parallel mesh partitioning library that made use of accurate hypergraph models to provide load balancing in mesh-based computations. We developed several graph coloring algorithms for computing Jacobian and Hessian matrices and organized them into a software package called ColPack. We developed parallel algorithms for graph coloring and graph matching problems, and also designed multi-scale graph algorithms. Three PhD students graduated, six more are continuing their PhD studies, and four postdoctoral scholars were advised. Six of these students and Fellowsmore » have joined DOE Labs (Sandia, Berkeley), as staff scientists or as postdoctoral scientists. We also organized the SIAM Workshop on Combinatorial Scientific Computing (CSC) in 2007, 2009, and 2011 to continue to foster the CSC community.« less

  10. Perspectives on an education in computational biology and medicine.

    PubMed

    Rubinstein, Jill C

    2012-09-01

    The mainstream application of massively parallel, high-throughput assays in biomedical research has created a demand for scientists educated in Computational Biology and Bioinformatics (CBB). In response, formalized graduate programs have rapidly evolved over the past decade. Concurrently, there is increasing need for clinicians trained to oversee the responsible translation of CBB research into clinical tools. Physician-scientists with dedicated CBB training can facilitate such translation, positioning themselves at the intersection between computational biomedical research and medicine. This perspective explores key elements of the educational path to such a position, specifically addressing: 1) evolving perceptions of the role of the computational biologist and the impact on training and career opportunities; 2) challenges in and strategies for obtaining the core skill set required of a biomedical researcher in a computational world; and 3) how the combination of CBB with medical training provides a logical foundation for a career in academic medicine and/or biomedical research.

  11. Mobilizing the GLOBE at Night Citizen-Scientist

    NASA Astrophysics Data System (ADS)

    Newhouse, M. A.; Walker, C. E.; Boss, S. K.; Hennig, A. J.

    2012-12-01

    GLOBE at Night is an international campaign to raise public awareness of the impact of light pollution. Citizen-scientists around the world measure their night sky brightness and submit their observations to a website from a computer. In the last two years a web application (webapp) was developed to enable reporting from mobile devices. Nearly 80,000 data points have been submitted by people in 115 countries during the last 7 years. Our poster will examine the effect of enabling real-time data reporting via mobile devices, and how the Adopt-a-Street pilot project has impacted data collection in two U.S. cities. Recognizing the increasing popularity of smartphones, in late 2010 NOAO staff built a webapp to take advantage of the GPS capabilities built into mobile devices to get an automated and accurate report of the user's location. Refinements to the application have enabled an order of magnitude reduction in the number of erroneous data points due to incorrect location. During the 2011 campaign a pilot program called Adopt-a-Street was created to further take advantage of the ability to report data in real-time via mobile devices. For the 2012 campaign the program continued in Tucson and expanded to Fayetteville, Arkansas. Both of these sub-campaigns encouraged more participation, and resulted in more meaningful results. For example, in prior years Fayetteville averaged three data points in the three years any points were submitted in that area. In 2012, due to the Adopt-a-Street program, there were 98 points submitted, clearly matching the map on their Adopt-a-Street page. Adding support for mobile devices has increased the accuracy and relevance of the data submitted via both mobile devices and desktop computers, as well as enabled new programs. We plan to expand the Adopt-a-Street program next year and find an easier way to accommodate multiple measurements.

  12. Mobilizing the GLOBE at Night Citizen-Scientist

    NASA Astrophysics Data System (ADS)

    Newhouse, M. A.; Walker, C. E.; Boss, S. K.; Hennig, A. J.

    2013-04-01

    GLOBE at Night is an international campaign to raise public awareness of the impact of light pollution. Citizen-scientists around the world measure their night sky brightness and submit their observations to a website from a computer. In the last two years a webapp was developed to enable reporting from mobile devices. Nearly 80,000 data points have been submitted by people in 115 countries during the last 7 years. Our poster will examine the effect of enabling real-time data reporting via mobile devices, and how the Adopt-a-Street pilot project has impacted data collection in two U.S. cities. Recognizing the increasing popularity of smartphones, in late 2010 NOAO staff built a webapp to take advantage of the GPS capabilities built into mobile devices to get an automated and accurate report of the user's location. Refinements to the application have enabled an order of magnitude reduction in the number of erroneous data points due to incorrect location. During the 2011 campaign a pilot program called Adopt-a-Street was created to further take advantage of the ability to report data in real-time via mobile devices. For the 2012 campaign the program continued in Tucson and expanded to Fayetteville, Arkansas. Both of these sub-campaigns encouraged more participation, and resulted in more meaningful results. For example, in prior years Fayetteville averaged three data points in the three years any points were submitted in that area. In 2012, due to the Adopt-a-Street program, there were 98 points submitted, clearly matching the map on their Adopt-a-Street page. Adding support for mobile devices has increased the accuracy and relevance of the data submitted via both mobile devices and desktop computers, as well as enabled new programs. We plan to expand the Adopt-a-Street program next year and find an easier way to accommodate multiple measurements.

  13. Integrated Circuits/Segregated Labor: Women in Three Computer-Related Occupations. Project Report No. 84-A27.

    ERIC Educational Resources Information Center

    Strober, Myra H.; Arnold, Carolyn L.

    This discussion of the impact of new computer occupations on women's employment patterns is divided into four major sections. The first section describes the six computer-related occupations to be analyzed: (1) engineers; (2) computer scientists and systems analysts; (3) programmers; (4) electronic technicians; (5) computer operators; and (6) data…

  14. Enduring Influence of Stereotypical Computer Science Role Models on Women's Academic Aspirations

    ERIC Educational Resources Information Center

    Cheryan, Sapna; Drury, Benjamin J.; Vichayapai, Marissa

    2013-01-01

    The current work examines whether a brief exposure to a computer science role model who fits stereotypes of computer scientists has a lasting influence on women's interest in the field. One-hundred undergraduate women who were not computer science majors met a female or male peer role model who embodied computer science stereotypes in appearance…

  15. Collective Computation of Neural Network

    DTIC Science & Technology

    1990-03-15

    Sciences, Beijing ABSTRACT Computational neuroscience is a new branch of neuroscience originating from current research on the theory of computer...scientists working in artificial intelligence engineering and neuroscience . The paper introduces the collective computational properties of model neural...vision research. On this basis, the authors analyzed the significance of the Hopfield model. Key phrases: Computational Neuroscience , Neural Network, Model

  16. The scientist's education and a civic conscience.

    PubMed

    Donald, Kelling J; Kovac, Jeffrey

    2013-09-01

    A civic science curriculum is advocated. We discuss practical mechanisms for (and highlight the possible benefits of) addressing the relationship between scientific knowledge and civic responsibility coextensively with rigorous scientific content. As a strategy, we suggest an in-course treatment of well known (and relevant) historical and contemporary controversies among scientists over science policy or the use of sciences. The scientific content of the course is used to understand the controversy and to inform the debate while allowing students to see the role of scientists in shaping public perceptions of science and the value of scientific inquiry, discoveries and technology in society. The examples of the activism of Linus Pauling, Alfred Nobel and Joseph Rotblat as scientists and engaged citizens are cited. We discuss the role of science professors in informing the social conscience of students and consider ways in which a treatment of the function of science in society may find, coherently, a meaningful space in a science curriculum at the college level. Strategies for helping students to recognize early the crucial contributions that science can make in informing public policy and global governance are discussed.

  17. Scientific Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fermilab

    2017-09-01

    Scientists, engineers and programmers at Fermilab are tackling today’s most challenging computational problems. Their solutions, motivated by the needs of worldwide research in particle physics and accelerators, help America stay at the forefront of innovation.

  18. An Analysis of Cloud Computing with Amazon Web Services for the Atmospheric Science Data Center

    NASA Astrophysics Data System (ADS)

    Gleason, J. L.; Little, M. M.

    2013-12-01

    NASA science and engineering efforts rely heavily on compute and data handling systems. The nature of NASA science data is such that it is not restricted to NASA users, instead it is widely shared across a globally distributed user community including scientists, educators, policy decision makers, and the public. Therefore NASA science computing is a candidate use case for cloud computing where compute resources are outsourced to an external vendor. Amazon Web Services (AWS) is a commercial cloud computing service developed to use excess computing capacity at Amazon, and potentially provides an alternative to costly and potentially underutilized dedicated acquisitions whenever NASA scientists or engineers require additional data processing. AWS desires to provide a simplified avenue for NASA scientists and researchers to share large, complex data sets with external partners and the public. AWS has been extensively used by JPL for a wide range of computing needs and was previously tested on a NASA Agency basis during the Nebula testing program. Its ability to support the Langley Science Directorate needs to be evaluated by integrating it with real world operational needs across NASA and the associated maturity that would come with that. The strengths and weaknesses of this architecture and its ability to support general science and engineering applications has been demonstrated during the previous testing. The Langley Office of the Chief Information Officer in partnership with the Atmospheric Sciences Data Center (ASDC) has established a pilot business interface to utilize AWS cloud computing resources on a organization and project level pay per use model. This poster discusses an effort to evaluate the feasibility of the pilot business interface from a project level perspective by specifically using a processing scenario involving the Clouds and Earth's Radiant Energy System (CERES) project.

  19. OptFuels: Fuel treatment optimization

    Treesearch

    Greg Jones

    2011-01-01

    Scientists at the USDA Forest Service, Rocky Mountain Research Station, in Missoula, MT, in collaboration with scientists at the University of Montana, are developing a tool to help forest managers prioritize forest fuel reduction treatments. Although several computer models analyze fuels and fire behavior, stand-level effects of fuel treatments, and priority planning...

  20. Four Argonne National Laboratory scientists receive Early Career Research

    Science.gov Websites

    Media Contacts Social Media Photos Videos Fact Sheets, Brochures and Reports Summer Science Writing Writing Internship Four Argonne National Laboratory scientists receive Early Career Research Program economic impact of cascading shortages. He will also seek to enable scaling on high-performance computing

  1. Air Force Laboratory’s 2005 Technology Milestones

    DTIC Science & Technology

    2006-01-01

    Computational materials science methods can benefit the design and property prediction of complex real-world materials. With these models , scientists and...Warfighter Page Air High - Frequency Acoustic System...800) 203-6451 High - Frequency Acoustic System Payoff Scientists created the High - Frequency Acoustic Suppression Technology (HiFAST) airflow control

  2. A toolbox and record for scientific models

    NASA Technical Reports Server (NTRS)

    Ellman, Thomas

    1994-01-01

    Computational science presents a host of challenges for the field of knowledge-based software design. Scientific computation models are difficult to construct. Models constructed by one scientist are easily misapplied by other scientists to problems for which they are not well-suited. Finally, models constructed by one scientist are difficult for others to modify or extend to handle new types of problems. Construction of scientific models actually involves much more than the mechanics of building a single computational model. In the course of developing a model, a scientist will often test a candidate model against experimental data or against a priori expectations. Test results often lead to revisions of the model and a consequent need for additional testing. During a single model development session, a scientist typically examines a whole series of alternative models, each using different simplifying assumptions or modeling techniques. A useful scientific software design tool must support these aspects of the model development process as well. In particular, it should propose and carry out tests of candidate models. It should analyze test results and identify models and parts of models that must be changed. It should determine what types of changes can potentially cure a given negative test result. It should organize candidate models, test data, and test results into a coherent record of the development process. Finally, it should exploit the development record for two purposes: (1) automatically determining the applicability of a scientific model to a given problem; (2) supporting revision of a scientific model to handle a new type of problem. Existing knowledge-based software design tools must be extended in order to provide these facilities.

  3. NEUTRALIZING THE DISINHERITED--SOME PSYCHOLOGICAL ASPECTS OF UNDERSTANDING THE POOR.

    ERIC Educational Resources Information Center

    RAINWATER, LEE

    MEMBERS OF THE DOMINANT SOCIETY IN THE UNITED STATES, BOTH SOCIAL SCIENTISTS AND LAYMEN, PERCEIVE THE POOR IN WAYS WHICH ALLOW THEM TO RESOLVE THE ANXIETY THEY EXPERIENCE WHEN THEY RECOGNIZE THAT THE POOR LIVE A LIFE WHICH IS OSTENSIBLY UNLIVABLE. ONE MODE OF PERCEPTION, WHICH UNDERLIES SEEMINGLY SOPHISTICATED VIEWS, AND IS FOUND IN THE ATTITUDE…

  4. Young Scientists Explore Wild Plants and Animals. Book 12 Primary Level.

    ERIC Educational Resources Information Center

    Penn, Linda

    Designed to present interesting facts about science and to heighten the curiosity of primary age students, this book contains activities about the natural world and numerous black and white illustrations. This activity book explores easily recognized animals, along with a few not-so-well-known plants. The theme of the first section is fall…

  5. Connecting mountain islands and desert seas: Biodiversity and management of the Madrean Archipelago II

    Treesearch

    Gerald J. Gottfried; Brooke S. Gebow; Lane G. Eskew; Carleton B. Edminster

    2005-01-01

    The Madrean Archipelago, or Sky Island, region of the southwestern United States and northern Mexico is recognized for its great biological diversity and natural beauty. This conference brought together scientists, managers, and other interested parties to share their knowledge about the region and to identify needs and possible solutions for existing and emerging...

  6. The 13th Annual James L. Waters Symposium at Pittcon: Electron Spectroscopy for Chemical Analysis

    ERIC Educational Resources Information Center

    Baltrus, John P.

    2004-01-01

    The objective of the James L. Waters Annual Symposium is to recognize pioneers in the development of instrumentation by preserving the early history of the cooperation and important contributions of inventors, scientists, engineers, entrepreneurs, and marketing organizations. The symposium was held in Pittsburgh, United States in March 2002 to…

  7. Reburns and their Impact on carbon pools, site productivity, and recovery [Chapter 13

    Treesearch

    Deborah S. Page-Dumroese; Terrie Jain; Jonathan E. Sandquist; Joanne M. Tirocke; John Errecart; Martin F. Jurgensen

    2015-01-01

    Prior to fire suppression and exclusion, wildfires and other disturbances (e.g., insects, disease, and weather) sustained ecosystem processes in many landscapes of the Western United States. However, wildfires have been increasing in size, frequency, and intensity in recent years (Kellogg and others 2008). Recognizing the value of wildfire, scientists and land...

  8. Diversion of lava during the 1983 eruption of Mount Etna

    USGS Publications Warehouse

    Lockwood, J.P.; Romano, R.

    1985-01-01

    During the 1983 eruption of Etna, Italian scientists managed, for the first time, to convince government authorities that direct intervention in natural volcanic processes was warranted. Both explosives and earthen barriers were used to divert major flows. These efforts were fairly successful, although at the time the historic importance of the operations was not fully recognized. 

  9. Do General Physics Textbooks Discuss Scientists' Ideas about Atomic Structure? A Case in Korea

    ERIC Educational Resources Information Center

    Niaz, Mansoor; Kwon, Sangwoon; Kim, Nahyun; Lee, Gyoungho

    2013-01-01

    Research in science education has recognized the importance of teaching atomic structure within a history and philosophy of science perspective. The objective of this study is to evaluate general physics textbooks published in Korea based on the eight criteria developed in previous research. The result of this study shows that Korean general…

  10. Opening remarks for the Fort Valley Centennial Celebration (P-53)

    Treesearch

    G. Sam Foster

    2008-01-01

    The Rocky Mountain Research Station recognizes and values the contributions of our scientists and collaborators for their work over the past century at Fort Valley Experimental Forest. With the help of our partners and collaborators, Rocky Mountain Research Station is working to improve coordination across its research Program Areas and Experimental Forests and Ranges...

  11. Training Tomorrow's Anatomists Today: A Partnership Approach

    ERIC Educational Resources Information Center

    Fraher, John P.; Evans, Darrell J. R.

    2009-01-01

    Anatomy is recognized to play a central role in the education and training of clinicians, healthcare professionals, and scientists. However, in recent years, the perceived decline in popularity of anatomy has led to a deficiency in the numbers of new anatomy educators. The tide is now turning with anatomy once again taking its rightful place in a…

  12. Superwoman: Ms. or Myth. A Study of Role Overload. A Report to the National Institute of Education.

    ERIC Educational Resources Information Center

    Bean, Joan P.; Wolfman, Brunetta R.

    Although some social scientists have recognized the conflict and duality of career and family roles, few researchers have examined the consequences of balancing working women's multiple roles. Studies reveal that professional men and women appear to comprehend the meaning and implications of the "Superwoman," a woman caught in the triple…

  13. NREL Scientists and Engineers Recognized for Top Innovations | NREL | News

    Science.gov Websites

    commercially available, large-format isothermal battery calorimeter for lithium-ion battery safety testing to test the performance and safety of large-format lithium-ion batteries used extensively in electric develop NREL intellectual property representing an isothermal battery calorimeter. The technical

  14. Computer Series, 98. Electronics for Scientists: A Computer-Intensive Approach.

    ERIC Educational Resources Information Center

    Scheeline, Alexander; Mork, Brian J.

    1988-01-01

    Reports the design for a principles-before-details presentation of electronics for an instrumental analysis class. Uses computers for data collection and simulations. Requires one semester with two 2.5-hour periods and two lectures per week. Includes lab and lecture syllabi. (MVL)

  15. Big data computing: Building a vision for ARS information management

    USDA-ARS?s Scientific Manuscript database

    Improvements are needed within the ARS to increase scientific capacity and keep pace with new developments in computer technologies that support data acquisition and analysis. Enhancements in computing power and IT infrastructure are needed to provide scientists better access to high performance com...

  16. Science-Driven Computing: NERSC's Plan for 2006-2010

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simon, Horst D.; Kramer, William T.C.; Bailey, David H.

    NERSC has developed a five-year strategic plan focusing on three components: Science-Driven Systems, Science-Driven Services, and Science-Driven Analytics. (1) Science-Driven Systems: Balanced introduction of the best new technologies for complete computational systems--computing, storage, networking, visualization and analysis--coupled with the activities necessary to engage vendors in addressing the DOE computational science requirements in their future roadmaps. (2) Science-Driven Services: The entire range of support activities, from high-quality operations and user services to direct scientific support, that enable a broad range of scientists to effectively use NERSC systems in their research. NERSC will concentrate on resources needed to realize the promise ofmore » the new highly scalable architectures for scientific discovery in multidisciplinary computational science projects. (3) Science-Driven Analytics: The architectural and systems enhancements and services required to integrate NERSC's powerful computational and storage resources to provide scientists with new tools to effectively manipulate, visualize, and analyze the huge data sets derived from simulations and experiments.« less

  17. High-Performance Computing Unlocks Innovation at NREL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Need to fly around a wind farm? Or step inside a molecule? NREL scientists use a super powerful (and highly energy-efficient) computer to visualize and solve big problems in renewable energy research.

  18. Mathematical computer programs: A compilation

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Computer programs, routines, and subroutines for aiding engineers, scientists, and mathematicians in direct problem solving are presented. Also included is a group of items that affords the same users greater flexibility in the use of software.

  19. A Modular Environment for Geophysical Inversion and Run-time Autotuning using Heterogeneous Computing Systems

    NASA Astrophysics Data System (ADS)

    Myre, Joseph M.

    Heterogeneous computing systems have recently come to the forefront of the High-Performance Computing (HPC) community's interest. HPC computer systems that incorporate special purpose accelerators, such as Graphics Processing Units (GPUs), are said to be heterogeneous. Large scale heterogeneous computing systems have consistently ranked highly on the Top500 list since the beginning of the heterogeneous computing trend. By using heterogeneous computing systems that consist of both general purpose processors and special- purpose accelerators, the speed and problem size of many simulations could be dramatically increased. Ultimately this results in enhanced simulation capabilities that allows, in some cases for the first time, the execution of parameter space and uncertainty analyses, model optimizations, and other inverse modeling techniques that are critical for scientific discovery and engineering analysis. However, simplifying the usage and optimization of codes for heterogeneous computing systems remains a challenge. This is particularly true for scientists and engineers for whom understanding HPC architectures and undertaking performance analysis may not be primary research objectives. To enable scientists and engineers to remain focused on their primary research objectives, a modular environment for geophysical inversion and run-time autotuning on heterogeneous computing systems is presented. This environment is composed of three major components: 1) CUSH---a framework for reducing the complexity of programming heterogeneous computer systems, 2) geophysical inversion routines which can be used to characterize physical systems, and 3) run-time autotuning routines designed to determine configurations of heterogeneous computing systems in an attempt to maximize the performance of scientific and engineering codes. Using three case studies, a lattice-Boltzmann method, a non-negative least squares inversion, and a finite-difference fluid flow method, it is shown that this environment provides scientists and engineers with means to reduce the programmatic complexity of their applications, to perform geophysical inversions for characterizing physical systems, and to determine high-performing run-time configurations of heterogeneous computing systems using a run-time autotuner.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nikolic, R J

    This month's issue has the following articles: (1) Dawn of a New Era of Scientific Discovery - Commentary by Edward I. Moses; (2) At the Frontiers of Fundamental Science Research - Collaborators from national laboratories, universities, and international organizations are using the National Ignition Facility to probe key fundamental science questions; (3) Livermore Responds to Crisis in Post-Earthquake Japan - More than 70 Laboratory scientists provided round-the-clock expertise in radionuclide analysis and atmospheric dispersion modeling as part of the nation's support to Japan following the March 2011 earthquake and nuclear accident; (4) A Comprehensive Resource for Modeling, Simulation, and Experimentsmore » - A new Web-based resource called MIDAS is a central repository for material properties, experimental data, and computer models; and (5) Finding Data Needles in Gigabit Haystacks - Livermore computer scientists have developed a novel computer architecture based on 'persistent' memory to ease data-intensive computations.« less

  1. How should a speech recognizer work?

    PubMed

    Scharenborg, Odette; Norris, Dennis; Bosch, Louis; McQueen, James M

    2005-11-12

    Although researchers studying human speech recognition (HSR) and automatic speech recognition (ASR) share a common interest in how information processing systems (human or machine) recognize spoken language, there is little communication between the two disciplines. We suggest that this lack of communication follows largely from the fact that research in these related fields has focused on the mechanics of how speech can be recognized. In Marr's (1982) terms, emphasis has been on the algorithmic and implementational levels rather than on the computational level. In this article, we provide a computational-level analysis of the task of speech recognition, which reveals the close parallels between research concerned with HSR and ASR. We illustrate this relation by presenting a new computational model of human spoken-word recognition, built using techniques from the field of ASR that, in contrast to current existing models of HSR, recognizes words from real speech input. 2005 Lawrence Erlbaum Associates, Inc.

  2. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill; Feiereisen, William (Technical Monitor)

    2000-01-01

    The term "Grid" refers to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. The vision for NASN's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks that will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. IPG development and deployment is addressing requirements obtained by analyzing a number of different application areas, in particular from the NASA Aero-Space Technology Enterprise. This analysis has focussed primarily on two types of users: The scientist / design engineer whose primary interest is problem solving (e.g., determining wing aerodynamic characteristics in many different operating environments), and whose primary interface to IPG will be through various sorts of problem solving frameworks. The second type of user if the tool designer: The computational scientists who convert physics and mathematics into code that can simulate the physical world. These are the two primary users of IPG, and they have rather different requirements. This paper describes the current state of IPG (the operational testbed), the set of capabilities being put into place for the operational prototype IPG, as well as some of the longer term R&D tasks.

  3. The Effects of a Robot Game Environment on Computer Programming Education for Elementary School Students

    ERIC Educational Resources Information Center

    Shim, Jaekwoun; Kwon, Daiyoung; Lee, Wongyu

    2017-01-01

    In the past, computer programming was perceived as a task only carried out by computer scientists; in the 21st century, however, computer programming is viewed as a critical and necessary skill that everyone should learn. In order to improve teaching of problem-solving abilities in a computing environment, extensive research is being done on…

  4. 2012 AGU medal, award, and prize recipients

    NASA Astrophysics Data System (ADS)

    McPhaden, Mike; Fine, Rana

    2012-07-01

    The Honors and Recognition Committee is very pleased to present the 2012 AGU medalists, awardees, and prize recipients. These individuals are recognized for their outstanding contributions to the advancement of Earth and space science and for their service to the scientific community. They have distinguished themselves through their extraordinary achievements and are role models for future generations of scientists. We look forward to recognizing the accomplishments of these esteemed colleagues at an honors ceremony to be held on 5 December 2012 at the Fall Meeting in San Francisco, Calif. This year's honorees are also listed on AGU's Web site at http://sites.agu.org/honors.

  5. Careers and people

    NASA Astrophysics Data System (ADS)

    2008-04-01

    Nuclear scientists needed The US is heading for a serious shortage of nuclear forensics experts, according to a new report by the American Physical Society (APS) and the American Association for the Advancement of Science (AAAS). Nuclear forensics involves using sophisticated technology to analyse the nature, use and origin of nuclear materials, and is key to monitoring the illicit trade in and use of nuclear weapons. Currently there are fewer than 50 nuclear forensic scientists working in the US's network of national laboratories - not enough, the report claims, to deal with an emergency - and half of them are expected to retire within the next 15 years. As university programmes in radiochemistry and related subjects have been dwindling, there are not nearly enough young scientists to replenish the expertise pool. The report calls for a new programme to develop nuclear forensic scientists that would involve funding research at universities, launching graduate scholarships and fellowships, as well as setting up internships for young scientists at the labs where this work is carried out. Stimulating industrial support of faculty positions is also deemed important. Indeed, at least three or four new postdocs need to be hired into nuclear forensics every year for the next 10 years, the report says. It also recognizes that more research is needed to develop new lab and field equipment, and to create better numerical-simulation techniques.

  6. RIACS

    NASA Technical Reports Server (NTRS)

    Moore, Robert C.

    1998-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities that serves as a bridge between NASA and the academic community. Under a five-year co-operative agreement with NASA, research at RIACS is focused on areas that are strategically enabling to the Ames Research Center's role as NASA's Center of Excellence for Information Technology. Research is carried out by a staff of full-time scientist,augmented by visitors, students, post doctoral candidates and visiting university faculty. The primary mission of RIACS is charted to carry out research and development in computer science. This work is devoted in the main to tasks that are strategically enabling with respect to NASA's bold mission in space exploration and aeronautics. There are three foci for this work: Automated Reasoning. Human-Centered Computing. and High Performance Computing and Networking. RIACS has the additional goal of broadening the base of researcher in these areas of importance to the nation's space and aeronautics enterprises. Through its visiting scientist program, RIACS facilitates the participation of university-based researchers, including both faculty and students, in the research activities of NASA and RIACS. RIACS researchers work in close collaboration with NASA computer scientists on projects such as the Remote Agent Experiment on Deep Space One mission, and Super-Resolution Surface Modeling.

  7. Computational chemistry at Janssen

    NASA Astrophysics Data System (ADS)

    van Vlijmen, Herman; Desjarlais, Renee L.; Mirzadegan, Tara

    2017-03-01

    Computer-aided drug discovery activities at Janssen are carried out by scientists in the Computational Chemistry group of the Discovery Sciences organization. This perspective gives an overview of the organizational and operational structure, the science, internal and external collaborations, and the impact of the group on Drug Discovery at Janssen.

  8. Computing Life

    ERIC Educational Resources Information Center

    National Institute of General Medical Sciences (NIGMS), 2009

    2009-01-01

    Computer advances now let researchers quickly search through DNA sequences to find gene variations that could lead to disease, simulate how flu might spread through one's school, and design three-dimensional animations of molecules that rival any video game. By teaming computers and biology, scientists can answer new and old questions that could…

  9. The Study Team for Early Life Asthma Research (STELAR) consortium ‘Asthma e-lab’: team science bringing data, methods and investigators together

    PubMed Central

    Custovic, Adnan; Ainsworth, John; Arshad, Hasan; Bishop, Christopher; Buchan, Iain; Cullinan, Paul; Devereux, Graham; Henderson, John; Holloway, John; Roberts, Graham; Turner, Steve; Woodcock, Ashley; Simpson, Angela

    2015-01-01

    We created Asthma e-Lab, a secure web-based research environment to support consistent recording, description and sharing of data, computational/statistical methods and emerging findings across the five UK birth cohorts. The e-Lab serves as a data repository for our unified dataset and provides the computational resources and a scientific social network to support collaborative research. All activities are transparent, and emerging findings are shared via the e-Lab, linked to explanations of analytical methods, thus enabling knowledge transfer. eLab facilitates the iterative interdisciplinary dialogue between clinicians, statisticians, computer scientists, mathematicians, geneticists and basic scientists, capturing collective thought behind the interpretations of findings. PMID:25805205

  10. Professional Ethics for Climate Scientists

    NASA Astrophysics Data System (ADS)

    Peacock, K.; Mann, M. E.

    2014-12-01

    Several authors have warned that climate scientists sometimes exhibit a tendency to "err on the side of least drama" in reporting the risks associated with fossil fuel emissions. Scientists are often reluctant to comment on the implications of their work for public policy, despite the fact that because of their expertise they may be among those best placed to make recommendations about such matters as mitigation and preparedness. Scientists often have little or no training in ethics or philosophy, and consequently they may feel that they lack clear guidelines for balancing the imperative to avoid error against the need to speak out when it may be ethically required to do so. This dilemma becomes acute in cases such as abrupt ice sheet collapse where it is easier to identify a risk than to assess its probability. We will argue that long-established codes of ethics in the learned professions such as medicine and engineering offer a model that can guide research scientists in cases like this, and we suggest that ethical training could be regularly incorporated into graduate curricula in fields such as climate science and geology. We recognize that there are disanalogies between professional and scientific ethics, the most important of which is that codes of ethics are typically written into the laws that govern licensed professions such as engineering. Presently, no one can legally compel a research scientist to be ethical, although legal precedent may evolve such that scientists are increasingly expected to communicate their knowledge of risks. We will show that the principles of professional ethics can be readily adapted to define an ethical code that could be voluntarily adopted by scientists who seek clearer guidelines in an era of rapid climate change.

  11. Tessera: Open source software for accelerated data science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sego, Landon H.; Hafen, Ryan P.; Director, Hannah M.

    2014-06-30

    Extracting useful, actionable information from data can be a formidable challenge for the safeguards, nonproliferation, and arms control verification communities. Data scientists are often on the “front-lines” of making sense of complex and large datasets. They require flexible tools that make it easy to rapidly reformat large datasets, interactively explore and visualize data, develop statistical algorithms, and validate their approaches—and they need to perform these activities with minimal lines of code. Existing commercial software solutions often lack extensibility and the flexibility required to address the nuances of the demanding and dynamic environments where data scientists work. To address this need,more » Pacific Northwest National Laboratory developed Tessera, an open source software suite designed to enable data scientists to interactively perform their craft at the terabyte scale. Tessera automatically manages the complicated tasks of distributed storage and computation, empowering data scientists to do what they do best: tackling critical research and mission objectives by deriving insight from data. We illustrate the use of Tessera with an example analysis of computer network data.« less

  12. Cross Domain Deterrence: Livermore Technical Report, 2014-2016

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnes, Peter D.; Bahney, Ben; Matarazzo, Celeste

    2016-08-03

    Lawrence Livermore National Laboratory (LLNL) is an original collaborator on the project titled “Deterring Complex Threats: The Effects of Asymmetry, Interdependence, and Multi-polarity on International Strategy,” (CDD Project) led by the UC Institute on Global Conflict and Cooperation at UCSD under PIs Jon Lindsay and Erik Gartzke , and funded through the DoD Minerva Research Initiative. In addition to participating in workshops and facilitating interaction among UC social scientists, LLNL is leading the computational modeling effort and assisting with empirical case studies to probe the viability of analytic, modeling and data analysis concepts. This report summarizes LLNL work on themore » CDD Project to date, primarily in Project Years 1-2, corresponding to Federal fiscal year 2015. LLNL brings two unique domains of expertise to bear on this Project: (1) access to scientific expertise on the technical dimensions of emerging threat technology, and (2) high performance computing (HPC) expertise, required for analyzing the complexity of bargaining interactions in the envisioned threat models. In addition, we have a small group of researchers trained as social scientists who are intimately familiar with the International Relations research. We find that pairing simulation scientists, who are typically trained in computer science, with domain experts, social scientists in this case, is the most effective route to developing powerful new simulation tools capable of representing domain concepts accurately and answering challenging questions in the field.« less

  13. Towards Robot Scientists for autonomous scientific discovery

    PubMed Central

    2010-01-01

    We review the main components of autonomous scientific discovery, and how they lead to the concept of a Robot Scientist. This is a system which uses techniques from artificial intelligence to automate all aspects of the scientific discovery process: it generates hypotheses from a computer model of the domain, designs experiments to test these hypotheses, runs the physical experiments using robotic systems, analyses and interprets the resulting data, and repeats the cycle. We describe our two prototype Robot Scientists: Adam and Eve. Adam has recently proven the potential of such systems by identifying twelve genes responsible for catalysing specific reactions in the metabolic pathways of the yeast Saccharomyces cerevisiae. This work has been formally recorded in great detail using logic. We argue that the reporting of science needs to become fully formalised and that Robot Scientists can help achieve this. This will make scientific information more reproducible and reusable, and promote the integration of computers in scientific reasoning. We believe the greater automation of both the physical and intellectual aspects of scientific investigations to be essential to the future of science. Greater automation improves the accuracy and reliability of experiments, increases the pace of discovery and, in common with conventional laboratory automation, removes tedious and repetitive tasks from the human scientist. PMID:20119518

  14. Towards Robot Scientists for autonomous scientific discovery.

    PubMed

    Sparkes, Andrew; Aubrey, Wayne; Byrne, Emma; Clare, Amanda; Khan, Muhammed N; Liakata, Maria; Markham, Magdalena; Rowland, Jem; Soldatova, Larisa N; Whelan, Kenneth E; Young, Michael; King, Ross D

    2010-01-04

    We review the main components of autonomous scientific discovery, and how they lead to the concept of a Robot Scientist. This is a system which uses techniques from artificial intelligence to automate all aspects of the scientific discovery process: it generates hypotheses from a computer model of the domain, designs experiments to test these hypotheses, runs the physical experiments using robotic systems, analyses and interprets the resulting data, and repeats the cycle. We describe our two prototype Robot Scientists: Adam and Eve. Adam has recently proven the potential of such systems by identifying twelve genes responsible for catalysing specific reactions in the metabolic pathways of the yeast Saccharomyces cerevisiae. This work has been formally recorded in great detail using logic. We argue that the reporting of science needs to become fully formalised and that Robot Scientists can help achieve this. This will make scientific information more reproducible and reusable, and promote the integration of computers in scientific reasoning. We believe the greater automation of both the physical and intellectual aspects of scientific investigations to be essential to the future of science. Greater automation improves the accuracy and reliability of experiments, increases the pace of discovery and, in common with conventional laboratory automation, removes tedious and repetitive tasks from the human scientist.

  15. Implementations of the CC'01 Human-Computer Interaction Guidelines Using Bloom's Taxonomy

    ERIC Educational Resources Information Center

    Manaris, Bill; Wainer, Michael; Kirkpatrick, Arthur E.; Stalvey, RoxAnn H.; Shannon, Christine; Leventhal, Laura; Barnes, Julie; Wright, John; Schafer, J. Ben; Sanders, Dean

    2007-01-01

    In today's technology-laden society human-computer interaction (HCI) is an important knowledge area for computer scientists and software engineers. This paper surveys existing approaches to incorporate HCI into computer science (CS) and such related issues as the perceived gap between the interests of the HCI community and the needs of CS…

  16. Eckert, Wallace John (1902-71)

    NASA Astrophysics Data System (ADS)

    Murdin, P.

    2000-11-01

    Computer scientist and astronomer. Born in Pittsburgh, PA, Eckert was a pioneer of the use of IBM punched card equipment for astronomical calculations. As director of the US Nautical Almanac Office he introduced computer methods to calculate and print tables instead of relying on human `computers'. When, later, he became director of the Watson Scientific Computing Laboratory at Columbia Universit...

  17. "I'm Good, but Not That Good": Digitally-Skilled Young People's Identity in Computing

    ERIC Educational Resources Information Center

    Wong, Billy

    2017-01-01

    Computers and information technology are fast becoming a part of young people's everyday life. However, there remains a difference between the majority who can use computers and the minority who are computer scientists or professionals. Drawing on 32 semi-structured interviews with digitally skilled young people (aged 13-19), we explore their…

  18. Computers in Education: Realizing the Potential. Chairmen's Report of a Research Conference, Pittsburgh, Pennsylvania, November 20-24, 1982.

    ERIC Educational Resources Information Center

    Lesgold, Alan; Reif, Frederick

    The future of computers in education and the research needed to realize the computer's potential are discussed in this report, which presents a summary and the conclusions from an invitational conference involving 40 computer scientists, psychologists, educational researchers, teachers, school administrators, and parents. The summary stresses the…

  19. 2005 White Paper on Institutional Capability Computing Requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carnes, B; McCoy, M; Seager, M

    This paper documents the need for a significant increase in the computing infrastructure provided to scientists working in the unclassified domains at Lawrence Livermore National Laboratory (LLNL). This need could be viewed as the next step in a broad strategy outlined in the January 2002 White Paper (UCRL-ID-147449) that bears essentially the same name as this document. Therein we wrote: 'This proposed increase could be viewed as a step in a broader strategy linking hardware evolution to applications development that would take LLNL unclassified computational science to a position of distinction if not preeminence by 2006.' This position of distinctionmore » has certainly been achieved. This paper provides a strategy for sustaining this success but will diverge from its 2002 predecessor in that it will: (1) Amplify the scientific and external success LLNL has enjoyed because of the investments made in 2002 (MCR, 11 TF) and 2004 (Thunder, 23 TF). (2) Describe in detail the nature of additional investments that are important to meet both the institutional objectives of advanced capability for breakthrough science and the scientists clearly stated request for adequate capacity and more rapid access to moderate-sized resources. (3) Put these requirements in the context of an overall strategy for simulation science and external collaboration. While our strategy for Multiprogrammatic and Institutional Computing (M&IC) has worked well, three challenges must be addressed to assure and enhance our position. The first is that while we now have over 50 important classified and unclassified simulation codes available for use by our computational scientists, we find ourselves coping with high demand for access and long queue wait times. This point was driven home in the 2005 Institutional Computing Executive Group (ICEG) 'Report Card' to the Deputy Director for Science and Technology (DDST) Office and Computation Directorate management. The second challenge is related to the balance that should be maintained in the simulation environment. With the advent of Thunder, the institution directed a change in course from past practice. Instead of making Thunder available to the large body of scientists, as was MCR, and effectively using it as a capacity system, the intent was to make it available to perhaps ten projects so that these teams could run very aggressive problems for breakthrough science. This usage model established Thunder as a capability system. The challenge this strategy raises is that the majority of scientists have not seen an improvement in capacity computing resources since MCR, thus creating significant tension in the system. The question then is: 'How do we address the institution's desire to maintain the potential for breakthrough science and also meet the legitimate requests from the ICEG to achieve balance?' Both the capability and the capacity environments must be addressed through this one procurement. The third challenge is to reach out more aggressively to the national science community to encourage access to LLNL resources as part of a strategy for sharpening our science through collaboration. Related to this, LLNL has been unable in the past to provide access for sensitive foreign nationals (SFNs) to the Livermore Computing (LC) unclassified 'yellow' network. Identifying some mechanism for data sharing between LLNL computational scientists and SFNs would be a first practical step in fostering cooperative, collaborative relationships with an important and growing sector of the American science community.« less

  20. Providing Climate Policy Makers With a Strong Scientific Base (Invited)

    NASA Astrophysics Data System (ADS)

    Struzik, E.

    2009-12-01

    Scientists can and should inform public policy decisions in the Arctic. But the pace of climate change in the polar world has been occurring far more quickly than most scientists have been able to predict. This creates problems for decision-makers who recognize that difficult management decisions have to be made in matters pertaining to wildlife management, cultural integrity and economic development. With sea ice melting, glaciers receding, permafrost thawing, forest fires intensifying, and disease and invasive species rapidly moving north, the challenge for scientists to provide climate policy makers with a strong scientific base has been daunting. Clashing as this data sometimes does with the “traditional knowledge” of indigenous peoples in the north, it can also become very political. As a result the need to effectively communicate complex data is more imperative now than ever before. Here, the author describes how the work of scientists can often be misinterpreted or exploited in ways that were not intended. Examples include the inappropriate use of scientific data in decision-making on polar bears, caribou and other wildlife populations; the use of scientific data to debunk the fact that greenhouse gases are driving climate change, and the use of scientific data to position one scientist against another when there is no inherent conflict. This work will highlight the need for climate policy makers to increase support for scientists working in the Arctic, as well as illustrate why it is important to find new and more effective ways of communicating scientific data. Strategies that might be considered by granting agencies, scientists and climate policy decision-makers will also be discussed.

  1. Perspective: Transforming science into medicine: how clinician-scientists can build bridges across research's "valley of death".

    PubMed

    Roberts, Scott F; Fischhoff, Martin A; Sakowski, Stacey A; Feldman, Eva L

    2012-03-01

    Significant increases in National Institutes of Health (NIH) spending on medical research have not produced corresponding increases in new treatments and cures. Instead, laboratory discoveries remain in what has been termed the "valley of death," the gap between bench research and clinical application. Recently, there has been considerable discussion in the literature and scientific community about the causes of this phenomenon and how to bridge the abyss. In this article, the authors examine one possible explanation: Clinician-scientists' declining role in the medical research enterprise has had a dilatory effect on the successful translation of laboratory breakthroughs into new clinical applications. In recent decades, the percentage of MDs receiving NIH funding has drastically decreased compared with PhDs. The growing gap between the research and clinical enterprises has resulted in fewer scientists with a true understanding of clinical problems as well as scientists who are unable to or uninterested in gleaning new basic research hypotheses from failed clinical trials. The NIH and many U.S. medical schools have recognized the decline of the clinician-scientist as a major problem and adopted innovative programs to reverse the trend. However, more radical action may be required, including major changes to the NIH peer-review process, greater funding for translational research, and significantly more resources for the training, debt relief, and early career support of potential clinician-scientists. Such improvements are required for clinician-scientists to conduct translational research that bridges the valley of death and transforms biomedical research discoveries into tangible clinical treatments and technologies.

  2. Strengthening LLNL Missions through Laboratory Directed Research and Development in High Performance Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Willis, D. K.

    2016-12-01

    High performance computing (HPC) has been a defining strength of Lawrence Livermore National Laboratory (LLNL) since its founding. Livermore scientists have designed and used some of the world’s most powerful computers to drive breakthroughs in nearly every mission area. Today, the Laboratory is recognized as a world leader in the application of HPC to complex science, technology, and engineering challenges. Most importantly, HPC has been integral to the National Nuclear Security Administration’s (NNSA’s) Stockpile Stewardship Program—designed to ensure the safety, security, and reliability of our nuclear deterrent without nuclear testing. A critical factor behind Lawrence Livermore’s preeminence in HPC ismore » the ongoing investments made by the Laboratory Directed Research and Development (LDRD) Program in cutting-edge concepts to enable efficient utilization of these powerful machines. Congress established the LDRD Program in 1991 to maintain the technical vitality of the Department of Energy (DOE) national laboratories. Since then, LDRD has been, and continues to be, an essential tool for exploring anticipated needs that lie beyond the planning horizon of our programs and for attracting the next generation of talented visionaries. Through LDRD, Livermore researchers can examine future challenges, propose and explore innovative solutions, and deliver creative approaches to support our missions. The present scientific and technical strengths of the Laboratory are, in large part, a product of past LDRD investments in HPC. Here, we provide seven examples of LDRD projects from the past decade that have played a critical role in building LLNL’s HPC, computer science, mathematics, and data science research capabilities, and describe how they have impacted LLNL’s mission.« less

  3. Algorithmics - Is There Hope for a Unified Theory?

    NASA Astrophysics Data System (ADS)

    Hromkovič, Juraj

    Computer science was born with the formal definition of the notion of an algorithm. This definition provides clear limits of automatization, separating problems into algorithmically solvable problems and algorithmically unsolvable ones. The second big bang of computer science was the development of the concept of computational complexity. People recognized that problems that do not admit efficient algorithms are not solvable in practice. The search for a reasonable, clear and robust definition of the class of practically solvable algorithmic tasks started with the notion of the class {P} and of {NP}-completeness. In spite of the fact that this robust concept is still fundamental for judging the hardness of computational problems, a variety of approaches was developed for solving instances of {NP}-hard problems in many applications. Our 40-years short attempt to fix the fuzzy border between the practically solvable problems and the practically unsolvable ones partially reminds of the never-ending search for the definition of "life" in biology or for the definitions of matter and energy in physics. Can the search for the formal notion of "practical solvability" also become a never-ending story or is there hope for getting a well-accepted, robust definition of it? Hopefully, it is not surprising that we are not able to answer this question in this invited talk. But to deal with this question is of crucial importance, because only due to enormous effort scientists get a better and better feeling of what the fundamental notions of science like life and energy mean. In the flow of numerous technical results, we must not forget the fact that most of the essential revolutionary contributions to science were done by defining new concepts and notions.

  4. 75 FR 60820 - United States v. Adobe Systems, Inc., et al.; Proposed Final Judgment and Competitive Impact...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-01

    ... compete for high tech employees, and in particular specialized computer science and engineering talent on the basis of salaries, benefits, and career opportunities. In recent years, talented computer... Venue 4. Each Defendant hires specialized computer engineers and scientists throughout the United States...

  5. Advanced Biomedical Computing Center (ABCC) | DSITP

    Cancer.gov

    The Advanced Biomedical Computing Center (ABCC), located in Frederick Maryland (MD), provides HPC resources for both NIH/NCI intramural scientists and the extramural biomedical research community. Its mission is to provide HPC support, to provide collaborative research, and to conduct in-house research in various areas of computational biology and biomedical research.

  6. Computer Art--A New Tool in Advertising Graphics.

    ERIC Educational Resources Information Center

    Wassmuth, Birgit L.

    Using computers to produce art began with scientists, mathematicians, and individuals with strong technical backgrounds who used the graphic material as visualizations of data in technical fields. People are using computer art in advertising, as well as in painting; sculpture; music; textile, product, industrial, and interior design; architecture;…

  7. Integrating Computational Science Tools into a Thermodynamics Course

    ERIC Educational Resources Information Center

    Vieira, Camilo; Magana, Alejandra J.; García, R. Edwin; Jana, Aniruddha; Krafcik, Matthew

    2018-01-01

    Computational tools and methods have permeated multiple science and engineering disciplines, because they enable scientists and engineers to process large amounts of data, represent abstract phenomena, and to model and simulate complex concepts. In order to prepare future engineers with the ability to use computational tools in the context of…

  8. 1001 Ways to run AutoDock Vina for virtual screening

    NASA Astrophysics Data System (ADS)

    Jaghoori, Mohammad Mahdi; Bleijlevens, Boris; Olabarriaga, Silvia D.

    2016-03-01

    Large-scale computing technologies have enabled high-throughput virtual screening involving thousands to millions of drug candidates. It is not trivial, however, for biochemical scientists to evaluate the technical alternatives and their implications for running such large experiments. Besides experience with the molecular docking tool itself, the scientist needs to learn how to run it on high-performance computing (HPC) infrastructures, and understand the impact of the choices made. Here, we review such considerations for a specific tool, AutoDock Vina, and use experimental data to illustrate the following points: (1) an additional level of parallelization increases virtual screening throughput on a multi-core machine; (2) capturing of the random seed is not enough (though necessary) for reproducibility on heterogeneous distributed computing systems; (3) the overall time spent on the screening of a ligand library can be improved by analysis of factors affecting execution time per ligand, including number of active torsions, heavy atoms and exhaustiveness. We also illustrate differences among four common HPC infrastructures: grid, Hadoop, small cluster and multi-core (virtual machine on the cloud). Our analysis shows that these platforms are suitable for screening experiments of different sizes. These considerations can guide scientists when choosing the best computing platform and set-up for their future large virtual screening experiments.

  9. 1001 Ways to run AutoDock Vina for virtual screening.

    PubMed

    Jaghoori, Mohammad Mahdi; Bleijlevens, Boris; Olabarriaga, Silvia D

    2016-03-01

    Large-scale computing technologies have enabled high-throughput virtual screening involving thousands to millions of drug candidates. It is not trivial, however, for biochemical scientists to evaluate the technical alternatives and their implications for running such large experiments. Besides experience with the molecular docking tool itself, the scientist needs to learn how to run it on high-performance computing (HPC) infrastructures, and understand the impact of the choices made. Here, we review such considerations for a specific tool, AutoDock Vina, and use experimental data to illustrate the following points: (1) an additional level of parallelization increases virtual screening throughput on a multi-core machine; (2) capturing of the random seed is not enough (though necessary) for reproducibility on heterogeneous distributed computing systems; (3) the overall time spent on the screening of a ligand library can be improved by analysis of factors affecting execution time per ligand, including number of active torsions, heavy atoms and exhaustiveness. We also illustrate differences among four common HPC infrastructures: grid, Hadoop, small cluster and multi-core (virtual machine on the cloud). Our analysis shows that these platforms are suitable for screening experiments of different sizes. These considerations can guide scientists when choosing the best computing platform and set-up for their future large virtual screening experiments.

  10. hackseq: Catalyzing collaboration between biological and computational scientists via hackathon.

    PubMed

    2017-01-01

    hackseq ( http://www.hackseq.com) was a genomics hackathon with the aim of bringing together a diverse set of biological and computational scientists to work on collaborative bioinformatics projects. In October 2016, 66 participants from nine nations came together for three days for hackseq and collaborated on nine projects ranging from data visualization to algorithm development. The response from participants was overwhelmingly positive with 100% (n = 54) of survey respondents saying they would like to participate in future hackathons. We detail key steps for others interested in organizing a successful hackathon and report excerpts from each project.

  11. hackseq: Catalyzing collaboration between biological and computational scientists via hackathon

    PubMed Central

    2017-01-01

    hackseq ( http://www.hackseq.com) was a genomics hackathon with the aim of bringing together a diverse set of biological and computational scientists to work on collaborative bioinformatics projects. In October 2016, 66 participants from nine nations came together for three days for hackseq and collaborated on nine projects ranging from data visualization to algorithm development. The response from participants was overwhelmingly positive with 100% (n = 54) of survey respondents saying they would like to participate in future hackathons. We detail key steps for others interested in organizing a successful hackathon and report excerpts from each project. PMID:28417000

  12. Public land, timber harvests, and climate mitigation: quantifying carbon sequestration potential on U.S. public timberlands

    Treesearch

    Brooks M. Depro; Brian C. Murray; Ralph J. Alig; Alyssa Shanks

    2008-01-01

    Scientists and policymakers have long recognized the role that forests can play in countering the atmospheric buildup of carbon dioxide (C02), a greenhouse gas (GHG). In the United States, terrestrial carbon sequestration in private and public forests offsets approximately 11 percent of all GHG emissions from all sectors of the economy annually....

  13. The thrill of scientific discovery and leadership with my group

    PubMed Central

    Greco, Valentina

    2016-01-01

    My group and I feel tremendously honored to be recognized with the 2016 Early Career Life Scientist Award from the American Society for Cell Biology. In this essay I share the scientific questions that my lab has been excitedly pursuing since starting in August 2009 and the leadership behaviors we have adopted that enable our collective scientific productivity. PMID:27799490

  14. Wisdom from the little folk: the forest tales of birds, squirrels, and fungi.

    Treesearch

    Sally Duncan

    1999-01-01

    Ecosystem function—the internal dynamics of a forest—is now recognized as a crucial component to forest health and biological diversity.Pacific Northwest Research Station scientist Andy Carey and others propose that the presence of small critters can be a measure of a forest's health. His research also shows that thinning, rather than...

  15. Save the Penguins: Teaching the Science of Heat Transfer through Engineering Design

    ERIC Educational Resources Information Center

    Schnittka, Christine; Bell, Randy; Richards, Larry

    2010-01-01

    Engineers, scientists, and environmental groups around the globe are hard at work finding solutions to mitigate or halt global warming. One major goal of the curriculum described here, Save the Penguins, is to help students recognize that what we do at home can affect how penguins fare in the Southern Hemisphere. In addition, students learn how…

  16. Decision Making in the Biological Field. The 1971 W. O. Atwater Memorial Lecture.

    ERIC Educational Resources Information Center

    Mayer, Jean

    Established in 1967 by the Agriculture Research Service of the U. S. Department of Agriculture to honor the memory of a gifted scientist . . . and to recognize accomplishment in a field or discipline that relates to the problem of nutrition and food production, the W. O. Atwater Memorial Lecture invited Dr. Jean Mayer, Professor of Nutrition at…

  17. 29ièmes Journées Franco-Belges de Pharmacochimie: Meeting Report

    PubMed Central

    2015-01-01

    The “Journées Franco-Belges de Pharmacochimie” is a recognized two-day annual meeting on Medicinal Chemistry that is renowned for the advanced science presented, conviviality, and outstanding opportunities for senior and young scientists to exchange knowledge. Abstracts of plenary lectures, oral communications, and posters presented during the meeting are collected in this report. PMID:26593925

  18. AACR Honors Louis Staudt with Princess Takamatsu Memorial Lectureship | Center for Cancer Research

    Cancer.gov

    The American Association for Cancer Research has awarded Louis M. Staudt, Co-Chief of CCR’s Lymphoid Malignancies Branch, its Princess Takamatsu Memorial Lectureship. The lectureship recognizes scientists whose work has had or may have a far-reaching impact on the detection, diagnosis, treatment or prevention of cancer and who embody the dedication of the princess to

  19. Society News: RAS Awards 2012; Prof. Andy Fabian; Prof. John C Brown; Prof. Andrew Fazakerley; Dr. Mike Irwin; Joss Bland-Hawthorn

    NASA Astrophysics Data System (ADS)

    2012-02-01

    Each year the RAS recognizes outstanding achievement in astronomy and geophysics by the award of medals and prizes. Candidates are nominated by Fellows and the awards made by a committee of Fellows, ensuring that these scientists have earned the respect and admiration of their peers in the research community.

  20. Hydrological processes and pathways affected by forest roads: what do we still need to learn?

    Treesearch

    Charles H. Luce

    2002-01-01

    Forest roads are an important environmental issue. While many scientists interested in hydrology recognize climate-altering processes as an important global issue, there are problems that are similar in scope and magnitude because human industriousness has brought them to so many parts of the world. Almost everywhere people live and work they build and use unimproved...

  1. Prehistoric human influence on the abundance and distribution of deadwood in alpine landscapes

    Treesearch

    Donald K. Grayson; Constance I. Millar

    2008-01-01

    Scientists have long inferred the locations of past treelines from the distribution of deadwood above modern tree boundaries. Although it is recognized that deadwood above treeline may have decayed, the absence of such wood is routinely taken to imply the absence of trees for periods ranging from the past few millennia to the entire Holocene. Reconstructed treeline...

  2. Science Outreach and the Religious Public: The Source Makes All the Difference

    NASA Astrophysics Data System (ADS)

    Davidson, G. R.; Hill, C.; Wolgemuth, K.

    2017-12-01

    Public resistance to well established scientific understanding has been a persistent problem in the US. Decades of improved educational materials, upgraded K-12 standards, and several successful court battles to curb anti-science influences did little to change the percentage of Americans resistant to even considering the evidence for subjects such as evolution or ancient Earth history. Research in the social sciences suggests that one reason has been a failure to recognize the importance of the source of information. Studies have documented that people are more receptive to challenging viewpoints when the advocate (the source) is recognized as a member of their own group or "tribe." The personal worldview or group-identity of an expert can determine how willing an audience is to consider the argument, much more so than the expert's scientific credentials. For a religious audience, this means that the quality of educational materials and the strength of an argument may be irrelevant if delivered by someone known to be dismissive of fundamental religious beliefs. In contrast, significant inroads have been realized with the religious public when scientists of faith have taken a pro-science message to members of their own religious affiliations. Encouraging stories are coming from outreach efforts of organizations and programs such as BioLogos, American Scientific Affiliation, Solid Rock Lectures, and AAAS Dialogue on Science, Ethics, and Religion. Secular scientists interested in outreach can benefit greatly by keeping a short list of resources (blogs, books, speakers) by religious scientists advocating for the legitimacy of modern science, or by directly teaming with scientists of faith. A recent example from our own efforts includes an 11 author book, The Grand Canyon, Monument to an Ancient Earth, aimed primarily at the Christian public to explain why Noah's flood does not explain the planet's complex geology. Eight authors are Christians and three are not.

  3. What Physicists Should Know About High Performance Computing - Circa 2002

    NASA Astrophysics Data System (ADS)

    Frederick, Donald

    2002-08-01

    High Performance Computing (HPC) is a dynamic, cross-disciplinary field that traditionally has involved applied mathematicians, computer scientists, and others primarily from the various disciplines that have been major users of HPC resources - physics, chemistry, engineering, with increasing use by those in the life sciences. There is a technological dynamic that is powered by economic as well as by technical innovations and developments. This talk will discuss practical ideas to be considered when developing numerical applications for research purposes. Even with the rapid pace of development in the field, the author believes that these concepts will not become obsolete for a while, and will be of use to scientists who either are considering, or who have already started down the HPC path. These principles will be applied in particular to current parallel HPC systems, but there will also be references of value to desktop users. The talk will cover such topics as: computing hardware basics, single-cpu optimization, compilers, timing, numerical libraries, debugging and profiling tools and the emergence of Computational Grids.

  4. Establishment of an Undergraduate Research and Training Program in Radiochemistry at Florida Memorial University, a Historically Black College or University (HBCU)

    NASA Astrophysics Data System (ADS)

    Tamalis, Dimitri; Stiffin, Rose; Elliott, Michael; Huisso, Ayivi; Biegalski, Steven; Landsberger, Sheldon

    2009-08-01

    With the passing of the Energy Policy Act of 2005, the United States is experiencing for the first time in over two decades, what some refer to as the "Nuclear Renaissance". The US Nuclear Regulatory Commission (NRC) recognizes this surge in application submissions and is committed to reviewing these applications in a timely manner to support the country's growing energy demands. Notwithstanding these facts, it is understood that the nuclear industry requires appropriately trained and educated personnel to support the growing needs of the nuclear industry and the US NRC. Equally important is the need to educate the next generation of students in nuclear non-proliferation, nuclear forensics and various aspects of homeland security for the national laboratories and the Department of Defense. From mechanical engineers educated and experienced in materials, thermal/fluid dynamics, and component failure analysis, to physicists using advanced computing techniques to design the next generation of nuclear reactor fuel elements, the need for new engineers, scientists, and health physicist has never been greater.

  5. Towards On-Line Services Based on a Holistic Analysis of Human Activities

    NASA Technical Reports Server (NTRS)

    Clancey, William J.

    2004-01-01

    Very often computer scientists view computerization of services in terms of the logistics of human-machine interaction, including establishing a contract, accessing records, and of course designing an interface. But this analysis often moves too quickly to tactical details, failing to frame the entire service in human terms, and not recognizing the mutual learning required to define and relate goals, constraints, and the personalized value of available services. In particular, on-line services that "computerize communication" can be improved by constructing an activity model of what the person is trying to do, not just filtering, comparing, and selling piece-meal services. For example, from the customer s perspective the task of an on-line travel service is not merely to establish confirmed reservations, but to have a complete travel plan, usually integrating many days of transportation, lodging, and recreation into a happy experience. The task of the travel agent is not merely "ticketing", but helping the customer understand what they want and providing services that will connect everything together in an enjoyable way.

  6. Community Decadal Panel for Terrestrial Analogs to Mars

    NASA Astrophysics Data System (ADS)

    Barlow, N. G.; Farr, T.; Baker, V. R.; Bridges, N.; Carsey, F.; Duxbury, N.; Gilmore, M. S.; Green, J. R.; Grin, E.; Hansen, V.; Keszthelyi, L.; Lanagan, P.; Lentz, R.; Marinangeli, L.; Morris, P. A.; Ori, G. G.; Paillou, P.; Robinson, C.; Thomson, B.

    2001-11-01

    It is well recognized that interpretations of Mars must begin with the Earth as a reference. The most successful comparisons have focused on understanding geologic processes on the Earth well enough to extrapolate to Mars' environment. Several facets of terrestrial analog studies have been pursued and are continuing. These studies include field workshops, characterization of terrestrial analog sites for Mars, instrument tests, laboratory measurements (including analysis of martian meteorites), and computer and laboratory modeling. The combination of all these activities allows scientists to constrain the processes operating in specific terrestrial environments and extrapolate how similar processes could affect Mars. The Terrestrial Analogs for Mars Community Panel is considering the following two key questions: (1) How do terrestrial analog studies tie in to the MEPAG science questions about life, past climate, and geologic evolution of Mars, and (2) How can future instrumentation be used to address these questions. The panel is considering the issues of data collection, value of field workshops, data archiving, laboratory measurements and modeling, human exploration issues, association with other areas of solar system exploration, and education and public outreach activities.

  7. Terrestrial Analogs to Mars

    NASA Astrophysics Data System (ADS)

    Farr, T. G.; Arcone, S.; Arvidson, R. W.; Baker, V.; Barlow, N. G.; Beaty, D.; Bell, M. S.; Blankenship, D. D.; Bridges, N.; Briggs, G.; Bulmer, M.; Carsey, F.; Clifford, S. M.; Craddock, R. A.; Dickerson, P. W.; Duxbury, N.; Galford, G. L.; Garvin, J.; Grant, J.; Green, J. R.; Gregg, T. K. P.; Guinness, E.; Hansen, V. L.; Hecht, M. H.; Holt, J.; Howard, A.; Keszthelyi, L. P.; Lee, P.; Lanagan, P. D.; Lentz, R. C. F.; Leverington, D. W.; Marinangeli, L.; Moersch, J. E.; Morris-Smith, P. A.; Mouginis-Mark, P.; Olhoeft, G. R.; Ori, G. G.; Paillou, P.; Reilly, J. F., II; Rice, J. W., Jr.; Robinson, C. A.; Sheridan, M.; Snook, K.; Thomson, B. J.; Watson, K.; Williams, K.; Yoshikawa, K.

    2002-08-01

    It is well recognized that interpretations of Mars must begin with the Earth as a reference. The most successful comparisons have focused on understanding geologic processes on the Earth well enough to extrapolate to Mars' environment. Several facets of terrestrial analog studies have been pursued and are continuing. These studies include field workshops, characterization of terrestrial analog sites, instrument tests, laboratory measurements (including analysis of Martian meteorites), and computer and laboratory modeling. The combination of all these activities allows scientists to constrain the processes operating in specific terrestrial environments and extrapolate how similar processes could affect Mars. The Terrestrial Analogs for Mars Community Panel has considered the following two key questions: (1) How do terrestrial analog studies tie in to the Mars Exploration Payload Assessment Group science questions about life, past climate, and geologic evolution of Mars, and (2) How can future instrumentation be used to address these questions. The panel has considered the issues of data collection, value of field workshops, data archiving, laboratory measurements and modeling, human exploration issues, association with other areas of solar system exploration, and education and public outreach activities.

  8. COMMUNITY CAPACITY BUILDING FOR REVITALIZATION AND SUSTAINABLE REDEVELOPMENT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Downing, Melinda; Rosenthall, John; Hudson, Michelle

    2003-02-27

    Capacity building programs help poor and disadvantaged communities to improve their ability to participate in the environmental decision-making processes. They encourage citizen involvement, and provide the tools that enable them to do so. Capacity building enables communities that would otherwise be excluded to participate in the process, leading to better, and more just decisions. The Department of Energy (DOE) continues to be committed to promoting environmental justice and involving its stakeholders more directly in the planning and decision-making process for environmental cleanup. DOE's Environmental Management Program (EM) is in full support of this commitment. Through its environmental justice project, EMmore » provides communities with the capacity to effectively contribute to a complex technical decision-making process by furnishing access to computers, the Internet, training and technical assistance. DOE's Dr. Samuel P. Massie Chairs of Excellence Program (Massie Chairs) function as technical advisors to many of these community projects. The Massie Chairs consist of nationally and internationally recognized engineers and scientists from nine Historically Black Colleges and Universities (HBCUs) and one Hispanic Serving Institution (HIS). This paper will discuss capacity building initiatives in various jurisdictions.« less

  9. Culture and Workplace Communications: A Comparison of the Technical Communications Practices of Japanese and U.S. Aerospace Engineers and Scientists.

    ERIC Educational Resources Information Center

    Pinelli, Thomas E.; Sato, Yuko; Barclay, Rebecca O.; Kennedy, John M.

    1997-01-01

    Japanese (n=94) and U.S. (n=340) aerospace scientists/engineers described time spent communicating information, collaborative writing, importance of technical communication courses, and the use of libraries, computer networks, and technical reports. Japanese respondents had greater language fluency; U.S. respondents spent more time with…

  10. MeDICi Software Superglue for Data Analysis Pipelines

    ScienceCinema

    Ian Gorton

    2017-12-09

    The Middleware for Data-Intensive Computing (MeDICi) Integration Framework is an integrated middleware platform developed to solve data analysis and processing needs of scientists across many domains. MeDICi is scalable, easily modified, and robust to multiple languages, protocols, and hardware platforms, and in use today by PNNL scientists for bioinformatics, power grid failure analysis, and text analysis.

  11. The Draw a Scientist Test: A Different Population and a Somewhat Different Story

    ERIC Educational Resources Information Center

    Thomas, Mark D.; Henley, Tracy B.; Snell, Catherine M.

    2006-01-01

    This study examined Draw-a-Scientist-Test (DAST) images solicited from 212 undergraduate students for the presence of traditional gender stereotypes. Participants were 100 males and 112 females enrolled in psychology or computer science courses with a mean age of 21.02 years. A standard multiple regression generated a model that accounts for the…

  12. Multiscale computing.

    PubMed

    Kobayashi, M; Irino, T; Sweldens, W

    2001-10-23

    Multiscale computing (MSC) involves the computation, manipulation, and analysis of information at different resolution levels. Widespread use of MSC algorithms and the discovery of important relationships between different approaches to implementation were catalyzed, in part, by the recent interest in wavelets. We present two examples that demonstrate how MSC can help scientists understand complex data. The first is from acoustical signal processing and the second is from computer graphics.

  13. A characterization of workflow management systems for extreme-scale applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia

    We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less

  14. A characterization of workflow management systems for extreme-scale applications

    DOE PAGES

    Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia; ...

    2017-02-16

    We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less

  15. Information processing, computation, and cognition

    PubMed Central

    Scarantino, Andrea

    2010-01-01

    Computation and information processing are among the most fundamental notions in cognitive science. They are also among the most imprecisely discussed. Many cognitive scientists take it for granted that cognition involves computation, information processing, or both – although others disagree vehemently. Yet different cognitive scientists use ‘computation’ and ‘information processing’ to mean different things, sometimes without realizing that they do. In addition, computation and information processing are surrounded by several myths; first and foremost, that they are the same thing. In this paper, we address this unsatisfactory state of affairs by presenting a general and theory-neutral account of computation and information processing. We also apply our framework by analyzing the relations between computation and information processing on one hand and classicism, connectionism, and computational neuroscience on the other. We defend the relevance to cognitive science of both computation, at least in a generic sense, and information processing, in three important senses of the term. Our account advances several foundational debates in cognitive science by untangling some of their conceptual knots in a theory-neutral way. By leveling the playing field, we pave the way for the future resolution of the debates’ empirical aspects. PMID:22210958

  16. Data-driven non-linear elasticity: constitutive manifold construction and problem discretization

    NASA Astrophysics Data System (ADS)

    Ibañez, Ruben; Borzacchiello, Domenico; Aguado, Jose Vicente; Abisset-Chavanne, Emmanuelle; Cueto, Elias; Ladeveze, Pierre; Chinesta, Francisco

    2017-11-01

    The use of constitutive equations calibrated from data has been implemented into standard numerical solvers for successfully addressing a variety problems encountered in simulation-based engineering sciences (SBES). However, the complexity remains constantly increasing due to the need of increasingly detailed models as well as the use of engineered materials. Data-Driven simulation constitutes a potential change of paradigm in SBES. Standard simulation in computational mechanics is based on the use of two very different types of equations. The first one, of axiomatic character, is related to balance laws (momentum, mass, energy,\\ldots ), whereas the second one consists of models that scientists have extracted from collected, either natural or synthetic, data. Data-driven (or data-intensive) simulation consists of directly linking experimental data to computers in order to perform numerical simulations. These simulations will employ laws, universally recognized as epistemic, while minimizing the need of explicit, often phenomenological, models. The main drawback of such an approach is the large amount of required data, some of them inaccessible from the nowadays testing facilities. Such difficulty can be circumvented in many cases, and in any case alleviated, by considering complex tests, collecting as many data as possible and then using a data-driven inverse approach in order to generate the whole constitutive manifold from few complex experimental tests, as discussed in the present work.

  17. Toward a framework for computer-mediated collaborative design in medical informatics.

    PubMed

    Patel, V L; Kaufman, D R; Allen, V G; Shortliffe, E H; Cimino, J J; Greenes, R A

    1999-09-01

    The development and implementation of enabling tools and methods that provide ready access to knowledge and information are among the central goals of medical informatics. The need for multi-institutional collaboration in the development of such tools and methods is increasingly being recognized. Collaboration involves communication, which typically involves individuals who work together at the same location. With the evolution of electronic modalities for communication, we seek to understand the role that such technologies can play in supporting collaboration, especially when the participants are geographically separated. Using the InterMed Collaboratory as a subject of study, we have analyzed their activities as an exercise in computer- and network-mediated collaborative design. We report on the cognitive, sociocultural, and logistical issues encountered when scientists from diverse organizations and backgrounds use communications technologies while designing and implementing shared products. Results demonstrate that it is important to match carefully the content with the mode of communication, identifying, for example, suitable uses of E-mail, conference calls, and face-to-face meetings. The special role of leaders in guiding and facilitating the group activities can also be seen, regardless of the communication setting in which the interactions occur. Most important is the proper use of technology to support the evolution of a shared vision of group goals and methods, an element that is clearly necessary before successful collaborative designs can proceed.

  18. Credibility and advocacy in conservation science

    PubMed Central

    Horton, Cristi C.; Peterson, Tarla Rai; Banerjee, Paulami

    2015-01-01

    Abstract Conservation policy sits at the nexus of natural science and politics. On the one hand, conservation scientists strive to maintain scientific credibility by emphasizing that their research findings are the result of disinterested observations of reality. On the other hand, conservation scientists are committed to conservation even if they do not advocate a particular policy. The professional conservation literature offers guidance on negotiating the relationship between scientific objectivity and political advocacy without damaging conservation science's credibility. The value of this guidance, however, may be restricted by limited recognition of credibility's multidimensionality and emergent nature: it emerges through perceptions of expertise, goodwill, and trustworthiness. We used content analysis of the literature to determine how credibility is framed in conservation science as it relates to apparent contradictions between science and advocacy. Credibility typically was framed as a static entity lacking dimensionality. Authors identified expertise or trustworthiness as important, but rarely mentioned goodwill. They usually did not identify expertise, goodwill, or trustworthiness as dimensions of credibility or recognize interactions among these 3 dimensions of credibility. This oversimplification may limit the ability of conservation scientists to contribute to biodiversity conservation. Accounting for the emergent quality and multidimensionality of credibility should enable conservation scientists to advance biodiversity conservation more effectively. PMID:26041036

  19. Mobile Devices and GPU Parallelism in Ionospheric Data Processing

    NASA Astrophysics Data System (ADS)

    Mascharka, D.; Pankratius, V.

    2015-12-01

    Scientific data acquisition in the field is often constrained by data transfer backchannels to analysis environments. Geoscientists are therefore facing practical bottlenecks with increasing sensor density and variety. Mobile devices, such as smartphones and tablets, offer promising solutions to key problems in scientific data acquisition, pre-processing, and validation by providing advanced capabilities in the field. This is due to affordable network connectivity options and the increasing mobile computational power. This contribution exemplifies a scenario faced by scientists in the field and presents the "Mahali TEC Processing App" developed in the context of the NSF-funded Mahali project. Aimed at atmospheric science and the study of ionospheric Total Electron Content (TEC), this app is able to gather data from various dual-frequency GPS receivers. It demonstrates parsing of full-day RINEX files on mobile devices and on-the-fly computation of vertical TEC values based on satellite ephemeris models that are obtained from NASA. Our experiments show how parallel computing on the mobile device GPU enables fast processing and visualization of up to 2 million datapoints in real-time using OpenGL. GPS receiver bias is estimated through minimum TEC approximations that can be interactively adjusted by scientists in the graphical user interface. Scientists can also perform approximate computations for "quickviews" to reduce CPU processing time and memory consumption. In the final stage of our mobile processing pipeline, scientists can upload data to the cloud for further processing. Acknowledgements: The Mahali project (http://mahali.mit.edu) is funded by the NSF INSPIRE grant no. AGS-1343967 (PI: V. Pankratius). We would like to acknowledge our collaborators at Boston College, Virginia Tech, Johns Hopkins University, Colorado State University, as well as the support of UNAVCO for loans of dual-frequency GPS receivers for use in this project, and Intel for loans of smartphones.

  20. Space Station Simulation Computer System (SCS) study for NASA/MSFC. Volume 2: Baseline architecture report

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned MSFC Payload Training Complex (PTC) required to meet this need will train the Space Station payload scientists, station scientists, and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. The Simulation Computer System (SCS) is the computer hardware, software, and workstations that will support the Payload Training Complex at MSFC. The purpose of this SCS Study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs.

  1. Space Station Simulation Computer System (SCS) study for NASA/MSFC. Phased development plan

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned MSFC Payload Training Complex (PTC) required to meet this need will train the Space Station payload scientists, station scientists and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. The Simulation Computer System (SCS) is made up of computer hardware, software, and workstations that will support the Payload Training Complex at MSFC. The purpose of this SCS Study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs.

  2. BioImg.org: A Catalog of Virtual Machine Images for the Life Sciences

    PubMed Central

    Dahlö, Martin; Haziza, Frédéric; Kallio, Aleksi; Korpelainen, Eija; Bongcam-Rudloff, Erik; Spjuth, Ola

    2015-01-01

    Virtualization is becoming increasingly important in bioscience, enabling assembly and provisioning of complete computer setups, including operating system, data, software, and services packaged as virtual machine images (VMIs). We present an open catalog of VMIs for the life sciences, where scientists can share information about images and optionally upload them to a server equipped with a large file system and fast Internet connection. Other scientists can then search for and download images that can be run on the local computer or in a cloud computing environment, providing easy access to bioinformatics environments. We also describe applications where VMIs aid life science research, including distributing tools and data, supporting reproducible analysis, and facilitating education. BioImg.org is freely available at: https://bioimg.org. PMID:26401099

  3. Space Station Simulation Computer System (SCS) study for NASA/MSFC. Volume 1: Baseline architecture report

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned MSFC Payload Training Complex (PTC) required to meet this need will train the Space Station payload scientists, station scientists, and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. The Simulation Computer System (SCS) is made up of the computer hardware, software, and workstations that will support the Payload Training Complex at MSFC. The purpose of this SCS Study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs.

  4. BioImg.org: A Catalog of Virtual Machine Images for the Life Sciences.

    PubMed

    Dahlö, Martin; Haziza, Frédéric; Kallio, Aleksi; Korpelainen, Eija; Bongcam-Rudloff, Erik; Spjuth, Ola

    2015-01-01

    Virtualization is becoming increasingly important in bioscience, enabling assembly and provisioning of complete computer setups, including operating system, data, software, and services packaged as virtual machine images (VMIs). We present an open catalog of VMIs for the life sciences, where scientists can share information about images and optionally upload them to a server equipped with a large file system and fast Internet connection. Other scientists can then search for and download images that can be run on the local computer or in a cloud computing environment, providing easy access to bioinformatics environments. We also describe applications where VMIs aid life science research, including distributing tools and data, supporting reproducible analysis, and facilitating education. BioImg.org is freely available at: https://bioimg.org.

  5. Space Station Simulation Computer System (SCS) study for NASA/MSFC. Operations concept report

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned MSFC Payload Training Complex (PTC) required to meet this need will train the Space Station payload scientists, station scientists, and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. The Simulation Computer System (SCS) is made up of computer hardware, software, and workstations that will support the Payload Training Complex at MSFC. The purpose of this SCS Study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs.

  6. Technology transfer opportunities: new development: computerized field manual provides valuable resource for hydrologic investigations

    USGS Publications Warehouse

    Chapel, Paul

    1996-01-01

    The U.S. Geological Survey (USGS) is known throughout the world for conducting quality scientific investigation is hydrologic environments. Proper and consistent field techniques have been an integral part of this good research. Over the past few decades, the USGS has developed and published detailed, standard protocols for conducting studies in most aspects of the hydrologic environment. These protocols have been published in a number of diverse documents. The wealth of information contained in these diverse documents can benefit other scientists in industry, government, and academia that are involved in conducting hydrologic studies. Scientists at the USGS have brought together many of the most important of the field protocols in a user-friendly, graphical-interfaced field manual that will be useful in both the field and in the office. This electronic field manual can assist hydrologists and other scientists in conducting and documenting their field activities in a manner that is recognized standard throughout the hydrologic community.

  7. Meet Temilola Fatoyinbo-Agueh

    NASA Image and Video Library

    2017-12-08

    President Obama has named six NASA individuals as recipients of the 2011 Presidential Early Career Award for Scientists and Engineers (PECASE). Temilola "Lola" Fatoyinbo-Agueh, an environmental scientist from NASA's Goddard Space Flight Center, Greenbelt, Md. was one of the recipients. The PECASE awards represent the highest honor bestowed by the U.S. government on scientists and engineers beginning their independent careers. They recognize recipients' exceptional potential for leadership at the frontiers of scientific knowledge, and their commitment to community service as demonstrated through professional leadership, education or community outreach. To read more go to: www.nasa.gov/centers/goddard/news/releases/2012/12-064.html Credit: NASA/GSFC/Chris Gunn NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  8. Mentoring Among Scientists: Implications of Interpersonal Relationships within a Formal Mentoring Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bryan D. Maughan

    2006-11-01

    Mentoring is an established strategy for learning that has its root in antiquity. Most, if not all, successful scientists and engineers had an effective mentor at some point in their career. In the context of scientists and engineers, mentoring has been undefined. Reports addressing critical concerns regarding the future of science and engineering in the U.S. mention the practice of mentoring a priori, leaving organizations without guidance in its application. Preliminary results from this study imply that formal mentoring can be effective when properly defined and operationalized. Recognizing the uniqueness of the individual in a symbiotic mentor-protégé relationship significantly influencesmore » a protégé’s learning experience which carries repercussions into their career intentions. The mentor-protégé relationship is a key factor in succession planning and preserving and disseminating critical information and tacit knowledge essential to the development of leadership in the science and technological industry.« less

  9. Alliance for Computational Science Collaboration HBCU Partnership at Fisk University. Final Report 2001

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, W. E.

    2004-08-16

    Computational Science plays a big role in research and development in mathematics, science, engineering and biomedical disciplines. The Alliance for Computational Science Collaboration (ACSC) has the goal of training African-American and other minority scientists in the computational science field for eventual employment with the Department of Energy (DOE). The involvements of Historically Black Colleges and Universities (HBCU) in the Alliance provide avenues for producing future DOE African-American scientists. Fisk University has been participating in this program through grants from the DOE. The DOE grant supported computational science activities at Fisk University. The research areas included energy related projects, distributed computing,more » visualization of scientific systems and biomedical computing. Students' involvement in computational science research included undergraduate summer research at Oak Ridge National Lab, on-campus research involving the participation of undergraduates, participation of undergraduate and faculty members in workshops, and mentoring of students. These activities enhanced research and education in computational science, thereby adding to Fisk University's spectrum of research and educational capabilities. Among the successes of the computational science activities are the acceptance of three undergraduate students to graduate schools with full scholarships beginning fall 2002 (one for master degree program and two for Doctoral degree program).« less

  10. Staff | Computational Science | NREL

    Science.gov Websites

    develops and leads laboratory-wide efforts in high-performance computing and energy-efficient data centers Professional IV-High Perf Computing Jim.Albin@nrel.gov 303-275-4069 Ananthan, Shreyas Senior Scientist - High -Performance Algorithms and Modeling Shreyas.Ananthan@nrel.gov 303-275-4807 Bendl, Kurt IT Professional IV-High

  11. Parallel computer vision

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Uhr, L.

    1987-01-01

    This book is written by research scientists involved in the development of massively parallel, but hierarchically structured, algorithms, architectures, and programs for image processing, pattern recognition, and computer vision. The book gives an integrated picture of the programs and algorithms that are being developed, and also of the multi-computer hardware architectures for which these systems are designed.

  12. How to Teach Residue Number System to Computer Scientists and Engineers

    ERIC Educational Resources Information Center

    Navi, K.; Molahosseini, A. S.; Esmaeildoust, M.

    2011-01-01

    The residue number system (RNS) has been an important research field in computer arithmetic for many decades, mainly because of its carry-free nature, which can provide high-performance computing architectures with superior delay specifications. Recently, research on RNS has found new directions that have resulted in the introduction of efficient…

  13. A Research and Development Strategy for High Performance Computing.

    ERIC Educational Resources Information Center

    Office of Science and Technology Policy, Washington, DC.

    This report is the result of a systematic review of the status and directions of high performance computing and its relationship to federal research and development. Conducted by the Federal Coordinating Council for Science, Engineering, and Technology (FCCSET), the review involved a series of workshops attended by numerous computer scientists and…

  14. Relevancy in Problem Solving: A Computational Framework

    ERIC Educational Resources Information Center

    Kwisthout, Johan

    2012-01-01

    When computer scientists discuss the computational complexity of, for example, finding the shortest path from building A to building B in some town or city, their starting point typically is a formal description of the problem at hand, e.g., a graph with weights on every edge where buildings correspond to vertices, routes between buildings to…

  15. Cultivating Critique: A (Humanoid) Response to the Online Teaching of Critical Thinking

    ERIC Educational Resources Information Center

    Waggoner, Matt

    2013-01-01

    The Turing era, defined by British mathematician and computer science pioneer Alan Turing's question about whether or not computers can think, is not over. Philosophers and scientists will continue to haggle over whether thought necessitates intentionality, and whether computation can rise to that level. Meanwhile, another frontier is emerging in…

  16. Knowledge Discovery from Climate Data using Graph-Based Methods

    NASA Astrophysics Data System (ADS)

    Steinhaeuser, K.

    2012-04-01

    Climate and Earth sciences have recently experienced a rapid transformation from a historically data-poor to a data-rich environment, thus bringing them into the realm of the Fourth Paradigm of scientific discovery - a term coined by the late Jim Gray (Hey et al. 2009), the other three being theory, experimentation and computer simulation. In particular, climate-related observations from remote sensors on satellites and weather radars, in situ sensors and sensor networks, as well as outputs of climate or Earth system models from large-scale simulations, provide terabytes of spatio-temporal data. These massive and information-rich datasets offer a significant opportunity for advancing climate science and our understanding of the global climate system, yet current analysis techniques are not able to fully realize their potential benefits. We describe a class of computational approaches, specifically from the data mining and machine learning domains, which may be novel to the climate science domain and can assist in the analysis process. Computer scientists have developed spatial and spatio-temporal analysis techniques for a number of years now, and many of them may be applicable and/or adaptable to problems in climate science. We describe a large-scale, NSF-funded project aimed at addressing climate science question using computational analysis methods; team members include computer scientists, statisticians, and climate scientists from various backgrounds. One of the major thrusts is in the development of graph-based methods, and several illustrative examples of recent work in this area will be presented.

  17. ARES Biennial Report 2012 Final

    NASA Technical Reports Server (NTRS)

    Stansbery, Eileen

    2014-01-01

    Since the return of the first lunar samples, what is now the Astromaterials Research and Exploration Science (ARES) Directorate has had curatorial responsibility for all NASA-held extraterrestrial materials. Originating during the Apollo Program (1960s), this capability at Johnson Space Center (JSC) included scientists who were responsible for the science planning and training of astronauts for lunar surface activities as well as experts in the analysis and preservation of the precious returned samples. Today, ARES conducts research in basic and applied space and planetary science, and its scientific staff represents a broad diversity of expertise in the physical sciences (physics, chemistry, geology, astronomy), mathematics, and engineering organized into three offices (figure 1): Astromaterials Research (KR), Astromaterials Acquisition and Curation (KT), and Human Exploration Science (KX). Scientists within the Astromaterials Acquisition and Curation Office preserve, protect, document, and distribute samples of the current astromaterials collections. Since the return of the first lunar samples, ARES has been assigned curatorial responsibility for all NASA-held extraterrestrial materials (Apollo lunar samples, Antarctic meteorites - some of which have been confirmed to have originated on the Moon and on Mars - cosmic dust, solar wind samples, comet and interstellar dust particles, and space-exposed hardware). The responsibilities of curation consist not only of the longterm care of the samples, but also the support and planning for future sample collection missions and research and technology to enable new sample types. Curation provides the foundation for research into the samples. The Lunar Sample Facility and other curation clean rooms, the data center, laboratories, and associated instrumentation are unique NASA resources that, together with our staff's fundamental understanding of the entire collection, provide a service to the external research community, which relies on access to the samples. The curation efforts are greatly enhanced by a strong group of planetary scientists who conduct peerreviewed astromaterials research. Astromaterials Research Office scientists conduct peer-reviewed research as Principal or Co-Investigators in planetary science (e. g., cosmochemistry, origins of solar systems, Mars fundamental research, planetary geology and geophysics) and participate as Co-Investigators or Participating Scientists in many of NASA's robotic planetary missions. Since the last report, ARES has achieved several noteworthy milestones, some of which are documented in detail in the sections that follow. Within the Human Exploration Science Office, ARES is a world leader in orbital debris research, modeling and monitoring the debris environment, designing debris shielding, and developing policy to control and mitigate the orbital debris population. ARES has aggressively pursued refinements in knowledge of the debris environment and the hazard it presents to spacecraft. Additionally, the ARES Image Science and Analysis Group has been recognized as world class as a result of the high quality of near-real-time analysis of ascent and on-orbit inspection imagery to identify debris shedding, anomalies, and associated potential damage during Space Shuttle missions. ARES Earth scientists manage and continuously update the database of astronaut photography that is predominantly from Shuttle and ISS missions, but also includes the results of 40 years of human spaceflight. The Crew Earth Observations Web site (http://eol.jsc.nasa.gov/Education/ESS/crew.htm) continues to receive several million hits per month. ARES scientists are also influencing decisions in the development of the next generation of human and robotic spacecraft and missions through laboratory tests on the optical qualities of materials for windows, micrometeoroid/orbital debris shielding technology, and analog activities to assess surface science operations. ARES serves as host to numerous students and visiting scientists as part of the services provided to the research community and conducts a robust education and outreach program. ARES scientists are recognized nationally and internationally by virtue of their success in publishing in peer-reviewed journals and winning competitive research proposals. ARES scientists have won every major award presented by the Meteoritical Society, including the Leonard Medal, the most prestigious award in planetary science and cosmochemistry; the Barringer Medal, recognizing outstanding work in the field of impact cratering; the Nier Prize for outstanding research by a young scientist; and several recipients of the Nininger Meteorite Award. One of our scientists received the Department of Defense (DoD) Joint Meritorious Civilian Service Award (the highest civilian honor given by the DoD). ARES has established numerous partnerships with other NASA Centers, universities, and national laboratories. ARES scientists serve as journal editors, members of advisory panels and review committees, and society officers, and several scientists have been elected as Fellows in their professional societies. This biennial report summarizes a subset of the accomplishments made by each of the ARES offices and highlights participation in ongoing human and robotic missions, development of new missions, and planning for future human and robotic exploration of the solar system beyond low Earth orbit.

  18. In Brief: Nominations requested for U.S. science medals

    NASA Astrophysics Data System (ADS)

    Showstack, Randy

    2011-02-01

    Scientists can help recognize the contributions of colleagues by submitting nominations for the National Medal of Science and the National Medal of Technology and Innovation, which are the highest honors the president bestows in science, technology, and innovation. The National Medal of Science, the nation's highest honor for American scientists and engineers, is given to individuals deserving special recognition for outstanding contributions to knowledge, or the total impact of their work, in the chemical, physical, biological, mathematical, engineering, or behavioral sciences. Nominations and three letters of support must be submitted by 31 March. For more information, contact program manager Mayra Montrose at nms@nsf.gov or +1-703-292-8040, or visit http://www.nsf.gov/od/nms/medal.jsp.

  19. Scientific and Regulatory Perspectives in Herbal and Dietary Supplement Associated Hepatotoxicity in the United States

    PubMed Central

    Avigan, Mark I.; Mozersky, Robert P.; Seeff, Leonard B.

    2016-01-01

    In the United States (US), the risk of hepatotoxicity linked to the widespread use of certain herbal products has gained increased attention among regulatory scientists. Based on current US law, all dietary supplements sold domestically, including botanical supplements, are regulated by the Food and Drug Administration (FDA) as a special category of foods. Under this designation, regulatory scientists do not routinely evaluate the efficacy of these products prior to their marketing, despite the content variability and phytochemical complexity that often characterizes them. Nonetheless, there has been notable progress in the development of advanced scientific methods to qualitatively and quantitatively measure ingredients and screen for contaminants and adulterants in botanical products when hepatotoxicity is recognized. PMID:26950122

  20. Benefits of an inclusive US education system.

    PubMed

    Gantt, Elisabeth

    2013-01-01

    Presented is a historical perspective of one scientist's journey from war-torn Europe to the opportunities presented by a flexible US educational system. It celebrates the opening of the science establishment that began in the 1950s and its fostering of basic research, and recognizes individuals who were instrumental in guiding the author's education as well as those with whom she later participated in collaborative algal plant research. The initial discovery and later elucidation of phycobilisome structure are elaborated, including the structural connection with photosystem II. Furthermore, she summarizes some of her laboratory's results on carotenoids and its exploration of the isoprenoid pathway in cyanobacteria. Finally, she comments on the gender gap and how her generation benefited when opportunities for women scientists were enlarged.

  1. The house of the future

    ScienceCinema

    None

    2017-12-09

    Learn what it will take to create tomorrow's net-zero energy home as scientists reveal the secrets of cool roofs, smart windows, and computer-driven energy control systems. The net-zero energy home: Scientists are working to make tomorrow's homes more than just energy efficient -- they want them to be zero energy. Iain Walker, a scientist in the Lab's Energy Performance of Buildings Group, will discuss what it takes to develop net-zero energy houses that generate as much energy as they use through highly aggressive energy efficiency and on-site renewable energy generation. Talking back to the grid: Imagine programming your house to use less energy if the electricity grid is full or price are high. Mary Ann Piette, deputy director of Berkeley Lab's building technology department and director of the Lab's Demand Response Research Center, will discuss how new technologies are enabling buildings to listen to the grid and automatically change their thermostat settings or lighting loads, among other demands, in response to fluctuating electricity prices. The networked (and energy efficient) house: In the future, your home's lights, climate control devices, computers, windows, and appliances could be controlled via a sophisticated digital network. If it's plugged in, it'll be connected. Bruce Nordman, an energy scientist in Berkeley Lab's Energy End-Use Forecasting group, will discuss how he and other scientists are working to ensure these networks help homeowners save energy.

  2. The house of the future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Learn what it will take to create tomorrow's net-zero energy home as scientists reveal the secrets of cool roofs, smart windows, and computer-driven energy control systems. The net-zero energy home: Scientists are working to make tomorrow's homes more than just energy efficient -- they want them to be zero energy. Iain Walker, a scientist in the Lab's Energy Performance of Buildings Group, will discuss what it takes to develop net-zero energy houses that generate as much energy as they use through highly aggressive energy efficiency and on-site renewable energy generation. Talking back to the grid: Imagine programming your house tomore » use less energy if the electricity grid is full or price are high. Mary Ann Piette, deputy director of Berkeley Lab's building technology department and director of the Lab's Demand Response Research Center, will discuss how new technologies are enabling buildings to listen to the grid and automatically change their thermostat settings or lighting loads, among other demands, in response to fluctuating electricity prices. The networked (and energy efficient) house: In the future, your home's lights, climate control devices, computers, windows, and appliances could be controlled via a sophisticated digital network. If it's plugged in, it'll be connected. Bruce Nordman, an energy scientist in Berkeley Lab's Energy End-Use Forecasting group, will discuss how he and other scientists are working to ensure these networks help homeowners save energy.« less

  3. Synergies and Distinctions between Computational Disciplines in Biomedical Research: Perspective from the Clinical and Translational Science Award Programs

    PubMed Central

    Bernstam, Elmer V.; Hersh, William R.; Johnson, Stephen B.; Chute, Christopher G.; Nguyen, Hien; Sim, Ida; Nahm, Meredith; Weiner, Mark; Miller, Perry; DiLaura, Robert P.; Overcash, Marc; Lehmann, Harold P.; Eichmann, David; Athey, Brian D.; Scheuermann, Richard H.; Anderson, Nick; Starren, Justin B.; Harris, Paul A.; Smith, Jack W.; Barbour, Ed; Silverstein, Jonathan C.; Krusch, David A.; Nagarajan, Rakesh; Becich, Michael J.

    2010-01-01

    Clinical and translational research increasingly requires computation. Projects may involve multiple computationally-oriented groups including information technology (IT) professionals, computer scientists and biomedical informaticians. However, many biomedical researchers are not aware of the distinctions among these complementary groups, leading to confusion, delays and sub-optimal results. Although written from the perspective of clinical and translational science award (CTSA) programs within academic medical centers, the paper addresses issues that extend beyond clinical and translational research. The authors describe the complementary but distinct roles of operational IT, research IT, computer science and biomedical informatics using a clinical data warehouse as a running example. In general, IT professionals focus on technology. The authors distinguish between two types of IT groups within academic medical centers: central or administrative IT (supporting the administrative computing needs of large organizations) and research IT (supporting the computing needs of researchers). Computer scientists focus on general issues of computation such as designing faster computers or more efficient algorithms, rather than specific applications. In contrast, informaticians are concerned with data, information and knowledge. Biomedical informaticians draw on a variety of tools, including but not limited to computers, to solve information problems in health care and biomedicine. The paper concludes with recommendations regarding administrative structures that can help to maximize the benefit of computation to biomedical research within academic health centers. PMID:19550198

  4. 30ièmes Journées Franco-Belges de Pharmacochimie

    PubMed Central

    Aci-Sèche, Samia; Buron, Frédéric; Plé, Karen; Robin, Laurent; Suzenet, Franck; Routier, Sylvain

    2016-01-01

    The “Journées Franco-Belges de Pharmacochimie” is a recognized annual meeting in organic and medicinal chemistry known for the quality of scientific exchange and conviviality. Young researchers were encouraged to present their work and share ideas with senior scientists. Abstracts of plenary lectures, oral communications, and posters presented during the meeting are collected in this report. PMID:27869720

  5. T-Cell Warriors—Equipped to Kill Cancer Cells | Center for Cancer Research

    Cancer.gov

    When the body recognizes tumor cells as foreign, a natural immune response arises to attack them. Unfortunately, tumors have ways to evade immune surveillance systems and antitumor responses are often too weak to defeat the disease. Rather than relying on the body’s natural response, scientists can now manipulate a patient’s own immune cells so that they latch on to tumor

  6. Information Superhighway: The Role of Librarians, Information Scientists, and Intermediaries. Proceedings of the International Essen Symposium (17th, Essen, Germany, October 24-27, 1994).

    ERIC Educational Resources Information Center

    Helal, Ahmed H., Ed.; Weiss, Joachim W., Ed.

    The emphasis of the symposium was the Internet, or information superhighway, and the provision of information services to end users. Many internationally recognized librarians shared their experiences and expressed their ideas on new developments and possibilities related to the information superhighway. The 34 papers presented at the symposium…

  7. Steltzer Receives 2013 Sulzman Award for Excellence in Education and Mentoring: Response

    NASA Astrophysics Data System (ADS)

    Steltzer, Heidi

    2014-07-01

    I am honored to receive the AGU Sulzman Award and am especially honored to be the first recipient of this award established in memory of Elizabeth Sulzman. At the 2013 Fall Meeting, I learned about the thought and work that went into establishing this award and want to thank all who contributed to its establishment. Awards that recognize outstanding female scientists are needed.

  8. Summer of Innovation Kick Off

    NASA Image and Video Library

    2010-06-09

    A group of Jet Propulsion Laboratory (JPL) engineers are recognized during the kick off of NASA's Summer of Innovation program at JPL in Pasadena, Calif., Thursday, June 10, 2010. Through the program, NASA will engage thousands of middle school students and teachers in stimulating math and science-based education programs with the goal of increasing the number of future scientists, mathematicians, and engineers. Photo Credit: (NASA/Bill Ingalls)

  9. AACR Honors Louis Staudt with Princess Takamatsu Memorial Lectureship | Center for Cancer Research

    Cancer.gov

    The American Association for Cancer Research has awarded Louis M. Staudt, Co-Chief of CCR’s Lymphoid Malignancies Branch, its Princess Takamatsu Memorial Lectureship. The lectureship recognizes scientists whose work has had or may have a far-reaching impact on the detection, diagnosis, treatment or prevention of cancer and who embody the dedication of the princess to multinational collaborations. Learn more...  

  10. Climate Communication from a Science Perspective

    NASA Astrophysics Data System (ADS)

    Somerville, R. C.

    2012-12-01

    Today, the world faces crucial choices in deciding what to do about climate change. Wise policy can be usefully informed by sound science. Scientists who are both climate experts and skilled communicators can provide valuable input into this policy process. They can help the public, media and policymakers learn what science has discovered about climate change. Scientists as a group are widely admired throughout the world. They can often use their prestige as well as their technical knowledge to advantage in publicizing and illuminating the findings of climate science. However, most scientists are unaware of the main obstacles to effective communication, such as the distrust that arises when the scientist and the audience do not have a shared worldview and shared cultural values. Many climate scientists also fail to realize that the jargon they use in their work is a significant barrier to communication, and that their messages requires skilled translation into the everyday language that people understand. Scientists need to recognize that lecturing is almost always poor communication. Speaking in a television interview or a Congressional hearing is completely unlike teaching a class of graduate students. The people whom one is trying to reach are rarely hungry for pure scientific information. Instead, they want to know how climate change will affect them and what can be done about it. Communicating climate science resembles skiing or speaking a foreign language: it is a skill that can be learned, but beginners are well advised to take lessons from expert instructors. Becoming adept at climate communication requires study and practice. Effective professional training in climate communication is available for those scientists who have the time and the willingness to improve as communicators.

  11. CGAT: a model for immersive personalized training in computational genomics

    PubMed Central

    Sims, David; Ponting, Chris P.

    2016-01-01

    How should the next generation of genomics scientists be trained while simultaneously pursuing high quality and diverse research? CGAT, the Computational Genomics Analysis and Training programme, was set up in 2010 by the UK Medical Research Council to complement its investment in next-generation sequencing capacity. CGAT was conceived around the twin goals of training future leaders in genome biology and medicine, and providing much needed capacity to UK science for analysing genome scale data sets. Here we outline the training programme employed by CGAT and describe how it dovetails with collaborative research projects to launch scientists on the road towards independent research careers in genomics. PMID:25981124

  12. Research Projects, Technical Reports and Publications

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph

    1996-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. A flexible scientific staff is provided through a university faculty visitor program, a post doctoral program, and a student visitor program. Not only does this provide appropriate expertise but it also introduces scientists outside of NASA to NASA problems. A small group of core RIACS staff provides continuity and interacts with an ARC technical monitor and scientific advisory group to determine the RIACS mission. RIACS activities are reviewed and monitored by a USRA advisory council and ARC technical monitor. Research at RIACS is currently being done in the following areas: Advanced Methods for Scientific Computing High Performance Networks During this report pefiod Professor Antony Jameson of Princeton University, Professor Wei-Pai Tang of the University of Waterloo, Professor Marsha Berger of New York University, Professor Tony Chan of UCLA, Associate Professor David Zingg of University of Toronto, Canada and Assistant Professor Andrew Sohn of New Jersey Institute of Technology have been visiting RIACS. January 1, 1996 through September 30, 1996 RIACS had three staff scientists, four visiting scientists, one post-doctoral scientist, three consultants, two research associates and one research assistant. RIACS held a joint workshop with Code 1 29-30 July 1996. The workshop was held to discuss needs and opportunities in basic research in computer science in and for NASA applications. There were 14 talks given by NASA, industry and university scientists and three open discussion sessions. There were approximately fifty participants. A proceedings is being prepared. It is planned to have similar workshops on an annual basis. RIACS technical reports are usually preprints of manuscripts that have been submitted to research 'ournals or conference proceedings. A list of these reports for the period January i 1, 1996 through September 30, 1996 is in the Reports and Abstracts section of this report.

  13. Determination of Perceptions of the Teacher Candidates Studying in the Computer and Instructional Technology Department towards Human-Computer Interaction and Related Basic Concepts

    ERIC Educational Resources Information Center

    Kiyici, Mubin

    2011-01-01

    HCI is a field which has an increasing popularity by virtue of the spread of the computers and internet and gradually contributes to the production of the user-friendlier software and hardware with the contribution of the scientists from different disciplines. Teacher candidates studying at the computer and instructional technologies department…

  14. Continuing professional development crediting system for specialists in laboratory medicine within 28 EFLM national societies.

    PubMed

    Topic, Elizabeta; Beletic, Andjelo; Zima, Tomas

    2013-01-01

    Continuing professional development (CPD) with corresponding crediting system is recognized as essential for the laboratory medicine specialists to provide optimal service for the patients. Article presents results of the survey evaluating current CPD crediting practice among members of European Federation of Clinical Chemistry and Laboratory Medicine (EFLM). A questionnaire had been forwarded to presidents/national representatives of all EFLM members, with invitation to provide information about CPD programmes and crediting policies, as well as feedback on individual CPD categories, through scoring their relevance. Complete or partial answers were received from 28 of 38 members. In 23 countries, CPD programmes exist and earn credits, with 19 of them offering access to non-medical scientists. CPD activities are evaluated in all participating countries, regardless to the existence of an official CPD programme. Among participating members with mandatory specialists' licensing (22/28), CPD is a prerequisite for relicensing in 13 countries. Main categories recognized as CPD are: continuing education (24 countries), article/book (17/14 countries) authorship and distance learning (14 countries). The highest median score of relevance (20) is allocated to professional training, editor/authorship and official activities in professional organizations, with the first category showing the least variation among scores. Majority of EFLM members have developed CPD programmes, regularly evaluated and accompanied by crediting systems. Programmes differ in accessibility for non-medical scientists and impact on relicensing eligibility. Continuing education, authorship and e-learning are mainly recognized as CPD activities, although the professional training is appreciated as the most important individual CPD category.

  15. Continuing professional development crediting system for specialists in laboratory medicine within 28 EFLM national societies

    PubMed Central

    Topic, Elizabeta; Beletic, Andjelo; Zima, Tomas

    2013-01-01

    Introduction: Continuing professional development (CPD) with corresponding crediting system is recognized as essential for the laboratory medicine specialists to provide optimal service for the patients. Article presents results of the survey evaluating current CPD crediting practice among members of European Federation of Clinical Chemistry and Laboratory Medicine (EFLM). Materials and methods: A questionnaire had been forwarded to presidents/national representatives of all EFLM members, with invitation to provide information about CPD programmes and crediting policies, as well as feedback on individual CPD categories, through scoring their relevance. Results: Complete or partial answers were received from 28 of 38 members. In 23 countries, CPD programmes exist and earn credits, with 19 of them offering access to non-medical scientists. CPD activities are evaluated in all participating countries, regardless to the existence of an official CPD programme. Among participating members with mandatory specialists’ licensing (22/28), CPD is a prerequisite for relicensing in 13 countries. Main categories recognized as CPD are: continuing education (24 countries), article/book (17/14 countries) authorship and distance learning (14 countries). The highest median score of relevance (20) is allocated to professional training, editor/authorship and official activities in professional organizations, with the first category showing the least variation among scores. Conclusions: Majority of EFLM members have developed CPD programmes, regularly evaluated and accompanied by crediting systems. Programmes differ in accessibility for non-medical scientists and impact on relicensing eligibility. Continuing education, authorship and e-learning are mainly recognized as CPD activities, although the professional training is appreciated as the most important individual CPD category. PMID:24266304

  16. Students as Virtual Scientists: An Exploration of Students' and Teachers' Perceived Realness of a Remote Electron Microscopy Investigation

    ERIC Educational Resources Information Center

    Childers, Gina; Jones, M. Gail

    2015-01-01

    Remote access technologies enable students to investigate science by utilizing scientific tools and communicating in real-time with scientists and researchers with only a computer and an Internet connection. Very little is known about student perceptions of how real remote investigations are and how immersed the students are in the experience.…

  17. Resident research associateships, postdoctoral research awards 1989: opportunities for research at the U.S. Geological Survey, U.S. Department of the Interior

    USGS Publications Warehouse

    ,; ,

    1989-01-01

    The scientists of the U.S. Geological Survey are engaged in a wide range of geologic, geophysical, geochemical, hydrologic, and cartographic programs, including the application of computer science to them. These programs offer exciting possibilities for scientific achievement and professional growth to young scientists through participation as Research Associates.

  18. Biography Today: Profiles of People of Interest to Young Readers. Scientists & Inventors Series, Volume 5.

    ERIC Educational Resources Information Center

    Abbey, Cherie D., Ed.

    This book, a special volume focusing on computer-related scientists and inventors, provides 12 biographical profiles of interest to readers ages 9 and above. The Biography Today series was created to appeal to young readers in a format they can enjoy reading and readily understand. Each entry provides at least one picture of the individual…

  19. A survey of big data research

    PubMed Central

    Fang, Hua; Zhang, Zhaoyang; Wang, Chanpaul Jin; Daneshmand, Mahmoud; Wang, Chonggang; Wang, Honggang

    2015-01-01

    Big data create values for business and research, but pose significant challenges in terms of networking, storage, management, analytics and ethics. Multidisciplinary collaborations from engineers, computer scientists, statisticians and social scientists are needed to tackle, discover and understand big data. This survey presents an overview of big data initiatives, technologies and research in industries and academia, and discusses challenges and potential solutions. PMID:26504265

  20. Integrating citizen-science data with movement models to estimate the size of a migratory golden eagle population

    Treesearch

    Andrew J. Dennhardt; Adam E. Duerr; David Brandes; Todd E. Katzner

    2015-01-01

    Estimating population size is fundamental to conservation and management. Population size is typically estimated using survey data, computer models, or both. Some of the most extensive and often least expensive survey data are those collected by citizen-scientists. A challenge to citizen-scientists is that the vagility of many organisms can complicate data collection....

  1. Welcome to the NASA High Performance Computing and Communications Computational Aerosciences (CAS) Workshop 2000

    NASA Technical Reports Server (NTRS)

    Schulbach, Catherine H. (Editor)

    2000-01-01

    The purpose of the CAS workshop is to bring together NASA's scientists and engineers and their counterparts in industry, other government agencies, and academia working in the Computational Aerosciences and related fields. This workshop is part of the technology transfer plan of the NASA High Performance Computing and Communications (HPCC) Program. Specific objectives of the CAS workshop are to: (1) communicate the goals and objectives of HPCC and CAS, (2) promote and disseminate CAS technology within the appropriate technical communities, including NASA, industry, academia, and other government labs, (3) help promote synergy among CAS and other HPCC scientists, and (4) permit feedback from peer researchers on issues facing High Performance Computing in general and the CAS project in particular. This year we had a number of exciting presentations in the traditional aeronautics, aerospace sciences, and high-end computing areas and in the less familiar (to many of us affiliated with CAS) earth science, space science, and revolutionary computing areas. Presentations of more than 40 high quality papers were organized into ten sessions and presented over the three-day workshop. The proceedings are organized here for easy access: by author, title and topic.

  2. Scalable data management, analysis and visualization (SDAV) Institute. Final Scientific/Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geveci, Berk

    The purpose of the SDAV institute is to provide tools and expertise in scientific data management, analysis, and visualization to DOE’s application scientists. Our goal is to actively work with application teams to assist them in achieving breakthrough science, and to provide technical solutions in the data management, analysis, and visualization regimes that are broadly used by the computational science community. Over the last 5 years members of our institute worked directly with application scientists and DOE leadership-class facilities to assist them by applying the best tools and technologies at our disposal. We also enhanced our tools based on inputmore » from scientists on their needs. Many of the applications we have been working with are based on connections with scientists established in previous years. However, we contacted additional scientists though our outreach activities, as well as engaging application teams running on leading DOE computing systems. Our approach is to employ an evolutionary development and deployment process: first considering the application of existing tools, followed by the customization necessary for each particular application, and then the deployment in real frameworks and infrastructures. The institute is organized into three areas, each with area leaders, who keep track of progress, engagement of application scientists, and results. The areas are: (1) Data Management, (2) Data Analysis, and (3) Visualization. Kitware has been involved in the Visualization area. This report covers Kitware’s contributions over the last 5 years (February 2012 – February 2017). For details on the work performed by the SDAV institute as a whole, please see the SDAV final report.« less

  3. A data mining technique for discovering distinct patterns of hand signs: implications in user training and computer interface design.

    PubMed

    Ye, Nong; Li, Xiangyang; Farley, Toni

    2003-01-15

    Hand signs are considered as one of the important ways to enter information into computers for certain tasks. Computers receive sensor data of hand signs for recognition. When using hand signs as computer inputs, we need to (1) train computer users in the sign language so that their hand signs can be easily recognized by computers, and (2) design the computer interface to avoid the use of confusing signs for improving user input performance and user satisfaction. For user training and computer interface design, it is important to have a knowledge of which signs can be easily recognized by computers and which signs are not distinguishable by computers. This paper presents a data mining technique to discover distinct patterns of hand signs from sensor data. Based on these patterns, we derive a group of indistinguishable signs by computers. Such information can in turn assist in user training and computer interface design.

  4. Moon Search Algorithms for NASA's Dawn Mission to Asteroid Vesta

    NASA Technical Reports Server (NTRS)

    Memarsadeghi, Nargess; Mcfadden, Lucy A.; Skillman, David R.; McLean, Brian; Mutchler, Max; Carsenty, Uri; Palmer, Eric E.

    2012-01-01

    A moon or natural satellite is a celestial body that orbits a planetary body such as a planet, dwarf planet, or an asteroid. Scientists seek understanding the origin and evolution of our solar system by studying moons of these bodies. Additionally, searches for satellites of planetary bodies can be important to protect the safety of a spacecraft as it approaches or orbits a planetary body. If a satellite of a celestial body is found, the mass of that body can also be calculated once its orbit is determined. Ensuring the Dawn spacecraft's safety on its mission to the asteroid Vesta primarily motivated the work of Dawn's Satellite Working Group (SWG) in summer of 2011. Dawn mission scientists and engineers utilized various computational tools and techniques for Vesta's satellite search. The objectives of this paper are to 1) introduce the natural satellite search problem, 2) present the computational challenges, approaches, and tools used when addressing this problem, and 3) describe applications of various image processing and computational algorithms for performing satellite searches to the electronic imaging and computer science community. Furthermore, we hope that this communication would enable Dawn mission scientists to improve their satellite search algorithms and tools and be better prepared for performing the same investigation in 2015, when the spacecraft is scheduled to approach and orbit the dwarf planet Ceres.

  5. A short course on measure and probability theories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pebay, Philippe Pierre

    2004-02-01

    This brief Introduction to Measure Theory, and its applications to Probabilities, corresponds to the lecture notes of a seminar series given at Sandia National Laboratories in Livermore, during the spring of 2003. The goal of these seminars was to provide a minimal background to Computational Combustion scientists interested in using more advanced stochastic concepts and methods, e.g., in the context of uncertainty quantification. Indeed, most mechanical engineering curricula do not provide students with formal training in the field of probability, and even in less in measure theory. However, stochastic methods have been used more and more extensively in the pastmore » decade, and have provided more successful computational tools. Scientists at the Combustion Research Facility of Sandia National Laboratories have been using computational stochastic methods for years. Addressing more and more complex applications, and facing difficult problems that arose in applications showed the need for a better understanding of theoretical foundations. This is why the seminar series was launched, and these notes summarize most of the concepts which have been discussed. The goal of the seminars was to bring a group of mechanical engineers and computational combustion scientists to a full understanding of N. WIENER'S polynomial chaos theory. Therefore, these lectures notes are built along those lines, and are not intended to be exhaustive. In particular, the author welcomes any comments or criticisms.« less

  6. Experiences with Efficient Methodologies for Teaching Computer Programming to Geoscientists

    ERIC Educational Resources Information Center

    Jacobs, Christian T.; Gorman, Gerard J.; Rees, Huw E.; Craig, Lorraine E.

    2016-01-01

    Computer programming was once thought of as a skill required only by professional software developers. But today, given the ubiquitous nature of computation and data science it is quickly becoming necessary for all scientists and engineers to have at least a basic knowledge of how to program. Teaching how to program, particularly to those students…

  7. Creating a Pipeline for African American Computing Science Faculty: An Innovative Faculty/Research Mentoring Program Model

    ERIC Educational Resources Information Center

    Charleston, LaVar J.; Gilbert, Juan E.; Escobar, Barbara; Jackson, Jerlando F. L.

    2014-01-01

    African Americans represent 1.3% of all computing sciences faculty in PhD-granting departments, underscoring the severe underrepresentation of Black/African American tenure-track faculty in computing (CRA, 2012). The Future Faculty/Research Scientist Mentoring (FFRM) program, funded by the National Science Foundation, was found to be an effective…

  8. ICASE

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in the areas of (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest, including acoustics and combustion; (3) experimental research in transition and turbulence and aerodynamics involving Langley facilities and scientists; and (4) computer science.

  9. Is there a glass ceiling for highly cited scientists at the top of research universities?

    PubMed

    Ioannidis, John P A

    2010-12-01

    University leaders aim to protect, shape, and promote the missions of their institutions. I evaluated whether top highly cited scientists are likely to occupy these positions. Of the current leaders of 96 U.S. high research activity universities, only 6 presidents or chancellors were found among the 4009 U.S. scientists listed in the ISIHighlyCited.com database. Of the current leaders of 77 UK universities, only 2 vice-chancellors were found among the 483 UK scientists listed in the same database. In a sample of 100 top-cited clinical medicine scientists and 100 top-cited biology and biochemistry scientists, only 1 and 1, respectively, had served at any time as president of a university. Among the leaders of 25 U.S. universities with the highest citation volumes, only 12 had doctoral degrees in life, natural, physical or computer sciences, and 5 of these 12 had a Hirsch citation index m < 1.0. The participation of highly cited scientists in the top leadership of universities is limited. This could have consequences for the research and overall mission of universities.

  10. New computer system simplifies programming of mathematical equations

    NASA Technical Reports Server (NTRS)

    Reinfelds, J.; Seitz, R. N.; Wood, L. H.

    1966-01-01

    Automatic Mathematical Translator /AMSTRAN/ permits scientists or engineers to enter mathematical equations in their natural mathematical format and to obtain an immediate graphical display of the solution. This automatic-programming, on-line, multiterminal computer system allows experienced programmers to solve nonroutine problems.

  11. NASA/DOD Aerospace Knowledge Diffusion Research Project. Paper 14: An analysis of the technical communications practices reported by Israeli and US aerospace engineers and scientists

    NASA Technical Reports Server (NTRS)

    Barclay, Rebecca O.; Pinelli, Thomas E.; Elazar, David; Kennedy, John M.

    1991-01-01

    As part of Phase 4 of the NASA/DoD Aerospace Knowledge Diffusion Research Project, two pilot studies were conducted that investigated the technical communications practices of Israeli and U.S. aerospace engineers and scientists. Both studies had the same five objectives: first, to solicit the opinions of aerospace engineers and scientists regarding the importance of technical communications to their profession; second, to determine the use and production of technical communications by aerospace engineers and scientists; third, to seek their view about the appropriate content of an undergraduate course in technical communications; fourth, to determine aerospace engineers' and scientists' use of libraries, technical information centers, and on-line databases; and fifth, to determine the use and importance of computer and information technology to them. A self-administered questionnaire was mailed to randomly selected U.S. aerospace engineers and scientists who are working in cryogenics, adaptive walls, and magnetic suspension. A slightly modified version was sent to Israeli aerospace engineers and scientists working at Israel Aircraft Industries, LTD. Responses of the Israeli and U.S. aerospace engineers and scientists to selected questions are presented in this paper.

  12. Science& Technology Review June 2003

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McMahon, D

    This month's issue has the following articles: (1) Livermore's Three-Pronged Strategy for High-Performance Computing, Commentary by Dona Crawford; (2) Riding the Waves of Supercomputing Technology--Livermore's Computation Directorate is exploiting multiple technologies to ensure high-performance, cost-effective computing; (3) Chromosome 19 and Lawrence Livermore Form a Long-Lasting Bond--Lawrence Livermore biomedical scientists have played an important role in the Human Genome Project through their long-term research on chromosome 19; (4) A New Way to Measure the Mass of Stars--For the first time, scientists have determined the mass of a star in isolation from other celestial bodies; and (5) Flexibly Fueled Storage Tank Bringsmore » Hydrogen-Powered Cars Closer to Reality--Livermore's cryogenic hydrogen fuel storage tank for passenger cars of the future can accommodate three forms of hydrogen fuel separately or in combination.« less

  13. Triangle Computer Science Distinguished Lecture Series

    DTIC Science & Technology

    2018-01-30

    scientific inquiry - the cell, the brain, the market - as well as in the models developed by scientists over the centuries for studying them. Human...the great objects of scientific inquiry - the cell, the brain, the market - as well as in the models developed by scientists over the centuries for...in principle , secure system operation can be achieved. Massive-Scale Streaming Analytics David Bader, Georgia Institute of Technology (telecast from

  14. Amplify scientific discovery with artificial intelligence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gil, Yolanda; Greaves, Mark T.; Hendler, James

    Computing innovations have fundamentally changed many aspects of scientific inquiry. For example, advances in robotics, high-end computing, networking, and databases now underlie much of what we do in science such as gene sequencing, general number crunching, sharing information between scientists, and analyzing large amounts of data. As computing has evolved at a rapid pace, so too has its impact in science, with the most recent computing innovations repeatedly being brought to bear to facilitate new forms of inquiry. Recently, advances in Artificial Intelligence (AI) have deeply penetrated many consumer sectors, including for example Apple’s Siri™ speech recognition system, real-time automatedmore » language translation services, and a new generation of self-driving cars and self-navigating drones. However, AI has yet to achieve comparable levels of penetration in scientific inquiry, despite its tremendous potential in aiding computers to help scientists tackle tasks that require scientific reasoning. We contend that advances in AI will transform the practice of science as we are increasingly able to effectively and jointly harness human and machine intelligence in the pursuit of major scientific challenges.« less

  15. The Computer Simulation of Liquids by Molecular Dynamics.

    ERIC Educational Resources Information Center

    Smith, W.

    1987-01-01

    Proposes a mathematical computer model for the behavior of liquids using the classical dynamic principles of Sir Isaac Newton and the molecular dynamics method invented by other scientists. Concludes that other applications will be successful using supercomputers to go beyond simple Newtonian physics. (CW)

  16. Carbon Smackdown: Visualizing Clean Energy (LBNL Summer Lecture Series)

    ScienceCinema

    Meza, Juan [LBNL Computational Research Division

    2017-12-09

    The final Carbon Smackdown match took place Aug. 9, 2010. Juan Meza of the Computational Research Division revealed how scientists use computer visualizations to accelerate climate research and discuss the development of next-generation clean energy technologies such as wind turbines and solar cells.

  17. Interfacing the Experimenter to the Computer: Languages for Psychologists

    ERIC Educational Resources Information Center

    Wood, Ronald W.; And Others

    1975-01-01

    An examination and comparison of the computer languages which behavioral scientists are most likely to use: SCAT, INTERACT, SKED, OS/8 Fortran IV, RT11/Fortran, RSX-11M, Data General's Real-Time; Disk Operating System and its Fortran, and interpretative Languages. (EH)

  18. Programming Digital Stories and How-to Animations

    ERIC Educational Resources Information Center

    Hansen, Alexandria Killian; Iveland, Ashley; Harlow, Danielle Boyd; Dwyer, Hilary; Franklin, Diana

    2015-01-01

    As science teachers continue preparing for implementation of the "Next Generation Science Standards," one recommendation is to use computer programming as a promising context to efficiently integrate science and engineering. In this article, a interdisciplinary team of educational researchers and computer scientists describe how to use…

  19. EASI: An electronic assistant for scientific investigation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schur, A.; Feller, D.; DeVaney, M.

    1991-09-01

    Although many automated tools support the productivity of professionals (engineers, managers, architects, secretaries, etc.), none specifically address the needs of the scientific researcher. The scientist's needs are complex and the primary activities are cognitive rather than physical. The individual scientist collects and manipulates large data sets, integrates, synthesizes, generates, and records information. The means to access and manipulate information are a critical determinant of the performance of the system as a whole. One hindrance in this process is the scientist's computer environment, which has changed little in the last two decades. Extensive time and effort is demanded from the scientistmore » to learn to use the computer system. This paper describes how chemists' activities and interactions with information were abstracted into a common paradigm that meets the critical requirement of facilitating information access and retrieval. This paradigm was embodied in EASI, a working prototype that increased the productivity of the individual scientific researcher. 4 refs., 2 figs., 1 tab.« less

  20. Educational NASA Computational and Scientific Studies (enCOMPASS)

    NASA Technical Reports Server (NTRS)

    Memarsadeghi, Nargess

    2013-01-01

    Educational NASA Computational and Scientific Studies (enCOMPASS) is an educational project of NASA Goddard Space Flight Center aimed at bridging the gap between computational objectives and needs of NASA's scientific research, missions, and projects, and academia's latest advances in applied mathematics and computer science. enCOMPASS achieves this goal via bidirectional collaboration and communication between NASA and academia. Using developed NASA Computational Case Studies in university computer science/engineering and applied mathematics classes is a way of addressing NASA's goals of contributing to the Science, Technology, Education, and Math (STEM) National Objective. The enCOMPASS Web site at http://encompass.gsfc.nasa.gov provides additional information. There are currently nine enCOMPASS case studies developed in areas of earth sciences, planetary sciences, and astrophysics. Some of these case studies have been published in AIP and IEEE's Computing in Science and Engineering magazines. A few university professors have used enCOMPASS case studies in their computational classes and contributed their findings to NASA scientists. In these case studies, after introducing the science area, the specific problem, and related NASA missions, students are first asked to solve a known problem using NASA data and past approaches used and often published in a scientific/research paper. Then, after learning about the NASA application and related computational tools and approaches for solving the proposed problem, students are given a harder problem as a challenge for them to research and develop solutions for. This project provides a model for NASA scientists and engineers on one side, and university students, faculty, and researchers in computer science and applied mathematics on the other side, to learn from each other's areas of work, computational needs and solutions, and the latest advances in research and development. This innovation takes NASA science and engineering applications to computer science and applied mathematics university classes, and makes NASA objectives part of the university curricula. There is great potential for growth and return on investment of this program to the point where every major university in the U.S. would use at least one of these case studies in one of their computational courses, and where every NASA scientist and engineer facing a computational challenge (without having resources or expertise to solve it) would use enCOMPASS to formulate the problem as a case study, provide it to a university, and get back their solutions and ideas.

  1. Theoretical integration in motivational science: System justification as one of many "autonomous motivational structures".

    PubMed

    Kay, Aaron C; Jost, John T

    2014-04-01

    Recognizing that there is a multiplicity of motives - and that the accessibility and strength of each one varies chronically and temporarily - is essential if motivational scientists are to achieve genuine theoretical and empirical integration. We agree that system justification is a case of nonconscious goal pursuit and discuss implications of the fact that it conflicts with many other psychological goals.

  2. A resolution recognizing the importance of cancer research and the contributions of scientists, clinicians, and patient advocates across the United States who are dedicated to finding a cure for cancer, and designating May 2014 as "National Cancer Research Month".

    THOMAS, 113th Congress

    Sen. Feinstein, Dianne [D-CA

    2014-05-14

    Senate - 05/21/2014 Resolution agreed to in Senate without amendment and with a preamble by Unanimous Consent. (All Actions) Tracker: This bill has the status Agreed to in SenateHere are the steps for Status of Legislation:

  3. A resolution recognizing the importance of cancer research and the contributions made by scientists and clinicians across the United States who are dedicated to finding a cure for cancer, and designating May 2011, as "National Cancer Research Month".

    THOMAS, 112th Congress

    Sen. Feinstein, Dianne [D-CA

    2011-05-05

    Senate - 05/26/2011 Resolution agreed to in Senate without amendment and with a preamble by Unanimous Consent. (All Actions) Tracker: This bill has the status Agreed to in SenateHere are the steps for Status of Legislation:

  4. An Introspective Critique of Past, Present, and Future USGS Decision Support

    NASA Astrophysics Data System (ADS)

    Neff, B. P.; Pavlick, M.

    2017-12-01

    In response to increasing scrutiny of publicly funded science, the Water Mission Area of USGS is shifting its approach for informing decisions that affect the country. Historically, USGS has focused on providing sound science on cutting edge, societally relevant issues with the expectation that decision makers will take action on this information. In practice, scientists often do not understand or focus on the needs of decision makers and decision makers often cannot or do not utilize information produced by scientists. The Water Mission Area of USGS has recognized that it can better serve the taxpayer by delivering information more relevant to decision making in a form more conducive to its use. To this end, the Water Mission Area of USGS is seeking greater integration with the decision making process to better inform what information it produces. In addition, recognizing that the transfer of scientific knowledge to decision making is fundamentally a social process, USGS is embracing the use of social science to better inform how it delivers scientific information and facilitates its use. This study utilizes qualitative methods to document the evolution of decision support at USGS and provide a rationale for a shift in direction. Challenges to implementation are identified and collaborative opportunities to improve decision making are discussed.

  5. Research experiences and mentoring practices in selected east Asian graduate programs: predictors of research productivity among doctoral students in molecular biology.

    PubMed

    Ynalvez, Ruby; Garza-Gongora, Claudia; Ynalvez, Marcus Antonius; Hara, Noriko

    2014-01-01

    Although doctoral mentors recognize the benefits of providing quality advisement and close guidance, those of sharing project management responsibilities with mentees are still not well recognized. We observed that mentees, who have the opportunity to co-manage projects, generate more written output. Here we examine the link between research productivity, doctoral mentoring practices (DMP), and doctoral research experiences (DRE) of mentees in programs in the non-West. Inspired by previous findings that early career productivity is a strong predictor of later productivity, we examine the research productivity of 210 molecular biology doctoral students in selected programs in Japan, Singapore, and Taiwan. Using principal component (PC) analysis, we derive two sets of PCs: one set from 15 DMP and another set from 16 DRE items. We model research productivity using Poisson and negative-binomial regression models with these sets as predictors. Our findings suggest a need to re-think extant practices and to allocate resources toward professional career development in training future scientists. We contend that doctoral science training must not only be an occasion for future scientists to learn scientific and technical skills, but it must also be the opportunity to experience, to acquire, and to hone research management skills. © 2014 The International Union of Biochemistry and Molecular Biology.

  6. Dr. Robert H. Goddard

    NASA Image and Video Library

    2010-01-04

    Dr. Robert Hutchings Goddard (1882-1945). Dr. Goddard has been recognized as the father of American rocketry and as one of the pioneers in the theoretical exploration of space. Robert Hutchings Goddard, born in Worcester, Massachusetts, on October 5, 1882, was theoretical scientist as well as a practical engineer. His dream was the conquest of the upper atmosphere and ultimately space through the use of rocket propulsion. Dr. Goddard, died in 1945, but was probably as responsible for the dawning of the Space Age as the Wrights were for the beginning of the Air Age. Yet his work attracted little serious attention during his lifetime. However, when the United States began to prepare for the conquest of space in the 1950's, American rocket scientists began to recognize the debt owed to the New England professor. They discovered that it was virtually impossible to construct a rocket or launch a satellite without acknowledging the work of Dr. Goddard. More than 200 patents, many of which were issued after his death, covered this great legacy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Join us on Facebook

  7. Dr. Robert Goddard

    NASA Image and Video Library

    2010-01-04

    Dr. Robert Hutchings Goddard (1882-1945). Dr. Goddard has been recognized as the father of American rocketry and as one of the pioneers in the theoretical exploration of space. Robert Hutchings Goddard, born in Worcester, Massachusetts, on October 5, 1882, was theoretical scientist as well as a practical engineer. His dream was the conquest of the upper atmosphere and ultimately space through the use of rocket propulsion. Dr. Goddard, died in 1945, but was probably as responsible for the dawning of the Space Age as the Wrights were for the beginning of the Air Age. Yet his work attracted little serious attention during his lifetime. However, when the United States began to prepare for the conquest of space in the 1950's, American rocket scientists began to recognize the debt owed to the New England professor. They discovered that it was virtually impossible to construct a rocket or launch a satellite without acknowledging the work of Dr. Goddard. More than 200 patents, many of which were issued after his death, covered this great legacy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Join us on Facebook

  8. Mathematical representations in science: a cognitive-historical case history.

    PubMed

    Tweney, Ryan D

    2009-10-01

    The important role of mathematical representations in scientific thinking has received little attention from cognitive scientists. This study argues that neglect of this issue is unwarranted, given existing cognitive theories and laws, together with promising results from the cognitive historical analysis of several important scientists. In particular, while the mathematical wizardry of James Clerk Maxwell differed dramatically from the experimental approaches favored by Michael Faraday, Maxwell himself recognized Faraday as "in reality a mathematician of a very high order," and his own work as in some respects a re-representation of Faraday's field theory in analytic terms. The implications of the similarities and differences between the two figures open new perspectives on the cognitive role of mathematics as a learned mode of representation in science. Copyright © 2009 Cognitive Science Society, Inc.

  9. Mitchell Receives 2013 Ronald Greeley Early Career Award in Planetary Science: Citation

    NASA Astrophysics Data System (ADS)

    McKinnon, William B.

    2014-07-01

    The Greeley Early Career Award is named for pioneering planetary scientist Ronald Greeley. Ron was involved in nearly every major planetary mission from the 1970s until his death and was extraordinarily active in service to the planetary science community. Ron's greatest legacies, however, are those he mentored through the decades, and it is young scientists whose work and promise we seek to recognize. This year's Greeley award winner is Jonathan L. Mitchell, an assistant professor at the University of California, Los Angeles (UCLA). Jonathan received his Ph.D. from the University of Chicago, and after a postdoc at the Institute for Advanced Studies in Princeton, he joined the UCLA faculty, where he holds a joint appointment in Earth and space sciences and in atmospheric sciences.

  10. A History of the Liberal Arts Computer Science Consortium and Its Model Curricula

    ERIC Educational Resources Information Center

    Bruce, Kim B.; Cupper, Robert D.; Scot Drysdale, Robert L.

    2010-01-01

    With the support of a grant from the Sloan Foundation, nine computer scientists from liberal arts colleges came together in October, 1984 to form the Liberal Arts Computer Science Consortium (LACS) and to create a model curriculum appropriate for liberal arts colleges. Over the years the membership has grown and changed, but the focus has remained…

  11. Computers in Education: Realizing the Potential. Report of a Research Conference, Pittsburgh, Pennsylvania, November 20-24, 1982.

    ERIC Educational Resources Information Center

    Lesgold, Alan M., Ed.; Reif, Frederick, Ed.

    The full proceedings are provided here of a conference of 40 teachers, educational researchers, and scientists from both the public and private sectors that centered on the future of computers in education and the research required to realize the computer's educational potential. A summary of the research issues considered and suggested means for…

  12. Simulation Modeling of Lakes in Undergraduate and Graduate Classrooms Increases Comprehension of Climate Change Concepts and Experience with Computational Tools

    ERIC Educational Resources Information Center

    Carey, Cayelan C.; Gougis, Rebekka Darner

    2017-01-01

    Ecosystem modeling is a critically important tool for environmental scientists, yet is rarely taught in undergraduate and graduate classrooms. To address this gap, we developed a teaching module that exposes students to a suite of modeling skills and tools (including computer programming, numerical simulation modeling, and distributed computing)…

  13. Sustaining and Extending the Open Science Grid: Science Innovation on a PetaScale Nationwide Facility (DE-FC02-06ER41436) SciDAC-2 Closeout Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Livny, Miron; Shank, James; Ernst, Michael

    Under this SciDAC-2 grant the project’s goal w a s t o stimulate new discoveries by providing scientists with effective and dependable access to an unprecedented national distributed computational facility: the Open Science Grid (OSG). We proposed to achieve this through the work of the Open Science Grid Consortium: a unique hands-on multi-disciplinary collaboration of scientists, software developers and providers of computing resources. Together the stakeholders in this consortium sustain and use a shared distributed computing environment that transforms simulation and experimental science in the US. The OSG consortium is an open collaboration that actively engages new research communities. Wemore » operate an open facility that brings together a broad spectrum of compute, storage, and networking resources and interfaces to other cyberinfrastructures, including the US XSEDE (previously TeraGrid), the European Grids for ESciencE (EGEE), as well as campus and regional grids. We leverage middleware provided by computer science groups, facility IT support organizations, and computing programs of application communities for the benefit of consortium members and the US national CI.« less

  14. Visually impaired researchers get their hands on quantum chemistry: application to a computational study on the isomerization of a sterol.

    PubMed

    Lounnas, Valère; Wedler, Henry B; Newman, Timothy; Schaftenaar, Gijs; Harrison, Jason G; Nepomuceno, Gabriella; Pemberton, Ryan; Tantillo, Dean J; Vriend, Gert

    2014-11-01

    In molecular sciences, articles tend to revolve around 2D representations of 3D molecules, and sighted scientists often resort to 3D virtual reality software to study these molecules in detail. Blind and visually impaired (BVI) molecular scientists have access to a series of audio devices that can help them read the text in articles and work with computers. Reading articles published in this journal, though, is nearly impossible for them because they need to generate mental 3D images of molecules, but the article-reading software cannot do that for them. We have previously designed AsteriX, a web server that fully automatically decomposes articles, detects 2D plots of low molecular weight molecules, removes meta data and annotations from these plots, and converts them into 3D atomic coordinates. AsteriX-BVI goes one step further and converts the 3D representation into a 3D printable, haptic-enhanced format that includes Braille annotations. These Braille-annotated physical 3D models allow BVI scientists to generate a complete mental model of the molecule. AsteriX-BVI uses Molden to convert the meta data of quantum chemistry experiments into BVI friendly formats so that the entire line of scientific information that sighted people take for granted-from published articles, via printed results of computational chemistry experiments, to 3D models-is now available to BVI scientists too. The possibilities offered by AsteriX-BVI are illustrated by a project on the isomerization of a sterol, executed by the blind co-author of this article (HBW).

  15. Visually impaired researchers get their hands on quantum chemistry: application to a computational study on the isomerization of a sterol

    NASA Astrophysics Data System (ADS)

    Lounnas, Valère; Wedler, Henry B.; Newman, Timothy; Schaftenaar, Gijs; Harrison, Jason G.; Nepomuceno, Gabriella; Pemberton, Ryan; Tantillo, Dean J.; Vriend, Gert

    2014-11-01

    In molecular sciences, articles tend to revolve around 2D representations of 3D molecules, and sighted scientists often resort to 3D virtual reality software to study these molecules in detail. Blind and visually impaired (BVI) molecular scientists have access to a series of audio devices that can help them read the text in articles and work with computers. Reading articles published in this journal, though, is nearly impossible for them because they need to generate mental 3D images of molecules, but the article-reading software cannot do that for them. We have previously designed AsteriX, a web server that fully automatically decomposes articles, detects 2D plots of low molecular weight molecules, removes meta data and annotations from these plots, and converts them into 3D atomic coordinates. AsteriX-BVI goes one step further and converts the 3D representation into a 3D printable, haptic-enhanced format that includes Braille annotations. These Braille-annotated physical 3D models allow BVI scientists to generate a complete mental model of the molecule. AsteriX-BVI uses Molden to convert the meta data of quantum chemistry experiments into BVI friendly formats so that the entire line of scientific information that sighted people take for granted—from published articles, via printed results of computational chemistry experiments, to 3D models—is now available to BVI scientists too. The possibilities offered by AsteriX-BVI are illustrated by a project on the isomerization of a sterol, executed by the blind co-author of this article (HBW).

  16. CGAT: a model for immersive personalized training in computational genomics.

    PubMed

    Sims, David; Ponting, Chris P; Heger, Andreas

    2016-01-01

    How should the next generation of genomics scientists be trained while simultaneously pursuing high quality and diverse research? CGAT, the Computational Genomics Analysis and Training programme, was set up in 2010 by the UK Medical Research Council to complement its investment in next-generation sequencing capacity. CGAT was conceived around the twin goals of training future leaders in genome biology and medicine, and providing much needed capacity to UK science for analysing genome scale data sets. Here we outline the training programme employed by CGAT and describe how it dovetails with collaborative research projects to launch scientists on the road towards independent research careers in genomics. © The Author 2015. Published by Oxford University Press.

  17. The Man computer Interactive Data Access System: 25 Years of Interactive Processing.

    NASA Astrophysics Data System (ADS)

    Lazzara, Matthew A.; Benson, John M.; Fox, Robert J.; Laitsch, Denise J.; Rueden, Joseph P.; Santek, David A.; Wade, Delores M.; Whittaker, Thomas M.; Young, J. T.

    1999-02-01

    On 12 October 1998, it was the 25th anniversary of the Man computer Interactive Data Access System (McIDAS). On that date in 1973, McIDAS was first used operationally by scientists as a tool for data analysis. Over the last 25 years, McIDAS has undergone numerous architectural changes in an effort to keep pace with changing technology. In its early years, significant technological breakthroughs were required to achieve the functionality needed by atmospheric scientists. Today McIDAS is challenged by new Internet-based approaches to data access and data display. The history and impact of McIDAS, along with some of the lessons learned, are presented here

  18. Applications of genetic programming in cancer research.

    PubMed

    Worzel, William P; Yu, Jianjun; Almal, Arpit A; Chinnaiyan, Arul M

    2009-02-01

    The theory of Darwinian evolution is the fundamental keystones of modern biology. Late in the last century, computer scientists began adapting its principles, in particular natural selection, to complex computational challenges, leading to the emergence of evolutionary algorithms. The conceptual model of selective pressure and recombination in evolutionary algorithms allow scientists to efficiently search high dimensional space for solutions to complex problems. In the last decade, genetic programming has been developed and extensively applied for analysis of molecular data to classify cancer subtypes and characterize the mechanisms of cancer pathogenesis and development. This article reviews current successes using genetic programming and discusses its potential impact in cancer research and treatment in the near future.

  19. Enabling Wide-Scale Computer Science Education through Improved Automated Assessment Tools

    NASA Astrophysics Data System (ADS)

    Boe, Bryce A.

    There is a proliferating demand for newly trained computer scientists as the number of computer science related jobs continues to increase. University programs will only be able to train enough new computer scientists to meet this demand when two things happen: when there are more primary and secondary school students interested in computer science, and when university departments have the resources to handle the resulting increase in enrollment. To meet these goals, significant effort is being made to both incorporate computational thinking into existing primary school education, and to support larger university computer science class sizes. We contribute to this effort through the creation and use of improved automated assessment tools. To enable wide-scale computer science education we do two things. First, we create a framework called Hairball to support the static analysis of Scratch programs targeted for fourth, fifth, and sixth grade students. Scratch is a popular building-block language utilized to pique interest in and teach the basics of computer science. We observe that Hairball allows for rapid curriculum alterations and thus contributes to wide-scale deployment of computer science curriculum. Second, we create a real-time feedback and assessment system utilized in university computer science classes to provide better feedback to students while reducing assessment time. Insights from our analysis of student submission data show that modifications to the system configuration support the way students learn and progress through course material, making it possible for instructors to tailor assignments to optimize learning in growing computer science classes.

  20. Credibility and advocacy in conservation science.

    PubMed

    Horton, Cristi C; Peterson, Tarla Rai; Banerjee, Paulami; Peterson, Markus J

    2016-02-01

    Conservation policy sits at the nexus of natural science and politics. On the one hand, conservation scientists strive to maintain scientific credibility by emphasizing that their research findings are the result of disinterested observations of reality. On the other hand, conservation scientists are committed to conservation even if they do not advocate a particular policy. The professional conservation literature offers guidance on negotiating the relationship between scientific objectivity and political advocacy without damaging conservation science's credibility. The value of this guidance, however, may be restricted by limited recognition of credibility's multidimensionality and emergent nature: it emerges through perceptions of expertise, goodwill, and trustworthiness. We used content analysis of the literature to determine how credibility is framed in conservation science as it relates to apparent contradictions between science and advocacy. Credibility typically was framed as a static entity lacking dimensionality. Authors identified expertise or trustworthiness as important, but rarely mentioned goodwill. They usually did not identify expertise, goodwill, or trustworthiness as dimensions of credibility or recognize interactions among these 3 dimensions of credibility. This oversimplification may limit the ability of conservation scientists to contribute to biodiversity conservation. Accounting for the emergent quality and multidimensionality of credibility should enable conservation scientists to advance biodiversity conservation more effectively. © 2015 The Authors. Conservation Biology published by Wiley Periodicals, Inc. on behalf of Society for Conservation Biology.

  1. University-NGO connections for earthquake and tsunami risk reduction: lessons learned in West Sumatra

    NASA Astrophysics Data System (ADS)

    McCaughey, J.; Dewi, P. R.

    2013-12-01

    Scientists have information that is critical to policy and public education, yet lack field staff of their own to put this into practice. NGOs have field staff as well as connections with policymakers and the community, yet lack a direct connection to the latest scientific research. Scientists face pressure to obtain grants and publish; NGOs face pressure to deliver programs to as many people as possible. Lacking institutional incentives that recognize efforts to bridge the science-practice gap, it is often out of personal convictions that scientists seek to share their results with NGOs, and NGO practitioners seek to deepen their own scientific knowledge. Such individual efforts are impactful; however, more can be achieved with institutional commitments to closer collaboration. Science communication is dialogue, not a one-way transfer of knowledge from science to practice. On the university side, listening to our NGO partners has inspired faculty, staff, and students, identified new areas of fundamental scientific research inspired by practical use, and helped prioritize and clarify the scientific information that is most useful for disaster-risk-reduction practice. On the NGO side, connections to scientists have informed the content of public education and policy advocacy programs and clarified technical information; this new understanding has been incorporated in advocacy and community engagement programs.

  2. Methods for Creating and Animating a Computer Model Depicting the Structure and Function of the Sarcoplasmic Reticulum Calcium ATPase Enzyme.

    ERIC Educational Resources Information Center

    Chen, Alice Y.; McKee, Nancy

    1999-01-01

    Describes the developmental process used to visualize the calcium ATPase enzyme of the sarcoplasmic reticulum which involves evaluating scientific information, consulting scientists, model making, storyboarding, and creating and editing in a computer medium. (Author/CCM)

  3. Inquiring Minds

    Science.gov Websites

    -performance Computing Grid Computing Networking Mass Storage Plan for the Future State of the Laboratory to help decipher the language of high-energy physics. Virtual Ask-a-Scientist Read transcripts from past online chat sessions. last modified 1/04/2005 email Fermilab Fermi National Accelerator Laboratory

  4. Computing Logarithms by Hand

    ERIC Educational Resources Information Center

    Reed, Cameron

    2016-01-01

    How can old-fashioned tables of logarithms be computed without technology? Today, of course, no practicing mathematician, scientist, or engineer would actually use logarithms to carry out a calculation, let alone worry about deriving them from scratch. But high school students may be curious about the process. This article develops a…

  5. Participatory Design of Human-Centered Cyberinfrastructure (Invited)

    NASA Astrophysics Data System (ADS)

    Pennington, D. D.; Gates, A. Q.

    2010-12-01

    Cyberinfrastructure, by definition, is about people sharing resources to achieve outcomes that cannot be reached independently. CI depends not just on creating discoverable resources, or tools that allow those resources to be processed, integrated, and visualized -- but on human activation of flows of information across those resources. CI must be centered on human activities. Yet for those CI projects that are directed towards observational science, there are few models for organizing collaborative research in ways that align individual research interests into a collective vision of CI-enabled science. Given that the emerging technologies are themselves expected to change the way science is conducted, it is not simply a matter of conducting requirements analysis on how scientists currently work, or building consensus among the scientists on what is needed. Developing effective CI depends on generating a new, creative vision of problem solving within a community based on computational concepts that are, in some cases, still very abstract and theoretical. The computer science theory may (or may not) be well formalized, but the potential for impact on any particular domain is typically ill-defined. In this presentation we will describe approaches being developed and tested at the CyberShARE Center of Excellence at University of Texas in El Paso for ill-structured problem solving within cross-disciplinary teams of scientists and computer scientists working on data intensive environmental and geoscience. These approaches deal with the challenges associated with sharing and integrating knowledge across disciplines; the challenges of developing effective teamwork skills in a culture that favors independent effort; and the challenges of evolving shared, focused research goals from ill-structured, vague starting points - all issues that must be confronted by every interdisciplinary CI project. We will introduce visual and semantic-based tools that can enable the collaborative research design process and illustrate their application in designing and developing useful end-to-end data solutions for scientists. Lastly, we will outline areas of future investigation within CyberShARE that we believe have the potential for high impact.

  6. A guide to understanding social science research for natural scientists.

    PubMed

    Moon, Katie; Blackman, Deborah

    2014-10-01

    Natural scientists are increasingly interested in social research because they recognize that conservation problems are commonly social problems. Interpreting social research, however, requires at least a basic understanding of the philosophical principles and theoretical assumptions of the discipline, which are embedded in the design of social research. Natural scientists who engage in social science but are unfamiliar with these principles and assumptions can misinterpret their results. We developed a guide to assist natural scientists in understanding the philosophical basis of social science to support the meaningful interpretation of social research outcomes. The 3 fundamental elements of research are ontology, what exists in the human world that researchers can acquire knowledge about; epistemology, how knowledge is created; and philosophical perspective, the philosophical orientation of the researcher that guides her or his action. Many elements of the guide also apply to the natural sciences. Natural scientists can use the guide to assist them in interpreting social science research to determine how the ontological position of the researcher can influence the nature of the research; how the epistemological position can be used to support the legitimacy of different types of knowledge; and how philosophical perspective can shape the researcher's choice of methods and affect interpretation, communication, and application of results. The use of this guide can also support and promote the effective integration of the natural and social sciences to generate more insightful and relevant conservation research outcomes. © 2014 Society for Conservation Biology.

  7. Data and Communications in Basic Energy Sciences: Creating a Pathway for Scientific Discovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nugent, Peter E.; Simonson, J. Michael

    2011-10-24

    This report is based on the Department of Energy (DOE) Workshop on “Data and Communications in Basic Energy Sciences: Creating a Pathway for Scientific Discovery” that was held at the Bethesda Marriott in Maryland on October 24-25, 2011. The workshop brought together leading researchers from the Basic Energy Sciences (BES) facilities and Advanced Scientific Computing Research (ASCR). The workshop was co-sponsored by these two Offices to identify opportunities and needs for data analysis, ownership, storage, mining, provenance and data transfer at light sources, neutron sources, microscopy centers and other facilities. Their charge was to identify current and anticipated issues inmore » the acquisition, analysis, communication and storage of experimental data that could impact the progress of scientific discovery, ascertain what knowledge, methods and tools are needed to mitigate present and projected shortcomings and to create the foundation for information exchanges and collaboration between ASCR and BES supported researchers and facilities. The workshop was organized in the context of the impending data tsunami that will be produced by DOE’s BES facilities. Current facilities, like SLAC National Accelerator Laboratory’s Linac Coherent Light Source, can produce up to 18 terabytes (TB) per day, while upgraded detectors at Lawrence Berkeley National Laboratory’s Advanced Light Source will generate ~10TB per hour. The expectation is that these rates will increase by over an order of magnitude in the coming decade. The urgency to develop new strategies and methods in order to stay ahead of this deluge and extract the most science from these facilities was recognized by all. The four focus areas addressed in this workshop were: Workflow Management - Experiment to Science: Identifying and managing the data path from experiment to publication. Theory and Algorithms: Recognizing the need for new tools for computation at scale, supporting large data sets and realistic theoretical models. Visualization and Analysis: Supporting near-real-time feedback for experiment optimization and new ways to extract and communicate critical information from large data sets. Data Processing and Management: Outlining needs in computational and communication approaches and infrastructure needed to handle unprecedented data volume and information content. It should be noted that almost all participants recognized that there were unlikely to be any turn-key solutions available due to the unique, diverse nature of the BES community, where research at adjacent beamlines at a given light source facility often span everything from biology to materials science to chemistry using scattering, imaging and/or spectroscopy. However, it was also noted that advances supported by other programs in data research, methodologies, and tool development could be implemented on reasonable time scales with modest effort. Adapting available standard file formats, robust workflows, and in-situ analysis tools for user facility needs could pay long-term dividends. Workshop participants assessed current requirements as well as future challenges and made the following recommendations in order to achieve the ultimate goal of enabling transformative science in current and future BES facilities: Theory and analysis components should be integrated seamlessly within experimental workflow. Develop new algorithms for data analysis based on common data formats and toolsets. Move analysis closer to experiment. Move the analysis closer to the experiment to enable real-time (in-situ) streaming capabilities, live visualization of the experiment and an increase of the overall experimental efficiency. Match data management access and capabilities with advancements in detectors and sources. Remove bottlenecks, provide interoperability across different facilities/beamlines and apply forefront mathematical techniques to more efficiently extract science from the experiments. This workshop report examines and reviews the status of several BES facilities and highlights the successes and shortcomings of the current data and communication pathways for scientific discovery. It then ascertains what methods and tools are needed to mitigate present and projected data bottlenecks to science over the next 10 years. The goal of this report is to create the foundation for information exchanges and collaborations among ASCR and BES supported researchers, the BES scientific user facilities, and ASCR computing and networking facilities. To jumpstart these activities, there was a strong desire to see a joint effort between ASCR and BES along the lines of the highly successful Scientific Discovery through Advanced Computing (SciDAC) program in which integrated teams of engineers, scientists and computer scientists were engaged to tackle a complete end-to-end workflow solution at one or more beamlines, to ascertain what challenges will need to be addressed in order to handle future increases in data« less

  8. Computer measurement of particle sizes in electron microscope images

    NASA Technical Reports Server (NTRS)

    Hall, E. L.; Thompson, W. B.; Varsi, G.; Gauldin, R.

    1976-01-01

    Computer image processing techniques have been applied to particle counting and sizing in electron microscope images. Distributions of particle sizes were computed for several images and compared to manually computed distributions. The results of these experiments indicate that automatic particle counting within a reasonable error and computer processing time is feasible. The significance of the results is that the tedious task of manually counting a large number of particles can be eliminated while still providing the scientist with accurate results.

  9. Proteomics, lipidomics, metabolomics: a mass spectrometry tutorial from a computer scientist's point of view.

    PubMed

    Smith, Rob; Mathis, Andrew D; Ventura, Dan; Prince, John T

    2014-01-01

    For decades, mass spectrometry data has been analyzed to investigate a wide array of research interests, including disease diagnostics, biological and chemical theory, genomics, and drug development. Progress towards solving any of these disparate problems depends upon overcoming the common challenge of interpreting the large data sets generated. Despite interim successes, many data interpretation problems in mass spectrometry are still challenging. Further, though these challenges are inherently interdisciplinary in nature, the significant domain-specific knowledge gap between disciplines makes interdisciplinary contributions difficult. This paper provides an introduction to the burgeoning field of computational mass spectrometry. We illustrate key concepts, vocabulary, and open problems in MS-omics, as well as provide invaluable resources such as open data sets and key search terms and references. This paper will facilitate contributions from mathematicians, computer scientists, and statisticians to MS-omics that will fundamentally improve results over existing approaches and inform novel algorithmic solutions to open problems.

  10. Nature apps: Waiting for the revolution.

    PubMed

    Jepson, Paul; Ladle, Richard J

    2015-12-01

    Apps are small task-orientated programs with the potential to integrate the computational and sensing capacities of smartphones with the power of cloud computing, social networking, and crowdsourcing. They have the potential to transform how humans interact with nature, cause a step change in the quantity and resolution of biodiversity data, democratize access to environmental knowledge, and reinvigorate ways of enjoying nature. To assess the extent to which this potential is being exploited in relation to nature, we conducted an automated search of the Google Play Store using 96 nature-related terms. This returned data on ~36 304 apps, of which ~6301 were nature-themed. We found that few of these fully exploit the full range of capabilities inherent in the technology and/or have successfully captured the public imagination. Such breakthroughs will only be achieved by increasing the frequency and quality of collaboration between environmental scientists, information engineers, computer scientists, and interested publics.

  11. Computer Programming Languages and Expertise Needed by Practicing Engineers.

    ERIC Educational Resources Information Center

    Doelling, Irvin

    1980-01-01

    Discussed is the present engineering computer environment of a large aerospace company recognized as a leader in the application and development of computer-aided design and computer-aided manufacturing techniques. A review is given of the exposure spectrum of engineers to the world of computing, the computer languages used, and the career impacts…

  12. PREFACE: First International Workshop and Summer School on Plasma Physics

    NASA Astrophysics Data System (ADS)

    Benova, Evgenia; Zhelyazkov, Ivan; Atanassov, Vladimir

    2006-07-01

    The First International Workshop and Summer School on Plasma Physics (IWSSPP'05) organized by The Faculty of Physics, University of Sofia and the Foundation `Theoretical and Computational Physics and Astrophysics' was dedicated to the World Year of Physics 2005 and held in Kiten, Bulgaria, on the Black Sea Coast, from 8--12 June 2005. The aim of the workshop was to bring together scientists from various branches of plasma physics in order to ensure an interdisciplinary exchange of views and initiate possible collaborations. Another important task was to stimulate the creation and support of a new generation of young scientists for the further development of plasma physics fundamentals and applications. This volume of Journal of Physics: Conference Series includes 31 papers (invited lectures, contributed talks and posters) devoted to various branches of plasma physics, among them fusion research, kinetics and transport phenomena in gas discharge plasmas, MHD waves and instabilities in the solar atmosphere, dc and microwave discharge modelling, plasma diagnostics, cross sections and rate constants of elementary processes, material processing, plasma-chemistry and technology. Some of them have been presented by internationally known and recognized specialists in their fields; others are Masters or PhD students' first steps in science. In both cases, we believe they will stimulate readers' interest. We would like to thank the members of both the International Advisory Committee and the Local Organizing Committee. We greatly appreciate the financial support from the sponsors: the Department for Language Teaching and International Students at Sofia University, Dr Ivan Bogorov Publishing house, and Artgraph2 Publishing house. We would like to express our gratitude to the invited lecturers who were willing to pay the participation fee. In this way, in addition to the intellectual support they provided by means of their excellent lectures, they also supported the school financially.

  13. A Serviced-based Approach to Connect Seismological Infrastructures: Current Efforts at the IRIS DMC

    NASA Astrophysics Data System (ADS)

    Ahern, Tim; Trabant, Chad

    2014-05-01

    As part of the COOPEUS initiative to build infrastructure that connects European and US research infrastructures, IRIS has advocated for the development of Federated services based upon internationally recognized standards using web services. By deploying International Federation of Digital Seismograph Networks (FDSN) endorsed web services at multiple data centers in the US and Europe, we have shown that integration within seismological domain can be realized. By deploying identical methods to invoke the web services at multiple centers this approach can significantly ease the methods through which a scientist can access seismic data (time series, metadata, and earthquake catalogs) from distributed federated centers. IRIS has developed an IRIS federator that helps a user identify where seismic data from global seismic networks can be accessed. The web services based federator can build the appropriate URLs and return them to client software running on the scientists own computer. These URLs are then used to directly pull data from the distributed center in a very peer-based fashion. IRIS is also involved in deploying web services across horizontal domains. As part of the US National Science Foundation's (NSF) EarthCube effort, an IRIS led EarthCube Building Block's project is underway. When completed this project will aid in the discovery, access, and usability of data across multiple geoscienece domains. This presentation will summarize current IRIS efforts in building vertical integration infrastructure within seismology working closely with 5 centers in Europe and 2 centers in the US, as well as how we are taking first steps toward horizontal integration of data from 14 different domains in the US, in Europe, and around the world.

  14. The APECS Virtual Poster Session: a virtual platform for science communication and discussion

    NASA Astrophysics Data System (ADS)

    Renner, A.; Jochum, K.; Jullion, L.; Pavlov, A.; Liggett, D.; Fugmann, G.; Baeseman, J. L.; Apecs Virtual Poster Session Working Group, T.

    2011-12-01

    The Virtual Poster Session (VPS) of the Association of Polar Early Career Scientists (APECS) was developed by early career scientists as an online tool for communicating and discussing science and research beyond the four walls of a conference venue. Poster sessions often are the backbone of a conference where especially early career scientists get a chance to communicate their research, discuss ideas, data, and scientific problems with their peers and senior scientists. There, they can hone their 'elevator pitch', discussion skills and presentation skills. APECS has taken the poster session one step further and created the VPS - the same idea but independent from conferences, travel, and location. All that is needed is a computer with internet access. Instead of letting their posters collect dust on the computer's hard drive, scientists can now upload them to the APECS website. There, others have the continuous opportunity to comment, give feedback and discuss the work. Currently, about 200 posters are accessible contributed by authors and co-authors from 34 countries. Since January 2010, researchers can discuss their poster with a broad international audience including fellow researchers, community members, potential colleagues and collaborators, policy makers and educators during monthly conference calls via an internet platform. Recordings of the calls are available online afterwards. Calls so far have included topical sessions on e.g. marine biology, glaciology, or social sciences, and interdisciplinary calls on Arctic sciences or polar research activities in a specific country, e.g. India or Romania. They attracted audiences of scientists at all career stages and from all continents, with on average about 15 persons participating per call. Online tools like the VPS open up new ways for creating collaborations and new research ideas and sharing different methodologies for future projects, pushing aside the boundaries of countries and nations, conferences, offices, and disciplines, and provide early career scientists with easily accessible training opportunities for their communication and outreach skills, independent of their location and funding situation.

  15. Using Noninvasive Wearable Computers to Recognize Human Emotions from Physiological Signals

    NASA Astrophysics Data System (ADS)

    Lisetti, Christine Lætitia; Nasoz, Fatma

    2004-12-01

    We discuss the strong relationship between affect and cognition and the importance of emotions in multimodal human computer interaction (HCI) and user modeling. We introduce the overall paradigm for our multimodal system that aims at recognizing its users' emotions and at responding to them accordingly depending upon the current context or application. We then describe the design of the emotion elicitation experiment we conducted by collecting, via wearable computers, physiological signals from the autonomic nervous system (galvanic skin response, heart rate, temperature) and mapping them to certain emotions (sadness, anger, fear, surprise, frustration, and amusement). We show the results of three different supervised learning algorithms that categorize these collected signals in terms of emotions, and generalize their learning to recognize emotions from new collections of signals. We finally discuss possible broader impact and potential applications of emotion recognition for multimodal intelligent systems.

  16. The dynamics of Brazilian protozoology over the past century.

    PubMed

    Elias, M Carolina; Floeter-Winter, Lucile M; Mena-Chalco, Jesus P

    2016-01-01

    Brazilian scientists have been contributing to the protozoology field for more than 100 years with important discoveries of new species such as Trypanosoma cruzi and Leishmania spp. In this work, we used a Brazilian thesis database (Coordination for the Improvement of Higher Education Personnel) covering the period from 1987-2011 to identify researchers who contributed substantially to protozoology. We selected 248 advisors by filtering to obtain researchers who supervised at least 10 theses. Based on a computational analysis of the thesis databases, we found students who were supervised by these scientists. A computational procedure was developed to determine the advisors' scientific ancestors using the Lattes Platform. These analyses provided a list of 1,997 researchers who were inspected through Lattes CV examination and allowed the identification of the pioneers of Brazilian protozoology. Moreover, we investigated the areas in which researchers who earned PhDs in protozoology are now working. We found that 68.4% of them are still in protozoology, while 16.7% have migrated to other fields. We observed that support for protozoology by national or international agencies is clearly correlated with the increase of scientists in the field. Finally, we described the academic genealogy of Brazilian protozoology by formalising the "forest" of Brazilian scientists involved in the study of protozoa and their vectors over the past century.

  17. The dynamics of Brazilian protozoology over the past century

    PubMed Central

    Elias, M Carolina; Floeter-Winter, Lucile M; Mena-Chalco, Jesus P

    2016-01-01

    Brazilian scientists have been contributing to the protozoology field for more than 100 years with important discoveries of new species such asTrypanosoma cruzi and Leishmania spp. In this work, we used a Brazilian thesis database (Coordination for the Improvement of Higher Education Personnel) covering the period from 1987-2011 to identify researchers who contributed substantially to protozoology. We selected 248 advisors by filtering to obtain researchers who supervised at least 10 theses. Based on a computational analysis of the thesis databases, we found students who were supervised by these scientists. A computational procedure was developed to determine the advisors’ scientific ancestors using the Lattes Platform. These analyses provided a list of 1,997 researchers who were inspected through Lattes CV examination and allowed the identification of the pioneers of Brazilian protozoology. Moreover, we investigated the areas in which researchers who earned PhDs in protozoology are now working. We found that 68.4% of them are still in protozoology, while 16.7% have migrated to other fields. We observed that support for protozoology by national or international agencies is clearly correlated with the increase of scientists in the field. Finally, we described the academic genealogy of Brazilian protozoology by formalising the “forest” of Brazilian scientists involved in the study of protozoa and their vectors over the past century. PMID:26814646

  18. Climate Feedback: Bringing the Scientific Community to Provide Direct Feedback on the Credibility of Climate Media Coverage

    NASA Astrophysics Data System (ADS)

    Vincent, E. M.; Matlock, T.; Westerling, A. L.

    2015-12-01

    While most scientists recognize climate change as a major societal and environmental issue, social and political will to tackle the problem is still lacking. One of the biggest obstacles is inaccurate reporting or even outright misinformation in climate change coverage that result in the confusion of the general public on the issue.In today's era of instant access to information, what we read online usually falls outside our field of expertise and it is a real challenge to evaluate what is credible. The emerging technology of web annotation could be a game changer as it allows knowledgeable individuals to attach notes to any piece of text of a webpage and to share them with readers who will be able to see the annotations in-context -like comments on a pdf.Here we present the Climate Feedback initiative that is bringing together a community of climate scientists who collectively evaluate the scientific accuracy of influential climate change media coverage. Scientists annotate articles sentence by sentence and assess whether they are consistent with scientific knowledge allowing readers to see where and why the coverage is -or is not- based on science. Scientists also summarize the essence of their critical commentary in the form of a simple article-level overall credibility rating that quickly informs readers about the credibility of the entire piece.Web-annotation allows readers to 'hear' directly from the experts and to sense the consensus in a personal way as one can literaly see how many scientists agree with a given statement. It also allows a broad population of scientists to interact with the media, notably early career scientists.In this talk, we will present results on the impacts annotations have on readers -regarding their evaluation of the trustworthiness of the information they read- and on journalists -regarding their reception of scientists comments.Several dozen scientists have contributed to this effort to date and the system offers potential to scale up as it relies on a crowdsourced process where each scientist only makes small contributions that get aggregated together. The project aims to build a network of scientists with varied expertise and to organize their efforts at a global scale to efficiently peer-review major news coverage on climate.

  19. Human Exploration Ethnography of the Haughton-Mars Project, 1998-1999

    NASA Technical Reports Server (NTRS)

    Clancey, William J.; Swanson, Keith (Technical Monitor)

    1999-01-01

    During the past two field seasons, July 1988 and 1999, we have conducted research about the field practices of scientists and engineers at Haughton Crater on Devon Island in the Canadian Arctic, with the objective of determining how people will live and work on Mars. This broad investigation of field life and work practice, part of the Haughton-Mars Project lead by Pascal Lee, spans social and cognitive anthropology, psychology, and computer science. Our approach involves systematic observation and description of activities, places, and concepts, constituting an ethnography of field science at Haughton. Our focus is on human behaviors-what people do, where, when, with whom, and why. By locating behavior in time and place-in contrast with a purely functional or "task oriented" description of work-we find patterns constituting the choreography of interaction between people, their habitat, and their tools. As such, we view the exploration process in terms of a total system comprising a social organization, facilities, terrain/climate, personal identities, artifacts, and computer tools. Because we are computer scientists seeking to develop new kinds of tools for living and working on Mars, we focus on the existing representational tools (such as documents and measuring devices), learning and improvization (such as use of the internet or informal assistance), and prototype computational systems brought to the field. Our research is based on partnership, by which field scientists and engineers actively contribute to our findings, just as we participate in their work and life.

  20. 26 CFR 1.1031(j)-1 - Exchanges of multiple properties.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... ($4000) exceeds the fair market value of automobile B ($2950) by that amount. (iii) K recognizes gain on...), or $1050. (iv) The total amount of gain recognized by K in the exchange is the sum of the gains... requires a property-by-property comparison for computing the gain recognized and basis of property received...

  1. 26 CFR 1.1031(j)-1 - Exchanges of multiple properties.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... ($4000) exceeds the fair market value of automobile B ($2950) by that amount. (iii) K recognizes gain on...), or $1050. (iv) The total amount of gain recognized by K in the exchange is the sum of the gains... requires a property-by-property comparison for computing the gain recognized and basis of property received...

  2. 26 CFR 1.1031(j)-1 - Exchanges of multiple properties.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... ($4000) exceeds the fair market value of automobile B ($2950) by that amount. (iii) K recognizes gain on...), or $1050. (iv) The total amount of gain recognized by K in the exchange is the sum of the gains... requires a property-by-property comparison for computing the gain recognized and basis of property received...

  3. 26 CFR 1.1031(j)-1 - Exchanges of multiple properties.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... ($4000) exceeds the fair market value of automobile B ($2950) by that amount. (iii) K recognizes gain on...), or $1050. (iv) The total amount of gain recognized by K in the exchange is the sum of the gains... property-by-property comparison for computing the gain recognized and basis of property received in a like...

  4. 26 CFR 1.1031(j)-1 - Exchanges of multiple properties.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... ($4000) exceeds the fair market value of automobile B ($2950) by that amount. (iii) K recognizes gain on...), or $1050. (iv) The total amount of gain recognized by K in the exchange is the sum of the gains... requires a property-by-property comparison for computing the gain recognized and basis of property received...

  5. Dynamic Collaboration Infrastructure for Hydrologic Science

    NASA Astrophysics Data System (ADS)

    Tarboton, D. G.; Idaszak, R.; Castillo, C.; Yi, H.; Jiang, F.; Jones, N.; Goodall, J. L.

    2016-12-01

    Data and modeling infrastructure is becoming increasingly accessible to water scientists. HydroShare is a collaborative environment that currently offers water scientists the ability to access modeling and data infrastructure in support of data intensive modeling and analysis. It supports the sharing of and collaboration around "resources" which are social objects defined to include both data and models in a structured standardized format. Users collaborate around these objects via comments, ratings, and groups. HydroShare also supports web services and cloud based computation for the execution of hydrologic models and analysis and visualization of hydrologic data. However, the quantity and variety of data and modeling infrastructure available that can be accessed from environments like HydroShare is increasing. Storage infrastructure can range from one's local PC to campus or organizational storage to storage in the cloud. Modeling or computing infrastructure can range from one's desktop to departmental clusters to national HPC resources to grid and cloud computing resources. How does one orchestrate this vast number of data and computing infrastructure without needing to correspondingly learn each new system? A common limitation across these systems is the lack of efficient integration between data transport mechanisms and the corresponding high-level services to support large distributed data and compute operations. A scientist running a hydrology model from their desktop may require processing a large collection of files across the aforementioned storage and compute resources and various national databases. To address these community challenges a proof-of-concept prototype was created integrating HydroShare with RADII (Resource Aware Data-centric collaboration Infrastructure) to provide software infrastructure to enable the comprehensive and rapid dynamic deployment of what we refer to as "collaborative infrastructure." In this presentation we discuss the results of this proof-of-concept prototype which enabled HydroShare users to readily instantiate virtual infrastructure marshaling arbitrary combinations, varieties, and quantities of distributed data and computing infrastructure in addressing big problems in hydrology.

  6. Celebrating Professor Britton Chance (1913-2010), a founding father of redox sciences.

    PubMed

    Ohnishi, Tomoko; Zweier, Jay L

    2011-12-01

    Renowned great scientist and redox pioneer, Dr. Britton Chance, closed his 97 years of legendary life on November 16, 2010. He was the Eldridge Reeves Johnson emeritus professor of biophysics, physical chemistry, and radiologic physics at the University of Pennsylvania. He achieved fame as a prominent biophysicist and developer of highly innovative biomedical instrumentation. His scientific career stretched over almost one century and he achieved many scientific and engineering breakthroughs throughout his long prolific career. The advances that he and his colleagues achieved led to great strides in our understanding of biology and disease. He was among the first scientists to recognize the importance of free radicals and reactive oxygen species in mitochondrial metabolism and cells as well as to map pathways of redox biology and signaling. Dr. Chance served as a pioneer and inspiration to generations of researchers in the fields of redox biochemistry, metabolism, and disease. He will be missed by all of us in the research community but will live on through his monumental scientific accomplishments, the novel instrumentation he developed, as well as the many scientists whom he trained and influenced.

  7. Martha Wollstein: A pioneer American female clinician-scientist.

    PubMed

    Abrams, Jeanne; Wright, James R

    2018-01-01

    Martha Wollstein was not only the first fully specialized pediatric perinatal pathologist practicing exclusively in a North America children's hospital, she also blazed another pathway as a very early pioneer female clinician-scientist. Wollstein provided patient care at Babies Hospital of New York City from 1891 until her retirement in 1935, and also simultaneously worked for many years as a basic scientist at the prestigious Rockefeller Institute for Medical Research. Wollstein published over 65 papers, many frequently cited, during her career on a wide range of topics including pediatric and infectious diseases. Wollstein was a rare female in the field of pathology in an era when just a relatively small number of women became doctors in any medical specialty. Wollstein was born into an affluent Jewish American family in New York City in 1868 and graduated from the Women's Medical College in 1889. This paper explores her family support and ethnic and religious background, which helped facilitate her professional success. During her time, she was recognized internationally for her research and was respected for her medical and scientific skills; unfortunately today her important career has been largely forgotten.

  8. Global Cooperation in the Science of Sun-Earth Connection

    NASA Technical Reports Server (NTRS)

    Gopalswamy, Natchimuthuk; Davila, Joseph

    2011-01-01

    The international space science community had recognized the importance of space weather more than a decade ago, which resulted in a number of international collaborative activities such as the International Space Weather Initiative (ISWI), the Climate and Weather of the Sun Earth System (CAWSES) by SCOSTEP and the International Living with a Star (ILWS) program. These programs have brought scientists together to tackle the scientific issues related to short and long term variability of the Sun and the consequences in the heliosphere. The ISWI program is a continuation of the successful International Heliophysical Year (IHY) 2007 program in focusing on science, observatory deployment, and outreach. The IHY/ISWI observatory deployment has not only filled voids in data coverage, but also inducted young scientists from developing countries into the scientific community. The ISWI schools and UN workshops are the primary venues for interaction and information exchange among scientists from developing and developed countries that lead to collaborative efforts in space weather. This paper presents a summary of ISWI activities that promote space weather science via complementary approaches in international scientific collaborations, capacity building, and public outreach.

  9. On ``Carrington, Schwabe, and the Gold Medal''

    NASA Astrophysics Data System (ADS)

    Schröder, Wilfried

    2006-06-01

    I note with interest the article by Cliver [2005] about the early solar investigations of Heinrich Schwabe and Richard Carrington and offer some further insights into Schwabe's work and its reception at the time. Schwabe commenced his observations in 1826 with a small telescope he had bought some years earlier. For more than 40 years, he observed the Sun and made meteorological notes. In his 1843 essay, he noted a sunspot cycle of about 10 years, but his result aroused little interest with contemporary astronomers. Research at the time was focused on the physics of the planets, the Moon, and other topics. Schwabe had published data in the well-known Astronomische Nachrichten, but not until Alexander von Humboldt republished it in his Kosmos, volume 3 (1851), did the data begin to be recognized and accepted by Schwabe's fellow scientists. Humboldt's Kosmos was a publication of considerable prestige, and it had a wide circulation among scientists and the educated public. Scwabe's work became familiar to other scientists including Carrington, Angelo Secchi, and Gustav Spörer and, as noted by Cliver, earned him the gold medal of the Royal Astronomical Society.

  10. The case for policy-relevant conservation science.

    PubMed

    Rose, David C

    2015-06-01

    Drawing on the "evidence-based" (Sutherland et al. 2013) versus "evidence-informed" debate (Adams & Sandbrook 2013), which has become prominent in conservation science, I argue that science can be influential if it holds a dual reference (Lentsch & Weingart 2011) that contributes to the needs of policy makers whilst maintaining technical rigor. In line with such a strategy, conservation scientists are increasingly recognizing the usefulness of constructing narratives through which to enhance the influence of their evidence (Leslie et al. 2013; Lawton & Rudd 2014). Yet telling stories alone is rarely enough to influence policy; instead, these narratives must be policy relevant. To ensure that evidence is persuasive alongside other factors in a complex policy-making process, conservation scientists could follow 2 steps: reframe within salient political contexts and engage more productively in boundary work, which is defined as the ways in which scientists "construct, negotiate, and defend the boundary between science and policy" (Owens et al. 2006:640). These will both improve the chances of evidence-informed conservation policy. © 2015 The Authors. Conservation Biology published by Wiley Periodicals, Inc., on behalf of Society for Conservation Biology.

  11. Integrating High-Throughput Parallel Processing Framework and Storage Area Network Concepts Into a Prototype Interactive Scientific Visualization Environment for Hyperspectral Data

    NASA Astrophysics Data System (ADS)

    Smuga-Otto, M. J.; Garcia, R. K.; Knuteson, R. O.; Martin, G. D.; Flynn, B. M.; Hackel, D.

    2006-12-01

    The University of Wisconsin-Madison Space Science and Engineering Center (UW-SSEC) is developing tools to help scientists realize the potential of high spectral resolution instruments for atmospheric science. Upcoming satellite spectrometers like the Cross-track Infrared Sounder (CrIS), experimental instruments like the Geosynchronous Imaging Fourier Transform Spectrometer (GIFTS) and proposed instruments like the Hyperspectral Environmental Suite (HES) within the GOES-R project will present a challenge in the form of the overwhelmingly large amounts of continuously generated data. Current and near-future workstations will have neither the storage space nor computational capacity to cope with raw spectral data spanning more than a few minutes of observations from these instruments. Schemes exist for processing raw data from hyperspectral instruments currently in testing, that involve distributed computation across clusters. Data, which for an instrument like GIFTS can amount to over 1.5 Terabytes per day, is carefully managed on Storage Area Networks (SANs), with attention paid to proper maintenance of associated metadata. The UW-SSEC is preparing a demonstration integrating these back-end capabilities as part of a larger visualization framework, to assist scientists in developing new products from high spectral data, sourcing data volumes they could not otherwise manage. This demonstration focuses on managing storage so that only the data specifically needed for the desired product are pulled from the SAN, and on running computationally expensive intermediate processing on a back-end cluster, with the final product being sent to a visualization system on the scientist's workstation. Where possible, existing software and solutions are used to reduce cost of development. The heart of the computing component is the GIFTS Information Processing System (GIPS), developed at the UW- SSEC to allow distribution of processing tasks such as conversion of raw GIFTS interferograms into calibrated radiance spectra, and retrieving temperature and water vapor content atmospheric profiles from these spectra. The hope is that by demonstrating the capabilities afforded by a composite system like the one described here, scientists can be convinced to contribute further algorithms in support of this model of computing and visualization.

  12. The effect of freezing on reactions with environmental impact.

    PubMed

    O'Concubhair, Ruairí; Sodeau, John R

    2013-11-19

    The knowledge that the freezing process can accelerate certain chemical reactions has been available since the 1960s, particularly in relation to the food industry. However, investigations into such effects on environmentally relevant reactions have only been carried out since the late 1980s. Some 20 years later, the field has matured and scientists have conducted research into various important processes such as the oxidation of nitrite ions to nitrates, sulfites to sulfates, and elemental mercury to inorganic mercury. Field observations mainly carried out in the polar regions have driven this work. For example, researchers have found that both ozone and mercury are removed from the troposphere completely (and almost instantaneously) at the time of Arctic polar sunrise. The monitoring activities suggested that both the phenomena were caused by involvement of bromine (and possibly iodine) chemistry. Scientists investigating the production of interhalide products (bromine and iodine producing interhalides) in frozen aqueous solutions have found that these reactions result in both rate accelerations and unexpected products. Furthermore, these scientists did this research with environmentally relevant concentrations of reagents, thereby suggesting that these reactions could occur in the polar regions. The conversion of elemental mercury to more oxidized forms has also shown that the acceleration of reactions can occur when environmentally relevant concentrations of Hg(0) and oxidants are frozen together in aqueous solutions. These observations, coupled with previous investigations into the effect of freezing on environmental reactions, lead us to conclude that this type of chemistry could potentially play a significant role in the chemical processing of a wide variety of inorganic components in polar regions. More recently, researchers have recognized the implications of these complementary field and laboratory findings toward human health and climate change. In this Account, we focus on the chemical and physical mechanisms that may promote novel chemistry and rate accelerations when water-ice is present. Future prospects will likely concentrate, once again, on the low-temperature chemistry of organic compounds, such as the humic acids, which are known cryospheric contaminants. Furthermore, data on the kinetics and thermodynamics of all types of reaction promoted by the freezing process would provide much assistance in determining their implications to environmental computer models.

  13. Update on the Culicoides sonorensis transcriptome project: a peek into the molecular biology of the midge

    USDA-ARS?s Scientific Manuscript database

    Next Generation Sequencing is transforming the way scientists collect and measure an organism’s genetic background and gene dynamics, while bioinformatics and super-computing are merging to facilitate parallel sample computation and interpretation at unprecedented speeds. Analyzing the complete gene...

  14. Collaborative Learning: Cognitive and Computational Approaches. Advances in Learning and Instruction Series.

    ERIC Educational Resources Information Center

    Dillenbourg, Pierre, Ed.

    Intended to illustrate the benefits of collaboration between scientists from psychology and computer science, namely machine learning, this book contains the following chapters, most of which are co-authored by scholars from both sides: (1) "Introduction: What Do You Mean by 'Collaborative Learning'?" (Pierre Dillenbourg); (2)…

  15. RESEARCH STRATEGIES FOR THE APPLICATION OF THE TECHNIQUES OF COMPUTATIONAL BIOLOGICAL CHEMISTRY TO ENVIRONMENTAL PROBLEMS

    EPA Science Inventory

    On October 25 and 26, 1984, the U.S. EPA sponsored a workshop to consider the potential applications of the techniques of computational biological chemistry to problems in environmental health. Eleven extramural scientists from the various related disciplines and a similar number...

  16. Debugging Geographers: Teaching Programming to Non-Computer Scientists

    ERIC Educational Resources Information Center

    Muller, Catherine L.; Kidd, Chris

    2014-01-01

    The steep learning curve associated with computer programming can be a daunting prospect, particularly for those not well aligned with this way of logical thinking. However, programming is a skill that is becoming increasingly important. Geography graduates entering careers in atmospheric science are one example of a particularly diverse group who…

  17. Using Computers for Research into Social Relations.

    ERIC Educational Resources Information Center

    Holden, George W.

    1988-01-01

    Discusses computer-presented social situations (CPSS), i.e., microcomputer-based simulations developed to provide a new methodological tool for social scientists interested in the study of social relations. Two CPSSs are described: DaySim, used to help identify types of parenting; and DateSim, used to study interpersonal attraction. (21…

  18. Brains--Computers--Machines: Neural Engineering in Science Classrooms

    ERIC Educational Resources Information Center

    Chudler, Eric H.; Bergsman, Kristen Clapper

    2016-01-01

    Neural engineering is an emerging field of high relevance to students, teachers, and the general public. This feature presents online resources that educators and scientists can use to introduce students to neural engineering and to integrate core ideas from the life sciences, physical sciences, social sciences, computer science, and engineering…

  19. Computer Science Professionals and Greek Library Science

    ERIC Educational Resources Information Center

    Dendrinos, Markos N.

    2008-01-01

    This paper attempts to present the current state of computer science penetration into librarianship in terms of both workplace and education issues. The shift from material libraries into digital libraries is mirrored in the corresponding shift from librarians into information scientists. New library data and metadata, as well as new automated…

  20. Describing the What and Why of Students' Difficulties in Boolean Logic

    ERIC Educational Resources Information Center

    Herman, Geoffrey L.; Loui, Michael C.; Kaczmarczyk, Lisa; Zilles, Craig

    2012-01-01

    The ability to reason with formal logic is a foundational skill for computer scientists and computer engineers that scaffolds the abilities to design, debug, and optimize. By interviewing students about their understanding of propositional logic and their ability to translate from English specifications to Boolean expressions, we characterized…

  1. Novel 3-D Computer Model Can Help Predict Pathogens’ Roles in Cancer | Poster

    Cancer.gov

    To understand how bacterial and viral infections contribute to human cancers, four NCI at Frederick scientists turned not to the lab bench, but to a computer. The team has created the world’s first—and currently, only—3-D computational approach for studying interactions between pathogen proteins and human proteins based on a molecular adaptation known as interface mimicry.

  2. Social and Personal Factors in Semantic Infusion Projects

    NASA Astrophysics Data System (ADS)

    West, P.; Fox, P. A.; McGuinness, D. L.

    2009-12-01

    As part of our semantic data framework activities across multiple, diverse disciplines we required the involvement of domain scientists, computer scientists, software engineers, data managers, and often, social scientists. This involvement from a cross-section of disciplines turns out to be a social exercise as much as it is a technical and methodical activity. Each member of the team is used to different modes of working, expectations, vocabularies, levels of participation, and incentive and reward systems. We will examine how both roles and personal responsibilities play in the development of semantic infusion projects, and how an iterative development cycle can contribute to the successful completion of such a project.

  3. The Ocean 180 Video Challenge: An Innovative Outreach Strategy for Connecting Scientists to Classrooms

    NASA Astrophysics Data System (ADS)

    Tankersley, R. A.; Windsor, J. G.; Briceno, K. V.

    2016-02-01

    Recognizing the need for scientists to engage and communicate more effectively with the public, the Florida Center for Ocean Sciences Education Excellence (COSEE Florida) created an opportunity to connect the two through film. The Ocean 180 Video Challenge taps into the competitive spirit of scientists and encourages them to submit short, 3-minute video abstracts summarizing the important findings of recent peer-reviewed papers and highlighting the relevance, meaning, and implications of the research to persons outside their discipline. Although the videos are initially screened and evaluated by a team of science and communication experts, the winners (from a field of ten finalists) are selected by middle school students in classrooms all over the world. Since its inception in 2013, Ocean 180 has grown in popularity, with more than 38,000 middle school students from 1,637 classrooms in 21 countries participating as judges. Results of a Draw-a-Scientist Test administered during the 2015 competition indicate Ocean 180 is an successful intervention that has a positive impact on students' views of science, including their perception and attitudes toward scientists and science careers. Thus, our presentation will discuss how video competitions can serve as effective outreach strategies for encouraging scientists to share new discoveries and their enthusiasm for science with K-12 students. We will also highlight the outcomes and lessons-learned from the 2014 and 2015 competitions, including (1) strategies for recruiting teachers and students to participate as judges, (2) approaches used by educators to align the content of videos with state and national science standards, and (3) ways contest videos can be integrated into science training and professional development programs, including workshops focusing on effective video storytelling techniques.

  4. Astronomy, New Instrumentation, and the News Media

    NASA Technical Reports Server (NTRS)

    Maran, Stephen P.

    2001-01-01

    The early work of Bob Tull who invented a photoelectric spectral scanner comprised a crucial phase in the development of astronomical instrumentation. The relationship between the academic astronomy/astrophysics community and journalists has been in flux since the early 1960s. Scientists should recognize that they rely on the press to disseminate scientific information. Public citizens and policy makers are interested in the pursuits of scientific research for which taxes and other public monies are used.

  5. [Activities of Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2001-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administrations missions. RIACS is located at the NASA Ames Research Center, Moffett Field, California. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1. Automated Reasoning for Autonomous Systems Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. 2. Human-Centered Computing Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities. 3. High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to analysis of large scientific datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.

  6. The monitoring system for vibratory disturbance detection in microgravity environment aboard the international space station

    NASA Technical Reports Server (NTRS)

    Laster, Rachel M.

    2004-01-01

    Scientists in the Office of Life and Microgravity Sciences and Applications within the Microgravity Research Division oversee studies in important physical, chemical, and biological processes in microgravity environment. Research is conducted in microgravity environment because of the beneficial results that come about for experiments. When research is done in normal gravity, scientists are limited to results that are affected by the gravity of Earth. Microgravity provides an environment where solid, liquid, and gas can be observed in a natural state of free fall and where many different variables are eliminated. One challenge that NASA faces is that space flight opportunities need to be used effectively and efficiently in order to ensure that some of the most scientifically promising research is conducted. Different vibratory sources are continually active aboard the International Space Station (ISS). Some of the vibratory sources include crew exercise, experiment setup, machinery startup (life support fans, pumps, freezer/compressor, centrifuge), thruster firings, and some unknown events. The Space Acceleration Measurement System (SAMs), which acts as the hardware and carefully positioned aboard the ISS, along with the Microgravity Environment Monitoring System MEMS), which acts as the software and is located here at NASA Glenn, are used to detect these vibratory sources aboard the ISS and recognize them as disturbances. The various vibratory disturbances can sometimes be harmful to the scientists different research projects. Some vibratory disturbances are recognized by the MEMS's database and some are not. Mainly, the unknown events that occur aboard the International Space Station are the ones of major concern. To better aid in the research experiments, the unknown events are identified and verified as unknown events. Features, such as frequency, acceleration level, time and date of recognition of the new patterns are stored in an Excel database. My task is to carefully synthesize frequency and acceleration patterns of unknown events within the Excel database into a new file to determine whether or not certain information that is received i s considered a real vibratory source. Once considered as a vibratory source, further analysis is carried out. The resulting information is used to retrain the MEMS to recognize them as known patterns. These different vibratory disturbances are being constantly monitored to observe if, in any way, the disturbances have an effect on the microgravity environment that research experiments are exposed to. If the disturbance has little or no effect on the experiments, then research is continued. However, if the disturbance is harmful to the experiment, scientists act accordingly by either minimizing the source or terminating the research and neither NASA's time nor money is wasted.

  7. Of possible cheminformatics futures.

    PubMed

    Oprea, Tudor I; Taboureau, Olivier; Bologa, Cristian G

    2012-01-01

    For over a decade, cheminformatics has contributed to a wide array of scientific tasks from analytical chemistry and biochemistry to pharmacology and drug discovery; and although its contributions to decision making are recognized, the challenge is how it would contribute to faster development of novel, better products. Here we address the future of cheminformatics with primary focus on innovation. Cheminformatics developers often need to choose between "mainstream" (i.e., accepted, expected) and novel, leading-edge tools, with an increasing trend for open science. Possible futures for cheminformatics include the worst case scenario (lack of funding, no creative usage), as well as the best case scenario (complete integration, from systems biology to virtual physiology). As "-omics" technologies advance, and computer hardware improves, compounds will no longer be profiled at the molecular level, but also in terms of genetic and clinical effects. Among potentially novel tools, we anticipate machine learning models based on free text processing, an increased performance in environmental cheminformatics, significant decision-making support, as well as the emergence of robot scientists conducting automated drug discovery research. Furthermore, cheminformatics is anticipated to expand the frontiers of knowledge and evolve in an open-ended, extensible manner, allowing us to explore multiple research scenarios in order to avoid epistemological "local information minimum trap".

  8. Biomolecular electronics in the twenty-first century.

    PubMed

    Phadke, R S

    2001-01-01

    A relentless decrease in the size of silicon-based microelectronics devices is posing problems. The most important among these are limitations imposed by quantum-size effects and instabilities introduced by the effects of thermal fluctuations. These inherent envisaged problems of present-day systems have prompted scientists to look for alternative options. Advancement in the understanding of natural systems such as photosynthetic apparatuses and genetic engineering has enabled attention to be focused on the use of biomolecules. Biomolecules have the advantages of functionality and specificity. The invention of scanning tunneling microscopy and atomic force microscopy has opened up the possibility of addressing and manipulating individual atoms and molecules. Realization of the power of self-assembly principles has opened a novel approach for designing and assembling molecular structures with desired intricate architecture. The utility of molecules such as DNA as a three-dimensional, high-density memory element and its capability for molecular computing have been fully recognized but not yet realized. More time and effort are necessary before devices that can transcend existing ones will become easily available. An overview of the current trends that are envisaged to give rich dividends in the next millennium are discussed.

  9. Summary of workshop 'Theory Meets Industry'—the impact of ab initio solid state calculations on industrial materials research

    NASA Astrophysics Data System (ADS)

    Wimmer, E.

    2008-02-01

    A workshop, 'Theory Meets Industry', was held on 12-14 June 2007 in Vienna, Austria, attended by a well balanced number of academic and industrial scientists from America, Europe, and Japan. The focus was on advances in ab initio solid state calculations and their practical use in industry. The theoretical papers addressed three dominant themes, namely (i) more accurate total energies and electronic excitations, (ii) more complex systems, and (iii) more diverse and accurate materials properties. Hybrid functionals give some improvements in energies, but encounter difficulties for metallic systems. Quantum Monte Carlo methods are progressing, but no clear breakthrough is on the horizon. Progress in order-N methods is steady, as is the case for efficient methods for exploring complex energy hypersurfaces and large numbers of structural configurations. The industrial applications were dominated by materials issues in energy conversion systems, the quest for hydrogen storage materials, improvements of electronic and optical properties of microelectronic and display materials, and the simulation of reactions on heterogeneous catalysts. The workshop is a clear testimony that ab initio computations have become an industrial practice with increasingly recognized impact.

  10. Decision tree and ensemble learning algorithms with their applications in bioinformatics.

    PubMed

    Che, Dongsheng; Liu, Qi; Rasheed, Khaled; Tao, Xiuping

    2011-01-01

    Machine learning approaches have wide applications in bioinformatics, and decision tree is one of the successful approaches applied in this field. In this chapter, we briefly review decision tree and related ensemble algorithms and show the successful applications of such approaches on solving biological problems. We hope that by learning the algorithms of decision trees and ensemble classifiers, biologists can get the basic ideas of how machine learning algorithms work. On the other hand, by being exposed to the applications of decision trees and ensemble algorithms in bioinformatics, computer scientists can get better ideas of which bioinformatics topics they may work on in their future research directions. We aim to provide a platform to bridge the gap between biologists and computer scientists.

  11. International Conferences and Young Scientists Schools on Computational Information Technologies for Environmental Sciences (CITES) as a professional growth instrument

    NASA Astrophysics Data System (ADS)

    Gordov, E. P.; Lykosov, V. N.; Genina, E. Yu; Gordova, Yu E.

    2017-11-01

    The paper describes a regular events CITES consisting of young scientists school and international conference as a tool for training and professional growth. The events address the most pressing issues of application of information-computational technologies in environmental sciences and young scientists’ training, diminishing a gap between university graduates’ skill and concurrent challenges. The viability of the approach to the CITES organization is proved by the fact that single event organized in 2001 turned into a series, quite a few young participants successfully defended their PhD thesis and a number of researchers became Doctors of Science during these years. Young researchers from Russia and foreign countries show undiminishing interest to these events.

  12. 10 CFR 35.657 - Therapy-related computer systems.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 1 2012-01-01 2012-01-01 false Therapy-related computer systems. 35.657 Section 35.657... Units, Teletherapy Units, and Gamma Stereotactic Radiosurgery Units § 35.657 Therapy-related computer... computer systems in accordance with published protocols accepted by nationally recognized bodies. At a...

  13. 10 CFR 35.657 - Therapy-related computer systems.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 1 2013-01-01 2013-01-01 false Therapy-related computer systems. 35.657 Section 35.657... Units, Teletherapy Units, and Gamma Stereotactic Radiosurgery Units § 35.657 Therapy-related computer... computer systems in accordance with published protocols accepted by nationally recognized bodies. At a...

  14. 10 CFR 35.657 - Therapy-related computer systems.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 1 2014-01-01 2014-01-01 false Therapy-related computer systems. 35.657 Section 35.657... Units, Teletherapy Units, and Gamma Stereotactic Radiosurgery Units § 35.657 Therapy-related computer... computer systems in accordance with published protocols accepted by nationally recognized bodies. At a...

  15. 10 CFR 35.657 - Therapy-related computer systems.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 1 2011-01-01 2011-01-01 false Therapy-related computer systems. 35.657 Section 35.657... Units, Teletherapy Units, and Gamma Stereotactic Radiosurgery Units § 35.657 Therapy-related computer... computer systems in accordance with published protocols accepted by nationally recognized bodies. At a...

  16. 10 CFR 35.657 - Therapy-related computer systems.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Therapy-related computer systems. 35.657 Section 35.657... Units, Teletherapy Units, and Gamma Stereotactic Radiosurgery Units § 35.657 Therapy-related computer... computer systems in accordance with published protocols accepted by nationally recognized bodies. At a...

  17. Multiuser Collaboration with Networked Mobile Devices

    NASA Technical Reports Server (NTRS)

    Tso, Kam S.; Tai, Ann T.; Deng, Yong M.; Becks, Paul G.

    2006-01-01

    In this paper we describe a multiuser collaboration infrastructure that enables multiple mission scientists to remotely and collaboratively interact with visualization and planning software, using wireless networked personal digital assistants(PDAs) and other mobile devices. During ground operations of planetary rover and lander missions, scientists need to meet daily to review downlinked data and plan science activities. For example, scientists use the Science Activity Planner (SAP) in the Mars Exploration Rover (MER) mission to visualize downlinked data and plan rover activities during the science meetings [1]. Computer displays are projected onto large screens in the meeting room to enable the scientists to view and discuss downlinked images and data displayed by SAP and other software applications. However, only one person can interact with the software applications because input to the computer is limited to a single mouse and keyboard. As a result, the scientists have to verbally express their intentions, such as selecting a target at a particular location on the Mars terrain image, to that person in order to interact with the applications. This constrains communication and limits the returns of science planning. Furthermore, ground operations for Mars missions are fundamentally constrained by the short turnaround time for science and engineering teams to process and analyze data, plan the next uplink, generate command sequences, and transmit the uplink to the vehicle [2]. Therefore, improving ground operations is crucial to the success of Mars missions. The multiuser collaboration infrastructure enables users to control software applications remotely and collaboratively using mobile devices. The infrastructure includes (1) human-computer interaction techniques to provide natural, fast, and accurate inputs, (2) a communications protocol to ensure reliable and efficient coordination of the input devices and host computers, (3) an application-independent middleware that maintains the states, sessions, and interactions of individual users of the software applications, (4) an application programming interface to enable tight integration of applications and the middleware. The infrastructure is able to support any software applications running under the Windows or Unix platforms. The resulting technologies not only are applicable to NASA mission operations, but also useful in other situations such as design reviews, brainstorming sessions, and business meetings, as they can benefit from having the participants concurrently interact with the software applications (e.g., presentation applications and CAD design tools) to illustrate their ideas and provide inputs.

  18. Bill Would Extend Efforts Against Harmful Algal Blooms and Hypoxia

    NASA Astrophysics Data System (ADS)

    Showstack, Randy

    Legislation to deal with the problems of harmful algal blooms and hypoxia in U.S. waters needs to recognize the growing national scope and economic effects of these phenomena, improve monitoring capabilities, and target remedies for them. It should also emphasize research and management in the Great Lakes and other fresh water bodies, as well as in U.S. coastal waters. This, according to a panel of scientists who testified at a 13 March hearing of the Science Subcommittee on Environment, Technology, and Standards of the U.S. House of Representatives. Those testifying said the two phenomena are causing enormous, negative ecological and economic impacts. Donald Scavia, a senior scientist with National Ocean Service of the National Oceanic and Atmospheric Administration, said, ``Harmful algal blooms and hypoxia are now among the most pressing environmental issues facing coastal states.''

  19. Global Cooperation in the Science of Space Weather

    NASA Technical Reports Server (NTRS)

    Gopalswamy, Nat

    2011-01-01

    The international space science community had recognized the importance of space weather more than a decade ago, which resulted in a number of international collaborative activities such as the Climate and Weather of the Sun Earth System (CAWSES) by SCOSTEP and the International Space Weather Initiative (ISWI). The ISWI program is a continuation of the successful International Heliophysical Year (IHY) program. These programs have brought scientists together to tackle the scientific issues behind space weather. In addition to the vast array of space instruments, ground based instruments have been deployed, which not only filled voids in data coverage, but also inducted young scientists from developing countries into the scientific community. This paper presents a summary of CAWSES and ISWI activities that promote space weather science via complementary approaches in international scientific collaborations. capacity building. and public outreach.

  20. Software Carpentry and the Hydrological Sciences

    NASA Astrophysics Data System (ADS)

    Ahmadia, A. J.; Kees, C. E.; Farthing, M. W.

    2013-12-01

    Scientists are spending an increasing amount of time building and using hydrology software. However, most scientists are never taught how to do this efficiently. As a result, many are unaware of tools and practices that would allow them to write more reliable and maintainable code with less effort. As hydrology models increase in capability and enter use by a growing number of scientists and their communities, it is important that the scientific software development practices scale up to meet the challenges posed by increasing software complexity, lengthening software lifecycles, a growing number of stakeholders and contributers, and a broadened developer base that extends from application domains to high performance computing centers. Many of these challenges in complexity, lifecycles, and developer base have been successfully met by the open source community, and there are many lessons to be learned from their experiences and practices. Additionally, there is much wisdom to be found in the results of research studies conducted on software engineering itself. Software Carpentry aims to bridge the gap between the current state of software development and these known best practices for scientific software development, with a focus on hands-on exercises and practical advice based on the following principles: 1. Write programs for people, not computers. 2. Automate repetitive tasks 3. Use the computer to record history 4. Make incremental changes 5. Use version control 6. Don't repeat yourself (or others) 7. Plan for mistakes 8. Optimize software only after it works 9. Document design and purpose, not mechanics 10. Collaborate We discuss how these best practices, arising from solid foundations in research and experience, have been shown to help improve scientist's productivity and the reliability of their software.

  1. CompNanoTox2015: novel perspectives from a European conference on computational nanotoxicology on predictive nanotoxicology.

    PubMed

    Bañares, Miguel A; Haase, Andrea; Tran, Lang; Lobaskin, Vladimir; Oberdörster, Günter; Rallo, Robert; Leszczynski, Jerzy; Hoet, Peter; Korenstein, Rafi; Hardy, Barry; Puzyn, Tomasz

    2017-09-01

    A first European Conference on Computational Nanotoxicology, CompNanoTox, was held in November 2015 in Benahavís, Spain with the objectives to disseminate and integrate results from the European modeling and database projects (NanoPUZZLES, ModENPTox, PreNanoTox, MembraneNanoPart, MODERN, eNanoMapper and EU COST TD1204 MODENA) as well as to create synergies within the European NanoSafety Cluster. This conference was supported by the COST Action TD1204 MODENA on developing computational methods for toxicological risk assessment of engineered nanoparticles and provided a unique opportunity for cross fertilization among complementary disciplines. The efforts to develop and validate computational models crucially depend on high quality experimental data and relevant assays which will be the basis to identify relevant descriptors. The ambitious overarching goal of this conference was to promote predictive nanotoxicology, which can only be achieved by a close collaboration between the computational scientists (e.g. database experts, modeling experts for structure, (eco) toxicological effects, performance and interaction of nanomaterials) and experimentalists from different areas (in particular toxicologists, biologists, chemists and material scientists, among others). The main outcome and new perspectives of this conference are summarized here.

  2. Tools and techniques for computational reproducibility.

    PubMed

    Piccolo, Stephen R; Frampton, Michael B

    2016-07-11

    When reporting research findings, scientists document the steps they followed so that others can verify and build upon the research. When those steps have been described in sufficient detail that others can retrace the steps and obtain similar results, the research is said to be reproducible. Computers play a vital role in many research disciplines and present both opportunities and challenges for reproducibility. Computers can be programmed to execute analysis tasks, and those programs can be repeated and shared with others. The deterministic nature of most computer programs means that the same analysis tasks, applied to the same data, will often produce the same outputs. However, in practice, computational findings often cannot be reproduced because of complexities in how software is packaged, installed, and executed-and because of limitations associated with how scientists document analysis steps. Many tools and techniques are available to help overcome these challenges; here we describe seven such strategies. With a broad scientific audience in mind, we describe the strengths and limitations of each approach, as well as the circumstances under which each might be applied. No single strategy is sufficient for every scenario; thus we emphasize that it is often useful to combine approaches.

  3. CompNanoTox2015: novel perspectives from a European conference on computational nanotoxicology on predictive nanotoxicology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bañares, Miguel A.; Haase, Andrea; Tran, Lang

    A first European Conference on Computational Nanotoxicology, CompNanoTox, was held in November 2015 in Benahavís, Spain with the objectives to disseminate and integrate results from the European modeling and database projects (NanoPUZZLES, ModENPTox, PreNanoTox, MembraneNanoPart, MODERN, eNanoMapper and EU COST TD1204 MODENA) as well as to create synergies within the European NanoSafety Cluster. This conference was supported by the COST Action TD1204 MODENA on developing computational methods for toxicological risk assessment of engineered nanoparticles and provided a unique opportunity for crossfertilization among complementary disciplines. The efforts to develop and validate computational models crucially depend on high quality experimental data andmore » relevant assays which will be the basis to identify relevant descriptors. The ambitious overarching goal of this conference was to promote predictive nanotoxicology, which can only be achieved by a close collaboration between the computational scientists (e.g. database experts, modeling experts for structure, (eco) toxicological effects, performance and interaction of nanomaterials) and experimentalists from different areas (in particular toxicologists, biologists, chemists and material scientists, among others). The main outcome and new perspectives of this conference are summarized here.« less

  4. Stereotyping in Relation to the Gender Gap in Participation in Computing.

    ERIC Educational Resources Information Center

    Siann, Gerda; And Others

    1988-01-01

    A questionnaire completed by 928 postsecondary students asked subjects to rate one of two computer scientists on 16 personal attributes. Aside from gender of the ratee, questionnaires were identical. Results indicate that on eight attributes the female was rated significantly more positively than the male. Implications are discussed. (Author/CH)

  5. Constructing Contracts: Making Discrete Mathematics Relevant to Beginning Programmers

    ERIC Educational Resources Information Center

    Gegg-Harrison, Timothy S.

    2005-01-01

    Although computer scientists understand the importance of discrete mathematics to the foundations of their field, computer science (CS) students do not always see the relevance. Thus, it is important to find a way to show students its relevance. The concept of program correctness is generally taught as an activity independent of the programming…

  6. Communication for Scientists and Engineers: A "Computer Model" in the Basic Course.

    ERIC Educational Resources Information Center

    Haynes, W. Lance

    Successful speech should rest not on prepared notes and outlines but on genuine oral discourse based on "data" fed into the "software" in the computer which already exists within each person. Writing cannot speak for itself, nor can it continually adjust itself to accommodate diverse response. Moreover, no matter how skillfully…

  7. Identification of Factors That Affect Software Complexity.

    ERIC Educational Resources Information Center

    Kaiser, Javaid

    A survey of computer scientists was conducted to identify factors that affect software complexity. A total of 160 items were selected from the literature to include in a questionnaire sent to 425 individuals who were employees of computer-related businesses in Lawrence and Kansas City. The items were grouped into nine categories called system…

  8. Synthetic Biology: Knowledge Accessed by Everyone (Open Sources)

    ERIC Educational Resources Information Center

    Sánchez Reyes, Patricia Margarita

    2016-01-01

    Using the principles of biology, along with engineering and with the help of computer, scientists manage to copy. DNA sequences from nature and use them to create new organisms. DNA is created through engineering and computer science managing to create life inside a laboratory. We cannot dismiss the role that synthetic biology could lead in…

  9. The Multiple Pendulum Problem via Maple[R

    ERIC Educational Resources Information Center

    Salisbury, K. L.; Knight, D. G.

    2002-01-01

    The way in which computer algebra systems, such as Maple, have made the study of physical problems of some considerable complexity accessible to mathematicians and scientists with modest computational skills is illustrated by solving the multiple pendulum problem. A solution is obtained for four pendulums with no restriction on the size of the…

  10. Computers and the Future of Skill Demand. Educational Research and Innovation Series

    ERIC Educational Resources Information Center

    Elliott, Stuart W.

    2017-01-01

    Computer scientists are working on reproducing all human skills using artificial intelligence, machine learning and robotics. Unsurprisingly then, many people worry that these advances will dramatically change work skills in the years ahead and perhaps leave many workers unemployable. This report develops a new approach to understanding these…

  11. Symposium on Parallel Computational Methods for Large-scale Structural Analysis and Design, 2nd, Norfolk, VA, US

    NASA Technical Reports Server (NTRS)

    Storaasli, Olaf O. (Editor); Housner, Jerrold M. (Editor)

    1993-01-01

    Computing speed is leaping forward by several orders of magnitude each decade. Engineers and scientists gathered at a NASA Langley symposium to discuss these exciting trends as they apply to parallel computational methods for large-scale structural analysis and design. Among the topics discussed were: large-scale static analysis; dynamic, transient, and thermal analysis; domain decomposition (substructuring); and nonlinear and numerical methods.

  12. Use of Emerging Grid Computing Technologies for the Analysis of LIGO Data

    NASA Astrophysics Data System (ADS)

    Koranda, Scott

    2004-03-01

    The LIGO Scientific Collaboration (LSC) today faces the challenge of enabling analysis of terabytes of LIGO data by hundreds of scientists from institutions all around the world. To meet this challenge the LSC is developing tools, infrastructure, applications, and expertise leveraging Grid Computing technologies available today, and making available to LSC scientists compute resources at sites across the United States and Europe. We use digital credentials for strong and secure authentication and authorization to compute resources and data. Building on top of products from the Globus project for high-speed data transfer and information discovery we have created the Lightweight Data Replicator (LDR) to securely and robustly replicate data to resource sites. We have deployed at our computing sites the Virtual Data Toolkit (VDT) Server and Client packages, developed in collaboration with our partners in the GriPhyN and iVDGL projects, providing uniform access to distributed resources for users and their applications. Taken together these Grid Computing technologies and infrastructure have formed the LSC DataGrid--a coherent and uniform environment across two continents for the analysis of gravitational-wave detector data. Much work, however, remains in order to scale current analyses and recent lessons learned need to be integrated into the next generation of Grid middleware.

  13. A cyber-linked undergraduate research experience in computational biomolecular structure prediction and design

    PubMed Central

    Alford, Rebecca F.; Dolan, Erin L.

    2017-01-01

    Computational biology is an interdisciplinary field, and many computational biology research projects involve distributed teams of scientists. To accomplish their work, these teams must overcome both disciplinary and geographic barriers. Introducing new training paradigms is one way to facilitate research progress in computational biology. Here, we describe a new undergraduate program in biomolecular structure prediction and design in which students conduct research at labs located at geographically-distributed institutions while remaining connected through an online community. This 10-week summer program begins with one week of training on computational biology methods development, transitions to eight weeks of research, and culminates in one week at the Rosetta annual conference. To date, two cohorts of students have participated, tackling research topics including vaccine design, enzyme design, protein-based materials, glycoprotein modeling, crowd-sourced science, RNA processing, hydrogen bond networks, and amyloid formation. Students in the program report outcomes comparable to students who participate in similar in-person programs. These outcomes include the development of a sense of community and increases in their scientific self-efficacy, scientific identity, and science values, all predictors of continuing in a science research career. Furthermore, the program attracted students from diverse backgrounds, which demonstrates the potential of this approach to broaden the participation of young scientists from backgrounds traditionally underrepresented in computational biology. PMID:29216185

  14. A cyber-linked undergraduate research experience in computational biomolecular structure prediction and design.

    PubMed

    Alford, Rebecca F; Leaver-Fay, Andrew; Gonzales, Lynda; Dolan, Erin L; Gray, Jeffrey J

    2017-12-01

    Computational biology is an interdisciplinary field, and many computational biology research projects involve distributed teams of scientists. To accomplish their work, these teams must overcome both disciplinary and geographic barriers. Introducing new training paradigms is one way to facilitate research progress in computational biology. Here, we describe a new undergraduate program in biomolecular structure prediction and design in which students conduct research at labs located at geographically-distributed institutions while remaining connected through an online community. This 10-week summer program begins with one week of training on computational biology methods development, transitions to eight weeks of research, and culminates in one week at the Rosetta annual conference. To date, two cohorts of students have participated, tackling research topics including vaccine design, enzyme design, protein-based materials, glycoprotein modeling, crowd-sourced science, RNA processing, hydrogen bond networks, and amyloid formation. Students in the program report outcomes comparable to students who participate in similar in-person programs. These outcomes include the development of a sense of community and increases in their scientific self-efficacy, scientific identity, and science values, all predictors of continuing in a science research career. Furthermore, the program attracted students from diverse backgrounds, which demonstrates the potential of this approach to broaden the participation of young scientists from backgrounds traditionally underrepresented in computational biology.

  15. Computer Model Predicts the Movement of Dust

    NASA Technical Reports Server (NTRS)

    2002-01-01

    A new computer model of the atmosphere can now actually pinpoint where global dust events come from, and can project where they're going. The model may help scientists better evaluate the impact of dust on human health, climate, ocean carbon cycles, ecosystems, and atmospheric chemistry. Also, by seeing where dust originates and where it blows people with respiratory problems can get advanced warning of approaching dust clouds. 'The model is physically more realistic than previous ones,' said Mian Chin, a co-author of the study and an Earth and atmospheric scientist at Georgia Tech and the Goddard Space Flight Center (GSFC) in Greenbelt, Md. 'It is able to reproduce the short term day-to-day variations and long term inter-annual variations of dust concentrations and distributions that are measured from field experiments and observed from satellites.' The above images show both aerosols measured from space (left) and the movement of aerosols predicted by computer model for the same date (right). For more information, read New Computer Model Tracks and Predicts Paths Of Earth's Dust Images courtesy Paul Giroux, Georgia Tech/NASA Goddard Space Flight Center

  16. Architectural Strategies for Enabling Data-Driven Science at Scale

    NASA Astrophysics Data System (ADS)

    Crichton, D. J.; Law, E. S.; Doyle, R. J.; Little, M. M.

    2017-12-01

    The analysis of large data collections from NASA or other agencies is often executed through traditional computational and data analysis approaches, which require users to bring data to their desktops and perform local data analysis. Alternatively, data are hauled to large computational environments that provide centralized data analysis via traditional High Performance Computing (HPC). Scientific data archives, however, are not only growing massive, but are also becoming highly distributed. Neither traditional approach provides a good solution for optimizing analysis into the future. Assumptions across the NASA mission and science data lifecycle, which historically assume that all data can be collected, transmitted, processed, and archived, will not scale as more capable instruments stress legacy-based systems. New paradigms are needed to increase the productivity and effectiveness of scientific data analysis. This paradigm must recognize that architectural and analytical choices are interrelated, and must be carefully coordinated in any system that aims to allow efficient, interactive scientific exploration and discovery to exploit massive data collections, from point of collection (e.g., onboard) to analysis and decision support. The most effective approach to analyzing a distributed set of massive data may involve some exploration and iteration, putting a premium on the flexibility afforded by the architectural framework. The framework should enable scientist users to assemble workflows efficiently, manage the uncertainties related to data analysis and inference, and optimize deep-dive analytics to enhance scalability. In many cases, this "data ecosystem" needs to be able to integrate multiple observing assets, ground environments, archives, and analytics, evolving from stewardship of measurements of data to using computational methodologies to better derive insight from the data that may be fused with other sets of data. This presentation will discuss architectural strategies, including a 2015-2016 NASA AIST Study on Big Data, for evolving scientific research towards massively distributed data-driven discovery. It will include example use cases across earth science, planetary science, and other disciplines.

  17. Automating CapCom: Pragmatic Operations and Technology Research for Human Exploration of Mars

    NASA Technical Reports Server (NTRS)

    Clancey, William J.

    2003-01-01

    During the Apollo program, NASA and the scientific community used terrestrial analog sites for understanding planetary features and for training astronauts to be scientists. More recently, computer scientists and human factors specialists have followed geologists and biologists into the field, learning how science is actually done on expeditions in extreme environments. Research stations have been constructed by the Mars Society in the Arctic and American southwest, providing facilities for hundreds of researchers to investigate how small crews might live and work on Mars. Combining these interests-science, operations, and technology-in Mars analog field expeditions provides tremendous synergy and authenticity to speculations about Mars missions. By relating historical analyses of Apollo and field science, engineers are creating experimental prototypes that provide significant new capabilities, such as a computer system that automates some of the functions of Apollo s CapCom. Thus, analog studies have created a community of practice-a new collaboration between scientists and engineers-so that technology begins with real human needs and works incrementally towards the challenges of the human exploration of Mars.

  18. 2017 Outstanding Contributions to ISCB Award: Fran Lewitter.

    PubMed

    Fogg, Christiana N; Kovats, Diane E; Berger, Bonnie

    2017-01-01

    The Outstanding Contributions to the International Society for Computational Biology (ISCB) Award was launched in 2015 to recognize individuals who have made lasting and valuable contributions to the Society through their leadership, service, and educational work, or a combination of these areas. Fran Lewitter is the 2017 winner of the Outstanding Contributions to ISCB Award and will be recognized at the 2017 Intelligent Systems for Molecular Biology (ISMB)/European Conference on Computational Biology, meeting in Prague, Czech Republic being held from July 21-25, 2017.

  19. 2017 Outstanding Contributions to ISCB Award: Fran Lewitter

    PubMed Central

    Fogg, Christiana N.; Kovats, Diane E.; Berger, Bonnie

    2017-01-01

    The Outstanding Contributions to the International Society for Computational Biology (ISCB) Award was launched in 2015 to recognize individuals who have made lasting and valuable contributions to the Society through their leadership, service, and educational work, or a combination of these areas. Fran Lewitter is the 2017 winner of the Outstanding Contributions to ISCB Award and will be recognized at the 2017 Intelligent Systems for Molecular Biology (ISMB)/European Conference on Computational Biology, meeting in Prague, Czech Republic being held from July 21-25, 2017. PMID:28713545

  20. Computer Program Recognizes Patterns in Time-Series Data

    NASA Technical Reports Server (NTRS)

    Hand, Charles

    2003-01-01

    A computer program recognizes selected patterns in time-series data like digitized samples of seismic or electrophysiological signals. The program implements an artificial neural network (ANN) and a set of N clocks for the purpose of determining whether N or more instances of a certain waveform, W, occur within a given time interval, T. The ANN must be trained to recognize W in the incoming stream of data. The first time the ANN recognizes W, it sets clock 1 to count down from T to zero; the second time it recognizes W, it sets clock 2 to count down from T to zero, and so forth through the Nth instance. On the N + 1st instance, the cycle is repeated, starting with clock 1. If any clock has not reached zero when it is reset, then N instances of W have been detected within time T, and the program so indicates. The program can readily be encoded in a field-programmable gate array or an application-specific integrated circuit that could be used, for example, to detect electroencephalographic or electrocardiographic waveforms indicative of epileptic seizures or heart attacks, respectively.

  1. A systematic identification and analysis of scientists on Twitter.

    PubMed

    Ke, Qing; Ahn, Yong-Yeol; Sugimoto, Cassidy R

    2017-01-01

    Metrics derived from Twitter and other social media-often referred to as altmetrics-are increasingly used to estimate the broader social impacts of scholarship. Such efforts, however, may produce highly misleading results, as the entities that participate in conversations about science on these platforms are largely unknown. For instance, if altmetric activities are generated mainly by scientists, does it really capture broader social impacts of science? Here we present a systematic approach to identifying and analyzing scientists on Twitter. Our method can identify scientists across many disciplines, without relying on external bibliographic data, and be easily adapted to identify other stakeholder groups in science. We investigate the demographics, sharing behaviors, and interconnectivity of the identified scientists. We find that Twitter has been employed by scholars across the disciplinary spectrum, with an over-representation of social and computer and information scientists; under-representation of mathematical, physical, and life scientists; and a better representation of women compared to scholarly publishing. Analysis of the sharing of URLs reveals a distinct imprint of scholarly sites, yet only a small fraction of shared URLs are science-related. We find an assortative mixing with respect to disciplines in the networks between scientists, suggesting the maintenance of disciplinary walls in social media. Our work contributes to the literature both methodologically and conceptually-we provide new methods for disambiguating and identifying particular actors on social media and describing the behaviors of scientists, thus providing foundational information for the construction and use of indicators on the basis of social media metrics.

  2. A systematic identification and analysis of scientists on Twitter

    PubMed Central

    Ke, Qing; Ahn, Yong-Yeol; Sugimoto, Cassidy R.

    2017-01-01

    Metrics derived from Twitter and other social media—often referred to as altmetrics—are increasingly used to estimate the broader social impacts of scholarship. Such efforts, however, may produce highly misleading results, as the entities that participate in conversations about science on these platforms are largely unknown. For instance, if altmetric activities are generated mainly by scientists, does it really capture broader social impacts of science? Here we present a systematic approach to identifying and analyzing scientists on Twitter. Our method can identify scientists across many disciplines, without relying on external bibliographic data, and be easily adapted to identify other stakeholder groups in science. We investigate the demographics, sharing behaviors, and interconnectivity of the identified scientists. We find that Twitter has been employed by scholars across the disciplinary spectrum, with an over-representation of social and computer and information scientists; under-representation of mathematical, physical, and life scientists; and a better representation of women compared to scholarly publishing. Analysis of the sharing of URLs reveals a distinct imprint of scholarly sites, yet only a small fraction of shared URLs are science-related. We find an assortative mixing with respect to disciplines in the networks between scientists, suggesting the maintenance of disciplinary walls in social media. Our work contributes to the literature both methodologically and conceptually—we provide new methods for disambiguating and identifying particular actors on social media and describing the behaviors of scientists, thus providing foundational information for the construction and use of indicators on the basis of social media metrics. PMID:28399145

  3. Understanding the Performance and Potential of Cloud Computing for Scientific Applications

    DOE PAGES

    Sadooghi, Iman; Martin, Jesus Hernandez; Li, Tonglin; ...

    2015-02-19

    In this paper, commercial clouds bring a great opportunity to the scientific computing area. Scientific applications usually require significant resources, however not all scientists have access to sufficient high-end computing systems, may of which can be found in the Top500 list. Cloud Computing has gained the attention of scientists as a competitive resource to run HPC applications at a potentially lower cost. But as a different infrastructure, it is unclear whether clouds are capable of running scientific applications with a reasonable performance per money spent. This work studies the performance of public clouds and places this performance in context tomore » price. We evaluate the raw performance of different services of AWS cloud in terms of the basic resources, such as compute, memory, network and I/O. We also evaluate the performance of the scientific applications running in the cloud. This paper aims to assess the ability of the cloud to perform well, as well as to evaluate the cost of the cloud running scientific applications. We developed a full set of metrics and conducted a comprehensive performance evlauation over the Amazon cloud. We evaluated EC2, S3, EBS and DynamoDB among the many Amazon AWS services. We evaluated the memory sub-system performance with CacheBench, the network performance with iperf, processor and network performance with the HPL benchmark application, and shared storage with NFS and PVFS in addition to S3. We also evaluated a real scientific computing application through the Swift parallel scripting system at scale. Armed with both detailed benchmarks to gauge expected performance and a detailed monetary cost analysis, we expect this paper will be a recipe cookbook for scientists to help them decide where to deploy and run their scientific applications between public clouds, private clouds, or hybrid clouds.« less

  4. Understanding the Performance and Potential of Cloud Computing for Scientific Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sadooghi, Iman; Martin, Jesus Hernandez; Li, Tonglin

    In this paper, commercial clouds bring a great opportunity to the scientific computing area. Scientific applications usually require significant resources, however not all scientists have access to sufficient high-end computing systems, may of which can be found in the Top500 list. Cloud Computing has gained the attention of scientists as a competitive resource to run HPC applications at a potentially lower cost. But as a different infrastructure, it is unclear whether clouds are capable of running scientific applications with a reasonable performance per money spent. This work studies the performance of public clouds and places this performance in context tomore » price. We evaluate the raw performance of different services of AWS cloud in terms of the basic resources, such as compute, memory, network and I/O. We also evaluate the performance of the scientific applications running in the cloud. This paper aims to assess the ability of the cloud to perform well, as well as to evaluate the cost of the cloud running scientific applications. We developed a full set of metrics and conducted a comprehensive performance evlauation over the Amazon cloud. We evaluated EC2, S3, EBS and DynamoDB among the many Amazon AWS services. We evaluated the memory sub-system performance with CacheBench, the network performance with iperf, processor and network performance with the HPL benchmark application, and shared storage with NFS and PVFS in addition to S3. We also evaluated a real scientific computing application through the Swift parallel scripting system at scale. Armed with both detailed benchmarks to gauge expected performance and a detailed monetary cost analysis, we expect this paper will be a recipe cookbook for scientists to help them decide where to deploy and run their scientific applications between public clouds, private clouds, or hybrid clouds.« less

  5. 2008 ALCF annual report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drugan, C.

    2009-12-07

    The word 'breakthrough' aptly describes the transformational science and milestones achieved at the Argonne Leadership Computing Facility (ALCF) throughout 2008. The number of research endeavors undertaken at the ALCF through the U.S. Department of Energy's (DOE) Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program grew from 9 in 2007 to 20 in 2008. The allocation of computer time awarded to researchers on the Blue Gene/P also spiked significantly - from nearly 10 million processor hours in 2007 to 111 million in 2008. To support this research, we expanded the capabilities of Intrepid, an IBM Blue Gene/P systemmore » at the ALCF, to 557 teraflops (TF) for production use. Furthermore, we enabled breakthrough levels of productivity and capability in visualization and data analysis with Eureka, a powerful installation of NVIDIA Quadro Plex S4 external graphics processing units. Eureka delivered a quantum leap in visual compute density, providing more than 111 TF and more than 3.2 terabytes of RAM. On April 21, 2008, the dedication of the ALCF realized DOE's vision to bring the power of the Department's high performance computing to open scientific research. In June, the IBM Blue Gene/P supercomputer at the ALCF debuted as the world's fastest for open science and third fastest overall. No question that the science benefited from this growth and system improvement. Four research projects spearheaded by Argonne National Laboratory computer scientists and ALCF users were named to the list of top ten scientific accomplishments supported by DOE's Advanced Scientific Computing Research (ASCR) program. Three of the top ten projects used extensive grants of computing time on the ALCF's Blue Gene/P to model the molecular basis of Parkinson's disease, design proteins at atomic scale, and create enzymes. As the year came to a close, the ALCF was recognized with several prestigious awards at SC08 in November. We provided resources for Linear Scaling Divide-and-Conquer Electronic Structure Calculations for Thousand Atom Nanostructures, a collaborative effort between Argonne, Lawrence Berkeley National Laboratory, and Oak Ridge National Laboratory that received the ACM Gordon Bell Prize Special Award for Algorithmic Innovation. The ALCF also was named a winner in two of the four categories in the HPC Challenge best performance benchmark competition.« less

  6. Network Analysis of Beliefs About the Scientific Enterprise: A comparison of scientists, middle school science teachers and eighth-grade science students

    NASA Astrophysics Data System (ADS)

    Peters-Burton, Erin; Baynard, Liz R.

    2013-11-01

    An understanding of the scientific enterprise is useful because citizens need to make systematic, rational decisions about projects involving scientific endeavors and technology, and a clearer understanding of scientific epistemology is beneficial because it could encourage more public engagement with science. The purpose of this study was to capture beliefs for three groups, scientists, secondary science teachers, and eighth-grade science students, about the ways scientific knowledge is generated and validated. Open-ended questions were framed by formal scientific epistemology and dimensions of epistemology recognized in the field of educational psychology. The resulting statements were placed in a card sort and mapped in a network analysis to communicate interconnections among ideas. Maps analyzed with multidimensional scaling revealed robust connections among students and scientists but not among teachers. Student and teacher maps illustrated the strongest connections among ideas about experiments while scientist maps present more descriptive and well-rounded ideas about the scientific enterprise. The students' map was robust in terms of numbers of ideas, but were lacking in a hierarchical organization of ideas. The teachers' map displayed an alignment with the learning standards of the state, but not a broader view of science. The scientists map displayed a hierarchy of ideas with elaboration of equally valued statements connected to several foundational statements. Network analysis can be helpful in forwarding the study of views of the nature of science because of the technique's ability to capture verbatim statements from participants and to display the strength of connections among the statements.

  7. Construction of an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Friedland, Peter; Keller, Richard M.; Mckay, Christopher P.; Sims, Michael H.; Thompson, David E.

    1993-01-01

    Scientific model-building can be a time intensive and painstaking process, often involving the development of large complex computer programs. Despite the effort involved, scientific models cannot be distributed easily and shared with other scientists. In general, implemented scientific models are complicated, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing, using and sharing models. The proposed tool will include an interactive intelligent graphical interface and a high-level domain-specific modeling language. As a testbed for this research, we propose to develop a software prototype in the domain of planetary atmospheric modeling.

  8. HERCULES: A Pattern Driven Code Transformation System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kartsaklis, Christos; Hernandez, Oscar R; Hsu, Chung-Hsing

    2012-01-01

    New parallel computers are emerging, but developing efficient scientific code for them remains difficult. A scientist must manage not only the science-domain complexity but also the performance-optimization complexity. HERCULES is a code transformation system designed to help the scientist to separate the two concerns, which improves code maintenance, and facilitates performance optimization. The system combines three technologies, code patterns, transformation scripts and compiler plugins, to provide the scientist with an environment to quickly implement code transformations that suit his needs. Unlike existing code optimization tools, HERCULES is unique in its focus on user-level accessibility. In this paper we discuss themore » design, implementation and an initial evaluation of HERCULES.« less

  9. Construction of an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Friedland, Peter; Keller, Richard M.; Mckay, Christopher P.; Sims, Michael H.; Thompson, David E.

    1992-01-01

    Scientific model-building can be a time intensive and painstaking process, often involving the development of large complex computer programs. Despite the effort involved, scientific models cannot be distributed easily and shared with other scientists. In general, implemented scientific models are complicated, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing, using and sharing models. The proposed tool will include an interactive intelligent graphical interface and a high-level domain-specific modeling language. As a test bed for this research, we propose to develop a software prototype in the domain of planetary atmospheric modeling.

  10. Quantifying the extent of North American mammal extinction relative to the pre-anthropogenic baseline.

    PubMed

    Carrasco, Marc A; Barnosky, Anthony D; Graham, Russell W

    2009-12-16

    Earth has experienced five major extinction events in the past 450 million years. Many scientists suggest we are now witnessing a sixth, driven by human impacts. However, it has been difficult to quantify the real extent of the current extinction episode, either for a given taxonomic group at the continental scale or for the worldwide biota, largely because comparisons of pre-anthropogenic and anthropogenic biodiversity baselines have been unavailable. Here, we compute those baselines for mammals of temperate North America, using a sampling-standardized rich fossil record to reconstruct species-area relationships for a series of time slices ranging from 30 million to 500 years ago. We show that shortly after humans first arrived in North America, mammalian diversity dropped to become at least 15%-42% too low compared to the "normal" diversity baseline that had existed for millions of years. While the Holocene reduction in North American mammal diversity has long been recognized qualitatively, our results provide a quantitative measure that clarifies how significant the diversity reduction actually was. If mass extinctions are defined as loss of at least 75% of species on a global scale, our data suggest that North American mammals had already progressed one-fifth to more than halfway (depending on biogeographic province) towards that benchmark, even before industrialized society began to affect them. Data currently are not available to make similar quantitative estimates for other continents, but qualitative declines in Holocene mammal diversity are also widely recognized in South America, Eurasia, and Australia. Extending our methodology to mammals in these areas, as well as to other taxa where possible, would provide a reasonable way to assess the magnitude of global extinction, the biodiversity impact of extinctions of currently threatened species, and the efficacy of conservation efforts into the future.

  11. A visiting scientist program in atmospheric sciences for the Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Davis, M. H.

    1989-01-01

    A visiting scientist program was conducted in the atmospheric sciences and related areas at the Goddard Laboratory for Atmospheres. Research was performed in mathematical analysis as applied to computer modeling of the atmospheres; development of atmospheric modeling programs; analysis of remotely sensed atmospheric, surface, and oceanic data and its incorporation into atmospheric models; development of advanced remote sensing instrumentation; and related research areas. The specific research efforts are detailed by tasks.

  12. Global Land Information System (GLIS)

    USGS Publications Warehouse

    ,

    1992-01-01

    The Global Land Information System (GLIS) is an interactive computer system developed by the U.S. Geological Survey (USGS) for scientists seeking sources of information about the Earth's land surfaces. GLIS contains "metadata," that is, descriptive information about data sets. Through GLIS, scientists can evaluate data sets, determine their availability, and place online requests for products. GLIS is more, however, than a mere list of products. It offers online samples of earth science data that may be ordered through the system.

  13. UNH Data Cooperative: A Cyber Infrastructure for Earth System Studies

    NASA Astrophysics Data System (ADS)

    Braswell, B. H.; Fekete, B. M.; Prusevich, A.; Gliden, S.; Magill, A.; Vorosmarty, C. J.

    2007-12-01

    Earth system scientists and managers have a continuously growing demand for a wide array of earth observations derived from various data sources including (a) modern satellite retrievals, (b) "in-situ" records, (c) various simulation outputs, and (d) assimilated data products combining model results with observational records. The sheer quantity of data, and formatting inconsistencies make it difficult for users to take full advantage of this important information resource. Thus the system could benefit from a thorough retooling of our current data processing procedures and infrastructure. Emerging technologies, like OPeNDAP and OGC map services, open standard data formats (NetCDF, HDF) data cataloging systems (NASA-Echo, Global Change Master Directory, etc.) are providing the basis for a new approach in data management and processing, where web- services are increasingly designed to serve computer-to-computer communications without human interactions and complex analysis can be carried out over distributed computer resources interconnected via cyber infrastructure. The UNH Earth System Data Collaborative is designed to utilize the aforementioned emerging web technologies to offer new means of access to earth system data. While the UNH Data Collaborative serves a wide array of data ranging from weather station data (Climate Portal) to ocean buoy records and ship tracks (Portsmouth Harbor Initiative) to land cover characteristics, etc. the underlaying data architecture shares common components for data mining and data dissemination via web-services. Perhaps the most unique element of the UNH Data Cooperative's IT infrastructure is its prototype modeling environment for regional ecosystem surveillance over the Northeast corridor, which allows the integration of complex earth system model components with the Cooperative's data services. While the complexity of the IT infrastructure to perform complex computations is continuously increasing, scientists are often forced to spend considerable amount of time to solve basic data management and preprocessing tasks and deal with low level computational design problems like parallelization of model codes. Our modeling infrastructure is designed to take care the bulk of the common tasks found in complex earth system models like I/O handling, computational domain and time management, parallel execution of the modeling tasks, etc. The modeling infrastructure allows scientists to focus on the numerical implementation of the physical processes on a single computational objects(typically grid cells) while the framework takes care of the preprocessing of input data, establishing of the data exchange between computation objects and the execution of the science code. In our presentation, we will discuss the key concepts of our modeling infrastructure. We will demonstrate integration of our modeling framework with data services offered by the UNH Earth System Data Collaborative via web interfaces. We will layout the road map to turn our prototype modeling environment into a truly community framework for wide range of earth system scientists and environmental managers.

  14. The iPlant Collaborative: Cyberinfrastructure for Plant Biology.

    PubMed

    Goff, Stephen A; Vaughn, Matthew; McKay, Sheldon; Lyons, Eric; Stapleton, Ann E; Gessler, Damian; Matasci, Naim; Wang, Liya; Hanlon, Matthew; Lenards, Andrew; Muir, Andy; Merchant, Nirav; Lowry, Sonya; Mock, Stephen; Helmke, Matthew; Kubach, Adam; Narro, Martha; Hopkins, Nicole; Micklos, David; Hilgert, Uwe; Gonzales, Michael; Jordan, Chris; Skidmore, Edwin; Dooley, Rion; Cazes, John; McLay, Robert; Lu, Zhenyuan; Pasternak, Shiran; Koesterke, Lars; Piel, William H; Grene, Ruth; Noutsos, Christos; Gendler, Karla; Feng, Xin; Tang, Chunlao; Lent, Monica; Kim, Seung-Jin; Kvilekval, Kristian; Manjunath, B S; Tannen, Val; Stamatakis, Alexandros; Sanderson, Michael; Welch, Stephen M; Cranston, Karen A; Soltis, Pamela; Soltis, Doug; O'Meara, Brian; Ane, Cecile; Brutnell, Tom; Kleibenstein, Daniel J; White, Jeffery W; Leebens-Mack, James; Donoghue, Michael J; Spalding, Edgar P; Vision, Todd J; Myers, Christopher R; Lowenthal, David; Enquist, Brian J; Boyle, Brad; Akoglu, Ali; Andrews, Greg; Ram, Sudha; Ware, Doreen; Stein, Lincoln; Stanzione, Dan

    2011-01-01

    The iPlant Collaborative (iPlant) is a United States National Science Foundation (NSF) funded project that aims to create an innovative, comprehensive, and foundational cyberinfrastructure in support of plant biology research (PSCIC, 2006). iPlant is developing cyberinfrastructure that uniquely enables scientists throughout the diverse fields that comprise plant biology to address Grand Challenges in new ways, to stimulate and facilitate cross-disciplinary research, to promote biology and computer science research interactions, and to train the next generation of scientists on the use of cyberinfrastructure in research and education. Meeting humanity's projected demands for agricultural and forest products and the expectation that natural ecosystems be managed sustainably will require synergies from the application of information technologies. The iPlant cyberinfrastructure design is based on an unprecedented period of research community input, and leverages developments in high-performance computing, data storage, and cyberinfrastructure for the physical sciences. iPlant is an open-source project with application programming interfaces that allow the community to extend the infrastructure to meet its needs. iPlant is sponsoring community-driven workshops addressing specific scientific questions via analysis tool integration and hypothesis testing. These workshops teach researchers how to add bioinformatics tools and/or datasets into the iPlant cyberinfrastructure enabling plant scientists to perform complex analyses on large datasets without the need to master the command-line or high-performance computational services.

  15. The iPlant Collaborative: Cyberinfrastructure for Plant Biology

    PubMed Central

    Goff, Stephen A.; Vaughn, Matthew; McKay, Sheldon; Lyons, Eric; Stapleton, Ann E.; Gessler, Damian; Matasci, Naim; Wang, Liya; Hanlon, Matthew; Lenards, Andrew; Muir, Andy; Merchant, Nirav; Lowry, Sonya; Mock, Stephen; Helmke, Matthew; Kubach, Adam; Narro, Martha; Hopkins, Nicole; Micklos, David; Hilgert, Uwe; Gonzales, Michael; Jordan, Chris; Skidmore, Edwin; Dooley, Rion; Cazes, John; McLay, Robert; Lu, Zhenyuan; Pasternak, Shiran; Koesterke, Lars; Piel, William H.; Grene, Ruth; Noutsos, Christos; Gendler, Karla; Feng, Xin; Tang, Chunlao; Lent, Monica; Kim, Seung-Jin; Kvilekval, Kristian; Manjunath, B. S.; Tannen, Val; Stamatakis, Alexandros; Sanderson, Michael; Welch, Stephen M.; Cranston, Karen A.; Soltis, Pamela; Soltis, Doug; O'Meara, Brian; Ane, Cecile; Brutnell, Tom; Kleibenstein, Daniel J.; White, Jeffery W.; Leebens-Mack, James; Donoghue, Michael J.; Spalding, Edgar P.; Vision, Todd J.; Myers, Christopher R.; Lowenthal, David; Enquist, Brian J.; Boyle, Brad; Akoglu, Ali; Andrews, Greg; Ram, Sudha; Ware, Doreen; Stein, Lincoln; Stanzione, Dan

    2011-01-01

    The iPlant Collaborative (iPlant) is a United States National Science Foundation (NSF) funded project that aims to create an innovative, comprehensive, and foundational cyberinfrastructure in support of plant biology research (PSCIC, 2006). iPlant is developing cyberinfrastructure that uniquely enables scientists throughout the diverse fields that comprise plant biology to address Grand Challenges in new ways, to stimulate and facilitate cross-disciplinary research, to promote biology and computer science research interactions, and to train the next generation of scientists on the use of cyberinfrastructure in research and education. Meeting humanity's projected demands for agricultural and forest products and the expectation that natural ecosystems be managed sustainably will require synergies from the application of information technologies. The iPlant cyberinfrastructure design is based on an unprecedented period of research community input, and leverages developments in high-performance computing, data storage, and cyberinfrastructure for the physical sciences. iPlant is an open-source project with application programming interfaces that allow the community to extend the infrastructure to meet its needs. iPlant is sponsoring community-driven workshops addressing specific scientific questions via analysis tool integration and hypothesis testing. These workshops teach researchers how to add bioinformatics tools and/or datasets into the iPlant cyberinfrastructure enabling plant scientists to perform complex analyses on large datasets without the need to master the command-line or high-performance computational services. PMID:22645531

  16. ArrayBridge: Interweaving declarative array processing with high-performance computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xing, Haoyuan; Floratos, Sofoklis; Blanas, Spyros

    Scientists are increasingly turning to datacenter-scale computers to produce and analyze massive arrays. Despite decades of database research that extols the virtues of declarative query processing, scientists still write, debug and parallelize imperative HPC kernels even for the most mundane queries. This impedance mismatch has been partly attributed to the cumbersome data loading process; in response, the database community has proposed in situ mechanisms to access data in scientific file formats. Scientists, however, desire more than a passive access method that reads arrays from files. This paper describes ArrayBridge, a bi-directional array view mechanism for scientific file formats, that aimsmore » to make declarative array manipulations interoperable with imperative file-centric analyses. Our prototype implementation of ArrayBridge uses HDF5 as the underlying array storage library and seamlessly integrates into the SciDB open-source array database system. In addition to fast querying over external array objects, ArrayBridge produces arrays in the HDF5 file format just as easily as it can read from it. ArrayBridge also supports time travel queries from imperative kernels through the unmodified HDF5 API, and automatically deduplicates between array versions for space efficiency. Our extensive performance evaluation in NERSC, a large-scale scientific computing facility, shows that ArrayBridge exhibits statistically indistinguishable performance and I/O scalability to the native SciDB storage engine.« less

  17. Automatic speech recognition and training for severely dysarthric users of assistive technology: the STARDUST project.

    PubMed

    Parker, Mark; Cunningham, Stuart; Enderby, Pam; Hawley, Mark; Green, Phil

    2006-01-01

    The STARDUST project developed robust computer speech recognizers for use by eight people with severe dysarthria and concomitant physical disability to access assistive technologies. Independent computer speech recognizers trained with normal speech are of limited functional use by those with severe dysarthria due to limited and inconsistent proximity to "normal" articulatory patterns. Severe dysarthric output may also be characterized by a small mass of distinguishable phonetic tokens making the acoustic differentiation of target words difficult. Speaker dependent computer speech recognition using Hidden Markov Models was achieved by the identification of robust phonetic elements within the individual speaker output patterns. A new system of speech training using computer generated visual and auditory feedback reduced the inconsistent production of key phonetic tokens over time.

  18. A smart sensor architecture based on emergent computation in an array of outer-totalistic cells

    NASA Astrophysics Data System (ADS)

    Dogaru, Radu; Dogaru, Ioana; Glesner, Manfred

    2005-06-01

    A novel smart-sensor architecture is proposed, capable to segment and recognize characters in a monochrome image. It is capable to provide a list of ASCII codes representing the recognized characters from the monochrome visual field. It can operate as a blind's aid or for industrial applications. A bio-inspired cellular model with simple linear neurons was found the best to perform the nontrivial task of cropping isolated compact objects such as handwritten digits or characters. By attaching a simple outer-totalistic cell to each pixel sensor, emergent computation in the resulting cellular automata lattice provides a straightforward and compact solution to the otherwise computationally intensive problem of character segmentation. A simple and robust recognition algorithm is built in a compact sequential controller accessing the array of cells so that the integrated device can provide directly a list of codes of the recognized characters. Preliminary simulation tests indicate good performance and robustness to various distortions of the visual field.

  19. A short history of nitroglycerine and nitric oxide in pharmacology and physiology.

    PubMed

    Marsh, N; Marsh, A

    2000-04-01

    1. Nitroglycerine (NG) was discovered in 1847 by Ascanio Sobrero in Turin, following work with Theophile-Jules Pelouze. Sobrero first noted the 'violent headache' produced by minute quantities of NG on the tongue. 2. Constantin Hering, in 1849, tested NG in healthy volunteers, observing that headache was caused with 'such precision'. Hering pursued NG ('glonoine') as a homeopathic remedy for headache, believing that its use fell within the doctrine of 'like cures like'. 3. Alfred Nobel joined Pelouze in 1851 and recognized the potential of NG. He began manufacturing NG in Sweden, overcoming handling problems with his patent detonator. Nobel suffered acutely from angina and was later to refuse NG as a treatment. 4. During the mid-19th century, scientists in Britain took an interest in the newly discovered amyl nitrite, recognized as a powerful vasodilator. Lauder Brunton, the father of modern pharmacology, used the compound to relieve angina in 1867, noting the pharmacological resistance to repeated doses. 5. William Murrell first used NG for angina in 1876, although NG entered the British Pharmacopoeia as a remedy for hypertension. William Martindale, the pharmaceutical chemist, prepared '...a more stable and portable preparation': 1/100th of a grain in chocolate. 6. In the early 20th century, scientists worked on in vitro actions of nitrate-containing compounds although little progress was made towards understanding the cellular mode of action. 7. The NG industry flourished from 1900, exposing workers to high levels of organic nitrites; the phenomena of nitrate tolerance was recognized by the onset of 'Monday disease' and of nitrate-withdrawal/overcompensation by 'Sunday Heart Attacks'. 8. Ferid Murad discovered the release of nitric oxide (NO) from NG and its action on vascular smooth muscle (in 1977). Robert Furchgott and John Zawadski recognized the importance of the endothelium in acetylcholine-induced vasorelaxation (in 1980) and Louis Ignarro and Salvador Moncada identified endothelial-derived relaxing factor (EDRF) as NO (in 1987). 9. Glycerol trinitrate remains the treatment of choice for relieving angina; other organic esters and inorganic nitrates are also used, but the rapid action of NG and its established efficacy make it the mainstay of angina pectoris relief.

  20. Connecting long-tail scientists with big data centers using SaaS

    NASA Astrophysics Data System (ADS)

    Percivall, G. S.; Bermudez, L. E.

    2012-12-01

    Big data centers and long tail scientists represent two extremes in the geoscience research community. Interoperability and inter-use based on software-as-a-service (SaaS) increases access to big data holdings by this underserved community of scientists. Large, institutional data centers have long been recognized as vital resources in the geoscience community. Permanent data archiving and dissemination centers provide "access to the data and (are) a critical source of people who have experience in the use of the data and can provide advice and counsel for new applications." [NRC] The "long-tail of science" is the geoscience researchers that work separate from institutional data centers [Heidorn]. Long-tail scientists need to be efficient consumers of data from large, institutional data centers. Discussions in NSF EarthCube capture the challenges: "Like the vast majority of NSF-funded researchers, Alice (a long-tail scientist) works with limited resources. In the absence of suitable expertise and infrastructure, the apparently simple task that she assigns to her graduate student becomes an information discovery and management nightmare. Downloading and transforming datasets takes weeks." [Foster, et.al.] The long-tail metaphor points to methods to bridge the gap, i.e., the Web. A decade ago, OGC began building a geospatial information space using open, web standards for geoprocessing [ORM]. Recently, [Foster, et.al.] accurately observed that "by adopting, adapting, and applying semantic web and SaaS technologies, we can make the use of geoscience data as easy and convenient as consumption of online media." SaaS places web services into Cloud Computing. SaaS for geospatial is emerging rapidly building on the first-generation geospatial web, e.g., OGC Web Coverage Service [WCS] and the Data Access Protocol [DAP]. Several recent examples show progress in applying SaaS to geosciences: - NASA's Earth Data Coherent Web has a goal to improve science user experience using Web Services (e.g. W*S, SOAP, RESTful) to reduce barriers to using EOSDIS data [ECW]. - NASA's LANCE provides direct access to vast amounts of satellite data using the OGC Web Map Tile Service (WMTS). - NOAA's Unified Access Framework for Gridded Data (UAF Grid) is a web service based capability for direct access to a variety of datasets using netCDF, OPeNDAP, THREDDS, WMS and WCS. [UAF] Tools to access SaaS's are many and varied: some proprietary, others open source; some run in browsers, others are stand-alone applications. What's required is interoperability using web interfaces offered by the data centers. NOAA's UAF service stack supports Matlab, ArcGIS, Ferret, GrADS, Google Earth, IDV, LAS. Any SaaS that offers OGC Web Services (WMS, WFS, WCS) can be accessed by scores of clients [OGC]. While there has been much progress in the recent year toward offering web services for the long-tail of scientists, more needs to be done. Web services offer data access but more than access is needed for inter-use of data, e.g. defining data schemas that allow for data fusion, addressing coordinate systems, spatial geometry, and semantics for observations. Connecting long-tail scientists with large, data centers using SaaS and, in the future, semantic web, will address this large and currently underserved user community.

Top