Sample records for computational science projects

  1. Spiral and Project-Based Learning with Peer Assessment in a Computer Science Project Management Course

    ERIC Educational Resources Information Center

    Jaime, Arturo; Blanco, José Miguel; Domínguez, César; Sánchez, Ana; Heras, Jónathan; Usandizaga, Imanol

    2016-01-01

    Different learning methods such as project-based learning, spiral learning and peer assessment have been implemented in science disciplines with different outcomes. This paper presents a proposal for a project management course in the context of a computer science degree. Our proposal combines three well-known methods: project-based learning,…

  2. Group Projects and the Computer Science Curriculum

    ERIC Educational Resources Information Center

    Joy, Mike

    2005-01-01

    Group projects in computer science are normally delivered with reference to good software engineering practice. The discipline of software engineering is rapidly evolving, and the application of the latest 'agile techniques' to group projects causes a potential conflict with constraints imposed by regulating bodies on the computer science…

  3. Snatching Defeat from the Jaws of Victory: When Good Projects Go Bad. Girls and Computer Science.

    ERIC Educational Resources Information Center

    Sanders, Jo

    In week-long semesters in the summers of 1997, 1998, and 1999, the 6APT (Summer Institute in Computer Science for Advanced Placement Teachers) project taught 240 high school teachers of Advanced Placement Computer Science (APCS) about gender equity in computers. Teachers were then followed through 2000. Results indicated that while teachers, did…

  4. An Interdisciplinary Team Project: Psychology and Computer Science Students Create Online Cognitive Tasks

    ERIC Educational Resources Information Center

    Flannery, Kathleen A.; Malita, Mihaela

    2014-01-01

    We present our case study of an interdisciplinary team project for students taking either a psychology or computer science (CS) course. The project required psychology and CS students to combine their knowledge and skills to create an online cognitive task. Each interdisciplinary project team included two psychology students who conducted library…

  5. Final Technical Progress Report; Closeout Certifications; CSSV Newsletter Volume I; CSSV Newsletter Volume II; CSSV Activity Journal; CSSV Final Financial Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Houston, Johnny L; Geter, Kerry

    This Project?s third year of implementation in 2007-2008, the final year, as designated by Elizabeth City State University (ECSU), in cooperation with the National Association of Mathematicians (NAM) Inc., in an effort to promote research and research training programs in computational science ? scientific visualization (CSSV). A major goal of the Project was to attract the energetic and productive faculty, graduate and upper division undergraduate students of diverse ethnicities to a program that investigates science and computational science issues of long-term interest to the Department of Energy (DoE) and the nation. The breadth and depth of computational science?scientific visualization andmore » the magnitude of resources available are enormous for permitting a variety of research activities. ECSU?s Computational Science-Science Visualization Center will serve as a conduit for directing users to these enormous resources.« less

  6. An Integrated Framework for Improved Computer Science Education: Strategies, Implementations, and Results

    ERIC Educational Resources Information Center

    Soh, Leen-Kiat; Samal, Ashok; Nugent, Gwen

    2007-01-01

    This paper describes the Reinventing Computer Science Curriculum Project at the University of Nebraska-Lincoln. Motivated by rapid and significant changes in the information technology and computing areas, high diversity in student aptitudes, and high dropout rates, the project designed and implemented an integrated instructional/research…

  7. Preparing Future Secondary Computer Science Educators

    ERIC Educational Resources Information Center

    Ajwa, Iyad

    2007-01-01

    Although nearly every college offers a major in computer science, many computer science teachers at the secondary level have received little formal training. This paper presents details of a project that could make a significant contribution to national efforts to improve computer science education by combining teacher education and professional…

  8. Summary of Research 1997, Department of Computer Science.

    DTIC Science & Technology

    1999-01-01

    Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington, VA 22202-4302, and to the Office of Management and Budget, Paperwork Reduction Project (0704...contains summaries of research projects in the Department of Computer Science . A list of recent publications is also included which consists of conference...parallel programming. Recently, in a joint research project between NPS and the Russian Academy of Sciences Systems Programming Insti- tute in Moscow

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hules, John

    This 1998 annual report from the National Scientific Energy Research Computing Center (NERSC) presents the year in review of the following categories: Computational Science; Computer Science and Applied Mathematics; and Systems and Services. Also presented are science highlights in the following categories: Basic Energy Sciences; Biological and Environmental Research; Fusion Energy Sciences; High Energy and Nuclear Physics; and Advanced Scientific Computing Research and Other Projects.

  10. Institutional computing (IC) information session

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koch, Kenneth R; Lally, Bryan R

    2011-01-19

    The LANL Institutional Computing Program (IC) will host an information session about the current state of unclassified Institutional Computing at Los Alamos, exciting plans for the future, and the current call for proposals for science and engineering projects requiring computing. Program representatives will give short presentations and field questions about the call for proposals and future planned machines, and discuss technical support available to existing and future projects. Los Alamos has started making a serious institutional investment in open computing available to our science projects, and that investment is expected to increase even more.

  11. [Earth Science Technology Office's Computational Technologies Project

    NASA Technical Reports Server (NTRS)

    Fischer, James (Technical Monitor); Merkey, Phillip

    2005-01-01

    This grant supported the effort to characterize the problem domain of the Earth Science Technology Office's Computational Technologies Project, to engage the Beowulf Cluster Computing Community as well as the High Performance Computing Research Community so that we can predict the applicability of said technologies to the scientific community represented by the CT project and formulate long term strategies to provide the computational resources necessary to attain the anticipated scientific objectives of the CT project. Specifically, the goal of the evaluation effort is to use the information gathered over the course of the Round-3 investigations to quantify the trends in scientific expectations, the algorithmic requirements and capabilities of high-performance computers to satisfy this anticipated need.

  12. The Computer Science Technical Report (CS-TR) Project: A Pioneering Digital Library Project Viewed from a Library Perspective.

    ERIC Educational Resources Information Center

    Anderson, Greg; And Others

    1996-01-01

    Describes the Computer Science Technical Report Project, one of the earliest investigations into the system engineering of digital libraries which pioneered multiinstitutional collaborative research into technical, social, and legal issues related to the development and implementation of a large, heterogeneous, distributed digital library. (LRW)

  13. Progress on the Fabric for Frontier Experiments Project at Fermilab

    NASA Astrophysics Data System (ADS)

    Box, Dennis; Boyd, Joseph; Dykstra, Dave; Garzoglio, Gabriele; Herner, Kenneth; Kirby, Michael; Kreymer, Arthur; Levshina, Tanya; Mhashilkar, Parag; Sharma, Neha

    2015-12-01

    The FabrIc for Frontier Experiments (FIFE) project is an ambitious, major-impact initiative within the Fermilab Scientific Computing Division designed to lead the computing model for Fermilab experiments. FIFE is a collaborative effort between experimenters and computing professionals to design and develop integrated computing models for experiments of varying needs and infrastructure. The major focus of the FIFE project is the development, deployment, and integration of Open Science Grid solutions for high throughput computing, data management, database access and collaboration within experiment. To accomplish this goal, FIFE has developed workflows that utilize Open Science Grid sites along with dedicated and commercial cloud resources. The FIFE project has made significant progress integrating into experiment computing operations several services including new job submission services, software and reference data distribution through CVMFS repositories, flexible data transfer client, and access to opportunistic resources on the Open Science Grid. The progress with current experiments and plans for expansion with additional projects will be discussed. FIFE has taken a leading role in the definition of the computing model for Fermilab experiments, aided in the design of computing for experiments beyond Fermilab, and will continue to define the future direction of high throughput computing for future physics experiments worldwide.

  14. Computer Assisted Project-Based Instruction: The Effects on Science Achievement, Computer Achievement and Portfolio Assessment

    ERIC Educational Resources Information Center

    Erdogan, Yavuz; Dede, Dinçer

    2015-01-01

    The purpose of this study is to compare the effects of computer assisted project-based instruction on learners' achievement in a science and technology course, in a computer course and in portfolio development. With this aim in mind, a quasi-experimental design was used and a sample of 70 seventh grade secondary school students from Org. Esref…

  15. [Earth and Space Sciences Project Services for NASA HPCC

    NASA Technical Reports Server (NTRS)

    Merkey, Phillip

    2002-01-01

    This grant supported the effort to characterize the problem domain of the Earth Science Technology Office's Computational Technologies Project, to engage the Beowulf Cluster Computing Community as well as the High Performance Computing Research Community so that we can predict the applicability of said technologies to the scientific community represented by the CT project and formulate long term strategies to provide the computational resources necessary to attain the anticipated scientific objectives of the CT project. Specifically, the goal of the evaluation effort is to use the information gathered over the course of the Round-3 investigations to quantify the trends in scientific expectations, the algorithmic requirements and capabilities of high-performance computers to satisfy this anticipated need.

  16. The influence of a game-making project on male and female learners' attitudes to computing

    NASA Astrophysics Data System (ADS)

    Robertson, Judy

    2013-03-01

    There is a pressing need for gender inclusive approaches to engage young people in computer science. A recent popular approach has been to harness learners' enthusiasm for computer games to motivate them to learn computer science concepts through game authoring. This article describes a study in which 992 learners across 13 schools took part in a game-making project. It provides evidence from 225 pre-test and post-test questionnaires on how learners' attitudes to computing changed during the project, as well as qualitative reflections from the class teachers on how the project affected their learners. Results indicate that girls did not enjoy the experience as much as boys, and that in fact, the project may make pupils less inclined to study computing in the future. This has important implications for future efforts to engage young people in computing.

  17. Tracking the NGS revolution: managing life science research on shared high-performance computing clusters.

    PubMed

    Dahlö, Martin; Scofield, Douglas G; Schaal, Wesley; Spjuth, Ola

    2018-05-01

    Next-generation sequencing (NGS) has transformed the life sciences, and many research groups are newly dependent upon computer clusters to store and analyze large datasets. This creates challenges for e-infrastructures accustomed to hosting computationally mature research in other sciences. Using data gathered from our own clusters at UPPMAX computing center at Uppsala University, Sweden, where core hour usage of ∼800 NGS and ∼200 non-NGS projects is now similar, we compare and contrast the growth, administrative burden, and cluster usage of NGS projects with projects from other sciences. The number of NGS projects has grown rapidly since 2010, with growth driven by entry of new research groups. Storage used by NGS projects has grown more rapidly since 2013 and is now limited by disk capacity. NGS users submit nearly twice as many support tickets per user, and 11 more tools are installed each month for NGS projects than for non-NGS projects. We developed usage and efficiency metrics and show that computing jobs for NGS projects use more RAM than non-NGS projects, are more variable in core usage, and rarely span multiple nodes. NGS jobs use booked resources less efficiently for a variety of reasons. Active monitoring can improve this somewhat. Hosting NGS projects imposes a large administrative burden at UPPMAX due to large numbers of inexperienced users and diverse and rapidly evolving research areas. We provide a set of recommendations for e-infrastructures that host NGS research projects. We provide anonymized versions of our storage, job, and efficiency databases.

  18. Tracking the NGS revolution: managing life science research on shared high-performance computing clusters

    PubMed Central

    2018-01-01

    Abstract Background Next-generation sequencing (NGS) has transformed the life sciences, and many research groups are newly dependent upon computer clusters to store and analyze large datasets. This creates challenges for e-infrastructures accustomed to hosting computationally mature research in other sciences. Using data gathered from our own clusters at UPPMAX computing center at Uppsala University, Sweden, where core hour usage of ∼800 NGS and ∼200 non-NGS projects is now similar, we compare and contrast the growth, administrative burden, and cluster usage of NGS projects with projects from other sciences. Results The number of NGS projects has grown rapidly since 2010, with growth driven by entry of new research groups. Storage used by NGS projects has grown more rapidly since 2013 and is now limited by disk capacity. NGS users submit nearly twice as many support tickets per user, and 11 more tools are installed each month for NGS projects than for non-NGS projects. We developed usage and efficiency metrics and show that computing jobs for NGS projects use more RAM than non-NGS projects, are more variable in core usage, and rarely span multiple nodes. NGS jobs use booked resources less efficiently for a variety of reasons. Active monitoring can improve this somewhat. Conclusions Hosting NGS projects imposes a large administrative burden at UPPMAX due to large numbers of inexperienced users and diverse and rapidly evolving research areas. We provide a set of recommendations for e-infrastructures that host NGS research projects. We provide anonymized versions of our storage, job, and efficiency databases. PMID:29659792

  19. A Survey of Computer Science Capstone Course Literature

    ERIC Educational Resources Information Center

    Dugan, Robert F., Jr.

    2011-01-01

    In this article, we surveyed literature related to undergraduate computer science capstone courses. The survey was organized around course and project issues. Course issues included: course models, learning theories, course goals, course topics, student evaluation, and course evaluation. Project issues included: software process models, software…

  20. Computer Science in High School Graduation Requirements. ECS Education Trends

    ERIC Educational Resources Information Center

    Zinth, Jennifer Dounay

    2015-01-01

    Computer science and coding skills are widely recognized as a valuable asset in the current and projected job market. The Bureau of Labor Statistics projects 37.5 percent growth from 2012 to 2022 in the "computer systems design and related services" industry--from 1,620,300 jobs in 2012 to an estimated 2,229,000 jobs in 2022. Yet some…

  1. Applications of Out-of-Domain Knowledge in Students' Reasoning about Computer Program State

    ERIC Educational Resources Information Center

    Lewis, Colleen Marie

    2012-01-01

    To meet a growing demand and a projected deficit in the supply of computer professionals (NCWIT, 2009), it is of vital importance to expand students' access to computer science. However, many researchers in the computer science education community unproductively assume that some students lack an innate ability for computer science and…

  2. Progress on the FabrIc for Frontier Experiments project at Fermilab

    DOE PAGES

    Box, Dennis; Boyd, Joseph; Dykstra, Dave; ...

    2015-12-23

    The FabrIc for Frontier Experiments (FIFE) project is an ambitious, major-impact initiative within the Fermilab Scientific Computing Division designed to lead the computing model for Fermilab experiments. FIFE is a collaborative effort between experimenters and computing professionals to design and develop integrated computing models for experiments of varying needs and infrastructure. The major focus of the FIFE project is the development, deployment, and integration of Open Science Grid solutions for high throughput computing, data management, database access and collaboration within experiment. To accomplish this goal, FIFE has developed workflows that utilize Open Science Grid sites along with dedicated and commercialmore » cloud resources. The FIFE project has made significant progress integrating into experiment computing operations several services including new job submission services, software and reference data distribution through CVMFS repositories, flexible data transfer client, and access to opportunistic resources on the Open Science Grid. Hence, the progress with current experiments and plans for expansion with additional projects will be discussed. FIFE has taken a leading role in the definition of the computing model for Fermilab experiments, aided in the design of computing for experiments beyond Fermilab, and will continue to define the future direction of high throughput computing for future physics experiments worldwide« less

  3. Computer Science and Telecommunications Board summary of activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blumenthal, M.S.

    1992-03-27

    The Computer Science and Telecommunications Board (CSTB) considers technical and policy issues pertaining to computer science, telecommunications, and associated technologies. CSTB actively disseminates the results of its completed projects to those in a position to help implement their recommendations or otherwise use their insights. It provides a forum for the exchange of information on computer science, computing technology, and telecommunications. This report discusses the major accomplishments of CSTB.

  4. Storage and network bandwidth requirements through the year 2000 for the NASA Center for Computational Sciences

    NASA Technical Reports Server (NTRS)

    Salmon, Ellen

    1996-01-01

    The data storage and retrieval demands of space and Earth sciences researchers have made the NASA Center for Computational Sciences (NCCS) Mass Data Storage and Delivery System (MDSDS) one of the world's most active Convex UniTree systems. Science researchers formed the NCCS's Computer Environments and Research Requirements Committee (CERRC) to relate their projected supercomputing and mass storage requirements through the year 2000. Using the CERRC guidelines and observations of current usage, some detailed projections of requirements for MDSDS network bandwidth and mass storage capacity and performance are presented.

  5. 77 FR 12823 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-02

    ... Exascale ARRA projects--Magellan final report, Advanced Networking update Status from Computer Science COV Early Career technical talks Summary of Applied Math and Computer Science Workshops ASCR's new SBIR..., Office of Science. ACTION: Notice of Open Meeting. SUMMARY: This notice announces a meeting of the...

  6. All Roads Lead to Computing: Making, Participatory Simulations, and Social Computing as Pathways to Computer Science

    ERIC Educational Resources Information Center

    Brady, Corey; Orton, Kai; Weintrop, David; Anton, Gabriella; Rodriguez, Sebastian; Wilensky, Uri

    2017-01-01

    Computer science (CS) is becoming an increasingly diverse domain. This paper reports on an initiative designed to introduce underrepresented populations to computing using an eclectic, multifaceted approach. As part of a yearlong computing course, students engage in Maker activities, participatory simulations, and computing projects that…

  7. Tutor Training in Computer Science: Tutor Opinions and Student Results.

    ERIC Educational Resources Information Center

    Carbone, Angela; Mitchell, Ian

    Edproj, a project team of faculty from the departments of computer science, software development and education at Monash University (Australia) investigated the quality of teaching and student learning and understanding in the computer science and software development departments. Edproj's research led to the development of a training program to…

  8. Computers as Media for Communication: Learning and Development in a Whole Earth Context.

    ERIC Educational Resources Information Center

    Levin, James A.

    Educationally successful electronic network activities involving microcomputers and long-distance networks include a student newswire, joint social science projects, and joint science projects. A newswire activity, such as "The Computer Chronicles," can provide a wide range of audiences for writing, a functional environment for reading, and a…

  9. Computer Technology-Integrated Projects Should Not Supplant Craft Projects in Science Education

    ERIC Educational Resources Information Center

    Klopp, Tabatha J.; Rule, Audrey C.; Schneider, Jean Suchsland; Boody, Robert M.

    2014-01-01

    The current emphasis on computer technology integration and narrowing of the curriculum has displaced arts and crafts. However, the hands-on, concrete nature of craft work in science modeling enables students to understand difficult concepts and to be engaged and motivated while learning spatial, logical, and sequential thinking skills. Analogy…

  10. Studying the Earth's Environment from Space: Computer Laboratory Exercised and Instructor Resources

    NASA Technical Reports Server (NTRS)

    Smith, Elizabeth A.; Alfultis, Michael

    1998-01-01

    Studying the Earth's Environment From Space is a two-year project to develop a suite of CD-ROMs containing Earth System Science curriculum modules for introductory undergraduate science classes. Lecture notes, slides, and computer laboratory exercises, including actual satellite data and software, are being developed in close collaboration with Carla Evans of NASA GSFC Earth Sciences Directorate Scientific and Educational Endeavors (SEE) project. Smith and Alfultis are responsible for the Oceanography and Sea Ice Processes Modules. The GSFC SEE project is responsible for Ozone and Land Vegetation Modules. This document constitutes a report on the first year of activities of Smith and Alfultis' project.

  11. Alliance for Computational Science Collaboration, HBCU Partnership at Alabama A&M University Final Performance Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deng, Z.T.

    2001-11-15

    The objective of this project was to conduct high-performance computing research and teaching at AAMU, and to train African-American and other minority students and scientists in the computational science field for eventual employment with DOE. During the project period, eight tasks were accomplished. Student Research Assistant, Work Study, Summer Interns, Scholarship were proved to be one of the best ways for us to attract top-quality minority students. Under the support of DOE, through research, summer interns, collaborations, scholarships programs, AAMU has successfully provided research and educational opportunities to minority students in the field related to computational science.

  12. Large-scale visualization projects for teaching software engineering.

    PubMed

    Müller, Christoph; Reina, Guido; Burch, Michael; Weiskopf, Daniel

    2012-01-01

    The University of Stuttgart's software engineering major complements the traditional computer science major with more practice-oriented education. Two-semester software projects in various application areas offered by the university's different computer science institutes are a successful building block in the curriculum. With this realistic, complex project setting, students experience the practice of software engineering, including software development processes, technologies, and soft skills. In particular, visualization-based projects are popular with students. Such projects offer them the opportunity to gain profound knowledge that would hardly be possible with only regular lectures and homework assignments.

  13. An Analysis of Cloud Computing with Amazon Web Services for the Atmospheric Science Data Center

    NASA Astrophysics Data System (ADS)

    Gleason, J. L.; Little, M. M.

    2013-12-01

    NASA science and engineering efforts rely heavily on compute and data handling systems. The nature of NASA science data is such that it is not restricted to NASA users, instead it is widely shared across a globally distributed user community including scientists, educators, policy decision makers, and the public. Therefore NASA science computing is a candidate use case for cloud computing where compute resources are outsourced to an external vendor. Amazon Web Services (AWS) is a commercial cloud computing service developed to use excess computing capacity at Amazon, and potentially provides an alternative to costly and potentially underutilized dedicated acquisitions whenever NASA scientists or engineers require additional data processing. AWS desires to provide a simplified avenue for NASA scientists and researchers to share large, complex data sets with external partners and the public. AWS has been extensively used by JPL for a wide range of computing needs and was previously tested on a NASA Agency basis during the Nebula testing program. Its ability to support the Langley Science Directorate needs to be evaluated by integrating it with real world operational needs across NASA and the associated maturity that would come with that. The strengths and weaknesses of this architecture and its ability to support general science and engineering applications has been demonstrated during the previous testing. The Langley Office of the Chief Information Officer in partnership with the Atmospheric Sciences Data Center (ASDC) has established a pilot business interface to utilize AWS cloud computing resources on a organization and project level pay per use model. This poster discusses an effort to evaluate the feasibility of the pilot business interface from a project level perspective by specifically using a processing scenario involving the Clouds and Earth's Radiant Energy System (CERES) project.

  14. 1999 NCCS Highlights

    NASA Technical Reports Server (NTRS)

    Bennett, Jerome (Technical Monitor)

    2002-01-01

    The NASA Center for Computational Sciences (NCCS) is a high-performance scientific computing facility operated, maintained and managed by the Earth and Space Data Computing Division (ESDCD) of NASA Goddard Space Flight Center's (GSFC) Earth Sciences Directorate. The mission of the NCCS is to advance leading-edge science by providing the best people, computers, and data storage systems to NASA's Earth and space sciences programs and those of other U.S. Government agencies, universities, and private institutions. Among the many computationally demanding Earth science research efforts supported by the NCCS in Fiscal Year 1999 (FY99) are the NASA Seasonal-to-Interannual Prediction Project, the NASA Search and Rescue Mission, Earth gravitational model development efforts, the National Weather Service's North American Observing System program, Data Assimilation Office studies, a NASA-sponsored project at the Center for Ocean-Land-Atmosphere Studies, a NASA-sponsored microgravity project conducted by researchers at the City University of New York and the University of Pennsylvania, the completion of a satellite-derived global climate data set, simulations of a new geodynamo model, and studies of Earth's torque. This document presents highlights of these research efforts and an overview of the NCCS, its facilities, and its people.

  15. Impact of Interdisciplinary Undergraduate Research in Mathematics and Biology on the Development of a New Course Integrating Five STEM Disciplines

    PubMed Central

    Caudill, Lester; Hill, April; Lipan, Ovidiu

    2010-01-01

    Funded by innovative programs at the National Science Foundation and the Howard Hughes Medical Institute, University of Richmond faculty in biology, chemistry, mathematics, physics, and computer science teamed up to offer first- and second-year students the opportunity to contribute to vibrant, interdisciplinary research projects. The result was not only good science but also good science that motivated and informed course development. Here, we describe four recent undergraduate research projects involving students and faculty in biology, physics, mathematics, and computer science and how each contributed in significant ways to the conception and implementation of our new Integrated Quantitative Science course, a course for first-year students that integrates the material in the first course of the major in each of biology, chemistry, mathematics, computer science, and physics. PMID:20810953

  16. Impact of Interdisciplinary Undergraduate Research in mathematics and biology on the development of a new course integrating five STEM disciplines.

    PubMed

    Caudill, Lester; Hill, April; Hoke, Kathy; Lipan, Ovidiu

    2010-01-01

    Funded by innovative programs at the National Science Foundation and the Howard Hughes Medical Institute, University of Richmond faculty in biology, chemistry, mathematics, physics, and computer science teamed up to offer first- and second-year students the opportunity to contribute to vibrant, interdisciplinary research projects. The result was not only good science but also good science that motivated and informed course development. Here, we describe four recent undergraduate research projects involving students and faculty in biology, physics, mathematics, and computer science and how each contributed in significant ways to the conception and implementation of our new Integrated Quantitative Science course, a course for first-year students that integrates the material in the first course of the major in each of biology, chemistry, mathematics, computer science, and physics.

  17. Laboratory directed research and development program FY 1997

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-03-01

    This report compiles the annual reports of Laboratory Directed Research and Development projects supported by the Berkeley Lab. Projects are arranged under the following topical sections: (1) Accelerator and fusion research division; (2) Chemical sciences division; (3) Computing Sciences; (4) Earth sciences division; (5) Environmental energy technologies division; (6) life sciences division; (7) Materials sciences division; (8) Nuclear science division; (9) Physics division; (10) Structural biology division; and (11) Cross-divisional. A total of 66 projects are summarized.

  18. Tri-P-LETS: Changing the Face of High School Computer Science

    ERIC Educational Resources Information Center

    Sherrell, Linda; Malasri, Kriangsiri; Mills, David; Thomas, Allen; Greer, James

    2012-01-01

    From 2004-2007, the University of Memphis carried out the NSF-funded Tri-P-LETS (Three P Learning Environment for Teachers and Students) project to improve local high-school computer science curricula. The project reached a total of 58 classrooms in eleven high schools emphasizing problem solving skills, programming concepts as opposed to syntax,…

  19. Teaching Mixed-Mode: A Case Study in Remote Delivery of Computer Science in Africa

    ERIC Educational Resources Information Center

    Howell, Sheila; Harris, Michael; Wilkinson, Simon; Zuluaga, Catherine; Voutier, Paul

    2004-01-01

    In February 2003, RMIT University in Melbourne, Australia, commenced delivery of a Computer Science diploma and degree programme using mixed mode delivery to 250 university students in sub-Saharan Africa, through a World Bank funded project designed for the African Virtual University (AVU). The project is a unique experience made possible by…

  20. Building place-based collaborations to develop high school students' groundwater systems knowledge and decision-making capacity

    NASA Astrophysics Data System (ADS)

    Podrasky, A.; Covitt, B. A.; Woessner, W.

    2017-12-01

    The availability of clean water to support human uses and ecological integrity has become an urgent interest for many scientists, decision makers and citizens. Likewise, as computational capabilities increasingly revolutionize and become integral to the practice of science, technology, engineering and math (STEM) disciplines, the STEM+ Computing (STEM+C) Partnerships program seeks to integrate the use of computational approaches in K-12 STEM teaching and learning. The Comp Hydro project, funded by a STEM+C grant from the National Science Foundation, brings together a diverse team of scientists, educators, professionals and citizens at sites in Arizona, Colorado, Maryland and Montana to foster water literacy, as well as computational science literacy, by integrating authentic, place- and data- based learning using physical, mathematical, computational and conceptual models. This multi-state project is currently engaging four teams of six teachers who work during two academic years with educators and scientists at each site. Teams work to develop instructional units specific to their region that integrate hydrologic science and computational modeling. The units, currently being piloted in high school earth and environmental science classes, provide a classroom context to investigate student understanding of how computation is used in Earth systems science. To develop effective science instruction that is rich in place- and data- based learning, effective collaborations between researchers, educators, scientists, professionals and citizens are crucial. In this poster, we focus on project implementation in Montana, where an instructional unit has been developed and is being tested through collaboration among University scientists, researchers and educators, high school teachers and agency and industry scientists and engineers. In particular, we discuss three characteristics of effective collaborative science education design for developing and implementing place- and data- based science education to support students in developing socio-scientific and computational literacy sufficient for making decisions about real world issues such as groundwater contamination. These characteristics include that science education experiences are real, responsive/accessible and rigorous.

  1. Project MASTER, 1987-88. OREA Report.

    ERIC Educational Resources Information Center

    Berney, Tomi D.; Hammack, Floyd

    Project MASTER completed its 3-year funding cycle in 1987-88. The project aimed at providing enhanced science instruction to 575 Spanish-speaking limited-English-proficient students in 5 elementary schools. Project MASTER offered classes in English as a Second Language (ESL), mathematics, science, and computer skills with a hands-on, integrated…

  2. Computational Thinking Patterns

    ERIC Educational Resources Information Center

    Ioannidou, Andri; Bennett, Vicki; Repenning, Alexander; Koh, Kyu Han; Basawapatna, Ashok

    2011-01-01

    The iDREAMS project aims to reinvent Computer Science education in K-12 schools, by using game design and computational science for motivating and educating students through an approach we call Scalable Game Design, starting at the middle school level. In this paper we discuss the use of Computational Thinking Patterns as the basis for our…

  3. Educational NASA Computational and Scientific Studies (enCOMPASS)

    NASA Technical Reports Server (NTRS)

    Memarsadeghi, Nargess

    2013-01-01

    Educational NASA Computational and Scientific Studies (enCOMPASS) is an educational project of NASA Goddard Space Flight Center aimed at bridging the gap between computational objectives and needs of NASA's scientific research, missions, and projects, and academia's latest advances in applied mathematics and computer science. enCOMPASS achieves this goal via bidirectional collaboration and communication between NASA and academia. Using developed NASA Computational Case Studies in university computer science/engineering and applied mathematics classes is a way of addressing NASA's goals of contributing to the Science, Technology, Education, and Math (STEM) National Objective. The enCOMPASS Web site at http://encompass.gsfc.nasa.gov provides additional information. There are currently nine enCOMPASS case studies developed in areas of earth sciences, planetary sciences, and astrophysics. Some of these case studies have been published in AIP and IEEE's Computing in Science and Engineering magazines. A few university professors have used enCOMPASS case studies in their computational classes and contributed their findings to NASA scientists. In these case studies, after introducing the science area, the specific problem, and related NASA missions, students are first asked to solve a known problem using NASA data and past approaches used and often published in a scientific/research paper. Then, after learning about the NASA application and related computational tools and approaches for solving the proposed problem, students are given a harder problem as a challenge for them to research and develop solutions for. This project provides a model for NASA scientists and engineers on one side, and university students, faculty, and researchers in computer science and applied mathematics on the other side, to learn from each other's areas of work, computational needs and solutions, and the latest advances in research and development. This innovation takes NASA science and engineering applications to computer science and applied mathematics university classes, and makes NASA objectives part of the university curricula. There is great potential for growth and return on investment of this program to the point where every major university in the U.S. would use at least one of these case studies in one of their computational courses, and where every NASA scientist and engineer facing a computational challenge (without having resources or expertise to solve it) would use enCOMPASS to formulate the problem as a case study, provide it to a university, and get back their solutions and ideas.

  4. ALCF Data Science Program: Productive Data-centric Supercomputing

    NASA Astrophysics Data System (ADS)

    Romero, Nichols; Vishwanath, Venkatram

    The ALCF Data Science Program (ADSP) is targeted at big data science problems that require leadership computing resources. The goal of the program is to explore and improve a variety of computational methods that will enable data-driven discoveries across all scientific disciplines. The projects will focus on data science techniques covering a wide area of discovery including but not limited to uncertainty quantification, statistics, machine learning, deep learning, databases, pattern recognition, image processing, graph analytics, data mining, real-time data analysis, and complex and interactive workflows. Project teams will be among the first to access Theta, ALCFs forthcoming 8.5 petaflops Intel/Cray system. The program will transition to the 200 petaflop/s Aurora supercomputing system when it becomes available. In 2016, four projects have been selected to kick off the ADSP. The selected projects span experimental and computational sciences and range from modeling the brain to discovering new materials for solar-powered windows to simulating collision events at the Large Hadron Collider (LHC). The program will have a regular call for proposals with the next call expected in Spring 2017.http://www.alcf.anl.gov/alcf-data-science-program This research used resources of the ALCF, which is a DOE Office of Science User Facility supported under Contract DE-AC02-06CH11357.

  5. A Review of Resources for Evaluating K-12 Computer Science Education Programs

    ERIC Educational Resources Information Center

    Randolph, Justus J.; Hartikainen, Elina

    2004-01-01

    Since computer science education is a key to preparing students for a technologically-oriented future, it makes sense to have high quality resources for conducting summative and formative evaluation of those programs. This paper describes the results of a critical analysis of the resources for evaluating K-12 computer science education projects.…

  6. Student teaching and research laboratory focusing on brain-computer interface paradigms--A creative environment for computer science students.

    PubMed

    Rutkowski, Tomasz M

    2015-08-01

    This paper presents an applied concept of a brain-computer interface (BCI) student research laboratory (BCI-LAB) at the Life Science Center of TARA, University of Tsukuba, Japan. Several successful case studies of the student projects are reviewed together with the BCI Research Award 2014 winner case. The BCI-LAB design and project-based teaching philosophy is also explained. Future teaching and research directions summarize the review.

  7. 33 CFR 385.26 - Project Implementation Reports.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... available science; (iii) Comply with all applicable Federal, State, and Tribal laws; (iv) Contain sufficient... boundary of regional computer models or projects whose effects cannot be captured in regional computer...

  8. 33 CFR 385.26 - Project Implementation Reports.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... available science; (iii) Comply with all applicable Federal, State, and Tribal laws; (iv) Contain sufficient... boundary of regional computer models or projects whose effects cannot be captured in regional computer...

  9. 33 CFR 385.26 - Project Implementation Reports.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... available science; (iii) Comply with all applicable Federal, State, and Tribal laws; (iv) Contain sufficient... boundary of regional computer models or projects whose effects cannot be captured in regional computer...

  10. 33 CFR 385.26 - Project Implementation Reports.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... available science; (iii) Comply with all applicable Federal, State, and Tribal laws; (iv) Contain sufficient... boundary of regional computer models or projects whose effects cannot be captured in regional computer...

  11. The Effect of a Computer Program Designed with Constructivist Principles for College Non-Science Majors on Understanding of Photosynthesis and Cellular Respiration

    ERIC Educational Resources Information Center

    Wielard, Valerie Michelle

    2013-01-01

    The primary objective of this project was to learn what effect a computer program would have on academic achievement and attitude toward science of college students enrolled in a biology class for non-science majors. It became apparent that the instructor also had an effect on attitudes toward science. The researcher designed a computer program,…

  12. Teaching Bioinformatics in Concert

    PubMed Central

    Goodman, Anya L.; Dekhtyar, Alex

    2014-01-01

    Can biology students without programming skills solve problems that require computational solutions? They can if they learn to cooperate effectively with computer science students. The goal of the in-concert teaching approach is to introduce biology students to computational thinking by engaging them in collaborative projects structured around the software development process. Our approach emphasizes development of interdisciplinary communication and collaboration skills for both life science and computer science students. PMID:25411792

  13. Architectural Aspects of Grid Computing and its Global Prospects for E-Science Community

    NASA Astrophysics Data System (ADS)

    Ahmad, Mushtaq

    2008-05-01

    The paper reviews the imminent Architectural Aspects of Grid Computing for e-Science community for scientific research and business/commercial collaboration beyond physical boundaries. Grid Computing provides all the needed facilities; hardware, software, communication interfaces, high speed internet, safe authentication and secure environment for collaboration of research projects around the globe. It provides highly fast compute engine for those scientific and engineering research projects and business/commercial applications which are heavily compute intensive and/or require humongous amounts of data. It also makes possible the use of very advanced methodologies, simulation models, expert systems and treasure of knowledge available around the globe under the umbrella of knowledge sharing. Thus it makes possible one of the dreams of global village for the benefit of e-Science community across the globe.

  14. Assessing Motivations and Use of Online Citizen Science Astronomy Projects

    NASA Astrophysics Data System (ADS)

    Nona Bakerman, Maya; Buxner, Sanlyn; Bracey, Georgia; Gugliucci, Nicole

    2018-01-01

    The exponential proliferation of astronomy data has resulted in the need to develop new ways to analyze data. Recent efforts to engage the public in the discussion of the importance of science has led to projects that are aimed at letting them have hands-on experiences. Citizen science in astronomy, which has followed the model of citizen science in other scientific fields, has increased in the number and type of projects in the last few years and poses captivating ways to engage the public in science.The primary feature of this study was citizen science users’ motivations and activities related to engaging in astronomy citizen science projects. We report on participants’ interview responses related to their motivations, length and frequency of engagement, and reasons for leaving the project. From May to October 2014, 32 adults were interviewed to assess their motivations and experiences with citizen science. In particular, we looked at if and how motivations have changed for those who have engaged in the projects in order to develop support for and understandparticipants of citizen science. The predominant reasons participants took part in citizen science were: interest, helping, learning or teaching, and being part of science. Everyone interviewed demonstrated an intrinsic motivation to do citizen science projects.Participants’ reasons for ending their engagement on any given day were: having to do other things, physical effects of the computer, scheduled event that ended, attention span or tired, computer or program issues. A small fraction of the participants also indicated experiencing negative feedback. Out of the participants who no longer took part in citizen science projects, some indicated that receiving negative feedback was their primary reason and others reported the program to be frustrating.Our work is helping us to understand participants who engage in online citizen science projects so that researchers can better design projects to meet their needs and develop support materials and incentives to encourage more participation.

  15. The Computing and Data Grid Approach: Infrastructure for Distributed Science Applications

    NASA Technical Reports Server (NTRS)

    Johnston, William E.

    2002-01-01

    With the advent of Grids - infrastructure for using and managing widely distributed computing and data resources in the science environment - there is now an opportunity to provide a standard, large-scale, computing, data, instrument, and collaboration environment for science that spans many different projects and provides the required infrastructure and services in a relatively uniform and supportable way. Grid technology has evolved over the past several years to provide the services and infrastructure needed for building 'virtual' systems and organizations. We argue that Grid technology provides an excellent basis for the creation of the integrated environments that can combine the resources needed to support the large- scale science projects located at multiple laboratories and universities. We present some science case studies that indicate that a paradigm shift in the process of science will come about as a result of Grids providing transparent and secure access to advanced and integrated information and technologies infrastructure: powerful computing systems, large-scale data archives, scientific instruments, and collaboration tools. These changes will be in the form of services that can be integrated with the user's work environment, and that enable uniform and highly capable access to these computers, data, and instruments, regardless of the location or exact nature of these resources. These services will integrate transient-use resources like computing systems, scientific instruments, and data caches (e.g., as they are needed to perform a simulation or analyze data from a single experiment); persistent-use resources. such as databases, data catalogues, and archives, and; collaborators, whose involvement will continue for the lifetime of a project or longer. While we largely address large-scale science in this paper, Grids, particularly when combined with Web Services, will address a broad spectrum of science scenarios. both large and small scale.

  16. Understanding and Improving Blind Students' Access to Visual Information in Computer Science Education

    NASA Astrophysics Data System (ADS)

    Baker, Catherine M.

    Teaching people with disabilities tech skills empowers them to create solutions to problems they encounter and prepares them for careers. However, computer science is typically taught in a highly visual manner which can present barriers for people who are blind. The goal of this dissertation is to understand and decrease those barriers. The first projects I present looked at the barriers that blind students face. I first present the results of my survey and interviews with blind students with degrees in computer science or related fields. This work highlighted the many barriers that these blind students faced. I then followed-up on one of the barriers mentioned, access to technology, by doing a preliminary accessibility evaluation of six popular integrated development environments (IDEs) and code editors. I found that half were unusable and all had some inaccessible portions. As access to visual information is a barrier in computer science education, I present three projects I have done to decrease this barrier. The first project is Tactile Graphics with a Voice (TGV). This project investigated an alternative to Braille labels for those who do not know Braille and showed that TGV was a potential alternative. The next project was StructJumper, which created a modified abstract syntax tree that blind programmers could use to navigate through code with their screen reader. The evaluation showed that users could navigate more quickly and easily determine the relationships of lines of code when they were using StructJumper compared to when they were not. Finally, I present a tool for dynamic graphs (the type with nodes and edges) which had two different modes for handling focus changes when moving between graphs. I found that the modes support different approaches for exploring the graphs and therefore preferences are mixed based on the user's preferred approach. However, both modes had similar accuracy in completing the tasks. These projects are a first step towards the goal of making computer science education more accessible to blind students. By identifying the barriers that exist and creating solutions to overcome them, we can support increasing the number of blind students in computer science.

  17. CESDIS

    NASA Technical Reports Server (NTRS)

    1994-01-01

    CESDIS, the Center of Excellence in Space Data and Information Sciences was developed jointly by NASA, Universities Space Research Association (USRA), and the University of Maryland in 1988 to focus on the design of advanced computing techniques and data systems to support NASA Earth and space science research programs. CESDIS is operated by USRA under contract to NASA. The Director, Associate Director, Staff Scientists, and administrative staff are located on-site at NASA's Goddard Space Flight Center in Greenbelt, Maryland. The primary CESDIS mission is to increase the connection between computer science and engineering research programs at colleges and universities and NASA groups working with computer applications in Earth and space science. Research areas of primary interest at CESDIS include: 1) High performance computing, especially software design and performance evaluation for massively parallel machines; 2) Parallel input/output and data storage systems for high performance parallel computers; 3) Data base and intelligent data management systems for parallel computers; 4) Image processing; 5) Digital libraries; and 6) Data compression. CESDIS funds multiyear projects at U. S. universities and colleges. Proposals are accepted in response to calls for proposals and are selected on the basis of peer reviews. Funds are provided to support faculty and graduate students working at their home institutions. Project personnel visit Goddard during academic recess periods to attend workshops, present seminars, and collaborate with NASA scientists on research projects. Additionally, CESDIS takes on specific research tasks of shorter duration for computer science research requested by NASA Goddard scientists.

  18. A Software Laboratory Environment for Computer-Based Problem Solving.

    ERIC Educational Resources Information Center

    Kurtz, Barry L.; O'Neal, Micheal B.

    This paper describes a National Science Foundation-sponsored project at Louisiana Technological University to develop computer-based laboratories for "hands-on" introductions to major topics of computer science. The underlying strategy is to develop structured laboratory environments that present abstract concepts through the use of…

  19. Alliance for Computational Science Collaboration: HBCU Partnership at Alabama A&M University Continuing High Performance Computing Research and Education at AAMU

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qian, Xiaoqing; Deng, Z. T.

    2009-11-10

    This is the final report for the Department of Energy (DOE) project DE-FG02-06ER25746, entitled, "Continuing High Performance Computing Research and Education at AAMU". This three-year project was started in August 15, 2006, and it was ended in August 14, 2009. The objective of this project was to enhance high performance computing research and education capabilities at Alabama A&M University (AAMU), and to train African-American and other minority students and scientists in the computational science field for eventual employment with DOE. AAMU has successfully completed all the proposed research and educational tasks. Through the support of DOE, AAMU was able tomore » provide opportunities to minority students through summer interns and DOE computational science scholarship program. In the past three years, AAMU (1). Supported three graduate research assistants in image processing for hypersonic shockwave control experiment and in computational science related area; (2). Recruited and provided full financial support for six AAMU undergraduate summer research interns to participate Research Alliance in Math and Science (RAMS) program at Oak Ridge National Lab (ORNL); (3). Awarded highly competitive 30 DOE High Performance Computing Scholarships ($1500 each) to qualified top AAMU undergraduate students in science and engineering majors; (4). Improved high performance computing laboratory at AAMU with the addition of three high performance Linux workstations; (5). Conducted image analysis for electromagnetic shockwave control experiment and computation of shockwave interactions to verify the design and operation of AAMU-Supersonic wind tunnel. The high performance computing research and education activities at AAMU created great impact to minority students. As praised by Accreditation Board for Engineering and Technology (ABET) in 2009, ?The work on high performance computing that is funded by the Department of Energy provides scholarships to undergraduate students as computational science scholars. This is a wonderful opportunity to recruit under-represented students.? Three ASEE papers were published in 2007, 2008 and 2009 proceedings of ASEE Annual Conferences, respectively. Presentations of these papers were also made at the ASEE Annual Conferences. It is very critical to continue the research and education activities.« less

  20. Student Science Training Program in Mathematics, Physics and Computer Science. Final Report to the National Science Foundation. Artificial Intelligence Memo No. 393.

    ERIC Educational Resources Information Center

    Abelson, Harold; diSessa, Andy

    During the summer of 1976, the MIT Artificial Intelligence Laboratory sponsored a Student Science Training Program in Mathematics, Physics, and Computer Science for high ability secondary school students. This report describes, in some detail, the style of the program, the curriculum and the projects the students under-took. It is hoped that this…

  1. A parallel-processing approach to computing for the geographic sciences; applications and systems enhancements

    USGS Publications Warehouse

    Crane, Michael; Steinwand, Dan; Beckmann, Tim; Krpan, Greg; Liu, Shu-Guang; Nichols, Erin; Haga, Jim; Maddox, Brian; Bilderback, Chris; Feller, Mark; Homer, George

    2001-01-01

    The overarching goal of this project is to build a spatially distributed infrastructure for information science research by forming a team of information science researchers and providing them with similar hardware and software tools to perform collaborative research. Four geographically distributed Centers of the U.S. Geological Survey (USGS) are developing their own clusters of low-cost, personal computers into parallel computing environments that provide a costeffective way for the USGS to increase participation in the high-performance computing community. Referred to as Beowulf clusters, these hybrid systems provide the robust computing power required for conducting information science research into parallel computing systems and applications.

  2. Developing the Next Generation of Science Data System Engineers

    NASA Technical Reports Server (NTRS)

    Moses, John F.; Behnke, Jeanne; Durachka, Christopher D.

    2016-01-01

    At Goddard, engineers and scientists with a range of experience in science data systems are needed to employ new technologies and develop advances in capabilities for supporting new Earth and Space science research. Engineers with extensive experience in science data, software engineering and computer-information architectures are needed to lead and perform these activities. The increasing types and complexity of instrument data and emerging computer technologies coupled with the current shortage of computer engineers with backgrounds in science has led the need to develop a career path for science data systems engineers and architects.The current career path, in which undergraduate students studying various disciplines such as Computer Engineering or Physical Scientist, generally begins with serving on a development team in any of the disciplines where they can work in depth on existing Goddard data systems or serve with a specific NASA science team. There they begin to understand the data, infuse technologies, and begin to know the architectures of science data systems. From here the typical career involves peermentoring, on-the-job training or graduate level studies in analytics, computational science and applied science and mathematics. At the most senior level, engineers become subject matter experts and system architect experts, leading discipline-specific data centers and large software development projects. They are recognized as a subject matter expert in a science domain, they have project management expertise, lead standards efforts and lead international projects. A long career development remains necessary not only because of the breadth of knowledge required across physical sciences and engineering disciplines, but also because of the diversity of instrument data being developed today both by NASA and international partner agencies and because multidiscipline science and practitioner communities expect to have access to all types of observational data.This paper describes an approach to defining career-path guidance for college-bound high school and undergraduate engineering students, junior and senior engineers from various disciplines.

  3. Developing the Next Generation of Science Data System Engineers

    NASA Astrophysics Data System (ADS)

    Moses, J. F.; Durachka, C. D.; Behnke, J.

    2015-12-01

    At Goddard, engineers and scientists with a range of experience in science data systems are needed to employ new technologies and develop advances in capabilities for supporting new Earth and Space science research. Engineers with extensive experience in science data, software engineering and computer-information architectures are needed to lead and perform these activities. The increasing types and complexity of instrument data and emerging computer technologies coupled with the current shortage of computer engineers with backgrounds in science has led the need to develop a career path for science data systems engineers and architects. The current career path, in which undergraduate students studying various disciplines such as Computer Engineering or Physical Scientist, generally begins with serving on a development team in any of the disciplines where they can work in depth on existing Goddard data systems or serve with a specific NASA science team. There they begin to understand the data, infuse technologies, and begin to know the architectures of science data systems. From here the typical career involves peer mentoring, on-the-job training or graduate level studies in analytics, computational science and applied science and mathematics. At the most senior level, engineers become subject matter experts and system architect experts, leading discipline-specific data centers and large software development projects. They are recognized as a subject matter expert in a science domain, they have project management expertise, lead standards efforts and lead international projects. A long career development remains necessary not only because of the breath of knowledge required across physical sciences and engineering disciplines, but also because of the diversity of instrument data being developed today both by NASA and international partner agencies and because multi-discipline science and practitioner communities expect to have access to all types of observational data. This paper describes an approach to defining career-path guidance for college-bound high school and undergraduate engineering students, junior and senior engineers from various disciplines.

  4. The Magellan Final Report on Cloud Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    ,; Coghlan, Susan; Yelick, Katherine

    The goal of Magellan, a project funded through the U.S. Department of Energy (DOE) Office of Advanced Scientific Computing Research (ASCR), was to investigate the potential role of cloud computing in addressing the computing needs for the DOE Office of Science (SC), particularly related to serving the needs of mid- range computing and future data-intensive computing workloads. A set of research questions was formed to probe various aspects of cloud computing from performance, usability, and cost. To address these questions, a distributed testbed infrastructure was deployed at the Argonne Leadership Computing Facility (ALCF) and the National Energy Research Scientific Computingmore » Center (NERSC). The testbed was designed to be flexible and capable enough to explore a variety of computing models and hardware design points in order to understand the impact for various scientific applications. During the project, the testbed also served as a valuable resource to application scientists. Applications from a diverse set of projects such as MG-RAST (a metagenomics analysis server), the Joint Genome Institute, the STAR experiment at the Relativistic Heavy Ion Collider, and the Laser Interferometer Gravitational Wave Observatory (LIGO), were used by the Magellan project for benchmarking within the cloud, but the project teams were also able to accomplish important production science utilizing the Magellan cloud resources.« less

  5. The FIFE Project at Fermilab

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Box, D.; Boyd, J.; Di Benedetto, V.

    2016-01-01

    The FabrIc for Frontier Experiments (FIFE) project is an initiative within the Fermilab Scientific Computing Division designed to steer the computing model for non-LHC Fermilab experiments across multiple physics areas. FIFE is a collaborative effort between experimenters and computing professionals to design and develop integrated computing models for experiments of varying size, needs, and infrastructure. The major focus of the FIFE project is the development, deployment, and integration of solutions for high throughput computing, data management, database access and collaboration management within an experiment. To accomplish this goal, FIFE has developed workflows that utilize Open Science Grid compute sites alongmore » with dedicated and commercial cloud resources. The FIFE project has made significant progress integrating into experiment computing operations several services including a common job submission service, software and reference data distribution through CVMFS repositories, flexible and robust data transfer clients, and access to opportunistic resources on the Open Science Grid. The progress with current experiments and plans for expansion with additional projects will be discussed. FIFE has taken the leading role in defining the computing model for Fermilab experiments, aided in the design of experiments beyond those hosted at Fermilab, and will continue to define the future direction of high throughput computing for future physics experiments worldwide.« less

  6. Bringing Computational Thinking into the High School Science and Math Classroom

    NASA Astrophysics Data System (ADS)

    Trouille, Laura; Beheshti, E.; Horn, M.; Jona, K.; Kalogera, V.; Weintrop, D.; Wilensky, U.; University CT-STEM Project, Northwestern; University CenterTalent Development, Northwestern

    2013-01-01

    Computational thinking (for example, the thought processes involved in developing algorithmic solutions to problems that can then be automated for computation) has revolutionized the way we do science. The Next Generation Science Standards require that teachers support their students’ development of computational thinking and computational modeling skills. As a result, there is a very high demand among teachers for quality materials. Astronomy provides an abundance of opportunities to support student development of computational thinking skills. Our group has taken advantage of this to create a series of astronomy-based computational thinking lesson plans for use in typical physics, astronomy, and math high school classrooms. This project is funded by the NSF Computing Education for the 21st Century grant and is jointly led by Northwestern University’s Center for Interdisciplinary Exploration and Research in Astrophysics (CIERA), the Computer Science department, the Learning Sciences department, and the Office of STEM Education Partnerships (OSEP). I will also briefly present the online ‘Astro Adventures’ courses for middle and high school students I have developed through NU’s Center for Talent Development. The online courses take advantage of many of the amazing online astronomy enrichment materials available to the public, including a range of hands-on activities and the ability to take images with the Global Telescope Network. The course culminates with an independent computational research project.

  7. Damsel: A Data Model Storage Library for Exascale Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choudhary, Alok; Liao, Wei-keng

    Computational science applications have been described as having one of seven motifs (the “seven dwarfs”), each having a particular pattern of computation and communication. From a storage and I/O perspective, these applications can also be grouped into a number of data model motifs describing the way data is organized and accessed during simulation, analysis, and visualization. Major storage data models developed in the 1990s, such as Network Common Data Format (netCDF) and Hierarchical Data Format (HDF) projects, created support for more complex data models. Development of both netCDF and HDF5 was influenced by multi-dimensional dataset storage requirements, but their accessmore » models and formats were designed with sequential storage in mind (e.g., a POSIX I/O model). Although these and other high-level I/O libraries have had a beneficial impact on large parallel applications, they do not always attain a high percentage of peak I/O performance due to fundamental design limitations, and they do not address the full range of current and future computational science data models. The goal of this project is to enable exascale computational science applications to interact conveniently and efficiently with storage through abstractions that match their data models. The project consists of three major activities: (1) identifying major data model motifs in computational science applications and developing representative benchmarks; (2) developing a data model storage library, called Damsel, that supports these motifs, provides efficient storage data layouts, incorporates optimizations to enable exascale operation, and is tolerant to failures; and (3) productizing Damsel and working with computational scientists to encourage adoption of this library by the scientific community. The product of this project, Damsel library, is openly available for download from http://cucis.ece.northwestern.edu/projects/DAMSEL. Several case studies and application programming interface reference are also available to assist new users to learn to use the library.« less

  8. Monogamy relations of quantum entanglement for partially coherently superposed states

    NASA Astrophysics Data System (ADS)

    Shi, Xian

    2017-12-01

    Not Available Project partially supported by the National Key Research and Development Program of China (Grant No. 2016YFB1000902), the National Natural Science Foundation of China (Grant Nos. 61232015, 61472412, and 61621003), the Beijing Science and Technology Project (2016), Tsinghua-Tencent-AMSS-Joint Project (2016), and the Key Laboratory of Mathematics Mechanization Project: Quantum Computing and Quantum Information Processing.

  9. Computing and the social organization of academic work

    NASA Astrophysics Data System (ADS)

    Shields, Mark A.; Graves, William; Nyce, James M.

    1992-12-01

    This article discusses the academic computing movement during the 1980s. We focus on the Faculty Workstations Project at Brown University, where major computing initiatives were undertaken during the 1980s. Six departments are compared: chemistry, cognitive and linguistic sciences, geology, music, neural science, and sociology. We discuss the theoretical implications of our study for conceptualizing the relationship of computing to academic work.

  10. The Argonne Leadership Computing Facility 2010 annual report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drugan, C.

    Researchers found more ways than ever to conduct transformative science at the Argonne Leadership Computing Facility (ALCF) in 2010. Both familiar initiatives and innovative new programs at the ALCF are now serving a growing, global user community with a wide range of computing needs. The Department of Energy's (DOE) INCITE Program remained vital in providing scientists with major allocations of leadership-class computing resources at the ALCF. For calendar year 2011, 35 projects were awarded 732 million supercomputer processor-hours for computationally intensive, large-scale research projects with the potential to significantly advance key areas in science and engineering. Argonne also continued tomore » provide Director's Discretionary allocations - 'start up' awards - for potential future INCITE projects. And DOE's new ASCR Leadership Computing (ALCC) Program allocated resources to 10 ALCF projects, with an emphasis on high-risk, high-payoff simulations directly related to the Department's energy mission, national emergencies, or for broadening the research community capable of using leadership computing resources. While delivering more science today, we've also been laying a solid foundation for high performance computing in the future. After a successful DOE Lehman review, a contract was signed to deliver Mira, the next-generation Blue Gene/Q system, to the ALCF in 2012. The ALCF is working with the 16 projects that were selected for the Early Science Program (ESP) to enable them to be productive as soon as Mira is operational. Preproduction access to Mira will enable ESP projects to adapt their codes to its architecture and collaborate with ALCF staff in shaking down the new system. We expect the 10-petaflops system to stoke economic growth and improve U.S. competitiveness in key areas such as advancing clean energy and addressing global climate change. Ultimately, we envision Mira as a stepping-stone to exascale-class computers that will be faster than petascale-class computers by a factor of a thousand. Pete Beckman, who served as the ALCF's Director for the past few years, has been named director of the newly created Exascale Technology and Computing Institute (ETCi). The institute will focus on developing exascale computing to extend scientific discovery and solve critical science and engineering problems. Just as Pete's leadership propelled the ALCF to great success, we know that that ETCi will benefit immensely from his expertise and experience. Without question, the future of supercomputing is certainly in good hands. I would like to thank Pete for all his effort over the past two years, during which he oversaw the establishing of ALCF2, the deployment of the Magellan project, increases in utilization, availability, and number of projects using ALCF1. He managed the rapid growth of ALCF staff and made the facility what it is today. All the staff and users are better for Pete's efforts.« less

  11. Computer Courseware Evaluations. A Series of Reports Compiled by the Clearinghouse Computer Technology Project.

    ERIC Educational Resources Information Center

    Alberta Dept. of Education, Edmonton.

    This report reviews Apple computer courseware in business education, library skills, mathematics, science, special education, and word processing based on the curricular requirements of Alberta, Canada. It provides detailed evaluations of 23 authorized titles in business education (2), mathematics (20), and science (1); 3 of the math titles are…

  12. A Cross-Cultural Study of the Effect of a Graph-Oriented Computer-Assisted Project-Based Learning Environment on Middle School Students' Science Knowledge and Argumentation Skills

    ERIC Educational Resources Information Center

    Hsu, P.-S.; Van Dyke, M.; Chen, Y.; Smith, T. J.

    2016-01-01

    The purpose of this mixed-methods study was to explore how seventh graders in a suburban school in the United States and sixth graders in an urban school in Taiwan developed argumentation skills and science knowledge in a project-based learning environment that incorporated a graph-oriented, computer-assisted application (GOCAA). A total of 42…

  13. Connectionist Models and Parallelism in High Level Vision.

    DTIC Science & Technology

    1985-01-01

    GRANT NUMBER(s) Jerome A. Feldman N00014-82-K-0193 9. PERFORMING ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENt. PROJECT, TASK Computer Science...Connectionist Models 2.1 Background and Overviev % Computer science is just beginning to look seriously at parallel computation : it may turn out that...the chair. The program includes intermediate level networks that compute more complex joints and ones that compute parallelograms in the image. These

  14. Computation, Mathematics and Logistics Department Report for Fiscal Year 1978.

    DTIC Science & Technology

    1980-03-01

    storage technology. A reference library on these and related areas is now composed of two thousand documents. The most comprehensive tool available...at DTNSRDC on the CDC 6000 Computer System for a variety of applications including Navy Logistics, Library Science, Ocean Science, Contract Manage... Library Science) Track technical documents on advanced ship design Univ. of Virginia at Charlottesville - (Ocean Science) Monitor research projects for

  15. Computers in Bilingual Education: Project CIBE. Evaluation Section Report. OREA Reports.

    ERIC Educational Resources Information Center

    Berney, Tomi D.; Alvarez, Rosalyn

    This project provided 360 students at South Bronx High School (New York) with instruction in English as a Second Language (ESL); Native Language Arts (NLA); the bilingual content area subjects of mathematics, science, and social studies; and computer literacy. The goal of the project was to provide instructional and support services to…

  16. Computers in Bilingual Education, Project CIBE, 1987-88. Evaluation Section Report. OREA Report.

    ERIC Educational Resources Information Center

    Berney, Tomi D.; Alvarez, Rosalyn

    Computers in Bilingual Education (Project CIBE) was fully implemented at South Bronx High School in its fourth year of federal funding. During the 1987-88 school year, students received computer-assisted and classroom instruction in English as a Second Language (ESL), native language arts (NLA), social studies, mathematics, science, computer…

  17. Students Develop Real-World Web and Pervasive Computing Systems.

    ERIC Educational Resources Information Center

    Tappert, Charles C.

    In the academic year 2001-2002, Pace University (New York) Computer Science and Information Systems (CSIS) students developed real-world Web and pervasive computing systems for actual customers. This paper describes the general use of team projects in CSIS at Pace University, the real-world projects from this academic year, the benefits of…

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shankar, Arjun

    Computer scientist Arjun Shankar is director of the Compute and Data Environment for Science (CADES), ORNL’s multidisciplinary big data computing center. CADES offers computing, networking and data analytics to facilitate workflows for both ORNL and external research projects.

  19. From Requirements to Code: Issues and Learning in IS Students' Systems Development Projects

    ERIC Educational Resources Information Center

    Scott, Elsje

    2008-01-01

    The Computing Curricula (2005) place Information Systems (IS) at the intersection of exact sciences (e.g. General Systems Theory), technology (e.g. Computer Science), and behavioral sciences (e.g. Sociology). This presents particular challenges for teaching and learning, as future IS professionals need to be equipped with a wide range of…

  20. eScience for molecular-scale simulations and the eMinerals project.

    PubMed

    Salje, E K H; Artacho, E; Austen, K F; Bruin, R P; Calleja, M; Chappell, H F; Chiang, G-T; Dove, M T; Frame, I; Goodwin, A L; Kleese van Dam, K; Marmier, A; Parker, S C; Pruneda, J M; Todorov, I T; Trachenko, K; Tyer, R P; Walker, A M; White, T O H

    2009-03-13

    We review the work carried out within the eMinerals project to develop eScience solutions that facilitate a new generation of molecular-scale simulation work. Technological developments include integration of compute and data systems, developing of collaborative frameworks and new researcher-friendly tools for grid job submission, XML data representation, information delivery, metadata harvesting and metadata management. A number of diverse science applications will illustrate how these tools are being used for large parameter-sweep studies, an emerging type of study for which the integration of computing, data and collaboration is essential.

  1. A Professional Development Project for Improving the Use of Information and Communication Technologies in Science Teaching

    ERIC Educational Resources Information Center

    Lavonen, Jari; Juuti, Kalle; Aksela, Maija; Meisalo, Veijo

    2006-01-01

    This article describes a professional development project aiming to develop practical approaches for the integration of information and communication technologies (ICT) into science education. Altogether, 13 two-day face-to-face seminars and numerous computer network conferences were held during a three-year period. The goals for the project were…

  2. Balancing Formative and Summative Science Assessment Practices: Year One of the GenScope Assessment Project.

    ERIC Educational Resources Information Center

    Hickey, Daniel T.; Kruger, Ann Cale; Fredrick, Laura D.; Schafer, Nancy Jo; Kindfield, Ann C. H.

    This paper describes the GenScope Assessment Project, a project that is exploring ways of using multimedia computers to teach complex science content, refining sociocultural views of assessment and motivation, and considering different ways of reconciling the differences between these newer views and prior behavioral and cognitive views. The…

  3. Science-Driven Computing: NERSC's Plan for 2006-2010

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simon, Horst D.; Kramer, William T.C.; Bailey, David H.

    NERSC has developed a five-year strategic plan focusing on three components: Science-Driven Systems, Science-Driven Services, and Science-Driven Analytics. (1) Science-Driven Systems: Balanced introduction of the best new technologies for complete computational systems--computing, storage, networking, visualization and analysis--coupled with the activities necessary to engage vendors in addressing the DOE computational science requirements in their future roadmaps. (2) Science-Driven Services: The entire range of support activities, from high-quality operations and user services to direct scientific support, that enable a broad range of scientists to effectively use NERSC systems in their research. NERSC will concentrate on resources needed to realize the promise ofmore » the new highly scalable architectures for scientific discovery in multidisciplinary computational science projects. (3) Science-Driven Analytics: The architectural and systems enhancements and services required to integrate NERSC's powerful computational and storage resources to provide scientists with new tools to effectively manipulate, visualize, and analyze the huge data sets derived from simulations and experiments.« less

  4. Bioinformatics in high school biology curricula: a study of state science standards.

    PubMed

    Wefer, Stephen H; Sheppard, Keith

    2008-01-01

    The proliferation of bioinformatics in modern biology marks a modern revolution in science that promises to influence science education at all levels. This study analyzed secondary school science standards of 49 U.S. states (Iowa has no science framework) and the District of Columbia for content related to bioinformatics. The bioinformatics content of each state's biology standards was analyzed and categorized into nine areas: Human Genome Project/genomics, forensics, evolution, classification, nucleotide variations, medicine, computer use, agriculture/food technology, and science technology and society/socioscientific issues. Findings indicated a generally low representation of bioinformatics-related content, which varied substantially across the different areas, with Human Genome Project/genomics and computer use being the lowest (8%), and evolution being the highest (64%) among states' science frameworks. This essay concludes with recommendations for reworking/rewording existing standards to facilitate the goal of promoting science literacy among secondary school students.

  5. Bioinformatics in High School Biology Curricula: A Study of State Science Standards

    PubMed Central

    Sheppard, Keith

    2008-01-01

    The proliferation of bioinformatics in modern biology marks a modern revolution in science that promises to influence science education at all levels. This study analyzed secondary school science standards of 49 U.S. states (Iowa has no science framework) and the District of Columbia for content related to bioinformatics. The bioinformatics content of each state's biology standards was analyzed and categorized into nine areas: Human Genome Project/genomics, forensics, evolution, classification, nucleotide variations, medicine, computer use, agriculture/food technology, and science technology and society/socioscientific issues. Findings indicated a generally low representation of bioinformatics-related content, which varied substantially across the different areas, with Human Genome Project/genomics and computer use being the lowest (8%), and evolution being the highest (64%) among states' science frameworks. This essay concludes with recommendations for reworking/rewording existing standards to facilitate the goal of promoting science literacy among secondary school students. PMID:18316818

  6. Using Pedagogical Tools to Help Hispanics be Successful in Computer Science

    NASA Astrophysics Data System (ADS)

    Irish, Rodger

    Irish, Rodger, Using Pedagogical Tools to Help Hispanics Be Successful in Computer Science. Master of Science (MS), July 2017, 68 pp., 4 tables, 2 figures, references 48 titles. Computer science (CS) jobs are a growing field and pay a living wage, but the Hispanics are underrepresented in this field. This project seeks to give an overview of several contributing factors to this problem. It will then explore some possible solutions to this problem and how a combination of some tools (teaching methods) can create the best possible outcome. It is my belief that this approach can produce successful Hispanics to fill the needed jobs in the CS field. Then the project will test its hypothesis. I will discuss the tools used to measure progress both in the affective and the cognitive domains. I will show how the decision to run a Computer Club was reached and the results of the research. The conclusion will summarize the results and tell of future research that still needs to be done.

  7. Infrastructure for Training and Partnershipes: California Water and Coastal Ocean Resources

    NASA Technical Reports Server (NTRS)

    Siegel, David A.; Dozier, Jeffrey; Gautier, Catherine; Davis, Frank; Dickey, Tommy; Dunne, Thomas; Frew, James; Keller, Arturo; MacIntyre, Sally; Melack, John

    2000-01-01

    The purpose of this project was to advance the existing ICESS/Bren School computing infrastructure to allow scientists, students, and research trainees the opportunity to interact with environmental data and simulations in near-real time. Improvements made with the funding from this project have helped to strengthen the research efforts within both units, fostered graduate research training, and helped fortify partnerships with government and industry. With this funding, we were able to expand our computational environment in which computer resources, software, and data sets are shared by ICESS/Bren School faculty researchers in all areas of Earth system science. All of the graduate and undergraduate students associated with the Donald Bren School of Environmental Science and Management and the Institute for Computational Earth System Science have benefited from the infrastructure upgrades accomplished by this project. Additionally, the upgrades fostered a significant number of research projects (attached is a list of the projects that benefited from the upgrades). As originally proposed, funding for this project provided the following infrastructure upgrades: 1) a modem file management system capable of interoperating UNIX and NT file systems that can scale to 6.7 TB, 2) a Qualstar 40-slot tape library with two AIT tape drives and Legato Networker backup/archive software, 3) previously unavailable import/export capability for data sets on Zip, Jaz, DAT, 8mm, CD, and DLT media in addition to a 622Mb/s Internet 2 connection, 4) network switches capable of 100 Mbps to 128 desktop workstations, 5) Portable Batch System (PBS) computational task scheduler, and vi) two Compaq/Digital Alpha XP1000 compute servers each with 1.5 GB of RAM along with an SGI Origin 2000 (purchased partially using funds from this project along with funding from various other sources) to be used for very large computations, as required for simulation of mesoscale meteorology or climate.

  8. Collaborative Visualization Project: shared-technology learning environments for science learning

    NASA Astrophysics Data System (ADS)

    Pea, Roy D.; Gomez, Louis M.

    1993-01-01

    Project-enhanced science learning (PESL) provides students with opportunities for `cognitive apprenticeships' in authentic scientific inquiry using computers for data-collection and analysis. Student teams work on projects with teacher guidance to develop and apply their understanding of science concepts and skills. We are applying advanced computing and communications technologies to augment and transform PESL at-a-distance (beyond the boundaries of the individual school), which is limited today to asynchronous, text-only networking and unsuitable for collaborative science learning involving shared access to multimedia resources such as data, graphs, tables, pictures, and audio-video communication. Our work creates user technology (a Collaborative Science Workbench providing PESL design support and shared synchronous document views, program, and data access; a Science Learning Resource Directory for easy access to resources including two-way video links to collaborators, mentors, museum exhibits, media-rich resources such as scientific visualization graphics), and refine enabling technologies (audiovisual and shared-data telephony, networking) for this PESL niche. We characterize participation scenarios for using these resources and we discuss national networked access to science education expertise.

  9. Commnity Petascale Project for Accelerator Science And Simulation: Advancing Computational Science for Future Accelerators And Accelerator Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spentzouris, Panagiotis; /Fermilab; Cary, John

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessarymore » accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors.« less

  10. Virtual University of Applied Sciences--German Flagship Project in the Field of E-Learning in Higher Education.

    ERIC Educational Resources Information Center

    Granow, Rolf; Bischoff, Michael

    In 1997, the German Federal Ministry of Education and Research started an initiative to promote e-learning in Germany by installing an extensive research program. The Virtual University of Applied Sciences in Engineering, Computer Science and Economic Engineering is the most prominent and best-funded of the more than 100 projects in the field…

  11. LLNL Mercury Project Trinity Open Science Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dawson, Shawn A.

    The Mercury Monte Carlo particle transport code is used to simulate the transport of radiation through urban environments. These challenging calculations include complicated geometries and require significant computational resources to complete. In the proposed Trinity Open Science calculations, I will investigate computer science aspects of the code which are relevant to convergence of the simulation quantities with increasing Monte Carlo particle counts.

  12. European Scientific Notes. Volume 35, Number 5,

    DTIC Science & Technology

    1981-05-31

    Mr. Y.S. Wu Information Systems ESN 35-5 (1981) COMPUTER Levrat himself is a fascinating Dan SCIENCE who took his doctorate at the Universitv of...fascinating Computer Science Department reports for project on computer graphics. Text nurposes of teaching and research di- processing by computer has...water batteries, of offshore winds and lighter support alkaline batterips, lead-acid systems , structures, will be carried out before metal/air batteries

  13. 78 FR 49781 - Notice of Intent To Seek Approval To Establish an Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-15

    ... computer and information science and engineering. Awardees will be required to submit annual project... through the use of automated collection techniques or other forms of information technology. DATES...: Title of Collection: Computer and Information Science and Engineering Reporting Requirements. OMB Number...

  14. The Southern Forest Futures Project: summary report

    Treesearch

    David N. Wear; John G. Greis

    2012-01-01

    The Southern Forest Futures Project provides a science-based “futuring” analysis of the forests of the 13 States of the Southeastern United States. With findings organized in a set of scenarios and using a combination of computer models and science synthesis, the authors of the Southern Forest Futures Project examine a variety of possible futures that could shape...

  15. Final Report of the Computer Assisted Learning Test Project. Report No. 19.

    ERIC Educational Resources Information Center

    Van der Drift, K. D.; And Others

    A pilot project was conducted to gain information to advise the Board of Directors at the University of Leyden as to the feasibility of using a computerized system to aid in instructional programs in the social sciences, law, medicine, arts, mathematics, and natural sciences at a low cost. The pilot project is divided into four parts which are…

  16. Status Report: Mathematics Curriculum-Development Projects Today

    ERIC Educational Resources Information Center

    Arithmetic Teacher, 1972

    1972-01-01

    Brief reports on the Cambridge Conference on School Mathematics, Comprehensive School Mathematics Program, Computer-Assisted Instruction Projects at Stanford, Individually Prescribed Instruction Project, The Madison Project, Mathematics/Science Learning System, MINNEMAST, and School Mathematics Study Group. (MM)

  17. Promoting Interests in Atmospheric Science at a Liberal Arts Institution

    NASA Astrophysics Data System (ADS)

    Roussev, S.; Sherengos, P. M.; Limpasuvan, V.; Xue, M.

    2007-12-01

    Coastal Carolina University (CCU) students in Computer Science participated in a project to set up an operational weather forecast for the local community. The project involved the construction of two computing clusters and the automation of daily forecasting. Funded by NSF-MRI, two high-performance clusters were successfully established to run the University of Oklahoma's Advance Regional Prediction System (ARPS). Daily weather predictions are made over South Carolina and North Carolina at 3-km horizontal resolution (roughly 1.9 miles) using initial and boundary condition data provided by UNIDATA. At this high resolution, the model is cloud- resolving, thus providing detailed picture of heavy thunderstorms and precipitation. Forecast results are displayed on CCU's website (https://marc.coastal.edu/HPC) to complement observations at the National Weather Service in Wilmington N.C. Present efforts include providing forecasts at 1-km resolution (or finer), comparisons with other models like Weather Research and Forecasting (WRF) model, and the examination of local phenomena (like water spouts and tornadoes). Through these activities the students learn about shell scripting, cluster operating systems, and web design. More importantly, students are introduced to Atmospheric Science, the processes involved in making weather forecasts, and the interpretation of their forecasts. Simulations generated by the forecasts will be integrated into the contents of CCU's course like Fluid Dynamics, Atmospheric Sciences, Atmospheric Physics, and Remote Sensing. Operated jointly between the departments of Applied Physics and Computer Science, the clusters are expected to be used by CCU faculty and students for future research and inquiry-based projects in Computer Science, Applied Physics, and Marine Science.

  18. dV/dt - Accelerating the Rate of Progress towards Extreme Scale Collaborative Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Livny, Miron

    This report introduces publications that report the results of a project that aimed to design a computational framework that enables computational experimentation at scale while supporting the model of “submit locally, compute globally”. The project focuses on estimating application resource needs, finding the appropriate computing resources, acquiring those resources,deploying the applications and data on the resources, managing applications and resources during run.

  19. 78 FR 42976 - Notice Pursuant to the National Cooperative Research and Production Act of 1993-Heterogeneous...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-18

    ... Computer Science and Engineering, Seoul, REPUBLIC OF KOREA; Missouri University of Science and Technology, Rolla, MO; Industrial Technology Research Institute of Taiwan, Chutung, Hsinchu, TAIWAN, Northeastern... activity of the group research project. Membership in this group research project remains open, and HSA...

  20. Project Solo; Newsletter Number Seven.

    ERIC Educational Resources Information Center

    Pittsburgh Univ., PA. Project Solo.

    The current curriculum modules under development at Project Solo are listed. The modules are grouped under the subject matter that they are designed to teach--algebra II, biology, calculus, chemistry, computer science, 12th grade math, physics, social science. Special programs written for use on the Hewlett-Packard Plotter are listed that may be…

  1. Student Sensemaking with Science Diagrams in a Computer-Based Setting

    ERIC Educational Resources Information Center

    Furberg, Anniken; Kluge, Anders; Ludvigsen, Sten

    2013-01-01

    This paper reports on a study of students' conceptual sensemaking with science diagrams within a computer-based learning environment aimed at supporting collaborative learning. Through the microanalysis of students' interactions in a project about energy and heat transfer, we demonstrate "how" representations become productive social and cognitive…

  2. Computer Networking Strategies for Building Collaboration among Science Educators.

    ERIC Educational Resources Information Center

    Aust, Ronald

    The development and dissemination of science materials can be associated with technical delivery systems such as the Unified Network for Informatics in Teacher Education (UNITE). The UNITE project was designed to investigate ways for using computer networking to improve communications and collaboration among university schools of education and…

  3. The grand challenge of managing the petascale facility.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aiken, R. J.; Mathematics and Computer Science

    2007-02-28

    This report is the result of a study of networks and how they may need to evolve to support petascale leadership computing and science. As Dr. Ray Orbach, director of the Department of Energy's Office of Science, says in the spring 2006 issue of SciDAC Review, 'One remarkable example of growth in unexpected directions has been in high-end computation'. In the same article Dr. Michael Strayer states, 'Moore's law suggests that before the end of the next cycle of SciDAC, we shall see petaflop computers'. Given the Office of Science's strong leadership and support for petascale computing and facilities, wemore » should expect to see petaflop computers in operation in support of science before the end of the decade, and DOE/SC Advanced Scientific Computing Research programs are focused on making this a reality. This study took its lead from this strong focus on petascale computing and the networks required to support such facilities, but it grew to include almost all aspects of the DOE/SC petascale computational and experimental science facilities, all of which will face daunting challenges in managing and analyzing the voluminous amounts of data expected. In addition, trends indicate the increased coupling of unique experimental facilities with computational facilities, along with the integration of multidisciplinary datasets and high-end computing with data-intensive computing; and we can expect these trends to continue at the petascale level and beyond. Coupled with recent technology trends, they clearly indicate the need for including capability petascale storage, networks, and experiments, as well as collaboration tools and programming environments, as integral components of the Office of Science's petascale capability metafacility. The objective of this report is to recommend a new cross-cutting program to support the management of petascale science and infrastructure. The appendices of the report document current and projected DOE computation facilities, science trends, and technology trends, whose combined impact can affect the manageability and stewardship of DOE's petascale facilities. This report is not meant to be all-inclusive. Rather, the facilities, science projects, and research topics presented are to be considered examples to clarify a point.« less

  4. Computational nuclear quantum many-body problem: The UNEDF project

    NASA Astrophysics Data System (ADS)

    Bogner, S.; Bulgac, A.; Carlson, J.; Engel, J.; Fann, G.; Furnstahl, R. J.; Gandolfi, S.; Hagen, G.; Horoi, M.; Johnson, C.; Kortelainen, M.; Lusk, E.; Maris, P.; Nam, H.; Navratil, P.; Nazarewicz, W.; Ng, E.; Nobre, G. P. A.; Ormand, E.; Papenbrock, T.; Pei, J.; Pieper, S. C.; Quaglioni, S.; Roche, K. J.; Sarich, J.; Schunck, N.; Sosonkina, M.; Terasaki, J.; Thompson, I.; Vary, J. P.; Wild, S. M.

    2013-10-01

    The UNEDF project was a large-scale collaborative effort that applied high-performance computing to the nuclear quantum many-body problem. The primary focus of the project was on constructing, validating, and applying an optimized nuclear energy density functional, which entailed a wide range of pioneering developments in microscopic nuclear structure and reactions, algorithms, high-performance computing, and uncertainty quantification. UNEDF demonstrated that close associations among nuclear physicists, mathematicians, and computer scientists can lead to novel physics outcomes built on algorithmic innovations and computational developments. This review showcases a wide range of UNEDF science results to illustrate this interplay.

  5. Lattice QCD Application Development within the US DOE Exascale Computing Project

    NASA Astrophysics Data System (ADS)

    Brower, Richard; Christ, Norman; DeTar, Carleton; Edwards, Robert; Mackenzie, Paul

    2018-03-01

    In October, 2016, the US Department of Energy launched the Exascale Computing Project, which aims to deploy exascale computing resources for science and engineering in the early 2020's. The project brings together application teams, software developers, and hardware vendors in order to realize this goal. Lattice QCD is one of the applications. Members of the US lattice gauge theory community with significant collaborators abroad are developing algorithms and software for exascale lattice QCD calculations. We give a short description of the project, our activities, and our plans.

  6. Lattice QCD Application Development within the US DOE Exascale Computing Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brower, Richard; Christ, Norman; DeTar, Carleton

    In October, 2016, the US Department of Energy launched the Exascale Computing Project, which aims to deploy exascale computing resources for science and engineering in the early 2020's. The project brings together application teams, software developers, and hardware vendors in order to realize this goal. Lattice QCD is one of the applications. Members of the US lattice gauge theory community with significant collaborators abroad are developing algorithms and software for exascale lattice QCD calculations. We give a short description of the project, our activities, and our plans.

  7. Year One of Project Pulse: Pupils Using Laptops in Science and English. A Final Report. Technical Report No. 26.

    ERIC Educational Resources Information Center

    McMillan, Katie; Honey, Margaret

    A year-long study was conducted with a class of 25 eighth graders, their English and science teachers, and the school computer supervisor at a school in Roselle (New Jersey). The structure and goals of the project, called PULSE, for Pupils Using Laptops in Science and English, are described. Research questions focused on the development of…

  8. Alliance for Computational Science Collaboration HBCU Partnership at Fisk University. Final Report 2001

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, W. E.

    2004-08-16

    Computational Science plays a big role in research and development in mathematics, science, engineering and biomedical disciplines. The Alliance for Computational Science Collaboration (ACSC) has the goal of training African-American and other minority scientists in the computational science field for eventual employment with the Department of Energy (DOE). The involvements of Historically Black Colleges and Universities (HBCU) in the Alliance provide avenues for producing future DOE African-American scientists. Fisk University has been participating in this program through grants from the DOE. The DOE grant supported computational science activities at Fisk University. The research areas included energy related projects, distributed computing,more » visualization of scientific systems and biomedical computing. Students' involvement in computational science research included undergraduate summer research at Oak Ridge National Lab, on-campus research involving the participation of undergraduates, participation of undergraduate and faculty members in workshops, and mentoring of students. These activities enhanced research and education in computational science, thereby adding to Fisk University's spectrum of research and educational capabilities. Among the successes of the computational science activities are the acceptance of three undergraduate students to graduate schools with full scholarships beginning fall 2002 (one for master degree program and two for Doctoral degree program).« less

  9. Evaluating Computer-Related Incidents on Campus

    ERIC Educational Resources Information Center

    Rothschild, Daniel; Rezmierski, Virginia

    2004-01-01

    The Computer Incident Factor Analysis and Categorization (CIFAC) Project at the University of Michigan began in September 2003 with grants from EDUCAUSE and the National Science Foundation (NSF). The project's primary goal is to create a best-practices security framework for colleges and universities based on rigorous quantitative analysis of…

  10. Environment and health: Probes and sensors for environment digital control

    NASA Astrophysics Data System (ADS)

    Schettini, Chiara

    2014-05-01

    The idea of studying the environment using New Technologies (NT) came from a MIUR (Ministry of Education of the Italian Government) notice that allocated funds for the realization of innovative school science projects. The "Environment and Health" project uses probes and sensors for digital control of environment (water, air and soil). The working group was composed of 4 Science teachers from 'Liceo Statale G. Mazzini ', under the coordination of teacher Chiara Schettini. The Didactic Section of Naples City of Sciences helped the teachers in developing the project and it organized a refresher course for them on the utilization of digital control sensors. The project connects Environment and Technology because the study of the natural aspects and the analysis of the chemical-physical parameters give students and teachers skills for studying the environment based on the utilization of NT in computing data elaboration. During the practical project, samples of air, water and soil are gathered in different contexts. Sample analysis was done in the school's scientific laboratory with digitally controlled sensors. The data are elaborated with specific software and the results have been written in a booklet and in a computing database. During the first year, the project involved 6 school classes (age of the students 14—15 years), under the coordination of Science teachers. The project aims are: 1) making students more aware about environmental matters 2) achieving basic skills for evaluating air, water and soil quality. 3) achieving strong skills for the utilization of digitally controlled sensors. 4) achieving computing skills for elaborating and presenting data. The project aims to develop a large environmental conscience and the need of a ' good ' environment for defending our health. Moreover it would increase the importance of NT as an instrument of knowledge.

  11. Science Teacher Efficacy and Extrinsic Factors Toward Professional Development Using Video Games in a Design-Based Research Model: The Next Generation of STEM Learning

    NASA Astrophysics Data System (ADS)

    Annetta, Leonard A.; Frazier, Wendy M.; Folta, Elizabeth; Holmes, Shawn; Lamb, Richard; Cheng, Meng-Tzu

    2013-02-01

    Designed-based research principles guided the study of 51 secondary-science teachers in the second year of a 3-year professional development project. The project entailed the creation of student-centered, inquiry-based, science, video games. A professional development model appropriate for infusing innovative technologies into standards-based curricula was employed to determine how science teacher's attitudes and efficacy where impacted while designing science-based video games. The study's mixed-method design ascertained teacher efficacy on five factors (General computer use, Science Learning, Inquiry Teaching and Learning, Synchronous chat/text, and Playing Video Games) related to technology and gaming using a web-based survey). Qualitative data in the form of online blog posts was gathered during the project to assist in the triangulation and assessment of teacher efficacy. Data analyses consisted of an Analysis of Variance and serial coding of teacher reflective responses. Results indicated participants who used computers daily have higher efficacy while using inquiry-based teaching methods and science teaching and learning. Additional emergent findings revealed possible motivating factors for efficacy. This professional development project was focused on inquiry as a pedagogical strategy, standard-based science learning as means to develop content knowledge, and creating video games as technological knowledge. The project was consistent with the Technological Pedagogical Content Knowledge (TPCK) framework where overlapping circles of the three components indicates development of an integrated understanding of the suggested relationships. Findings provide suggestions for development of standards-based science education software, its integration into the curriculum and, strategies for implementing technology into teaching practices.

  12. Ten quick tips for machine learning in computational biology.

    PubMed

    Chicco, Davide

    2017-01-01

    Machine learning has become a pivotal tool for many projects in computational biology, bioinformatics, and health informatics. Nevertheless, beginners and biomedical researchers often do not have enough experience to run a data mining project effectively, and therefore can follow incorrect practices, that may lead to common mistakes or over-optimistic results. With this review, we present ten quick tips to take advantage of machine learning in any computational biology context, by avoiding some common errors that we observed hundreds of times in multiple bioinformatics projects. We believe our ten suggestions can strongly help any machine learning practitioner to carry on a successful project in computational biology and related sciences.

  13. Long live the Data Scientist, but can he/she persist?

    NASA Astrophysics Data System (ADS)

    Wyborn, L. A.

    2011-12-01

    In recent years the fourth paradigm of data intensive science has slowly taken hold as the increased capacity of instruments and an increasing number of instruments (in particular sensor networks) have changed how fundamental research is undertaken. Most modern scientific research is about digital capture of data direct from instruments, processing it by computers, storing the results on computers and only publishing a small fraction of data in hard copy publications. At the same time, the rapid increase in capacity of supercomputers, particularly at petascale, means that far larger data sets can be analysed and to greater resolution than previously possible. The new cloud computing paradigm which allows distributed data, software and compute resources to be linked by seamless workflows, is creating new opportunities in processing of high volumes of data to an increasingly larger number of researchers. However, to take full advantage of these compute resources, data sets for analysis have to be aggregated from multiple sources to create high performance data sets. These new technology developments require that scientists must become more skilled in data management and/or have a higher degree of computer literacy. In almost every science discipline there is now an X-informatics branch and a computational X branch (eg, Geoinformatics and Computational Geoscience): both require a new breed of researcher that has skills in both the science fundamentals and also knowledge of some ICT aspects (computer programming, data base design and development, data curation, software engineering). People that can operate in both science and ICT are increasingly known as 'data scientists'. Data scientists are a critical element of many large scale earth and space science informatics projects, particularly those that are tackling current grand challenges at an international level on issues such as climate change, hazard prediction and sustainable development of our natural resources. These projects by their very nature require the integration of multiple digital data sets from multiple sources. Often the preparation of the data for computational analysis can take months and requires painstaking attention to detail to ensure that anomalies identified are real and are not just artefacts of the data preparation and/or the computational analysis. Although data scientists are increasingly vital to successful data intensive earth and space science projects, unless they are recognised for their capabilities in both the science and the computational domains they are likely to migrate to either a science role or an ICT role as their career advances. Most reward and recognition systems do not recognise those with skills in both, hence, getting trained data scientists to persist beyond one or two projects can be challenge. Those data scientists that persist in the profession are characteristically committed and enthusiastic people who have the support of their organisations to take on this role. They also tend to be people who share developments and are critical to the success of the open source software movement. However, the fact remains that survival of the data scientist as a species is being threatened unless something is done to recognise their invaluable contributions to the new fourth paradigm of science.

  14. Programs for attracting under-represented minority students to graduate school and research careers in computational science. Final report for period October 1, 1995 - September 30, 1997

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turner, James C. Jr.; Mason, Thomas; Guerrieri, Bruno

    1997-10-01

    Programs have been established at Florida A & M University to attract minority students to research careers in mathematics and computational science. The primary goal of the program was to increase the number of such students studying computational science via an interactive multimedia learning environment One mechanism used for meeting this goal was the development of educational modules. This academic year program established within the mathematics department at Florida A&M University, introduced students to computational science projects using high-performance computers. Additional activities were conducted during the summer, these included workshops, meetings, and lectures. Through the exposure provided by this programmore » to scientific ideas and research in computational science, it is likely that their successful applications of tools from this interdisciplinary field will be high.« less

  15. Science Notes.

    ERIC Educational Resources Information Center

    School Science Review, 1985

    1985-01-01

    Presents biology, chemistry, physics, and health activities, experiments, demonstrations, and computer programs. Includes mechanism of stomatal opening, using aquatic plants to help demonstrate chemical buffering, microbial activity/contamination in milk samples, computer computation of fitness scores, reservoir project, complexes of transition…

  16. Jennifer Southerland | NREL

    Science.gov Websites

    Southerland Jennifer Southerland Professional II-Project Assistant Jennifer.Southerland@nrel.gov | 303-275-4065 Jennifer Southerland is a project assistant with the Computational Science Center where

  17. The Human Genome Project: Biology, Computers, and Privacy.

    ERIC Educational Resources Information Center

    Cutter, Mary Ann G.; Drexler, Edward; Gottesman, Kay S.; Goulding, Philip G.; McCullough, Laurence B.; McInerney, Joseph D.; Micikas, Lynda B.; Mural, Richard J.; Murray, Jeffrey C.; Zola, John

    This module, for high school teachers, is the second of two modules about the Human Genome Project (HGP) produced by the Biological Sciences Curriculum Study (BSCS). The first section of this module provides background information for teachers about the structure and objectives of the HGP, aspects of the science and technology that underlie the…

  18. Computing methods for icosahedral and symmetry-mismatch reconstruction of viruses by cryo-electron microscopy

    NASA Astrophysics Data System (ADS)

    Zhu, Bin; Cheng, Lingpeng; Liu, Hongrong

    2018-05-01

    Not Available Project supported by the National Key R&D Program of China (Grant No. 2016YFA0501100), the National Natural Science Foundation of China (Grant Nos. 91530321, 31570742, and 31570727), and Science and Technology Planning Project of Hunan Province, China (Grant No. 2017RS3033).

  19. MLeXAI: A Project-Based Application-Oriented Model

    ERIC Educational Resources Information Center

    Russell, Ingrid; Markov, Zdravko; Neller, Todd; Coleman, Susan

    2010-01-01

    Our approach to teaching introductory artificial intelligence (AI) unifies its diverse core topics through a theme of machine learning, and emphasizes how AI relates more broadly with computer science. Our work, funded by a grant from the National Science Foundation, involves the development, implementation, and testing of a suite of projects that…

  20. First 3 years of operation of RIACS (Research Institute for Advanced Computer Science) (1983-1985)

    NASA Technical Reports Server (NTRS)

    Denning, P. J.

    1986-01-01

    The focus of the Research Institute for Advanced Computer Science (RIACS) is to explore matches between advanced computing architectures and the processes of scientific research. An architecture evaluation of the MIT static dataflow machine, specification of a graphical language for expressing distributed computations, and specification of an expert system for aiding in grid generation for two-dimensional flow problems was initiated. Research projects for 1984 and 1985 are summarized.

  1. Design Principles for "Thriving in Our Digital World": A High School Computer Science Course

    ERIC Educational Resources Information Center

    Veletsianos, George; Beth, Bradley; Lin, Calvin; Russell, Gregory

    2016-01-01

    "Thriving in Our Digital World" is a technology-enhanced dual enrollment course introducing high school students to computer science through project- and problem-based learning. This article describes the evolution of the course and five lessons learned during the design, development, implementation, and iteration of the course from its…

  2. Finding the Hook: Computer Science Education in Elementary Contexts

    ERIC Educational Resources Information Center

    Ozturk, Zehra; Dooley, Caitlin McMunn; Welch, Meghan

    2018-01-01

    The purpose of this study was to investigate how elementary teachers with little knowledge of computer science (CS) and project-based learning (PBL) experienced integrating CS through PBL as a part of a standards-based elementary curriculum in Grades 3-5. The researchers used qualitative constant comparison methods on field notes and reflections…

  3. Local and Long Distance Computer Networking for Science Classrooms. Technical Report No. 43.

    ERIC Educational Resources Information Center

    Newman, Denis

    This report describes Earth Lab, a project which is demonstrating new ways of using computers for upper-elementary and middle-school science instruction, and finding ways to integrate local-area and telecommunications networks. The discussion covers software, classroom activities, formative research on communications networks, and integration of…

  4. NASA high performance computing and communications program

    NASA Technical Reports Server (NTRS)

    Holcomb, Lee; Smith, Paul; Hunter, Paul

    1993-01-01

    The National Aeronautics and Space Administration's HPCC program is part of a new Presidential initiative aimed at producing a 1000-fold increase in supercomputing speed and a 100-fold improvement in available communications capability by 1997. As more advanced technologies are developed under the HPCC program, they will be used to solve NASA's 'Grand Challenge' problems, which include improving the design and simulation of advanced aerospace vehicles, allowing people at remote locations to communicate more effectively and share information, increasing scientist's abilities to model the Earth's climate and forecast global environmental trends, and improving the development of advanced spacecraft. NASA's HPCC program is organized into three projects which are unique to the agency's mission: the Computational Aerosciences (CAS) project, the Earth and Space Sciences (ESS) project, and the Remote Exploration and Experimentation (REE) project. An additional project, the Basic Research and Human Resources (BRHR) project exists to promote long term research in computer science and engineering and to increase the pool of trained personnel in a variety of scientific disciplines. This document presents an overview of the objectives and organization of these projects as well as summaries of individual research and development programs within each project.

  5. Inquiring Minds

    Science.gov Websites

    Proposed Projects and Experiments Fermilab's Tevatron Questions for the Universe Theory Computing High -performance Computing Grid Computing Networking Mass Storage Plan for the Future State of the Laboratory Homeland Security Industry Computing Sciences Workforce Development A Growing List Historic Results

  6. Case Study on the Use of Microcomputers in Primary Schools in Bar-le-Duc (France).

    ERIC Educational Resources Information Center

    Dieschbourg, Robert

    1988-01-01

    Examines a project which involves the introduction of computer science into elementary schools to create an awareness of data processing as an intellectual, technological, and socio-cultural phenomenon. Concludes that the early computer experience and group work involved in the project enhances student social and psychological development. (GEA)

  7. Laboratory-directed research and development: FY 1996 progress report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vigil, J.; Prono, J.

    1997-05-01

    This report summarizes the FY 1996 goals and accomplishments of Laboratory-Directed Research and Development (LDRD) projects. It gives an overview of the LDRD program, summarizes work done on individual research projects, and provides an index to the projects` principal investigators. Projects are grouped by their LDRD component: Individual Projects, Competency Development, and Program Development. Within each component, they are further divided into nine technical disciplines: (1) materials science, (2) engineering and base technologies, (3) plasmas, fluids, and particle beams, (4) chemistry, (5) mathematics and computational sciences, (6) atomic and molecular physics, (7) geoscience, space science, and astrophysics, (8) nuclear andmore » particle physics, and (9) biosciences.« less

  8. Enlist micros: Training science teachers to use microcomputers

    NASA Astrophysics Data System (ADS)

    Baird, William E.; Ellis, James D.; Kuerbis, Paul J.

    A National Science Foundation grant to the Biological Sciences Curriculum Study (BSCS) at The Colorado College supported the design and production of training materials to encourage literacy of science teachers in the use of microcomputers. ENLIST Micros is based on results of a national needs assessment that identified 22 compentencies needed by K-12 science teachers to use microcomputers for instruction. A writing team developed the 16-hour training program in the summer of 1985, and field-test coordinators tested it with 18 preservice or in-service groups during the 1985-86 academic year at 15 sites within the United States. The training materials consist of video programs, interactive computer disks for the Apple II series microcomputer, a training manual for participants, and a guide for the group leader. The experimental materials address major areas of educational computing: awareness, applications, implementation, evaluation, and resources. Each chapter contains activities developed for this program, such as viewing video segments of science teachers who are using computers effectively and running commercial science and training courseware. Role playing and small-group interaction help the teachers overcome their reluctance to use computers and plan for effective implementation of microcomputers in the school. This study examines the implementation of educational computing among 47 science teachers who completed the ENLIST Micros training at a southern university. We present results of formative evaluation for that site. Results indicate that both elementary and secondary teachers benefit from the training program and demonstrate gains in attitudes toward computer use. Participating teachers said that the program met its stated objectives and helped them obtain needed skills. Only 33 percent of these teachers, however, reported using computers one year after the training. In June 1986, the BSCS initiated a follow up to the ENLIST Micros curriculum to develop, evaluate, and disseminate a complete model of teacher enhancement for educational computing in the sciences. In that project, we use the ENLIST Micros curriculum as the first step in a training process. The project includes seminars that introduce additional skills: It contains provisions for sharing among participants, monitors use of computers in participants' classrooms, provides structured coaching of participants' use of computers in their classrooms, and offers planned observations of peers using computers in their science teaching.

  9. Development of an Integrated, Computer-Based, Bibliographical Data System for a Large University Library. Annual Report to the National Science Foundation from the University of Chicago Library, 1967-68.

    ERIC Educational Resources Information Center

    Fussler, Herman H.; Payne, Charles T.

    The project's second year (1967/68) was devoted to upgrading the computer operating software and programs to increase versatility and reliability. General conclusions about the program after 24 months of operation are that the project's objectives are sound and that effective utilization of computer-aided bibliographic data processing is essential…

  10. Teaching Science with Web-Based Inquiry Projects: An Exploratory Investigation

    ERIC Educational Resources Information Center

    Webb, Aubree M.; Knight, Stephanie L.; Wu, X. Ben; Schielack, Jane F.

    2014-01-01

    The purpose of this research is to explore a new computer-based interactive learning approach to assess the impact on student learning and attitudes toward science in a large university ecology classroom. A comparison was done with an established program to measure the relative impact of the new approach. The first inquiry project, BearCam, gives…

  11. 1999 LDRD Laboratory Directed Research and Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rita Spencer; Kyle Wheeler

    This is the FY 1999 Progress Report for the Laboratory Directed Research and Development (LDRD) Program at Los Alamos National Laboratory. It gives an overview of the LDRD Program, summarizes work done on individual research projects, relates the projects to major Laboratory program sponsors, and provides an index to the principal investigators. Project summaries are grouped by their LDRD component: Competency Development, Program Development, and Individual Projects. Within each component, they are further grouped into nine technical categories: (1) materials science, (2) chemistry, (3) mathematics and computational science, (4) atomic, molecular, optical, and plasma physics, fluids, and particle beams, (5)more » engineering science, (6) instrumentation and diagnostics, (7) geoscience, space science, and astrophysics, (8) nuclear and particle physics, and (9) bioscience.« less

  12. Laboratory Directed Research and Development FY 1998 Progress Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    John Vigil; Kyle Wheeler

    This is the FY 1998 Progress Report for the Laboratory Directed Research and Development (LDRD) Program at Los Alamos National Laboratory. It gives an overview of the LDRD Program, summarizes work done on individual research projects, relates the projects to major Laboratory program sponsors, and provides an index to the principle investigators. Project summaries are grouped by their LDRD component: Competency Development, Program Development, and Individual Projects. Within each component, they are further grouped into nine technical categories: (1) materials science, (2) chemistry, (3) mathematics and computational science, (4) atomic, molecular, optical, and plasma physics, fluids, and particle beams, (5)more » engineering science, (6) instrumentation and diagnostics, (7) geoscience, space science, and astrophysics, (8) nuclear and particle physics, and (9) bioscience.« less

  13. Laboratory directed research and development: FY 1997 progress report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vigil, J.; Prono, J.

    1998-05-01

    This is the FY 1997 Progress Report for the Laboratory Directed Research and Development (LDRD) program at Los Alamos National Laboratory. It gives an overview of the LDRD program, summarizes work done on individual research projects, relates the projects to major Laboratory program sponsors, and provides an index to the principal investigators. Project summaries are grouped by their LDRD component: Competency Development, Program Development, and Individual Projects. Within each component, they are further grouped into nine technical categories: (1) materials science, (2) chemistry, (3) mathematics and computational science, (4) atomic and molecular physics and plasmas, fluids, and particle beams, (5)more » engineering science, (6) instrumentation and diagnostics, (7) geoscience, space science, and astrophysics, (8) nuclear and particle physics, and (9) bioscience.« less

  14. A Structured Professional Development Approach to Unit Study: The Experiences of 200 Teachers in a National Teacher Development Project.

    ERIC Educational Resources Information Center

    McColskey, Wendy; Parke, Helen; Furtak, Erin; Butler, Susan

    This article addresses what was learned through the National Computational Science Leadership Program about involving teachers in planning high quality units of instruction around computational science investigations. Two cohorts of roughly 25 teacher teams nationwide were given opportunities to develop "replacement units." The goal was to support…

  15. A Report on the Design and Construction of the University of Massachusetts Computer Science Center.

    ERIC Educational Resources Information Center

    Massachusetts State Office of the Inspector General, Boston.

    This report describes a review conducted by the Massachusetts Office of the Inspector General on the construction of the Computer Science and Development Center at the University of Massachusetts, Amherst. The office initiated the review after hearing concerns about the management of the project, including its delayed completion and substantial…

  16. Enhancing Computer Science Education with a Wireless Intelligent Simulation Environment

    ERIC Educational Resources Information Center

    Cook, Diane J.; Huber, Manfred; Yerraballi, Ramesh; Holder, Lawrence B.

    2004-01-01

    The goal of this project is to develop a unique simulation environment that can be used to increase students' interest and expertise in Computer Science curriculum. Hands-on experience with physical or simulated equipment is an essential ingredient for learning, but many approaches to training develop a separate piece of equipment or software for…

  17. A School-College Consultation Model for Integration of Technology and Whole Language in Elementary Science Instruction. Field Study Report No. 1991.A.BAL, Christopher Columbus Consortium Project.

    ERIC Educational Resources Information Center

    Balajthy, Ernest

    A study examined a new collaborative consultation process to enhance the classroom implementation of whole language science units that make use of computers and multimedia resources. The overall program was divided into three projects, two at the fifth-grade level and one at the third grade level. Each project was staffed by a team of one…

  18. Trinity to Trinity 1945-2015

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moniz, Ernest; Carr, Alan; Bethe, Hans

    The Trinity Test of July 16, 1945 was the first full-scale, real-world test of a nuclear weapon; with the new Trinity supercomputer Los Alamos National Laboratory's goal is to do this virtually, in 3D. Trinity was the culmination of a fantastic effort of groundbreaking science and engineering by hundreds of men and women at Los Alamos and other Manhattan Project sites. It took them less than two years to change the world. The Laboratory is marking the 70th anniversary of the Trinity Test because it not only ushered in the Nuclear Age, but with it the origin of today’s advancedmore » supercomputing. We live in the Age of Supercomputers due in large part to nuclear weapons science here at Los Alamos. National security science, and nuclear weapons science in particular, at Los Alamos National Laboratory have provided a key motivation for the evolution of large-scale scientific computing. Beginning with the Manhattan Project there has been a constant stream of increasingly significant, complex problems in nuclear weapons science whose timely solutions demand larger and faster computers. The relationship between national security science at Los Alamos and the evolution of computing is one of interdependence.« less

  19. Trinity to Trinity 1945-2015

    ScienceCinema

    Moniz, Ernest; Carr, Alan; Bethe, Hans; Morrison, Phillip; Ramsay, Norman; Teller, Edward; Brixner, Berlyn; Archer, Bill; Agnew, Harold; Morrison, John

    2018-01-16

    The Trinity Test of July 16, 1945 was the first full-scale, real-world test of a nuclear weapon; with the new Trinity supercomputer Los Alamos National Laboratory's goal is to do this virtually, in 3D. Trinity was the culmination of a fantastic effort of groundbreaking science and engineering by hundreds of men and women at Los Alamos and other Manhattan Project sites. It took them less than two years to change the world. The Laboratory is marking the 70th anniversary of the Trinity Test because it not only ushered in the Nuclear Age, but with it the origin of today’s advanced supercomputing. We live in the Age of Supercomputers due in large part to nuclear weapons science here at Los Alamos. National security science, and nuclear weapons science in particular, at Los Alamos National Laboratory have provided a key motivation for the evolution of large-scale scientific computing. Beginning with the Manhattan Project there has been a constant stream of increasingly significant, complex problems in nuclear weapons science whose timely solutions demand larger and faster computers. The relationship between national security science at Los Alamos and the evolution of computing is one of interdependence.

  20. Earth System Grid II, Turning Climate Datasets into Community Resources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Middleton, Don

    2006-08-01

    The Earth System Grid (ESG) II project, funded by the Department of Energy’s Scientific Discovery through Advanced Computing program, has transformed climate data into community resources. ESG II has accomplished this goal by creating a virtual collaborative environment that links climate centers and users around the world to models and data via a computing Grid, which is based on the Department of Energy’s supercomputing resources and the Internet. Our project’s success stems from partnerships between climate researchers and computer scientists to advance basic and applied research in the terrestrial, atmospheric, and oceanic sciences. By interfacing with other climate science projects,more » we have learned that commonly used methods to manage and remotely distribute data among related groups lack infrastructure and under-utilize existing technologies. Knowledge and expertise gained from ESG II have helped the climate community plan strategies to manage a rapidly growing data environment more effectively. Moreover, approaches and technologies developed under the ESG project have impacted datasimulation integration in other disciplines, such as astrophysics, molecular biology and materials science.« less

  1. Study of Local Radon Occurrence as an Interdisciplinary Undergraduate Research Project.

    ERIC Educational Resources Information Center

    Purdom, William Berlin; And Others

    1990-01-01

    Described is an undergraduate interdisciplinary project encompassing physics, computer science, and geology and involving a number of students from several academic departments. The project used the topic of the occurrence of in-home radon. Student projects, radon sampling, and results are discussed. (CW)

  2. Power monitoring and control for large scale projects: SKA, a case study

    NASA Astrophysics Data System (ADS)

    Barbosa, Domingos; Barraca, João. Paulo; Maia, Dalmiro; Carvalho, Bruno; Vieira, Jorge; Swart, Paul; Le Roux, Gerhard; Natarajan, Swaminathan; van Ardenne, Arnold; Seca, Luis

    2016-07-01

    Large sensor-based science infrastructures for radio astronomy like the SKA will be among the most intensive datadriven projects in the world, facing very high demanding computation, storage, management, and above all power demands. The geographically wide distribution of the SKA and its associated processing requirements in the form of tailored High Performance Computing (HPC) facilities, require a Greener approach towards the Information and Communications Technologies (ICT) adopted for the data processing to enable operational compliance to potentially strict power budgets. Addressing the reduction of electricity costs, improve system power monitoring and the generation and management of electricity at system level is paramount to avoid future inefficiencies and higher costs and enable fulfillments of Key Science Cases. Here we outline major characteristics and innovation approaches to address power efficiency and long-term power sustainability for radio astronomy projects, focusing on Green ICT for science and Smart power monitoring and control.

  3. The OptIPuter microscopy demonstrator: enabling science through a transatlantic lightpath

    PubMed Central

    Ellisman, M.; Hutton, T.; Kirkland, A.; Lin, A.; Lin, C.; Molina, T.; Peltier, S.; Singh, R.; Tang, K.; Trefethen, A.E.; Wallom, D.C.H.; Xiong, X.

    2009-01-01

    The OptIPuter microscopy demonstrator project has been designed to enable concurrent and remote usage of world-class electron microscopes located in Oxford and San Diego. The project has constructed a network consisting of microscopes and computational and data resources that are all connected by a dedicated network infrastructure using the UK Lightpath and US Starlight systems. Key science drivers include examples from both materials and biological science. The resulting system is now a permanent link between the Oxford and San Diego microscopy centres. This will form the basis of further projects between the sites and expansion of the types of systems that can be remotely controlled, including optical, as well as electron, microscopy. Other improvements will include the updating of the Microsoft cluster software to the high performance computing (HPC) server 2008, which includes the HPC basic profile implementation that will enable the development of interoperable clients. PMID:19487201

  4. The OptIPuter microscopy demonstrator: enabling science through a transatlantic lightpath.

    PubMed

    Ellisman, M; Hutton, T; Kirkland, A; Lin, A; Lin, C; Molina, T; Peltier, S; Singh, R; Tang, K; Trefethen, A E; Wallom, D C H; Xiong, X

    2009-07-13

    The OptIPuter microscopy demonstrator project has been designed to enable concurrent and remote usage of world-class electron microscopes located in Oxford and San Diego. The project has constructed a network consisting of microscopes and computational and data resources that are all connected by a dedicated network infrastructure using the UK Lightpath and US Starlight systems. Key science drivers include examples from both materials and biological science. The resulting system is now a permanent link between the Oxford and San Diego microscopy centres. This will form the basis of further projects between the sites and expansion of the types of systems that can be remotely controlled, including optical, as well as electron, microscopy. Other improvements will include the updating of the Microsoft cluster software to the high performance computing (HPC) server 2008, which includes the HPC basic profile implementation that will enable the development of interoperable clients.

  5. Peculiarities of organization of project and research activity of students in computer science, physics and technology

    NASA Astrophysics Data System (ADS)

    Stolyarov, I. V.

    2017-01-01

    The author of this article manages a project and research activity of students in the areas of computer science, physics, engineering and biology, basing on the acquired experience in these fields. Pupils constantly become winners of competitions and conferences of different levels, for example, three of the finalists of Intel ISEF in 2013 in Phoenix (Arizona, USA) and in 2014 in Los Angeles (California, USA). In 2013 A. Makarychev received the "Small Nobel prize" in Computer Science section and special award sponsors - the company's CAST. Scientific themes and methods suggested by the author and developed in joint publications of students from Russia, Germany and Austria are the patents for invention and certificates for registration in the ROSPATENT. The article presents the results of the implementation of specific software and hardware systems in physics, engineering and medicine.

  6. Enabling Extreme Scale Earth Science Applications at the Oak Ridge Leadership Computing Facility

    NASA Astrophysics Data System (ADS)

    Anantharaj, V. G.; Mozdzynski, G.; Hamrud, M.; Deconinck, W.; Smith, L.; Hack, J.

    2014-12-01

    The Oak Ridge Leadership Facility (OLCF), established at the Oak Ridge National Laboratory (ORNL) under the auspices of the U.S. Department of Energy (DOE), welcomes investigators from universities, government agencies, national laboratories and industry who are prepared to perform breakthrough research across a broad domain of scientific disciplines, including earth and space sciences. Titan, the OLCF flagship system, is currently listed as #2 in the Top500 list of supercomputers in the world, and the largest available for open science. The computational resources are allocated primarily via the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, sponsored by the U.S. DOE Office of Science. In 2014, over 2.25 billion core hours on Titan were awarded via INCITE projects., including 14% of the allocation toward earth sciences. The INCITE competition is also open to research scientists based outside the USA. In fact, international research projects account for 12% of the INCITE awards in 2014. The INCITE scientific review panel also includes 20% participation from international experts. Recent accomplishments in earth sciences at OLCF include the world's first continuous simulation of 21,000 years of earth's climate history (2009); and an unprecedented simulation of a magnitude 8 earthquake over 125 sq. miles. One of the ongoing international projects involves scaling the ECMWF Integrated Forecasting System (IFS) model to over 200K cores of Titan. ECMWF is a partner in the EU funded Collaborative Research into Exascale Systemware, Tools and Applications (CRESTA) project. The significance of the research carried out within this project is the demonstration of techniques required to scale current generation Petascale capable simulation codes towards the performance levels required for running on future Exascale systems. One of the techniques pursued by ECMWF is to use Fortran2008 coarrays to overlap computations and communications and to reduce the total volume of data communicated. Use of Titan has enabled ECMWF to plan future scalability developments and resource requirements. We will also discuss the best practices developed over the years in navigating logistical, legal and regulatory hurdles involved in supporting the facility's diverse user community.

  7. Spiral and Project-Based Learning with Peer Assessment in a Computer Science Project Management Course

    NASA Astrophysics Data System (ADS)

    Jaime, Arturo; Blanco, José Miguel; Domínguez, César; Sánchez, Ana; Heras, Jónathan; Usandizaga, Imanol

    2016-06-01

    Different learning methods such as project-based learning, spiral learning and peer assessment have been implemented in science disciplines with different outcomes. This paper presents a proposal for a project management course in the context of a computer science degree. Our proposal combines three well-known methods: project-based learning, spiral learning and peer assessment. Namely, the course is articulated during a semester through the structured (progressive and incremental) development of a sequence of four projects, whose duration, scope and difficulty of management increase as the student gains theoretical and instrumental knowledge related to planning, monitoring and controlling projects. Moreover, the proposal is complemented using peer assessment. The proposal has already been implemented and validated for the last 3 years in two different universities. In the first year, project-based learning and spiral learning methods were combined. Such a combination was also employed in the other 2 years; but additionally, students had the opportunity to assess projects developed by university partners and by students of the other university. A total of 154 students have participated in the study. We obtain a gain in the quality of the subsequently projects derived from the spiral project-based learning. Moreover, this gain is significantly bigger when peer assessment is introduced. In addition, high-performance students take advantage of peer assessment from the first moment, whereas the improvement in poor-performance students is delayed.

  8. Developing Teachers' Computational Thinking Beliefs and Engineering Practices through Game Design and Robotics

    ERIC Educational Resources Information Center

    Leonard, Jacqueline; Barnes-Johnson, Joy; Mitchell, Monica; Unertl, Adrienne; Stubbe, Christopher R.; Ingraham, Latanya

    2017-01-01

    This research report presents the final year results of a three-year research project on computational thinking (CT). The project, funded by the National Science Foundation, involved training teachers in grades four through six to implement Scalable Game Design and LEGO® EV3 robotics during afterschool clubs. Thirty teachers and 531 students took…

  9. The Effect of a Graph-Oriented Computer-Assisted Project-Based Learning Environment on Argumentation Skills

    ERIC Educational Resources Information Center

    Hsu, P. -S.; Van Dyke, M.; Chen, Y.; Smith, T. J.

    2015-01-01

    The purpose of this quasi-experimental study was to explore how seventh graders in a suburban school in the United States developed argumentation skills and science knowledge in a project-based learning environment that incorporated a graph-oriented, computer-assisted application. A total of 54 students (three classes) comprised this treatment…

  10. Lecture and Tutorial via the Internet - Experiences from a Pilot Project Connecting Five Universities.

    ERIC Educational Resources Information Center

    Wulf, Volker; Schinzel, Britta

    This paper reports on a pilot project in which German universities in Freiburg, Constance, Mannheim, Stuttgart, and Ulm connected computer science departments via the Internet for a summer 1997 telelecture and teletutorial on computers and society. The first section provides background on telelearning and introduces the case study. The second…

  11. Does Cloud Computing in the Atmospheric Sciences Make Sense? A case study of hybrid cloud computing at NASA Langley Research Center

    NASA Astrophysics Data System (ADS)

    Nguyen, L.; Chee, T.; Minnis, P.; Spangenberg, D.; Ayers, J. K.; Palikonda, R.; Vakhnin, A.; Dubois, R.; Murphy, P. R.

    2014-12-01

    The processing, storage and dissemination of satellite cloud and radiation products produced at NASA Langley Research Center are key activities for the Climate Science Branch. A constellation of systems operates in sync to accomplish these goals. Because of the complexity involved with operating such intricate systems, there are both high failure rates and high costs for hardware and system maintenance. Cloud computing has the potential to ameliorate cost and complexity issues. Over time, the cloud computing model has evolved and hybrid systems comprising off-site as well as on-site resources are now common. Towards our mission of providing the highest quality research products to the widest audience, we have explored the use of the Amazon Web Services (AWS) Cloud and Storage and present a case study of our results and efforts. This project builds upon NASA Langley Cloud and Radiation Group's experience with operating large and complex computing infrastructures in a reliable and cost effective manner to explore novel ways to leverage cloud computing resources in the atmospheric science environment. Our case study presents the project requirements and then examines the fit of AWS with the LaRC computing model. We also discuss the evaluation metrics, feasibility, and outcomes and close the case study with the lessons we learned that would apply to others interested in exploring the implementation of the AWS system in their own atmospheric science computing environments.

  12. Computational Science at the Argonne Leadership Computing Facility

    NASA Astrophysics Data System (ADS)

    Romero, Nichols

    2014-03-01

    The goal of the Argonne Leadership Computing Facility (ALCF) is to extend the frontiers of science by solving problems that require innovative approaches and the largest-scale computing systems. ALCF's most powerful computer - Mira, an IBM Blue Gene/Q system - has nearly one million cores. How does one program such systems? What software tools are available? Which scientific and engineering applications are able to utilize such levels of parallelism? This talk will address these questions and describe a sampling of projects that are using ALCF systems in their research, including ones in nanoscience, materials science, and chemistry. Finally, the ways to gain access to ALCF resources will be presented. This research used resources of the Argonne Leadership Computing Facility at Argonne National Laboratory, which is supported by the Office of Science of the U.S. Department of Energy under contract DE-AC02-06CH11357.

  13. Science Notes.

    ERIC Educational Resources Information Center

    School Science Review, 1985

    1985-01-01

    Presents 23 experiments, activities, field projects and computer programs in the biological and physical sciences. Instructional procedures, experimental designs, materials, and background information are suggested. Topics include fluid mechanics, electricity, crystals, arthropods, limpets, acid neutralization, and software evaluation. (ML)

  14. GES DISC Data Recipes in Jupyter Notebooks

    NASA Astrophysics Data System (ADS)

    Li, A.; Banavige, B.; Garimella, K.; Rice, J.; Shen, S.; Liu, Z.

    2017-12-01

    The Earth Science Data and Information System (ESDIS) Project manages twelve Distributed Active Archive Centers (DAACs) which are geographically dispersed across the United States. The DAACs are responsible for ingesting, processing, archiving, and distributing Earth science data produced from various sources (satellites, aircraft, field measurements, etc.). In response to projections of an exponential increase in data production, there has been a recent effort to prototype various DAAC activities in the cloud computing environment. This, in turn, led to the creation of an initiative, called the Cloud Analysis Toolkit to Enable Earth Science (CATEES), to develop a Python software package in order to transition Earth science data processing to the cloud. This project, in particular, supports CATEES and has two primary goals. One, transition data recipes created by the Goddard Earth Science Data and Information Service Center (GES DISC) DAAC into an interactive and educational environment using Jupyter Notebooks. Two, acclimate Earth scientists to cloud computing. To accomplish these goals, we create Jupyter Notebooks to compartmentalize the different steps of data analysis and help users obtain and parse data from the command line. We also develop a Docker container, comprised of Jupyter Notebooks, Python library dependencies, and command line tools, and configure it into an easy to deploy package. The end result is an end-to-end product that simulates the use case of end users working in the cloud computing environment.

  15. Analysis of reference transactions using packaged computer programs.

    PubMed

    Calabretta, N; Ross, R

    1984-01-01

    Motivated by a continuing education class attended by the authors on the measurement of reference desk activities, the reference department at Scott Memorial Library initiated a project to gather data on reference desk transactions and to analyze the data by using packaged computer programs. The programs utilized for the project were SPSS (Statistical Package for the Social Sciences) and SAS (Statistical Analysis System). The planning, implementation and development of the project are described.

  16. Wide-angle display developments by computer graphics

    NASA Technical Reports Server (NTRS)

    Fetter, William A.

    1989-01-01

    Computer graphics can now expand its new subset, wide-angle projection, to be as significant a generic capability as computer graphics itself. Some prior work in computer graphics is presented which leads to an attractive further subset of wide-angle projection, called hemispheric projection, to be a major communication media. Hemispheric film systems have long been present and such computer graphics systems are in use in simulators. This is the leading edge of capabilities which should ultimately be as ubiquitous as CRTs (cathode-ray tubes). These assertions are not from degrees in science or only from a degree in graphic design, but in a history of computer graphics innovations, laying groundwork by demonstration. The author believes that it is timely to look at several development strategies, since hemispheric projection is now at a point comparable to the early stages of computer graphics, requiring similar patterns of development again.

  17. Evaluation in the Classroom.

    ERIC Educational Resources Information Center

    Becnel, Shirley

    Six classroom research-based instructional projects funded under Chapter 2 are described, and their outcomes are summarized. The projects each used computer hardware and software in the classroom setting. The projects and their salient points include: (1) the Science Technology Project, in which 48 teachers and 2,847 students in 18 schools used…

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    This document comprises Pacific Northwest National Laboratory`s report for Fiscal Year 1996 on research and development programs. The document contains 161 project summaries in 16 areas of research and development. The 16 areas of research and development reported on are: atmospheric sciences, biotechnology, chemical instrumentation and analysis, computer and information science, ecological science, electronics and sensors, health protection and dosimetry, hydrological and geologic sciences, marine sciences, materials science and engineering, molecular science, process science and engineering, risk and safety analysis, socio-technical systems analysis, statistics and applied mathematics, and thermal and energy systems. In addition, this report provides an overview ofmore » the research and development program, program management, program funding, and Fiscal Year 1997 projects.« less

  19. Research 1970/1971: Annual Progress Report.

    ERIC Educational Resources Information Center

    Georgia Inst. of Tech., Atlanta. Science Information Research Center.

    The report presents a summary of science information research activities of the School of Information and Computer Science, Georgia Institute of Technology. Included are project reports on interrelated studies in science information, information processing and systems design, automata and systems theories, and semiotics and linguistics. Also…

  20. NASA High Performance Computing and Communications program

    NASA Technical Reports Server (NTRS)

    Holcomb, Lee; Smith, Paul; Hunter, Paul

    1994-01-01

    The National Aeronautics and Space Administration's HPCC program is part of a new Presidential initiative aimed at producing a 1000-fold increase in supercomputing speed and a 1(X)-fold improvement in available communications capability by 1997. As more advanced technologies are developed under the HPCC program, they will be used to solve NASA's 'Grand Challenge' problems, which include improving the design and simulation of advanced aerospace vehicles, allowing people at remote locations to communicate more effectively and share information, increasing scientists' abilities to model the Earth's climate and forecast global environmental trends, and improving the development of advanced spacecraft. NASA's HPCC program is organized into three projects which are unique to the agency's mission: the Computational Aerosciences (CAS) project, the Earth and Space Sciences (ESS) project, and the Remote Exploration and Experimentation (REE) project. An additional project, the Basic Research and Human Resources (BRHR) project, exists to promote long term research in computer science and engineering and to increase the pool of trained personnel in a variety of scientific disciplines. This document presents an overview of the objectives and organization of these projects, as well as summaries of early accomplishments and the significance, status, and plans for individual research and development programs within each project. Areas of emphasis include benchmarking, testbeds, software and simulation methods.

  1. Computer Technology-Integrated Projects Should not Supplant Craft Projects in Science Education

    NASA Astrophysics Data System (ADS)

    Klopp, Tabatha J.; Rule, Audrey C.; Suchsland Schneider, Jean; Boody, Robert M.

    2014-03-01

    The current emphasis on computer technology integration and narrowing of the curriculum has displaced arts and crafts. However, the hands-on, concrete nature of craft work in science modeling enables students to understand difficult concepts and to be engaged and motivated while learning spatial, logical, and sequential thinking skills. Analogy use is also helpful in understanding unfamiliar, complex science concepts. This study of 28 academically advanced elementary to middle-school students examined student work and perceptions during a science unit focused on four fossil organisms: crinoid, brachiopod, horn coral and trilobite. The study compared: (1) analogy-focused instruction to independent Internet research and (2) computer technology-rich products to crafts-based products. Findings indicate student products were more creative after analogy-based instruction and when made using technology. However, students expressed a strong desire to engage in additional craft work after making craft products and enjoyed making crafts more after analogy-focused instruction. Additionally, more science content was found in the craft products than the technology-rich products. Students expressed a particular liking for two of the fossil organisms because they had been modeled with crafts. The authors recommend that room should be retained for crafts in the science curriculum to model science concepts.

  2. Diminishing the Gap between University and High School Research Programs: Computational Physics

    ERIC Educational Resources Information Center

    Vondracek, Mark

    2007-01-01

    There are many schools (grades K-12) around the country that offer some sort of science research option for students to pursue. Often this option is a local science fair, where students do smaller projects that are then presented at poster sessions. Many times the top local projects can advance to some type of regional and, possibly, state science…

  3. The Use of Mobile Technologies in Project-Based Science: A Case Study

    ERIC Educational Resources Information Center

    Avraamidou, Lucy

    2013-01-01

    The main aim of this study was to examine how a group of elementary students perceived their engagement in a project-based science intervention investigating the water quality of a local lake. The students collaborated with a scientist to conduct various experiments and used handheld computers to collect and analyze data in order to examine the…

  4. A Meta-Analysis of National Research: Effects of Teaching Strategies on Student Achievement in Science in the United States

    ERIC Educational Resources Information Center

    Schroeder, Carolyn M.; Scott, Timothy P.; Tolson, Homer; Huang, Tse-Yang; Lee, Yi-Hsuan

    2007-01-01

    This project consisted of a meta-analysis of U.S. research published from 1980 to 2004 on the effect of specific science teaching strategies on student achievement. The six phases of the project included study acquisition, study coding, determination of intercoder objectivity, establishing criteria for inclusion of studies, computation of effect…

  5. Damsel: A Data Model Storage Library for Exascale Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koziol, Quincey

    The goal of this project is to enable exascale computational science applications to interact conveniently and efficiently with storage through abstractions that match their data models. We will accomplish this through three major activities: (1) identifying major data model motifs in computational science applications and developing representative benchmarks; (2) developing a data model storage library, called Damsel, that supports these motifs, provides efficient storage data layouts, incorporates optimizations to enable exascale operation, and is tolerant to failures; and (3) productizing Damsel and working with computational scientists to encourage adoption of this library by the scientific community.

  6. Software Assurance Curriculum Project Volume 2: Undergraduate Course Outlines

    DTIC Science & Technology

    2010-08-01

    Contents Acknowledgments iii Abstract v 1 An Undergraduate Curriculum Focus on Software Assurance 1 2 Computer Science I 7 3 Computer Science II...confidence that can be integrated into traditional software development and acquisition process models . Thus, in addition to a technology focus...testing throughout the software development life cycle ( SDLC ) AP Security and complexity—system development challenges: security failures

  7. Exploratory Research and Development Fund, FY 1990

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1992-05-01

    The Lawrence Berkeley Laboratory Exploratory R D Fund FY 1990 report is compiled from annual reports submitted by principal investigators following the close of the fiscal year. This report describes the projects supported and summarizes their accomplishments. It constitutes a part of an Exploratory R D Fund (ERF) planning and documentation process that includes an annual planning cycle, projection selection, implementation, and review. The research areas covered in this report are: Accelerator and fusion research; applied science; cell and molecular biology; chemical biodynamics; chemical sciences; earth sciences; engineering; information and computing sciences; materials sciences; nuclear science; physics and research medicinemore » and radiation biophysics.« less

  8. Computer science: Key to a space program renaissance. The 1981 NASA/ASEE summer study on the use of computer science and technology in NASA. Volume 2: Appendices

    NASA Technical Reports Server (NTRS)

    Freitas, R. A., Jr. (Editor); Carlson, P. A. (Editor)

    1983-01-01

    Adoption of an aggressive computer science research and technology program within NASA will: (1) enable new mission capabilities such as autonomous spacecraft, reliability and self-repair, and low-bandwidth intelligent Earth sensing; (2) lower manpower requirements, especially in the areas of Space Shuttle operations, by making fuller use of control center automation, technical support, and internal utilization of state-of-the-art computer techniques; (3) reduce project costs via improved software verification, software engineering, enhanced scientist/engineer productivity, and increased managerial effectiveness; and (4) significantly improve internal operations within NASA with electronic mail, managerial computer aids, an automated bureaucracy and uniform program operating plans.

  9. Challenges and opportunities of cloud computing for atmospheric sciences

    NASA Astrophysics Data System (ADS)

    Pérez Montes, Diego A.; Añel, Juan A.; Pena, Tomás F.; Wallom, David C. H.

    2016-04-01

    Cloud computing is an emerging technological solution widely used in many fields. Initially developed as a flexible way of managing peak demand it has began to make its way in scientific research. One of the greatest advantages of cloud computing for scientific research is independence of having access to a large cyberinfrastructure to fund or perform a research project. Cloud computing can avoid maintenance expenses for large supercomputers and has the potential to 'democratize' the access to high-performance computing, giving flexibility to funding bodies for allocating budgets for the computational costs associated with a project. Two of the most challenging problems in atmospheric sciences are computational cost and uncertainty in meteorological forecasting and climate projections. Both problems are closely related. Usually uncertainty can be reduced with the availability of computational resources to better reproduce a phenomenon or to perform a larger number of experiments. Here we expose results of the application of cloud computing resources for climate modeling using cloud computing infrastructures of three major vendors and two climate models. We show how the cloud infrastructure compares in performance to traditional supercomputers and how it provides the capability to complete experiments in shorter periods of time. The monetary cost associated is also analyzed. Finally we discuss the future potential of this technology for meteorological and climatological applications, both from the point of view of operational use and research.

  10. Influence of Computer-Aided Assessment on Ways of Working with Mathematics

    ERIC Educational Resources Information Center

    Rønning, Frode

    2017-01-01

    This paper is based on an on-going project for modernizing the basic education in mathematics for engineers at the Norwegian University of Science and Technology. One of the components in the project is using a computer-aided assessment system (Maple T.A.) for handling students' weekly hand-ins. Successful completion of a certain number of problem…

  11. For the Love of Statistics: Appreciating and Learning to Apply Experimental Analysis and Statistics through Computer Programming Activities

    ERIC Educational Resources Information Center

    Mascaró, Maite; Sacristán, Ana Isabel; Rufino, Marta M.

    2016-01-01

    For the past 4 years, we have been involved in a project that aims to enhance the teaching and learning of experimental analysis and statistics, of environmental and biological sciences students, through computational programming activities (using R code). In this project, through an iterative design, we have developed sequences of R-code-based…

  12. Laboratory Directed Research and Development Annual Report for 2011

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hughes, Pamela J.

    2012-04-09

    This report documents progress made on all LDRD-funded projects during fiscal year 2011. The following topics are discussed: (1) Advanced sensors and instrumentation; (2) Biological Sciences; (3) Chemistry; (4) Earth and space sciences; (5) Energy supply and use; (6) Engineering and manufacturing processes; (7) Materials science and technology; (8) Mathematics and computing sciences; (9) Nuclear science and engineering; and (10) Physics.

  13. SpaceScience@Home: Authentic Research Projects that Use Citizen Scientists

    NASA Astrophysics Data System (ADS)

    Méndez, B. J. H.

    2008-06-01

    In recent years, several space science research projects have enlisted the help of large numbers of non-professional volunteers, ``citizen scientists'', to aid in performing tasks that are critical to a project, but require more person-time (or computing time) than a small professional research team can practically perform themselves. Examples of such projects include SETI@home, which uses time from volunteers computers to process radio-telescope observation looking for signals originating from extra-terrestrial intelligences; Clickworkers, which asks volunteers to review images of the surface of Mars to identify craters; Spacewatch, which used volunteers to review astronomical telescopic images of the sky to identify streaks made by possible Near Earth Asteroids; and Stardust@home, which asks volunteers to review ``focus movies'' taken of the Stardust interstellar dust aerogel collector to search for possible impacts from interstellar dust particles. We shall describe these and other similar projects and discuss lessons learned from carrying out such projects, including the educational opportunities they create.

  14. Johnson Space Center Research and Technology 1997 Annual Report

    NASA Technical Reports Server (NTRS)

    1998-01-01

    This report highlights key projects and technologies at Johnson Space Center for 1997. The report focuses on the commercial potential of the projects and technologies and is arranged by CorpTech Major Products Groups. Emerging technologies in these major disciplines we summarized: solar system sciences, life sciences, technology transfer, computer sciences, space technology, and human support technology. Them NASA advances have a range of potential commercial applications, from a school internet manager for networks to a liquid metal mirror for optical measurements.

  15. Topics in computational physics

    NASA Astrophysics Data System (ADS)

    Monville, Maura Edelweiss

    Computational Physics spans a broad range of applied fields extending beyond the border of traditional physics tracks. Demonstrated flexibility and capability to switch to a new project, and pick up the basics of the new field quickly, are among the essential requirements for a computational physicist. In line with the above mentioned prerequisites, my thesis described the development and results of two computational projects belonging to two different applied science areas. The first project is a Materials Science application. It is a prescription for an innovative nano-fabrication technique that is built out of two other known techniques. The preliminary results of the simulation of this novel nano-patterning fabrication method show an average improvement, roughly equal to 18%, with respect to the single techniques it draws on. The second project is a Homeland Security application aimed at preventing smuggling of nuclear material at ports of entry. It is concerned with a simulation of an active material interrogation system based on the analysis of induced photo-nuclear reactions. This project consists of a preliminary evaluation of the photo-fission implementation in the more robust radiation transport Monte Carlo codes, followed by the customization and extension of MCNPX, a Monte Carlo code developed in Los Alamos National Laboratory, and MCNP-PoliMi. The final stage of the project consists of testing the interrogation system against some real world scenarios, for the purpose of determining the system's reliability, material discrimination power, and limitations.

  16. NASA/DoD Aerospace Knowledge Diffusion Research Project. Report Number 20. The Use of Selected Information Products and Services by U.S. Aerospace Engineers and Scientists: Results of Two Surveys.

    DTIC Science & Technology

    1994-02-01

    within and between organizations. The technical report has been defined etymologically , according to report content and method (U.S. Department of...number) I AERONAUTICS 6 MATHEMATICAL & COMPUTER SCIENCES 2 ASTRONAUTICS 7 MATERIALS & CHEMISTRY 3 ENGINEERING 8 PHYSICS 4 GEOSCIENCES 9 SPACE SCIENCES 5...the application of your work? (Circle ONLY one number) 1 AERONAUTICS 6 MATHEMATICAL & COMPUTER SCIENCES 2 ASTRONAUTICS 7 MATERIALS & CHEMISTRY 3

  17. Georgia Computes! An Intervention in a US State, with Formal and Informal Education in a Policy Context

    ERIC Educational Resources Information Center

    Guzdial, Mark; Ericson, Barbara; Mcklin, Tom; Engelman, Shelly

    2014-01-01

    Georgia Computes! ("GaComputes") was a six-year (2006-2012) project to improve computing education across the state of Georgia in the United States, funded by the National Science Foundation. The goal of GaComputes was to broaden participation in computing and especially to engage more members of underrepresented groups which includes…

  18. S'COOL Provides Research Opportunities and Current Data for Today's Technological Classroom

    NASA Technical Reports Server (NTRS)

    Green, Carolyn J.; Chambers, Lin H.; Racel, Anne M.

    1999-01-01

    NASA's Students' Cloud Observations On-Line (S'COOL) project, a hands-on educational project, was an innovative idea conceived by the scientists in the Radiation Sciences Branch at NASA Langley Research Center, Hampton, Virginia, in 1996. It came about after a local teacher expressed the idea that she wanted her students to be involved in real-life science. S'COOL supports NASA's Clouds and the Earth's Radiant Energy System (CERES) instrument, which was launched on the Tropical Rainforest Measuring Mission (TRMM) in November, 1997, as part of NASA's Earth Science Enterprise. With the S'COOL project students observe clouds and related weather conditions, compute data and note vital information while obtaining ground truth observations for the CERES instrument. The observations can then be used to help validate the CERES measurements, particularly detection of clear sky from space. In addition to meeting math, science and geography standards, students are engaged in using the computer to obtain, report and analyze current data, thus bringing modern technology into the realm of classroom, a paradigm that demands our attention.

  19. Establishing the Basis for a CIS (Computer Information Systems) Undergraduate Program: On Seeking the Body of Knowledge

    ERIC Educational Resources Information Center

    Longenecker, Herbert E., Jr.; Babb, Jeffry; Waguespack, Leslie J.; Janicki, Thomas N.; Feinstein, David

    2015-01-01

    The evolution of computing education spans a spectrum from "computer science" ("CS") grounded in the theory of computing, to "information systems" ("IS"), grounded in the organizational application of data processing. This paper reports on a project focusing on a particular slice of that spectrum commonly…

  20. Laboratory Directed Research and Development FY2011 Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Craig, W; Sketchley, J; Kotta, P

    2012-03-22

    A premier applied-science laboratory, Lawrence Livermore National Laboratory (LLNL) has earned the reputation as a leader in providing science and technology solutions to the most pressing national and global security problems. The LDRD Program, established by Congress at all DOE national laboratories in 1991, is LLNL's most important single resource for fostering excellent science and technology for today's needs and tomorrow's challenges. The LDRD internally directed research and development funding at LLNL enables high-risk, potentially high-payoff projects at the forefront of science and technology. The LDRD Program at Livermore serves to: (1) Support the Laboratory's missions, strategic plan, and foundationalmore » science; (2) Maintain the Laboratory's science and technology vitality; (3) Promote recruiting and retention; (4) Pursue collaborations; (5) Generate intellectual property; and (6) Strengthen the U.S. economy. Myriad LDRD projects over the years have made important contributions to every facet of the Laboratory's mission and strategic plan, including its commitment to nuclear, global, and energy and environmental security, as well as cutting-edge science and technology and engineering in high-energy-density matter, high-performance computing and simulation, materials and chemistry at the extremes, information systems, measurements and experimental science, and energy manipulation. A summary of each project was submitted by the principal investigator. Project summaries include the scope, motivation, goals, relevance to DOE/NNSA and LLNL mission areas, the technical progress achieved in FY11, and a list of publications that resulted from the research. The projects are: (1) Nuclear Threat Reduction; (2) Biosecurity; (3) High-Performance Computing and Simulation; (4) Intelligence; (5) Cybersecurity; (6) Energy Security; (7) Carbon Capture; (8) Material Properties, Theory, and Design; (9) Radiochemistry; (10) High-Energy-Density Science; (11) Laser Inertial-Fusion Energy; (12) Advanced Laser Optical Systems and Applications; (12) Space Security; (13) Stockpile Stewardship Science; (14) National Security; (15) Alternative Energy; and (16) Climatic Change.« less

  1. FY 1999 Laboratory Directed Research and Development annual report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    PJ Hughes

    2000-06-13

    A short synopsis of each project is given covering the following main areas of research and development: Atmospheric sciences; Biotechnology; Chemical and instrumentation analysis; Computer and information science; Design and manufacture engineering; Ecological science; Electronics and sensors; Experimental technology; Health protection and dosimetry; Hydrologic and geologic science; Marine sciences; Materials science; Nuclear science and engineering; Process science and engineering; Sociotechnical systems analysis; Statistics and applied mathematics; and Thermal and energy systems.

  2. Integrating an Intelligent Tutoring System for TAOs with Second Life

    DTIC Science & Technology

    2010-12-01

    SL) and interacts with a number of computer -controlled objects that take on the roles of the TAO’s teammates. TAOs rely on the same mechanism to...projects that utilize both game and simulation technology for training. He joined Stottler Henke in the fall of 2000 and holds a Ph.D. in computer science...including implementing tutors in multiuser worlds. He has been at Stottler Henke since 2005 and has a MS in computer science from Stanford University

  3. A Research Program in Computer Technology. 1982 Annual Technical Report

    DTIC Science & Technology

    1983-03-01

    for the Defense Advanced Research Projects Agency. The research applies computer science and technology to areas of high DoD/ military impact. The ISI...implement the plan; New Computing Environment - investigation and adaptation of developing computer technologies to serve the research and military ...Computing Environment - ,.*_i;.;"’.)n and adaptation of developing computer technologies to serve the research and military tser communities; and Computer

  4. A review of small canned computer programs for survey research and demographic analysis.

    PubMed

    Sinquefield, J C

    1976-12-01

    A variety of small canned computer programs for survey research and demographic analysis appropriate for use in developing countries are reviewed in this article. The programs discussed are SPSS (Statistical Package for the Social Sciences); CENTS, CO-CENTS, CENTS-AID, CENTS-AIE II; MINI-TAB EDIT, FREQUENCIES, TABLES, REGRESSION, CLIENT RECORD, DATES, MULT, LIFE, and PREGNANCY HISTORY; FIVFIV and SINSIN; DCL (Demographic Computer Library); MINI-TAB Population Projection, Functional Population Projection, and Family Planning Target Projection. A description and evaluation for each program of uses, instruction manuals, computer requirements, and procedures for obtaining manuals and programs are provided. Such information is intended to facilitate and encourage the use of the computer by data processors in developing countries.

  5. Dragonfly: strengthening programming skills by building a game engine from scratch

    NASA Astrophysics Data System (ADS)

    Claypool, Mark

    2013-06-01

    Computer game development has been shown to be an effective hook for motivating students to learn both introductory and advanced computer science topics. While games can be made from scratch, to simplify the programming required game development often uses game engines that handle complicated or frequently used components of the game. These game engines present the opportunity to strengthen programming skills and expose students to a range of fundamental computer science topics. While educational efforts have been effective in using game engines to improve computer science education, there have been no published papers describing and evaluating students building a game engine from scratch as part of their course work. This paper presents the Dragonfly-approach in which students build a fully functional game engine from scratch and make a game using their engine as part of a junior-level course. Details on the programming projects are presented, as well as an evaluation of the results from two offerings that used Dragonfly. Student performance on the projects as well as student assessments demonstrates the efficacy of having students build a game engine from scratch in strengthening their programming skills.

  6. Fermilab | Science at Fermilab | Experiments & Projects | Intensity

    Science.gov Websites

    Theory Computing High-performance Computing Grid Computing Networking Mass Storage Plan for the Future List Historic Results Inquiring Minds Questions About Physics Other High-Energy Physics Sites More About Particle Physics Library Visual Media Services Timeline History High-Energy Physics Accelerator

  7. 2009 ALCF annual report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beckman, P.; Martin, D.; Drugan, C.

    2010-11-23

    This year the Argonne Leadership Computing Facility (ALCF) delivered nearly 900 million core hours of science. The research conducted at their leadership class facility touched our lives in both minute and massive ways - whether it was studying the catalytic properties of gold nanoparticles, predicting protein structures, or unearthing the secrets of exploding stars. The authors remained true to their vision to act as the forefront computational center in extending science frontiers by solving pressing problems for our nation. Our success in this endeavor was due mainly to the Department of Energy's (DOE) INCITE (Innovative and Novel Computational Impact onmore » Theory and Experiment) program. The program awards significant amounts of computing time to computationally intensive, unclassified research projects that can make high-impact scientific advances. This year, DOE allocated 400 million hours of time to 28 research projects at the ALCF. Scientists from around the world conducted the research, representing such esteemed institutions as the Princeton Plasma Physics Laboratory, National Institute of Standards and Technology, and European Center for Research and Advanced Training in Scientific Computation. Argonne also provided Director's Discretionary allocations for research challenges, addressing such issues as reducing aerodynamic noise, critical for next-generation 'green' energy systems. Intrepid - the ALCF's 557-teraflops IBM Blue/Gene P supercomputer - enabled astounding scientific solutions and discoveries. Intrepid went into full production five months ahead of schedule. As a result, the ALCF nearly doubled the days of production computing available to the DOE Office of Science, INCITE awardees, and Argonne projects. One of the fastest supercomputers in the world for open science, the energy-efficient system uses about one-third as much electricity as a machine of comparable size built with more conventional parts. In October 2009, President Barack Obama recognized the excellence of the entire Blue Gene series by awarding it to the National Medal of Technology and Innovation. Other noteworthy achievements included the ALCF's collaboration with the National Energy Research Scientific Computing Center (NERSC) to examine cloud computing as a potential new computing paradigm for scientists. Named Magellan, the DOE-funded initiative will explore which science application programming models work well within the cloud, as well as evaluate the challenges that come with this new paradigm. The ALCF obtained approval for its next-generation machine, a 10-petaflops system to be delivered in 2012. This system will allow us to resolve ever more pressing problems, even more expeditiously through breakthrough science in the years to come.« less

  8. PREPARING FOR EXASCALE: ORNL Leadership Computing Application Requirements and Strategy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joubert, Wayne; Kothe, Douglas B; Nam, Hai Ah

    2009-12-01

    In 2009 the Oak Ridge Leadership Computing Facility (OLCF), a U.S. Department of Energy (DOE) facility at the Oak Ridge National Laboratory (ORNL) National Center for Computational Sciences (NCCS), elicited petascale computational science requirements from leading computational scientists in the international science community. This effort targeted science teams whose projects received large computer allocation awards on OLCF systems. A clear finding of this process was that in order to reach their science goals over the next several years, multiple projects will require computational resources in excess of an order of magnitude more powerful than those currently available. Additionally, for themore » longer term, next-generation science will require computing platforms of exascale capability in order to reach DOE science objectives over the next decade. It is generally recognized that achieving exascale in the proposed time frame will require disruptive changes in computer hardware and software. Processor hardware will become necessarily heterogeneous and will include accelerator technologies. Software must undergo the concomitant changes needed to extract the available performance from this heterogeneous hardware. This disruption portends to be substantial, not unlike the change to the message passing paradigm in the computational science community over 20 years ago. Since technological disruptions take time to assimilate, we must aggressively embark on this course of change now, to insure that science applications and their underlying programming models are mature and ready when exascale computing arrives. This includes initiation of application readiness efforts to adapt existing codes to heterogeneous architectures, support of relevant software tools, and procurement of next-generation hardware testbeds for porting and testing codes. The 2009 OLCF requirements process identified numerous actions necessary to meet this challenge: (1) Hardware capabilities must be advanced on multiple fronts, including peak flops, node memory capacity, interconnect latency, interconnect bandwidth, and memory bandwidth. (2) Effective parallel programming interfaces must be developed to exploit the power of emerging hardware. (3) Science application teams must now begin to adapt and reformulate application codes to the new hardware and software, typified by hierarchical and disparate layers of compute, memory and concurrency. (4) Algorithm research must be realigned to exploit this hierarchy. (5) When possible, mathematical libraries must be used to encapsulate the required operations in an efficient and useful way. (6) Software tools must be developed to make the new hardware more usable. (7) Science application software must be improved to cope with the increasing complexity of computing systems. (8) Data management efforts must be readied for the larger quantities of data generated by larger, more accurate science models. Requirements elicitation, analysis, validation, and management comprise a difficult and inexact process, particularly in periods of technological change. Nonetheless, the OLCF requirements modeling process is becoming increasingly quantitative and actionable, as the process becomes more developed and mature, and the process this year has identified clear and concrete steps to be taken. This report discloses (1) the fundamental science case driving the need for the next generation of computer hardware, (2) application usage trends that illustrate the science need, (3) application performance characteristics that drive the need for increased hardware capabilities, (4) resource and process requirements that make the development and deployment of science applications on next-generation hardware successful, and (5) summary recommendations for the required next steps within the computer and computational science communities.« less

  9. Approaches for Measuring the Management Effectiveness of Software Projects

    DTIC Science & Technology

    2008-04-01

    John S. Osmundson Research Assoc. Professor of...and Department of Computer Science Dean of Research ...caused otherwise good projects grind to a halt.” [RO]. Various other studies, researchers and practitioners report similar issues regarding the

  10. Climate Modeling Computing Needs Assessment

    NASA Astrophysics Data System (ADS)

    Petraska, K. E.; McCabe, J. D.

    2011-12-01

    This paper discusses early findings of an assessment of computing needs for NASA science, engineering and flight communities. The purpose of this assessment is to document a comprehensive set of computing needs that will allow us to better evaluate whether our computing assets are adequately structured to meet evolving demand. The early results are interesting, already pointing out improvements we can make today to get more out of the computing capacity we have, as well as potential game changing innovations for the future in how we apply information technology to science computing. Our objective is to learn how to leverage our resources in the best way possible to do more science for less money. Our approach in this assessment is threefold: Development of use case studies for science workflows; Creating a taxonomy and structure for describing science computing requirements; and characterizing agency computing, analysis, and visualization resources. As projects evolve, science data sets increase in a number of ways: in size, scope, timelines, complexity, and fidelity. Generating, processing, moving, and analyzing these data sets places distinct and discernable requirements on underlying computing, analysis, storage, and visualization systems. The initial focus group for this assessment is the Earth Science modeling community within NASA's Science Mission Directorate (SMD). As the assessment evolves, this focus will expand to other science communities across the agency. We will discuss our use cases, our framework for requirements and our characterizations, as well as our interview process, what we learned and how we plan to improve our materials after using them in the first round of interviews in the Earth Science Modeling community. We will describe our plans for how to expand this assessment, first into the Earth Science data analysis and remote sensing communities, and then throughout the full community of science, engineering and flight at NASA.

  11. An exploration of gender participation patterns in science competitions

    NASA Astrophysics Data System (ADS)

    Arámbula Greenfield, Teresa

    This study investigated participation in a state-level science competition over most of its 35-year history. Issues examined included whether different gender patterns occurred with respect to entry rate, project topic (life science, physical science, earth science, and math), and project type (research or display). The study also examined to what extent the identified patterns reflected or contradicted nationwide patterns of girls' academic performance in science over roughly the same time period. It was found that although girls initially participated in the fair less frequently than boys, for the past 20 years their participation rate has been greater than that of boys. Examination of topic preferences over the years indicates that both girls and boys have traditionally favored life science; however, boys have been and continue to be more likely to prepare physical, earth, and math/computer science projects than girls. Another gender difference is that girls are generally less likely than boys to prepare projects based on experimental research as opposed to library research. The study provides some suggestions for teachers and teacher educators for addressing these disparities.Received: 4 February 1994; Revised: 12 January 1995;

  12. Exploratory Research and Development Fund, FY 1990. Report on Lawrence Berkeley Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1992-05-01

    The Lawrence Berkeley Laboratory Exploratory R&D Fund FY 1990 report is compiled from annual reports submitted by principal investigators following the close of the fiscal year. This report describes the projects supported and summarizes their accomplishments. It constitutes a part of an Exploratory R&D Fund (ERF) planning and documentation process that includes an annual planning cycle, projection selection, implementation, and review. The research areas covered in this report are: Accelerator and fusion research; applied science; cell and molecular biology; chemical biodynamics; chemical sciences; earth sciences; engineering; information and computing sciences; materials sciences; nuclear science; physics and research medicine and radiationmore » biophysics.« less

  13. CILogon-HA. Higher Assurance Federated Identities for DOE Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Basney, James

    The CILogon-HA project extended the existing open source CILogon service (initially developed with funding from the National Science Foundation) to provide credentials at multiple levels of assurance to users of DOE facilities for collaborative science. CILogon translates mechanism and policy across higher education and grid trust federations, bridging from the InCommon identity federation (which federates university and DOE lab identities) to the Interoperable Global Trust Federation (which defines standards across the Worldwide LHC Computing Grid, the Open Science Grid, and other cyberinfrastructure). The CILogon-HA project expanded the CILogon service to support over 160 identity providers (including 6 DOE facilities) andmore » 3 internationally accredited certification authorities. To provide continuity of operations upon the end of the CILogon-HA project period, project staff transitioned the CILogon service to operation by XSEDE.« less

  14. Optimization of knowledge-based systems and expert system building tools

    NASA Technical Reports Server (NTRS)

    Yasuda, Phyllis; Mckellar, Donald

    1993-01-01

    The objectives of the NASA-AMES Cooperative Agreement were to investigate, develop, and evaluate, via test cases, the system parameters and processing algorithms that constrain the overall performance of the Information Sciences Division's Artificial Intelligence Research Facility. Written reports covering various aspects of the grant were submitted to the co-investigators for the grant. Research studies concentrated on the field of artificial intelligence knowledge-based systems technology. Activities included the following areas: (1) AI training classes; (2) merging optical and digital processing; (3) science experiment remote coaching; (4) SSF data management system tests; (5) computer integrated documentation project; (6) conservation of design knowledge project; (7) project management calendar and reporting system; (8) automation and robotics technology assessment; (9) advanced computer architectures and operating systems; and (10) honors program.

  15. Bayesian Research at the NASA Ames Research Center,Computational Sciences Division

    NASA Technical Reports Server (NTRS)

    Morris, Robin D.

    2003-01-01

    NASA Ames Research Center is one of NASA s oldest centers, having started out as part of the National Advisory Committee on Aeronautics, (NACA). The site, about 40 miles south of San Francisco, still houses many wind tunnels and other aviation related departments. In recent years, with the growing realization that space exploration is heavily dependent on computing and data analysis, its focus has turned more towards Information Technology. The Computational Sciences Division has expanded rapidly as a result. In this article, I will give a brief overview of some of the past and present projects with a Bayesian content. Much more than is described here goes on with the Division. The web pages at http://ic.arc. nasa.gov give more information on these, and the other Division projects.

  16. Computational Science in Armenia (Invited Talk)

    NASA Astrophysics Data System (ADS)

    Marandjian, H.; Shoukourian, Yu.

    This survey is devoted to the development of informatics and computer science in Armenia. The results in theoretical computer science (algebraic models, solutions to systems of general form recursive equations, the methods of coding theory, pattern recognition and image processing), constitute the theoretical basis for developing problem-solving-oriented environments. As examples can be mentioned: a synthesizer of optimized distributed recursive programs, software tools for cluster-oriented implementations of two-dimensional cellular automata, a grid-aware web interface with advanced service trading for linear algebra calculations. In the direction of solving scientific problems that require high-performance computing resources, examples of completed projects include the field of physics (parallel computing of complex quantum systems), astrophysics (Armenian virtual laboratory), biology (molecular dynamics study of human red blood cell membrane), meteorology (implementing and evaluating the Weather Research and Forecast Model for the territory of Armenia). The overview also notes that the Institute for Informatics and Automation Problems of the National Academy of Sciences of Armenia has established a scientific and educational infrastructure, uniting computing clusters of scientific and educational institutions of the country and provides the scientific community with access to local and international computational resources, that is a strong support for computational science in Armenia.

  17. A parallel-processing approach to computing for the geographic sciences

    USGS Publications Warehouse

    Crane, Michael; Steinwand, Dan; Beckmann, Tim; Krpan, Greg; Haga, Jim; Maddox, Brian; Feller, Mark

    2001-01-01

    The overarching goal of this project is to build a spatially distributed infrastructure for information science research by forming a team of information science researchers and providing them with similar hardware and software tools to perform collaborative research. Four geographically distributed Centers of the U.S. Geological Survey (USGS) are developing their own clusters of low-cost personal computers into parallel computing environments that provide a costeffective way for the USGS to increase participation in the high-performance computing community. Referred to as Beowulf clusters, these hybrid systems provide the robust computing power required for conducting research into various areas, such as advanced computer architecture, algorithms to meet the processing needs for real-time image and data processing, the creation of custom datasets from seamless source data, rapid turn-around of products for emergency response, and support for computationally intense spatial and temporal modeling.

  18. Team Projects and Peer Evaluations

    ERIC Educational Resources Information Center

    Doyle, John Kevin; Meeker, Ralph D.

    2008-01-01

    The authors assign semester- or quarter-long team-based projects in several Computer Science and Finance courses. This paper reports on our experience in designing, managing, and evaluating such projects. In particular, we discuss the effects of team size and of various peer evaluation schemes on team performance and student learning. We report…

  19. Project CHAMP, 1985-1986. OEA Evaluation Report.

    ERIC Educational Resources Information Center

    New York City Board of Education, Brooklyn. Office of Educational Assessment.

    The Chinese Achievement and Mastery program, Project CHAMP, was a bilingual (Chinese/English) project offered at three high schools in Manhattan. The major goals were to enable Chinese students of limited English proficiency (LEP) to learn English and to master content in mathematics, science, global history, computer mathematics, and native…

  20. Hands-On Astrophysics: Variable Stars in Math, Science, and Computer Education

    NASA Astrophysics Data System (ADS)

    Mattei, J. A.; Percy, J. R.

    1999-12-01

    Hands-On Astrophysics (HOA): Variable Stars in Math, Science, and Computer Education, is a project recently developed by the American Association of Variable Star Observers (AAVSO) with funds from the National Science Foundation. HOA uses the unique methods and the international database of the AAVSO to develop and integrate students' math and science skills through variable star observation and analysis. It can provide an understanding of basic astronomy concepts, as well as interdisciplinary connections. Most of all, it motivates the user by exposing them to the excitement of doing real science with real data. Project materials include: a database of 600,000 variable star observations; VSTAR (a data plotting and analysis program), and other user friendly software; 31 slides and 14 prints of five constellations; 45 variable star finder charts; an instructional videotape in three 15-minute segments; and a 560-page student's and teacher's manual. These materials support the National Standards for Science and Math education by directly involving the students in the scientific process. Hands-On Astrophysics is designed to be flexible. It is organized so that it can be used at many levels, in many contexts: for classroom use from high school to college level, or for individual projects. In addition, communication and support can be found through the AAVSO home page on the World Wide Web: http://www.aavso.org. The HOA materials can be ordered through this web site or from the AAVSO, 25 Birch Street Cambridge, MA 02138, USA. We gratefully acknowledge the education grant ESI-9154091 from the National Science Foundation which funded the development of this project.

  1. Designing Citizen Science Projects in the Era of Mega-Information and Connected Activism

    NASA Astrophysics Data System (ADS)

    Pompea, S. M.

    2010-12-01

    The design of citizen science projects must take many factors into account in order to be successful. Currently, there are a wide variety of citizen science projects with different aims, audiences, reporting methods, and degrees of scientific rigor and usefulness. Projects function on local, national, and worldwide scales and range in time from limited campaigns to around the clock projects. For current and future projects, advanced cell phones and mobile computing allow an unprecedented degree of connectivity and data transfer. These advances will greatly influence the design of citizen science projects. An unprecedented amount of data is available for data mining by interested citizen scientists; how can projects take advantage of this? Finally, a variety of citizen scientist projects have social activism and change as part of their mission and goals. How can this be harnessed in a constructive and efficient way? The design of projects must also select the proper role for experts and novices, provide quality control, and must motivate users to encourage long-term involvement. Effective educational and instructional materials design can be used to design responsive and effective projects in a more highly connected age with access to very large amounts of information.

  2. Cloudbursting - Solving the 3-body problem

    NASA Astrophysics Data System (ADS)

    Chang, G.; Heistand, S.; Vakhnin, A.; Huang, T.; Zimdars, P.; Hua, H.; Hood, R.; Koenig, J.; Mehrotra, P.; Little, M. M.; Law, E.

    2014-12-01

    Many science projects in the future will be accomplished through collaboration among 2 or more NASA centers along with, potentially, external scientists. Science teams will be composed of more geographically dispersed individuals and groups. However, the current computing environment does not make this easy and seamless. By being able to share computing resources among members of a multi-center team working on a science/ engineering project, limited pre-competition funds could be more efficiently applied and technical work could be conducted more effectively with less time spent moving data or waiting for computing resources to free up. Based on the work from an NASA CIO IT Labs task, this presentation will highlight our prototype work in identifying the feasibility and identify the obstacles, both technical and management, to perform "Cloudbursting" among private clouds located at three different centers. We will demonstrate the use of private cloud computing infrastructure at the Jet Propulsion Laboratory, Langley Research Center, and Ames Research Center to provide elastic computation to each other to perform parallel Earth Science data imaging. We leverage elastic load balancing and auto-scaling features at each data center so that each location can independently define how many resources to allocate to a particular job that was "bursted" from another data center and demonstrate that compute capacity scales up and down with the job. We will also discuss future work in the area, which could include the use of cloud infrastructure from different cloud framework providers as well as other cloud service providers.

  3. QMC Goes BOINC: Using Public Resource Computing to Perform Quantum Monte Carlo Calculations

    NASA Astrophysics Data System (ADS)

    Rainey, Cameron; Engelhardt, Larry; Schröder, Christian; Hilbig, Thomas

    2008-10-01

    Theoretical modeling of magnetic molecules traditionally involves the diagonalization of quantum Hamiltonian matrices. However, as the complexity of these molecules increases, the matrices become so large that this process becomes unusable. An additional challenge to this modeling is that many repetitive calculations must be performed, further increasing the need for computing power. Both of these obstacles can be overcome by using a quantum Monte Carlo (QMC) method and a distributed computing project. We have recently implemented a QMC method within the Spinhenge@home project, which is a Public Resource Computing (PRC) project where private citizens allow part-time usage of their PCs for scientific computing. The use of PRC for scientific computing will be described in detail, as well as how you can contribute to the project. See, e.g., L. Engelhardt, et. al., Angew. Chem. Int. Ed. 47, 924 (2008). C. Schröoder, in Distributed & Grid Computing - Science Made Transparent for Everyone. Principles, Applications and Supporting Communities. (Weber, M.H.W., ed., 2008). Project URL: http://spin.fh-bielefeld.de

  4. Introducing Computational Thinking through Hands-on Projects Using R with Applications to Calculus, Probability and Data Analysis

    ERIC Educational Resources Information Center

    Benakli, Nadia; Kostadinov, Boyan; Satyanarayana, Ashwin; Singh, Satyanand

    2017-01-01

    The goal of this paper is to promote computational thinking among mathematics, engineering, science and technology students, through hands-on computer experiments. These activities have the potential to empower students to learn, create and invent with technology, and they engage computational thinking through simulations, visualizations and data…

  5. Association of Small Computer Users in Education (ASCUE) Summer Conference. Proceedings (25th, North Myrtle Beach, South Carolina, June 21-25, 1992).

    ERIC Educational Resources Information Center

    Association of Small Computer Users in Education, Greencastle, IN.

    Forty-three papers from a conference on microcomputers are presented under the following headings: Computing in the Curriculum, Information and Computer Science Information; Institutional and Administrative Computing, and Management, Services, and Training. Topics of the papers include the following: telecommunications projects that work in…

  6. The DEVELOP Program as a Unique Applied Science Internship

    NASA Astrophysics Data System (ADS)

    Skiles, J. W.; Schmidt, C. L.; Ruiz, M. L.; Cawthorn, J.

    2004-12-01

    The NASA mission includes "Inspiring the next generation of explorers" and "Understanding and protecting our home planet". DEVELOP students conduct research projects in Earth Systems Science, gaining valuable training and work experience, which support accomplishing this mission. This presentation will describe the DEVELOP Program, a NASA human capital development initiative, which is student run and student led with NASA scientists serving as mentors. DEVELOP began in 1998 at NASA's Langley Research Center in Virginia and expanded to NASA's Stennis Space Center in Mississippi and Marshall Space Flight Center in Alabama in 2002. NASA's Ames Research Center in California began DEVELOP activity in 2003. DEVELOP is a year round activity. High school through graduate school students participate in DEVELOP with students' backgrounds encompassing a wide variety of academic majors such as engineering, biology, physics, mathematics, computer science, remote sensing, geographic information systems, business, and geography. DEVELOP projects are initiated when county, state, or tribal governments submit a proposal requesting students work on local projects. When a project is selected, science mentors guide students in the application of NASA applied science and technology to enhance decision support tools for customers. Partnerships are established with customers, professional organizations and state and federal agencies in order to leverage resources needed to complete research projects. Student teams are assigned a project and are responsible for creating an inclusive project plan beginning with the design and approach of the study, the timeline, and the deliverables for the customer. Project results can consist of student papers, both team and individually written, face-to-face meetings and seminars with customers, presentations at national meetings in the form of posters and oral papers, displays at the Western and Southern Governors' Associations, and visualizations produced by the students. Projects have included Homeland Security in Virginia, Energy Management in New Mexico, Water Management in Mississippi, Air Quality Management in Alabama, Invasive Species mapping in Nevada, Public Health risk assessment in California, Disaster Management in Oklahoma, Agricultural Efficiency in South Dakota, Coastal Management in Louisiana and Carbon Management in Oregon. DEVELOP students gain experience in applied science, computer technology, and project management. Several DEVELOP projects will be demonstrated and discussed during this presentation. DEVELOP is sponsored by the Applications Division of NASA's Science Mission Directorate.

  7. Future Challenges in Library Science.

    ERIC Educational Resources Information Center

    Murgai, Sarla R.

    This paper considers a number of potential developments for the future of library science and the roles of information professionals. Among the projections are: (1) the use of computers and management science operations research methodologies will form the basis of decision making in libraries in the future; (2) a concerted effort will be made to…

  8. Identifying the Factors Leading to Success: How an Innovative Science Curriculum Cultivates Student Motivation

    ERIC Educational Resources Information Center

    Scogin, Stephen C.

    2016-01-01

    "PlantingScience" is an award-winning program recognized for its innovation and use of computer-supported scientist mentoring. Science learners work on inquiry-based experiments in their classrooms and communicate asynchronously with practicing plant scientist-mentors about the projects. The purpose of this study was to identify specific…

  9. Combining Art and Science in "Arts and Sciences" Education

    ERIC Educational Resources Information Center

    Needle, Andrew; Corbo, Christopher; Wong, Denise; Greenfeder, Gary; Raths, Linda; Fulop, Zoltan

    2007-01-01

    Two of this article's authors--an art professor and a biology professor--shared a project for advanced biology, art, nursing, and computer science majors involving scientific research that used digital imaging of the brain of the zebrafish, a newly favored laboratory animal. These contemporary and innovative teaching and learning practices were a…

  10. Progress Monitoring in Grade 5 Science for Low Achievers

    ERIC Educational Resources Information Center

    Vannest, Kimberly J.; Parker, Richard; Dyer, Nicole

    2011-01-01

    This article presents procedures and results from a 2-year project developing science key vocabulary (KV) short tests suitable for progress monitoring Grade 5 science in Texas public schools using computer-generated, -administered, and -scored assessments. KV items included KV definitions and important usages in a multiple-choice cloze format. A…

  11. Globus Quick Start Guide. Globus Software Version 1.1

    NASA Technical Reports Server (NTRS)

    1999-01-01

    The Globus Project is a community effort, led by Argonne National Laboratory and the University of Southern California's Information Sciences Institute. Globus is developing the basic software infrastructure for computations that integrate geographically distributed computational and information resources.

  12. Merging the Machines of Modern Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wolf, Laura; Collins, Jim

    Two recent projects have harnessed supercomputing resources at the US Department of Energy’s Argonne National Laboratory in a novel way to support major fusion science and particle collider experiments. Using leadership computing resources, one team ran fine-grid analysis of real-time data to make near-real-time adjustments to an ongoing experiment, while a second team is working to integrate Argonne’s supercomputers into the Large Hadron Collider/ATLAS workflow. Together these efforts represent a new paradigm of the high-performance computing center as a partner in experimental science.

  13. The diversity and evolution of ecological and environmental citizen science.

    PubMed

    Pocock, Michael J O; Tweddle, John C; Savage, Joanna; Robinson, Lucy D; Roy, Helen E

    2017-01-01

    Citizen science-the involvement of volunteers in data collection, analysis and interpretation-simultaneously supports research and public engagement with science, and its profile is rapidly rising. Citizen science represents a diverse range of approaches, but until now this diversity has not been quantitatively explored. We conducted a systematic internet search and discovered 509 environmental and ecological citizen science projects. We scored each project for 32 attributes based on publicly obtainable information and used multiple factor analysis to summarise this variation to assess citizen science approaches. We found that projects varied according to their methodological approach from 'mass participation' (e.g. easy participation by anyone anywhere) to 'systematic monitoring' (e.g. trained volunteers repeatedly sampling at specific locations). They also varied in complexity from approaches that are 'simple' to those that are 'elaborate' (e.g. provide lots of support to gather rich, detailed datasets). There was a separate cluster of entirely computer-based projects but, in general, we found that the range of citizen science projects in ecology and the environment showed continuous variation and cannot be neatly categorised into distinct types of activity. While the diversity of projects begun in each time period (pre 1990, 1990-99, 2000-09 and 2010-13) has not increased, we found that projects tended to have become increasingly different from each other as time progressed (possibly due to changing opportunities, including technological innovation). Most projects were still active so consequently we found that the overall diversity of active projects (available for participation) increased as time progressed. Overall, understanding the landscape of citizen science in ecology and the environment (and its change over time) is valuable because it informs the comparative evaluation of the 'success' of different citizen science approaches. Comparative evaluation provides an evidence-base to inform the future development of citizen science activities.

  14. Good enough practices in scientific computing.

    PubMed

    Wilson, Greg; Bryan, Jennifer; Cranston, Karen; Kitzes, Justin; Nederbragt, Lex; Teal, Tracy K

    2017-06-01

    Computers are now essential in all branches of science, but most researchers are never taught the equivalent of basic lab skills for research computing. As a result, data can get lost, analyses can take much longer than necessary, and researchers are limited in how effectively they can work with software and data. Computing workflows need to follow the same practices as lab projects and notebooks, with organized data, documented steps, and the project structured for reproducibility, but researchers new to computing often don't know where to start. This paper presents a set of good computing practices that every researcher can adopt, regardless of their current level of computational skill. These practices, which encompass data management, programming, collaborating with colleagues, organizing projects, tracking work, and writing manuscripts, are drawn from a wide variety of published sources from our daily lives and from our work with volunteer organizations that have delivered workshops to over 11,000 people since 2010.

  15. GRAPE project

    NASA Astrophysics Data System (ADS)

    Makino, Junichiro

    2002-12-01

    We overview our GRAvity PipE (GRAPE) project to develop special-purpose computers for astrophysical N-body simulations. The basic idea of GRAPE is to attach a custom-build computer dedicated to the calculation of gravitational interaction between particles to a general-purpose programmable computer. By this hybrid architecture, we can achieve both a wide range of applications and very high peak performance. Our newest machine, GRAPE-6, achieved the peak speed of 32 Tflops, and sustained performance of 11.55 Tflops, for the total budget of about 4 million USD. We also discuss relative advantages of special-purpose and general-purpose computers and the future of high-performance computing for science and technology.

  16. Applications of computer assisted surgery and medical robotics at the ISSSTE, México: preliminary results.

    PubMed

    Mosso, José Luis; Pohl, Mauricio; Jimenez, Juan Ramon; Valdes, Raquel; Yañez, Oscar; Medina, Veronica; Arambula, Fernando; Padilla, Miguel Angel; Marquez, Jorge; Gastelum, Alfonso; Mosso, Alejo; Frausto, Juan

    2007-01-01

    We present the first results of four projects of a second phase of a Mexican Project Computer Assisted Surgery and Medical Robotics, supported by the Mexican Science and Technology National Council (Consejo Nacional de Ciencia y Tecnología) under grant SALUD-2002-C01-8181. The projects are being developed by three universities (UNAM, UAM, ITESM) and the goal of this project is to integrate a laboratory in a Hospital of the ISSSTE to give service to surgeons or clinicians of Endoscopic surgeons, urologist, gastrointestinal endoscopist and neurosurgeons.

  17. Lobachevsky Year at Kazan University: Center of Science, Education, Intellectual-Cognitive Tourism "Kazan - GeoNa - 2020+" and "Kazan-Moon-2020+" projects

    NASA Astrophysics Data System (ADS)

    Gusev, A.; Trudkova, N.

    2017-09-01

    Center "GeoNa" will enable scientists and teachers of the Russian universities to join to advanced achievements of a science, information technologies; to establish scientific communications with foreign colleagues in sphere of the high technology, educational projects and Intellectual-Cognitive Tourism. The Project "Kazan - Moon - 2020+" is directed on the decision of fundamental problems of celestial mechanics, selenodesy and geophysics of the Moon(s) connected to carrying out of complex theoretical researches and computer modelling.

  18. Recovery Act: Web-based CO{sub 2} Subsurface Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paolini, Christopher; Castillo, Jose

    2012-11-30

    The Web-based CO{sub 2} Subsurface Modeling project focused primarily on extending an existing text-only, command-line driven, isothermal and isobaric, geochemical reaction-transport simulation code, developed and donated by Sienna Geodynamics, into an easier-to-use Web-based application for simulating long-term storage of CO{sub 2} in geologic reservoirs. The Web-based interface developed through this project, publically accessible via URL http://symc.sdsu.edu/, enables rapid prototyping of CO{sub 2} injection scenarios and allows students without advanced knowledge of geochemistry to setup a typical sequestration scenario, invoke a simulation, analyze results, and then vary one or more problem parameters and quickly re-run a simulation to answer what-if questions.more » symc.sdsu.edu has 2x12 core AMD Opteron™ 6174 2.20GHz processors and 16GB RAM. The Web-based application was used to develop a new computational science course at San Diego State University, COMP 670: Numerical Simulation of CO{sub 2} Sequestration, which was taught during the fall semester of 2012. The purpose of the class was to introduce graduate students to Carbon Capture, Use and Storage (CCUS) through numerical modeling and simulation, and to teach students how to interpret simulation results to make predictions about long-term CO{sub 2} storage capacity in deep brine reservoirs. In addition to the training and education component of the project, significant software development efforts took place. Two computational science doctoral and one geological science masters student, under the direction of the PIs, extended the original code developed by Sienna Geodynamics, named Sym.8. New capabilities were added to Sym.8 to simulate non-isothermal and non-isobaric flows of charged aqueous solutes in porous media, in addition to incorporating HPC support into the code for execution on many-core XSEDE clusters. A successful outcome of this project was the funding and training of three new computational science students and one geological science student in technologies relevant to carbon sequestration and problems involving flow in subsurface media. The three computational science students are currently finishing their doctorial studies on different aspects of modeling CO{sub 2} sequestration, while the geological science student completed his master’s thesis in modeling the thermal response of CO{sub 2} injection in brine and, as a direct result of participation in this project, is now employed at ExxonMobil as a full-time staff geologist.« less

  19. Argonne Leadership Computing Facility 2011 annual report : Shaping future supercomputing.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Papka, M.; Messina, P.; Coffey, R.

    The ALCF's Early Science Program aims to prepare key applications for the architecture and scale of Mira and to solidify libraries and infrastructure that will pave the way for other future production applications. Two billion core-hours have been allocated to 16 Early Science projects on Mira. The projects, in addition to promising delivery of exciting new science, are all based on state-of-the-art, petascale, parallel applications. The project teams, in collaboration with ALCF staff and IBM, have undertaken intensive efforts to adapt their software to take advantage of Mira's Blue Gene/Q architecture, which, in a number of ways, is a precursormore » to future high-performance-computing architecture. The Argonne Leadership Computing Facility (ALCF) enables transformative science that solves some of the most difficult challenges in biology, chemistry, energy, climate, materials, physics, and other scientific realms. Users partnering with ALCF staff have reached research milestones previously unattainable, due to the ALCF's world-class supercomputing resources and expertise in computation science. In 2011, the ALCF's commitment to providing outstanding science and leadership-class resources was honored with several prestigious awards. Research on multiscale brain blood flow simulations was named a Gordon Bell Prize finalist. Intrepid, the ALCF's BG/P system, ranked No. 1 on the Graph 500 list for the second consecutive year. The next-generation BG/Q prototype again topped the Green500 list. Skilled experts at the ALCF enable researchers to conduct breakthrough science on the Blue Gene system in key ways. The Catalyst Team matches project PIs with experienced computational scientists to maximize and accelerate research in their specific scientific domains. The Performance Engineering Team facilitates the effective use of applications on the Blue Gene system by assessing and improving the algorithms used by applications and the techniques used to implement those algorithms. The Data Analytics and Visualization Team lends expertise in tools and methods for high-performance, post-processing of large datasets, interactive data exploration, batch visualization, and production visualization. The Operations Team ensures that system hardware and software work reliably and optimally; system tools are matched to the unique system architectures and scale of ALCF resources; the entire system software stack works smoothly together; and I/O performance issues, bug fixes, and requests for system software are addressed. The User Services and Outreach Team offers frontline services and support to existing and potential ALCF users. The team also provides marketing and outreach to users, DOE, and the broader community.« less

  20. Life science research and drug discovery at the turn of the 21st century: the experience of SwissBioGrid.

    PubMed

    den Besten, Matthijs; Thomas, Arthur J; Schroeder, Ralph

    2009-04-22

    It is often said that the life sciences are transforming into an information science. As laboratory experiments are starting to yield ever increasing amounts of data and the capacity to deal with those data is catching up, an increasing share of scientific activity is seen to be taking place outside the laboratories, sifting through the data and modelling "in silico" the processes observed "in vitro." The transformation of the life sciences and similar developments in other disciplines have inspired a variety of initiatives around the world to create technical infrastructure to support the new scientific practices that are emerging. The e-Science programme in the United Kingdom and the NSF Office for Cyberinfrastructure are examples of these. In Switzerland there have been no such national initiatives. Yet, this has not prevented scientists from exploring the development of similar types of computing infrastructures. In 2004, a group of researchers in Switzerland established a project, SwissBioGrid, to explore whether Grid computing technologies could be successfully deployed within the life sciences. This paper presents their experiences as a case study of how the life sciences are currently operating as an information science and presents the lessons learned about how existing institutional and technical arrangements facilitate or impede this operation. SwissBioGrid gave rise to two pilot projects: one for proteomics data analysis and the other for high-throughput molecular docking ("virtual screening") to find new drugs for neglected diseases (specifically, for dengue fever). The proteomics project was an example of a data management problem, applying many different analysis algorithms to Terabyte-sized datasets from mass spectrometry, involving comparisons with many different reference databases; the virtual screening project was more a purely computational problem, modelling the interactions of millions of small molecules with a limited number of protein targets on the coat of the dengue virus. Both present interesting lessons about how scientific practices are changing when they tackle the problems of large-scale data analysis and data management by means of creating a novel technical infrastructure. In the experience of SwissBioGrid, data intensive discovery has a lot to gain from close collaboration with industry and harnessing distributed computing power. Yet the diversity in life science research implies only a limited role for generic infrastructure; and the transience of support means that researchers need to integrate their efforts with others if they want to sustain the benefits of their success, which are otherwise lost.

  1. Intelligent Computer-Assisted Instruction: A Review and Assessment of ICAI Research and Its Potential for Education.

    ERIC Educational Resources Information Center

    Dede, Christopher J.; And Others

    The first of five sections in this report places intelligent computer-assisted instruction (ICAI) in its historical context through discussions of traditional computer-assisted instruction (CAI) linear and branching programs; TICCIT and PLATO IV, two CAI demonstration projects funded by the National Science Foundation; generative programs, the…

  2. Science Support: The Building Blocks of Active Data Curation

    NASA Astrophysics Data System (ADS)

    Guillory, A.

    2013-12-01

    While the scientific method is built on reproducibility and transparency, and results are published in peer reviewed literature, we have come to the digital age of very large datasets (now of the order of petabytes and soon exabytes) which cannot be published in the traditional way. To preserve reproducibility and transparency, active curation is necessary to keep and protect the information in the long term, and 'science support' activities provide the building blocks for active data curation. With the explosive growth of data in all fields in recent years, there is a pressing urge for data centres to now provide adequate services to ensure long-term preservation and digital curation of project data outputs, however complex those may be. Science support provides advice and support to science projects on data and information management, from file formats through to general data management awareness. Another purpose of science support is to raise awareness in the science community of data and metadata standards and best practice, engendering a culture where data outputs are seen as valued assets. At the heart of Science support is the Data Management Plan (DMP) which sets out a coherent approach to data issues pertaining to the data generating project. It provides an agreed record of the data management needs and issues within the project. The DMP is agreed upon with project investigators to ensure that a high quality documented data archive is created. It includes conditions of use and deposit to clearly express the ownership, responsibilities and rights associated with the data. Project specific needs are also identified for data processing, visualization tools and data sharing services. As part of the National Centre for Atmospheric Science (NCAS) and National Centre for Earth Observation (NCEO), the Centre for Environmental Data Archival (CEDA) fulfills this science support role of facilitating atmospheric and Earth observation data generating projects to ensure successful management of the data and accompanying information for reuse and repurpose. Specific examples at CEDA include science support provided to FAAM (Facility for Airborne Atmospheric Measurements) aircraft campaigns and large-scale modelling projects such as UPSCALE, the largest ever PRACE (Partnership for Advanced Computing in Europe) computational project, dependent on CEDA to provide the high-performance storage, transfer capability and data analysis environment on the 'super-data-cluster' JASMIN. The impact of science support on scientific research is conspicuous: better documented datasets with an increasing collection of metadata associated to the archived data, ease of data sharing with the use of standards in formats and metadata and data citation. These establish a high-quality of data management ensuring long-term preservation and enabling re-use by peer scientists which ultimately leads to faster paced progress in science.

  3. Ethics across the computer science curriculum: privacy modules in an introductory database course.

    PubMed

    Appel, Florence

    2005-10-01

    This paper describes the author's experience of infusing an introductory database course with privacy content, and the on-going project entitled Integrating Ethics Into the Database Curriculum, that evolved from that experience. The project, which has received funding from the National Science Foundation, involves the creation of a set of privacy modules that can be implemented systematically by database educators throughout the database design thread of an undergraduate course.

  4. Comments from the Science Education Directorate, National Science Foundation: CAUSE, ISEP, and LOCI: Three-Program Approach to College-Level Science Improvement. II. Patterns and Problems.

    ERIC Educational Resources Information Center

    Erickson, Judith B.; And Others

    1980-01-01

    Discusses patterns resulting from the monitor of science education proposals which may reflect problems or differing perceptions of NSF. Discusses these areas: proposal submissions from two-year institutions and social and behavioral scientists, trends in project content at the academic-industrial interface and in computer technology, and…

  5. JPRS Report, Science & Technology, Japan. Goto Quantum Magneto-Flux Logic Project.

    DTIC Science & Technology

    1992-04-23

    established infrastruc- T. Kobayashi Professor, Physics Department, ture technology, such as the minimal signal measure-Faculty of Science, Tokyo Uni...5th Josephson Electronics, p 103 ence Proceedings, p 1215 (1989) (1988) M. Sato, N. Fukazawa, P. Spee and E. Goto J. Yuyama, M. Kasuya, S. Kobayashi , R...a speed similar to the real number inner product Nobuaki Yoshida computation. Because the single precision inner product computation can be composed

  6. CESDIS

    NASA Technical Reports Server (NTRS)

    1994-01-01

    CESDIS, the Center of Excellence in Space Data and Information Sciences was developed jointly by NASA, Universities Space Research Association (USRA), and the University of Maryland in 1988 to focus on the design of advanced computing techniques and data systems to support NASA Earth and space science research programs. CESDIS is operated by USRA under contract to NASA. The Director, Associate Director, Staff Scientists, and administrative staff are located on-site at NASA's Goddard Space Flight Center in Greenbelt, Maryland. The primary CESDIS mission is to increase the connection between computer science and engineering research programs at colleges and universities and NASA groups working with computer applications in Earth and space science. The 1993-94 CESDIS year included a broad range of computer science research applied to NASA problems. This report provides an overview of these research projects and programs as well as a summary of the various other activities of CESDIS in support of NASA and the university research community, We have had an exciting and challenging year.

  7. JPL basic research review. [research and advanced development

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Current status, projected goals, and results of 49 research and advanced development programs at the Jet Propulsion Laboratory are reported in abstract form. Areas of investigation include: aerodynamics and fluid mechanics, applied mathematics and computer sciences, environment protection, materials science, propulsion, electric and solar power, guidance and navigation, communication and information sciences, general physics, and chemistry.

  8. Projected Employment Scenarios Show Possible Shortages in Some Engineering and Computer Specialties. Science Resources Studies Highlights.

    ERIC Educational Resources Information Center

    National Science Foundation, Washington, DC. Div. of Science Resources Studies.

    This report presents findings of a National Science Foundation (NSF) assessment of the adequacy of the supply of science, engineering, and technician (SET) personnel for meeting defense and non-defense requirements between 1982 and 1987. Selected findings included in the report follow. Based on four scenarios representing combinations of low and…

  9. Undergraduate Research in Physics as a course for Engineering and Computer Science Majors

    NASA Astrophysics Data System (ADS)

    O'Brien, James; Rueckert, Franz; Sirokman, Greg

    2017-01-01

    Undergraduate research has become more and more integral to the functioning of higher educational institutions. At many institutions undergraduate research is conducted as capstone projects in the pure sciences, however, science faculty at some schools (including that of the authors) face the challenge of not having science majors. Even at these institutions, a select population of high achieving engineering students will often express a keen interest in conducting pure science research. Since a foray into science research provides the student the full exposure to the scientific method and scientific collaboration, the experience can be quite rewarding and beneficial to the development of the student as a professional. To this end, the authors have been working to find new contexts in which to offer research experiences to non- science majors, including a new undergraduate research class conducted by physics and chemistry faculty. An added benefit is that these courses are inherently interdisciplinary. Students in the engineering and computer science fields step into physics and chemistry labs to solve science problems, often invoking their own relevant expertise. In this paper we start by discussing the common themes and outcomes of the course. We then discuss three particular projects that were conducted with engineering students and focus on how the undergraduate research experience enhanced their already rigorous engineering curriculum.

  10. An Inquiry Safari

    ERIC Educational Resources Information Center

    Thomson, Norman; Chapman, Seri

    2004-01-01

    The Virtual Gorilla Modeling Project--a professional development project--is a collaboration of middle and high school inservice teachers, Zoo Atlanta primatologists, science and computer educators, and students. During a 10-day professional development summer workshop, middle and high school teachers explore the world of the gorilla through…

  11. MU-SPIN Project Update

    NASA Technical Reports Server (NTRS)

    Harrington, James L., Jr.

    2000-01-01

    The Minority University Space Interdisciplinary (MUSPIN) Network project is a comprehensive outreach and education initiative that focuses on the transfer of advanced computer networking technologies and relevant science to Historically Black Colleges and Universities (HBCU's) and Other Minority Universities (OMU's) for supporting multi-disciplinary education research.

  12. Final Report Scalable Analysis Methods and In Situ Infrastructure for Extreme Scale Knowledge Discovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Leary, Patrick

    The primary challenge motivating this project is the widening gap between the ability to compute information and to store it for subsequent analysis. This gap adversely impacts science code teams, who can perform analysis only on a small fraction of the data they calculate, resulting in the substantial likelihood of lost or missed science, when results are computed but not analyzed. Our approach is to perform as much analysis or visualization processing on data while it is still resident in memory, which is known as in situ processing. The idea in situ processing was not new at the time ofmore » the start of this effort in 2014, but efforts in that space were largely ad hoc, and there was no concerted effort within the research community that aimed to foster production-quality software tools suitable for use by Department of Energy (DOE) science projects. Our objective was to produce and enable the use of production-quality in situ methods and infrastructure, at scale, on DOE high-performance computing (HPC) facilities, though we expected to have an impact beyond DOE due to the widespread nature of the challenges, which affect virtually all large-scale computational science efforts. To achieve this objective, we engaged in software technology research and development (R&D), in close partnerships with DOE science code teams, to produce software technologies that were shown to run efficiently at scale on DOE HPC platforms.« less

  13. BioSIGHT: Interactive Visualization Modules for Science Education

    NASA Technical Reports Server (NTRS)

    Wong, Wee Ling

    1998-01-01

    Redefining science education to harness emerging integrated media technologies with innovative pedagogical goals represents a unique challenge. The Integrated Media Systems Center (IMSC) is the only engineering research center in the area of multimedia and creative technologies sponsored by the National Science Foundation. The research program at IMSC is focused on developing advanced technologies that address human-computer interfaces, database management, and high-speed network capabilities. The BioSIGHT project at is a demonstration technology project in the area of education that seeks to address how such emerging multimedia technologies can make an impact on science education. The scope of this project will help solidify NASA's commitment for the development of innovative educational resources that promotes science literacy for our students and the general population as well. These issues must be addressed as NASA marches toward the goal of enabling human space exploration that requires an understanding of life sciences in space. The IMSC BioSIGHT lab was established with the purpose of developing a novel methodology that will map a high school biology curriculum into a series of interactive visualization modules that can be easily incorporated into a space biology curriculum. Fundamental concepts in general biology must be mastered in order to allow a better understanding and application for space biology. Interactive visualization is a powerful component that can capture the students' imagination, facilitate their assimilation of complex ideas, and help them develop integrated views of biology. These modules will augment the role of the teacher and will establish the value of student-centered interactivity, both in an individual setting as well as in a collaborative learning environment. Students will be able to interact with the content material, explore new challenges, and perform virtual laboratory simulations. The BioSIGHT effort is truly cross-disciplinary in nature and requires expertise from many areas including Biology, Computer Science Electrical Engineering, Education, and the Cognitive Sciences. The BioSIGHT team includes a scientific illustrator, educational software designer, computer programmers as well as IMSC graduate and undergraduate students.

  14. Interdisciplinary multiinstitutional alliances in support of educational programs for health sciences librarians.

    PubMed Central

    Smith, L C

    1996-01-01

    This project responds to the need to identify the knowledge, skills, and expertise required by health sciences librarians in the future and to devise mechanisms for providing this requisite training. The approach involves interdisciplinary multiinstitutional alliances with collaborators drawn from two graduate schools of library and information science (University of Illinois at Urbana-Champaign and Indiana University) and two medical schools (University of Illinois at Chicago and Washington University). The project encompasses six specific aims: (1) investigate the evolving role of the health sciences librarian; (2) analyze existing programs of study in library and information science at all levels at Illinois and Indiana; (3) develop opportunities for practicums, internships, and residencies; (4) explore the possibilities of computing and communication technologies to enhance instruction; (5) identify mechanisms to encourage faculty and graduate students to participate in medical informatics research projects; and (6) create recruitment strategies to achieve better representation of currently underrepresented groups. The project can serve as a model for other institutions interested in regional collaboration to enhance graduate education for health sciences librarianship. PMID:8913560

  15. India's Computational Biology Growth and Challenges.

    PubMed

    Chakraborty, Chiranjib; Bandyopadhyay, Sanghamitra; Agoramoorthy, Govindasamy

    2016-09-01

    India's computational science is growing swiftly due to the outburst of internet and information technology services. The bioinformatics sector of India has been transforming rapidly by creating a competitive position in global bioinformatics market. Bioinformatics is widely used across India to address a wide range of biological issues. Recently, computational researchers and biologists are collaborating in projects such as database development, sequence analysis, genomic prospects and algorithm generations. In this paper, we have presented the Indian computational biology scenario highlighting bioinformatics-related educational activities, manpower development, internet boom, service industry, research activities, conferences and trainings undertaken by the corporate and government sectors. Nonetheless, this new field of science faces lots of challenges.

  16. Project LASER

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA formally launched Project LASER (Learning About Science, Engineering and Research) in March 1990, a program designed to help teachers improve science and mathematics education and to provide 'hands on' experiences. It featured the first LASER Mobile Teacher Resource Center (MTRC), is designed to reach educators all over the nation. NASA hopes to operate several MTRCs with funds provided by private industry. The mobile unit is a 22-ton tractor-trailer stocked with NASA educational publications and outfitted with six work stations. Each work station, which can accommodate two teachers at a time, has a computer providing access to NASA Spacelink. Each also has video recorders and photocopy/photographic equipment for the teacher's use. MTRC is only one of the five major elements within LASER. The others are: a Space Technology Course, to promote integration of space science studies with traditional courses; the Volunteer Databank, in which NASA employees are encouraged to volunteer as tutors, instructors, etc; Mobile Discovery Laboratories that will carry simple laboratory equipment and computers to provide hands-on activities for students and demonstrations of classroom activities for teachers; and the Public Library Science Program which will present library based science and math programs.

  17. Biological and Environmental Research Exascale Requirements Review. An Office of Science review sponsored jointly by Advanced Scientific Computing Research and Biological and Environmental Research, March 28-31, 2016, Rockville, Maryland

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arkin, Adam; Bader, David C.; Coffey, Richard

    Understanding the fundamentals of genomic systems or the processes governing impactful weather patterns are examples of the types of simulation and modeling performed on the most advanced computing resources in America. High-performance computing and computational science together provide a necessary platform for the mission science conducted by the Biological and Environmental Research (BER) office at the U.S. Department of Energy (DOE). This report reviews BER’s computing needs and their importance for solving some of the toughest problems in BER’s portfolio. BER’s impact on science has been transformative. Mapping the human genome, including the U.S.-supported international Human Genome Project that DOEmore » began in 1987, initiated the era of modern biotechnology and genomics-based systems biology. And since the 1950s, BER has been a core contributor to atmospheric, environmental, and climate science research, beginning with atmospheric circulation studies that were the forerunners of modern Earth system models (ESMs) and by pioneering the implementation of climate codes onto high-performance computers. See http://exascaleage.org/ber/ for more information.« less

  18. The Human Genome Project: big science transforms biology and medicine.

    PubMed

    Hood, Leroy; Rowen, Lee

    2013-01-01

    The Human Genome Project has transformed biology through its integrated big science approach to deciphering a reference human genome sequence along with the complete sequences of key model organisms. The project exemplifies the power, necessity and success of large, integrated, cross-disciplinary efforts - so-called 'big science' - directed towards complex major objectives. In this article, we discuss the ways in which this ambitious endeavor led to the development of novel technologies and analytical tools, and how it brought the expertise of engineers, computer scientists and mathematicians together with biologists. It established an open approach to data sharing and open-source software, thereby making the data resulting from the project accessible to all. The genome sequences of microbes, plants and animals have revolutionized many fields of science, including microbiology, virology, infectious disease and plant biology. Moreover, deeper knowledge of human sequence variation has begun to alter the practice of medicine. The Human Genome Project has inspired subsequent large-scale data acquisition initiatives such as the International HapMap Project, 1000 Genomes, and The Cancer Genome Atlas, as well as the recently announced Human Brain Project and the emerging Human Proteome Project.

  19. Neural network based visualization of collaborations in a citizen science project

    NASA Astrophysics Data System (ADS)

    Morais, Alessandra M. M.; Santos, Rafael D. C.; Raddick, M. Jordan

    2014-05-01

    Citizen science projects are those in which volunteers are asked to collaborate in scientific projects, usually by volunteering idle computer time for distributed data processing efforts or by actively labeling or classifying information - shapes of galaxies, whale sounds, historical records are all examples of citizen science projects in which users access a data collecting system to label or classify images and sounds. In order to be successful, a citizen science project must captivate users and keep them interested on the project and on the science behind it, increasing therefore the time the users spend collaborating with the project. Understanding behavior of citizen scientists and their interaction with the data collection systems may help increase the involvement of the users, categorize them accordingly to different parameters, facilitate their collaboration with the systems, design better user interfaces, and allow better planning and deployment of similar projects and systems. Users behavior can be actively monitored or derived from their interaction with the data collection systems. Records of the interactions can be analyzed using visualization techniques to identify patterns and outliers. In this paper we present some results on the visualization of more than 80 million interactions of almost 150 thousand users with the Galaxy Zoo I citizen science project. Visualization of the attributes extracted from their behaviors was done with a clustering neural network (the Self-Organizing Map) and a selection of icon- and pixel-based techniques. These techniques allows the visual identification of groups of similar behavior in several different ways.

  20. Advances in Grid Computing for the FabrIc for Frontier Experiments Project at Fermialb

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herner, K.; Alba Hernandex, A. F.; Bhat, S.

    The FabrIc for Frontier Experiments (FIFE) project is a major initiative within the Fermilab Scientic Computing Division charged with leading the computing model for Fermilab experiments. Work within the FIFE project creates close collaboration between experimenters and computing professionals to serve high-energy physics experiments of diering size, scope, and physics area. The FIFE project has worked to develop common tools for job submission, certicate management, software and reference data distribution through CVMFS repositories, robust data transfer, job monitoring, and databases for project tracking. Since the projects inception the experiments under the FIFE umbrella have signicantly matured, and present an increasinglymore » complex list of requirements to service providers. To meet these requirements, the FIFE project has been involved in transitioning the Fermilab General Purpose Grid cluster to support a partitionable slot model, expanding the resources available to experiments via the Open Science Grid, assisting with commissioning dedicated high-throughput computing resources for individual experiments, supporting the eorts of the HEP Cloud projects to provision a variety of back end resources, including public clouds and high performance computers, and developing rapid onboarding procedures for new experiments and collaborations. The larger demands also require enhanced job monitoring tools, which the project has developed using such tools as ElasticSearch and Grafana. in helping experiments manage their large-scale production work ows. This group in turn requires a structured service to facilitate smooth management of experiment requests, which FIFE provides in the form of the Production Operations Management Service (POMS). POMS is designed to track and manage requests from the FIFE experiments to run particular work ows, and support troubleshooting and triage in case of problems. Recently a new certicate management infrastructure called Distributed Computing Access with Federated Identities (DCAFI) has been put in place that has eliminated our dependence on a Fermilab-specic third-party Certicate Authority service and better accommodates FIFE collaborators without a Fermilab Kerberos account. DCAFI integrates the existing InCommon federated identity infrastructure, CILogon Basic CA, and a MyProxy service using a new general purpose open source tool. We will discuss the general FIFE onboarding strategy, progress in expanding FIFE experiments presence on the Open Science Grid, new tools for job monitoring, the POMS service, and the DCAFI project.« less

  1. Advances in Grid Computing for the Fabric for Frontier Experiments Project at Fermilab

    NASA Astrophysics Data System (ADS)

    Herner, K.; Alba Hernandez, A. F.; Bhat, S.; Box, D.; Boyd, J.; Di Benedetto, V.; Ding, P.; Dykstra, D.; Fattoruso, M.; Garzoglio, G.; Kirby, M.; Kreymer, A.; Levshina, T.; Mazzacane, A.; Mengel, M.; Mhashilkar, P.; Podstavkov, V.; Retzke, K.; Sharma, N.; Teheran, J.

    2017-10-01

    The Fabric for Frontier Experiments (FIFE) project is a major initiative within the Fermilab Scientific Computing Division charged with leading the computing model for Fermilab experiments. Work within the FIFE project creates close collaboration between experimenters and computing professionals to serve high-energy physics experiments of differing size, scope, and physics area. The FIFE project has worked to develop common tools for job submission, certificate management, software and reference data distribution through CVMFS repositories, robust data transfer, job monitoring, and databases for project tracking. Since the projects inception the experiments under the FIFE umbrella have significantly matured, and present an increasingly complex list of requirements to service providers. To meet these requirements, the FIFE project has been involved in transitioning the Fermilab General Purpose Grid cluster to support a partitionable slot model, expanding the resources available to experiments via the Open Science Grid, assisting with commissioning dedicated high-throughput computing resources for individual experiments, supporting the efforts of the HEP Cloud projects to provision a variety of back end resources, including public clouds and high performance computers, and developing rapid onboarding procedures for new experiments and collaborations. The larger demands also require enhanced job monitoring tools, which the project has developed using such tools as ElasticSearch and Grafana. in helping experiments manage their large-scale production workflows. This group in turn requires a structured service to facilitate smooth management of experiment requests, which FIFE provides in the form of the Production Operations Management Service (POMS). POMS is designed to track and manage requests from the FIFE experiments to run particular workflows, and support troubleshooting and triage in case of problems. Recently a new certificate management infrastructure called Distributed Computing Access with Federated Identities (DCAFI) has been put in place that has eliminated our dependence on a Fermilab-specific third-party Certificate Authority service and better accommodates FIFE collaborators without a Fermilab Kerberos account. DCAFI integrates the existing InCommon federated identity infrastructure, CILogon Basic CA, and a MyProxy service using a new general purpose open source tool. We will discuss the general FIFE onboarding strategy, progress in expanding FIFE experiments presence on the Open Science Grid, new tools for job monitoring, the POMS service, and the DCAFI project.

  2. Student and Staff Perceptions of Key Aspects of Computer Science Engineering Capstone Projects

    ERIC Educational Resources Information Center

    Olarte, Juan José; Dominguez, César; Jaime, Arturo; Garcia-Izquierdo, Francisco José

    2016-01-01

    In carrying out their capstone projects, students use knowledge and skills acquired throughout their degree program to create a product or provide a technical service. An assigned advisor guides the students and supervises the work, and a committee assesses the projects. This study compares student and staff perceptions of key aspects of…

  3. Education through the prism of computation

    NASA Astrophysics Data System (ADS)

    Kaurov, Vitaliy

    2014-03-01

    With the rapid development of technology, computation claims its irrevocable place among research components of modern science. Thus to foster a successful future scientist, engineer or educator we need to add computation to the foundations of scientific education. We will discuss what type of paradigm shifts it brings to these foundations on the example of Wolfram Science Summer School. It is one of the most advanced computational outreach programs run by Wolfram Foundation, welcoming participants of almost all ages and backgrounds. Centered on complexity science and physics, it also covers numerous adjacent and interdisciplinary fields such as finance, biology, medicine and even music. We will talk about educational and research experiences in this program during the 12 years of its existence. We will review statistics and outputs the program has produced. Among these are interactive electronic publications at the Wolfram Demonstrations Project and contributions to the computational knowledge engine Wolfram|Alpa.

  4. Automated metadata--final project report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schissel, David

    This report summarizes the work of the Automated Metadata, Provenance Cataloging, and Navigable Interfaces: Ensuring the Usefulness of Extreme-Scale Data Project (MPO Project) funded by the United States Department of Energy (DOE), Offices of Advanced Scientific Computing Research and Fusion Energy Sciences. Initially funded for three years starting in 2012, it was extended for 6 months with additional funding. The project was a collaboration between scientists at General Atomics, Lawrence Berkley National Laboratory (LBNL), and Massachusetts Institute of Technology (MIT). The group leveraged existing computer science technology where possible, and extended or created new capabilities where required. The MPO projectmore » was able to successfully create a suite of software tools that can be used by a scientific community to automatically document their scientific workflows. These tools were integrated into workflows for fusion energy and climate research illustrating the general applicability of the project’s toolkit. Feedback was very positive on the project’s toolkit and the value of such automatic workflow documentation to the scientific endeavor.« less

  5. Learning Science in Grades 3 8 Using Probeware and Computers: Findings from the TEEMSS II Project

    NASA Astrophysics Data System (ADS)

    Zucker, Andrew A.; Tinker, Robert; Staudt, Carolyn; Mansfield, Amie; Metcalf, Shari

    2008-02-01

    The Technology Enhanced Elementary and Middle School Science II project (TEEMSS), funded by the National Science Foundation, produced 15 inquiry-based instructional science units for teaching in grades 3-8. Each unit uses computers and probeware to support students' investigations of real-world phenomena using probes (e.g., for temperature or pressure) or, in one case, virtual environments based on mathematical models. TEEMSS units were used in more than 100 classrooms by over 60 teachers and thousands of students. This paper reports on cases in which groups of teachers taught science topics without TEEMSS materials in school year 2004-2005 and then the same teachers taught those topics using TEEMSS materials in 2005-2006. There are eight TEEMSS units for which such comparison data are available. Students showed significant learning gains for all eight. In four cases (sound and electricity, both for grades 3-4; temperature, grades 5-6; and motion, grades 7-8) there were significant differences in science learning favoring the students who used the TEEMSS materials. The effect sizes are 0.58, 0.94, 1.54, and 0.49, respectively. For the other four units there were no significant differences in science learning between TEEMSS and non-TEEMSS students. We discuss the implications of these results for science education.

  6. Pedagogical Approaches for Technology-Integrated Science Teaching

    ERIC Educational Resources Information Center

    Hennessy, Sara; Wishart, Jocelyn; Whitelock, Denise; Deaney, Rosemary; Brawn, Richard; la Velle, Linda; McFarlane, Angela; Ruthven, Kenneth; Winterbottom, Mark

    2007-01-01

    The two separate projects described have examined how teachers exploit computer-based technologies in supporting learning of science at secondary level. This paper examines how pedagogical approaches associated with these technological tools are adapted to both the cognitive and structuring resources available in the classroom setting. Four…

  7. Social Studies: Application Units. Course II, Teachers. Computer-Oriented Curriculum. REACT (Relevant Educational Applications of Computer Technology).

    ERIC Educational Resources Information Center

    Tecnica Education Corp., San Carlos, CA.

    This book is one of a series in Course II of the Relevant Educational Applications of Computer Technology (REACT) Project. It is designed to point out to teachers two of the major applications of computers in the social sciences: simulation and data analysis. The first section contains a variety of simulation units organized under the following…

  8. Reassessing the English Course Offered to Computer Engineering Students at the National School of Applied Sciences of Al-Hoceima in Morocco: An Action Research Project

    ERIC Educational Resources Information Center

    Dahbi, M.

    2015-01-01

    In computer engineering education, specific English language practices are needed to enable computer engineering students to succeed in professional settings. This study was conducted for two purposes. First, it aimed at investigating to what extent the English courses offered to computer engineering students at the National School of Applied…

  9. Tools and Techniques for Measuring and Improving Grid Performance

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Frumkin, M.; Smith, W.; VanderWijngaart, R.; Wong, P.; Biegel, Bryan (Technical Monitor)

    2001-01-01

    This viewgraph presentation provides information on NASA's geographically dispersed computing resources, and the various methods by which the disparate technologies are integrated within a nationwide computational grid. Many large-scale science and engineering projects are accomplished through the interaction of people, heterogeneous computing resources, information systems and instruments at different locations. The overall goal is to facilitate the routine interactions of these resources to reduce the time spent in design cycles, particularly for NASA's mission critical projects. The IPG (Information Power Grid) seeks to implement NASA's diverse computing resources in a fashion similar to the way in which electric power is made available.

  10. Fusion Simulation Project Workshop Report

    NASA Astrophysics Data System (ADS)

    Kritz, Arnold; Keyes, David

    2009-03-01

    The mission of the Fusion Simulation Project is to develop a predictive capability for the integrated modeling of magnetically confined plasmas. This FSP report adds to the previous activities that defined an approach to integrated modeling in magnetic fusion. These previous activities included a Fusion Energy Sciences Advisory Committee panel that was charged to study integrated simulation in 2002. The report of that panel [Journal of Fusion Energy 20, 135 (2001)] recommended the prompt initiation of a Fusion Simulation Project. In 2003, the Office of Fusion Energy Sciences formed a steering committee that developed a project vision, roadmap, and governance concepts [Journal of Fusion Energy 23, 1 (2004)]. The current FSP planning effort involved 46 physicists, applied mathematicians and computer scientists, from 21 institutions, formed into four panels and a coordinating committee. These panels were constituted to consider: Status of Physics Components, Required Computational and Applied Mathematics Tools, Integration and Management of Code Components, and Project Structure and Management. The ideas, reported here, are the products of these panels, working together over several months and culminating in a 3-day workshop in May 2007.

  11. The WHATs and HOWs of maturing computational and software engineering skills in Russian higher education institutions

    NASA Astrophysics Data System (ADS)

    Semushin, I. V.; Tsyganova, J. V.; Ugarov, V. V.; Afanasova, A. I.

    2018-05-01

    Russian higher education institutions' tradition of teaching large-enrolled classes is impairing student striving for individual prominence, one-upmanship, and hopes for originality. Intending to converting these drawbacks into benefits, a Project-Centred Education Model (PCEM) has been introduced to deliver Computational Mathematics and Information Science courses. The model combines a Frontal Competitive Approach and a Project-Driven Learning (PDL) framework. The PDL framework has been developed by stating and solving three design problems: (i) enhance the diversity of project assignments on specific computation methods algorithmic approaches, (ii) balance similarity and dissimilarity of the project assignments, and (iii) develop a software assessment tool suitable for evaluating the technological maturity of students' project deliverables and thus reducing instructor's workload and possible overlook. The positive experience accumulated over 15 years shows that implementing the PCEM keeps students motivated to strive for success in rising to higher levels of their computational and software engineering skills.

  12. Physics Notes.

    ERIC Educational Resources Information Center

    School Science Review, 1983

    1983-01-01

    Discusses the Rugby clock as a source of project material, use of ZX81 for experimental science, computer dice analog, oil recovery from reservoirs, and computer simulation of Thompson's experiment for determining e/m for an electron. Activities/procedures are provided when applicable. Also presents questions (and answers) related to time-coded…

  13. Connecting Biology and Organic Chemistry Introductory Laboratory Courses through a Collaborative Research Project

    ERIC Educational Resources Information Center

    Boltax, Ariana L.; Armanious, Stephanie; Kosinski-Collins, Melissa S.; Pontrello, Jason K.

    2015-01-01

    Modern research often requires collaboration of experts in fields, such as math, chemistry, biology, physics, and computer science to develop unique solutions to common problems. Traditional introductory undergraduate laboratory curricula in the sciences often do not emphasize connections possible between the various disciplines. We designed an…

  14. Richer Connections to Robotics through Project Personalization

    ERIC Educational Resources Information Center

    Veltman, Melanie; Davidson, Valerie; Deyell, Bethany

    2012-01-01

    In this work, we describe youth outreach activities carried out under the Chair for Women in Science and Engineering for Ontario (CWSE-ON) program. Specifically, we outline our design and implementation of robotics workshops to introduce and engage middle and secondary school students in engineering and computer science. Toward the goal of…

  15. How Reliable is the Temperature Forecast?

    ERIC Educational Resources Information Center

    Christmann, Edwin P.

    2005-01-01

    Project 2061 suggests "technology provides the eyes and ears of science--and some of the muscle too. The electronic computer, for example, has led to substantial progress in the study of weather systems...." Obviously, now that teachers have access to a kaleidoscope of technological advancements, middle school science teachers can engage students…

  16. Simulation and Collaborative Learning in Political Science and Sociology Classrooms.

    ERIC Educational Resources Information Center

    Peters, Sandra; Saxon, Deborah

    The program described here used cooperative, content-based computer writing projects to teach Japanese students at an intermediate level of English proficiency enrolled in first-year, English-language courses in political science/environmental issues and sociology/environmental issues in an international college program. The approach was taken to…

  17. Applied Information Systems Research Program Workshop

    NASA Technical Reports Server (NTRS)

    Bredekamp, Joe

    1991-01-01

    Viewgraphs on Applied Information Systems Research Program Workshop are presented. Topics covered include: the Earth Observing System Data and Information System; the planetary data system; Astrophysics Data System project review; OAET Computer Science and Data Systems Programs; the Center of Excellence in Space Data and Information Sciences; and CASIS background.

  18. Course Syllabus: Science, Technology and Society.

    ERIC Educational Resources Information Center

    Garner, Douglas

    1985-01-01

    Describes the aims, methods, project, and topics of a course designed so that students may explore the impact of science and technology on society. Units include: technology (pro and con); nuclear deterrence; politics and technical decisions; and computers. Includes a list of audiovisual resources (with title, source, and current cost). (DH)

  19. Spinning a Web Around Forensic Science and Senior Biology.

    ERIC Educational Resources Information Center

    Harrison, Colin R.

    1999-01-01

    Discusses a project that was established to integrate computer technology, especially the Internet, into the science classroom. Argues for the importance of providing students with a program of study that exposes them to the widest possible range of ways of gathering information for problem solving. (Author/WRM)

  20. Big questions, big science: meeting the challenges of global ecology.

    PubMed

    Schimel, David; Keller, Michael

    2015-04-01

    Ecologists are increasingly tackling questions that require significant infrastucture, large experiments, networks of observations, and complex data and computation. Key hypotheses in ecology increasingly require more investment, and larger data sets to be tested than can be collected by a single investigator's or s group of investigator's labs, sustained for longer than a typical grant. Large-scale projects are expensive, so their scientific return on the investment has to justify the opportunity cost-the science foregone because resources were expended on a large project rather than supporting a number of individual projects. In addition, their management must be accountable and efficient in the use of significant resources, requiring the use of formal systems engineering and project management to mitigate risk of failure. Mapping the scientific method into formal project management requires both scientists able to work in the context, and a project implementation team sensitive to the unique requirements of ecology. Sponsoring agencies, under pressure from external and internal forces, experience many pressures that push them towards counterproductive project management but a scientific community aware and experienced in large project science can mitigate these tendencies. For big ecology to result in great science, ecologists must become informed, aware and engaged in the advocacy and governance of large ecological projects.

  1. Final Report: A Broad Research Project on the Sciences of Complexity, September 15, 1994 - November 15, 1999

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2000-02-01

    DOE support for a broad research program in the sciences of complexity permitted the Santa Fe Institute to initiate new collaborative research within its integrative core activities as well as to host visitors to participate in research on specific topics that serve as motivation and testing ground for the study of the general principles of complex systems. Results are presented on computational biology, biodiversity and ecosystem research, and advanced computing and simulation.

  2. Oversight Hearing on the High Performance Computing and Communications Program and Uses of the Information Highway. Hearing before the Subcommittee on Science, Technology, and Space of the Committee on Commerce, Science, and Transportation. United States Senate, One Hundred Fourth Congress, First Session.

    ERIC Educational Resources Information Center

    Congress of the U.S., Washington, DC. House Committee on Science, Space and Technology.

    This document presents witness testimony and supplemental materials from a Congressional hearing called to evaluate the progress of the High Performance Computing and Communications program in light of budget requests, to examine the appropriate role for the government in such a project, and to see demonstrations of the World Wide Web and related…

  3. Exposing the Science in Citizen Science: Fitness to Purpose and Intentional Design.

    PubMed

    Parrish, Julia K; Burgess, Hillary; Weltzin, Jake F; Fortson, Lucy; Wiggins, Andrea; Simmons, Brooke

    2018-05-21

    Citizen science is a growing phenomenon. With millions of people involved and billions of in-kind dollars contributed annually, this broad extent, fine grain approach to data collection should be garnering enthusiastic support in the mainstream science and higher education communities. However, many academic researchers demonstrate distinct biases against the use of citizen science as a source of rigorous information. To engage the public in scientific research, and the research community in the practice of citizen science, a mutual understanding is needed of accepted quality standards in science, and the corresponding specifics of project design and implementation when working with a broad public base. We define a science-based typology focused on the degree to which projects deliver the type(s) and quality of data/work needed to produce valid scientific outcomes directly useful in science and natural resource management. Where project intent includes direct contribution to science and the public is actively involved either virtually or hands-on, we examine the measures of quality assurance (methods to increase data quality during the design and implementation phases of a project) and quality control (post hoc methods to increase the quality of scientific outcomes). We suggest that high quality science can be produced with massive, largely one-off, participation if data collection is simple and quality control includes algorithm voting, statistical pruning and/or computational modeling. Small to mid-scale projects engaging participants in repeated, often complex, sampling can advance quality through expert-led training and well-designed materials, and through independent verification. Both approaches - simplification at scale and complexity with care - generate more robust science outcomes.

  4. Computational Understanding: Analysis of Sentences and Context

    DTIC Science & Technology

    1974-05-01

    Computer Science Department Stanford, California 9430b 10- PROGRAM ELEMENT. PROJECT. TASK AREA « WORK UNIT NUMBERS II. CONTROLLING OFFICE NAME...these is the need tor programs that can respond in useful ways to information expressed in a natural language. However a computational understanding...buying structure because "Mary" appears where it does. But the time for analysis was rarely over five seconds of computer time, when the Lisp program

  5. Institutional Computing Executive Group Review of Multi-programmatic & Institutional Computing, Fiscal Year 2005 and 2006

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Langer, S; Rotman, D; Schwegler, E

    The Institutional Computing Executive Group (ICEG) review of FY05-06 Multiprogrammatic and Institutional Computing (M and IC) activities is presented in the attached report. In summary, we find that the M and IC staff does an outstanding job of acquiring and supporting a wide range of institutional computing resources to meet the programmatic and scientific goals of LLNL. The responsiveness and high quality of support given to users and the programs investing in M and IC reflects the dedication and skill of the M and IC staff. M and IC has successfully managed serial capacity, parallel capacity, and capability computing resources.more » Serial capacity computing supports a wide range of scientific projects which require access to a few high performance processors within a shared memory computer. Parallel capacity computing supports scientific projects that require a moderate number of processors (up to roughly 1000) on a parallel computer. Capability computing supports parallel jobs that push the limits of simulation science. M and IC has worked closely with Stockpile Stewardship, and together they have made LLNL a premier institution for computational and simulation science. Such a standing is vital to the continued success of laboratory science programs and to the recruitment and retention of top scientists. This report provides recommendations to build on M and IC's accomplishments and improve simulation capabilities at LLNL. We recommend that institution fully fund (1) operation of the atlas cluster purchased in FY06 to support a few large projects; (2) operation of the thunder and zeus clusters to enable 'mid-range' parallel capacity simulations during normal operation and a limited number of large simulations during dedicated application time; (3) operation of the new yana cluster to support a wide range of serial capacity simulations; (4) improvements to the reliability and performance of the Lustre parallel file system; (5) support for the new GDO petabyte-class storage facility on the green network for use in data intensive external collaborations; and (6) continued support for visualization and other methods for analyzing large simulations. We also recommend that M and IC begin planning in FY07 for the next upgrade of its parallel clusters. LLNL investments in M and IC have resulted in a world-class simulation capability leading to innovative science. We thank the LLNL management for its continued support and thank the M and IC staff for its vision and dedicated efforts to make it all happen.« less

  6. Fermilab | Science at Fermilab | Experiments & Projects | Cosmic Frontier

    Science.gov Websites

    Proposed Projects and Experiments Fermilab's Tevatron Questions for the Universe Theory Computing High Answers Submit a Question Frontiers of Particle Physics Benefits to Society Benefits to Society Medicine Inquiring Minds Questions About Physics Other High-Energy Physics Sites More About Particle Physics Library

  7. An Interdisciplinary Microprocessor Project.

    ERIC Educational Resources Information Center

    Wilcox, Alan D.; And Others

    1985-01-01

    Describes an unusual project in which third-year computer science students designed and built a four-bit multiplier circuit and then combines it with software to complete a full 16-bit multiplication. The multiplier was built using TTL components, interfaced with a Z-80 microprocessor system, and programed in assembly language. (JN)

  8. Predicting Precipitation in Darwin: An Experiment with Markov Chains

    ERIC Educational Resources Information Center

    Boncek, John; Harden, Sig

    2009-01-01

    As teachers of first-year college mathematics and science students, the authors are constantly on the lookout for simple classroom exercises that improve their students' analytical and computational skills. In this article, the authors outline a project entitled "Predicting Precipitation in Darwin." In this project, students: (1) analyze…

  9. A Hybrid Cloud Computing Service for Earth Sciences

    NASA Astrophysics Data System (ADS)

    Yang, C. P.

    2016-12-01

    Cloud Computing is becoming a norm for providing computing capabilities for advancing Earth sciences including big Earth data management, processing, analytics, model simulations, and many other aspects. A hybrid spatiotemporal cloud computing service is bulit at George Mason NSF spatiotemporal innovation center to meet this demands. This paper will report the service including several aspects: 1) the hardware includes 500 computing services and close to 2PB storage as well as connection to XSEDE Jetstream and Caltech experimental cloud computing environment for sharing the resource; 2) the cloud service is geographically distributed at east coast, west coast, and central region; 3) the cloud includes private clouds managed using open stack and eucalyptus, DC2 is used to bridge these and the public AWS cloud for interoperability and sharing computing resources when high demands surfing; 4) the cloud service is used to support NSF EarthCube program through the ECITE project, ESIP through the ESIP cloud computing cluster, semantics testbed cluster, and other clusters; 5) the cloud service is also available for the earth science communities to conduct geoscience. A brief introduction about how to use the cloud service will be included.

  10. Investigation of noise properties in grating-based x-ray phase tomography with reverse projection method

    NASA Astrophysics Data System (ADS)

    Bao, Yuan; Wang, Yan; Gao, Kun; Wang, Zhi-Li; Zhu, Pei-Ping; Wu, Zi-Yu

    2015-10-01

    The relationship between noise variance and spatial resolution in grating-based x-ray phase computed tomography (PCT) imaging is investigated with reverse projection extraction method, and the noise variances of the reconstructed absorption coefficient and refractive index decrement are compared. For the differential phase contrast method, the noise variance in the differential projection images follows the same inverse-square law with spatial resolution as in conventional absorption-based x-ray imaging projections. However, both theoretical analysis and simulations demonstrate that in PCT the noise variance of the reconstructed refractive index decrement scales with spatial resolution follows an inverse linear relationship at fixed slice thickness, while the noise variance of the reconstructed absorption coefficient conforms with the inverse cubic law. The results indicate that, for the same noise variance level, PCT imaging may enable higher spatial resolution than conventional absorption computed tomography (ACT), while ACT benefits more from degraded spatial resolution. This could be a useful guidance in imaging the inner structure of the sample in higher spatial resolution. Project supported by the National Basic Research Program of China (Grant No. 2012CB825800), the Science Fund for Creative Research Groups, the Knowledge Innovation Program of the Chinese Academy of Sciences (Grant Nos. KJCX2-YW-N42 and Y4545320Y2), the National Natural Science Foundation of China (Grant Nos. 11475170, 11205157, 11305173, 11205189, 11375225, 11321503, 11179004, and U1332109).

  11. Use of PL/1 in a Bibliographic Information Retrieval System.

    ERIC Educational Resources Information Center

    Schipma, Peter B.; And Others

    The Information Sciences section of ITT Research Institute (IITRI) has developed a Computer Search Center and is currently conducting a research project to explore computer searching of a variety of machine-readable data bases. The Center provides Selective Dissemination of Information services to academic, industrial and research organizations…

  12. Blending an Android Development Course with Software Engineering Concepts

    ERIC Educational Resources Information Center

    Chatzigeorgiou, Alexander; Theodorou, Tryfon L.; Violettas, George E.; Xinogalos, Stelios

    2016-01-01

    The tremendous popularity of mobile computing and Android in particular has attracted millions of developers who see opportunities for building their own start-ups. As a consequence Computer Science students express an increasing interest into the related technology of Java development for Android applications. Android projects are complex by…

  13. Making Construals as a New Digital Skill for Learning

    ERIC Educational Resources Information Center

    Beynon, Meurig; Boyatt, Russell; Foss, Jonathan; Hall, Chris; Hudnott, Elizabeth; Russ, Steve; Sutinen, Erkki; Macleod, Hamish; Kommers, Piet

    2015-01-01

    Making construals is a practical approach to computing that was originally developed for and by computer science undergraduates. It is the central theme of an EU project aimed at disseminating the relevant principles to a broader audience. This involves bringing together technical experts in making construals and international experts in…

  14. Comprehensive report of aeropropulsion, space propulsion, space power, and space science applications of the Lewis Research Center

    NASA Technical Reports Server (NTRS)

    1988-01-01

    The research activities of the Lewis Research Center for 1988 are summarized. The projects included are within basic and applied technical disciplines essential to aeropropulsion, space propulsion, space power, and space science/applications. These disciplines are materials science and technology, structural mechanics, life prediction, internal computational fluid mechanics, heat transfer, instruments and controls, and space electronics.

  15. Carbon dioxide in the atmosphere. [and other research projects

    NASA Technical Reports Server (NTRS)

    Johnson, F. S.

    1974-01-01

    Research projects for the period ending September 15, 1973 are reported as follows: (1) the abundances of carbon dioxide in the atmosphere, and the processes by which it is released from carbonate deposits in the earth and then transferred to organic material by photosynthesis; the pathways for movement of carbon and oxygen through the atmosphere; (2) space science computation assistance by PDP computer; the performance characteristics and user instances; (3) OGO-6 data analysis studies of the variations of nighttime ion temperature in the upper atmosphere.

  16. Issues in undergraduate education in computational science and high performance computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marchioro, T.L. II; Martin, D.

    1994-12-31

    The ever increasing need for mathematical and computational literacy within their society and among members of the work force has generated enormous pressure to revise and improve the teaching of related subjects throughout the curriculum, particularly at the undergraduate level. The Calculus Reform movement is perhaps the best known example of an organized initiative in this regard. The UCES (Undergraduate Computational Engineering and Science) project, an effort funded by the Department of Energy and administered through the Ames Laboratory, is sponsoring an informal and open discussion of the salient issues confronting efforts to improve and expand the teaching of computationalmore » science as a problem oriented, interdisciplinary approach to scientific investigation. Although the format is open, the authors hope to consider pertinent questions such as: (1) How can faculty and research scientists obtain the recognition necessary to further excellence in teaching the mathematical and computational sciences? (2) What sort of educational resources--both hardware and software--are needed to teach computational science at the undergraduate level? Are traditional procedural languages sufficient? Are PCs enough? Are massively parallel platforms needed? (3) How can electronic educational materials be distributed in an efficient way? Can they be made interactive in nature? How should such materials be tied to the World Wide Web and the growing ``Information Superhighway``?« less

  17. The iPlant Collaborative: Cyberinfrastructure for Plant Biology.

    PubMed

    Goff, Stephen A; Vaughn, Matthew; McKay, Sheldon; Lyons, Eric; Stapleton, Ann E; Gessler, Damian; Matasci, Naim; Wang, Liya; Hanlon, Matthew; Lenards, Andrew; Muir, Andy; Merchant, Nirav; Lowry, Sonya; Mock, Stephen; Helmke, Matthew; Kubach, Adam; Narro, Martha; Hopkins, Nicole; Micklos, David; Hilgert, Uwe; Gonzales, Michael; Jordan, Chris; Skidmore, Edwin; Dooley, Rion; Cazes, John; McLay, Robert; Lu, Zhenyuan; Pasternak, Shiran; Koesterke, Lars; Piel, William H; Grene, Ruth; Noutsos, Christos; Gendler, Karla; Feng, Xin; Tang, Chunlao; Lent, Monica; Kim, Seung-Jin; Kvilekval, Kristian; Manjunath, B S; Tannen, Val; Stamatakis, Alexandros; Sanderson, Michael; Welch, Stephen M; Cranston, Karen A; Soltis, Pamela; Soltis, Doug; O'Meara, Brian; Ane, Cecile; Brutnell, Tom; Kleibenstein, Daniel J; White, Jeffery W; Leebens-Mack, James; Donoghue, Michael J; Spalding, Edgar P; Vision, Todd J; Myers, Christopher R; Lowenthal, David; Enquist, Brian J; Boyle, Brad; Akoglu, Ali; Andrews, Greg; Ram, Sudha; Ware, Doreen; Stein, Lincoln; Stanzione, Dan

    2011-01-01

    The iPlant Collaborative (iPlant) is a United States National Science Foundation (NSF) funded project that aims to create an innovative, comprehensive, and foundational cyberinfrastructure in support of plant biology research (PSCIC, 2006). iPlant is developing cyberinfrastructure that uniquely enables scientists throughout the diverse fields that comprise plant biology to address Grand Challenges in new ways, to stimulate and facilitate cross-disciplinary research, to promote biology and computer science research interactions, and to train the next generation of scientists on the use of cyberinfrastructure in research and education. Meeting humanity's projected demands for agricultural and forest products and the expectation that natural ecosystems be managed sustainably will require synergies from the application of information technologies. The iPlant cyberinfrastructure design is based on an unprecedented period of research community input, and leverages developments in high-performance computing, data storage, and cyberinfrastructure for the physical sciences. iPlant is an open-source project with application programming interfaces that allow the community to extend the infrastructure to meet its needs. iPlant is sponsoring community-driven workshops addressing specific scientific questions via analysis tool integration and hypothesis testing. These workshops teach researchers how to add bioinformatics tools and/or datasets into the iPlant cyberinfrastructure enabling plant scientists to perform complex analyses on large datasets without the need to master the command-line or high-performance computational services.

  18. The iPlant Collaborative: Cyberinfrastructure for Plant Biology

    PubMed Central

    Goff, Stephen A.; Vaughn, Matthew; McKay, Sheldon; Lyons, Eric; Stapleton, Ann E.; Gessler, Damian; Matasci, Naim; Wang, Liya; Hanlon, Matthew; Lenards, Andrew; Muir, Andy; Merchant, Nirav; Lowry, Sonya; Mock, Stephen; Helmke, Matthew; Kubach, Adam; Narro, Martha; Hopkins, Nicole; Micklos, David; Hilgert, Uwe; Gonzales, Michael; Jordan, Chris; Skidmore, Edwin; Dooley, Rion; Cazes, John; McLay, Robert; Lu, Zhenyuan; Pasternak, Shiran; Koesterke, Lars; Piel, William H.; Grene, Ruth; Noutsos, Christos; Gendler, Karla; Feng, Xin; Tang, Chunlao; Lent, Monica; Kim, Seung-Jin; Kvilekval, Kristian; Manjunath, B. S.; Tannen, Val; Stamatakis, Alexandros; Sanderson, Michael; Welch, Stephen M.; Cranston, Karen A.; Soltis, Pamela; Soltis, Doug; O'Meara, Brian; Ane, Cecile; Brutnell, Tom; Kleibenstein, Daniel J.; White, Jeffery W.; Leebens-Mack, James; Donoghue, Michael J.; Spalding, Edgar P.; Vision, Todd J.; Myers, Christopher R.; Lowenthal, David; Enquist, Brian J.; Boyle, Brad; Akoglu, Ali; Andrews, Greg; Ram, Sudha; Ware, Doreen; Stein, Lincoln; Stanzione, Dan

    2011-01-01

    The iPlant Collaborative (iPlant) is a United States National Science Foundation (NSF) funded project that aims to create an innovative, comprehensive, and foundational cyberinfrastructure in support of plant biology research (PSCIC, 2006). iPlant is developing cyberinfrastructure that uniquely enables scientists throughout the diverse fields that comprise plant biology to address Grand Challenges in new ways, to stimulate and facilitate cross-disciplinary research, to promote biology and computer science research interactions, and to train the next generation of scientists on the use of cyberinfrastructure in research and education. Meeting humanity's projected demands for agricultural and forest products and the expectation that natural ecosystems be managed sustainably will require synergies from the application of information technologies. The iPlant cyberinfrastructure design is based on an unprecedented period of research community input, and leverages developments in high-performance computing, data storage, and cyberinfrastructure for the physical sciences. iPlant is an open-source project with application programming interfaces that allow the community to extend the infrastructure to meet its needs. iPlant is sponsoring community-driven workshops addressing specific scientific questions via analysis tool integration and hypothesis testing. These workshops teach researchers how to add bioinformatics tools and/or datasets into the iPlant cyberinfrastructure enabling plant scientists to perform complex analyses on large datasets without the need to master the command-line or high-performance computational services. PMID:22645531

  19. Realistic Covariance Prediction for the Earth Science Constellation

    NASA Technical Reports Server (NTRS)

    Duncan, Matthew; Long, Anne

    2006-01-01

    Routine satellite operations for the Earth Science Constellation (ESC) include collision risk assessment between members of the constellation and other orbiting space objects. One component of the risk assessment process is computing the collision probability between two space objects. The collision probability is computed using Monte Carlo techniques as well as by numerically integrating relative state probability density functions. Each algorithm takes as inputs state vector and state vector uncertainty information for both objects. The state vector uncertainty information is expressed in terms of a covariance matrix. The collision probability computation is only as good as the inputs. Therefore, to obtain a collision calculation that is a useful decision-making metric, realistic covariance matrices must be used as inputs to the calculation. This paper describes the process used by the NASA/Goddard Space Flight Center's Earth Science Mission Operations Project to generate realistic covariance predictions for three of the Earth Science Constellation satellites: Aqua, Aura and Terra.

  20. An evaluation of the role of email in promoting science investigative skills in primary rural schools in England

    NASA Astrophysics Data System (ADS)

    Jarvis, Tina; Hargreaves, Linda; Comber, Chris

    1997-06-01

    This project evaluated the effect of collaboration via email links on the quality of 10-11 year old students’ science investigative skills in six primary rural schools. After a joint planning meeting, sixty children collected, identified and shared information via email about moths in their area, in order to produce a joint booklet. All email traffic was monitored throughout the project. Indepth structured observations and interviews were carried out at the schools. Children completed daily diaries. The children demonstrated a variety of science skills, particularly observation and recording. Their competence and confidence in using computers, handling email and in manipulating a data base developed during the project. The project identified a number of important issues relating to teacher inservice training requirements, the importance of a suitable progression of IT experiences throughout the school, development in cooperative groupwork for children, and software design.

  1. Mentoring the Next Generation of Science Gateway Developers and Users

    NASA Astrophysics Data System (ADS)

    Hayden, L. B.; Jackson-Ward, F.

    2016-12-01

    The Science Gateway Institute (SGW-I) for the Democratization and Acceleration of Science was a SI2-SSE Collaborative Research conceptualization award funded by NSF in 2012. From 2012 through 2015, we engaged interested members of the science and engineering community in a planning process for a Science Gateway Community Institute (SGCI). Science Gateways provide Web interfaces to some of the most sophisticated cyberinfrastructure resources. They interact with remotely executing science applications on supercomputers, they connect to remote scientific data collections, instruments and sensor streams, and support large collaborations. Gateways allow scientists to concentrate on the most challenging science problems while underlying components such as computing architectures and interfaces to data collection changes. The goal of our institute was to provide coordinating activities across the National Science Foundation, eventually providing services more broadly to projects funded by other agencies. SGW-I has succeeded in identifying two underrepresented communities of future gateway designers and users. The Association of Computer and Information Science/Engineering Departments at Minority Institutions (ADMI) was identified as a source of future gateway designers. The National Organization for the Professional Advancement of Black Chemists and Chemical Engineers (NOBCChE) was identified as a community of future science gateway users. SGW-I efforts to engage NOBCChE and ADMI faculty and students in SGW-I are now woven into the workforce development component of SGCI. SGCI (ScienceGateways.org ) is a collaboration of six universities, led by San Diego Supercomputer Center. The workforce development component is led by Elizabeth City State University (ECSU). ECSU efforts focus is on: Produce a model of engagement; Integration of research into education; and Mentoring of students while aggressively addressing diversity. This paper documents the outcome of the SGW-I conceptualization project and describes the extensive Workforce Development effort going forward into the 5-year SGCI project recently funded by NSF.

  2. The diversity and evolution of ecological and environmental citizen science

    PubMed Central

    Tweddle, John C.; Savage, Joanna; Robinson, Lucy D.; Roy, Helen E.

    2017-01-01

    Citizen science—the involvement of volunteers in data collection, analysis and interpretation—simultaneously supports research and public engagement with science, and its profile is rapidly rising. Citizen science represents a diverse range of approaches, but until now this diversity has not been quantitatively explored. We conducted a systematic internet search and discovered 509 environmental and ecological citizen science projects. We scored each project for 32 attributes based on publicly obtainable information and used multiple factor analysis to summarise this variation to assess citizen science approaches. We found that projects varied according to their methodological approach from ‘mass participation’ (e.g. easy participation by anyone anywhere) to ‘systematic monitoring’ (e.g. trained volunteers repeatedly sampling at specific locations). They also varied in complexity from approaches that are ‘simple’ to those that are ‘elaborate’ (e.g. provide lots of support to gather rich, detailed datasets). There was a separate cluster of entirely computer-based projects but, in general, we found that the range of citizen science projects in ecology and the environment showed continuous variation and cannot be neatly categorised into distinct types of activity. While the diversity of projects begun in each time period (pre 1990, 1990–99, 2000–09 and 2010–13) has not increased, we found that projects tended to have become increasingly different from each other as time progressed (possibly due to changing opportunities, including technological innovation). Most projects were still active so consequently we found that the overall diversity of active projects (available for participation) increased as time progressed. Overall, understanding the landscape of citizen science in ecology and the environment (and its change over time) is valuable because it informs the comparative evaluation of the ‘success’ of different citizen science approaches. Comparative evaluation provides an evidence-base to inform the future development of citizen science activities. PMID:28369087

  3. The Navajo Learning Network and the NASA Life Sciences/AFOSR Infrastructure Development Project

    NASA Technical Reports Server (NTRS)

    1999-01-01

    The NSF-funded Navajo Learning Network project, with help from NASA Life Sciences and AFOSR, enabled Dine College to take a giant leap forward technologically - in a way that could never had been possible had these projects been managed separately. The combination of these and other efforts created a network of over 500 computers located at ten sites across the Navajo reservation. Additionally, the college was able to install a modern telephone system which shares network data, and purchase a new higher education management system. The NASA Life Sciences funds further allowed the college library system to go online and become available to the entire campus community. NSF, NASA and AFOSR are committed to improving minority access to higher education opportunities and promoting faculty development and undergraduate research through infrastructure support and development. This project has begun to address critical inequalities in access to science, mathematics, engineering and technology for Navajo students and educators. As a result, Navajo K-12 education has been bolstered and Dine College will therefore better prepare students to transfer successfully to four-year institutions. Due to the integration of the NSF and NASA/AFOSR components of the project, a unified project report is appropriate.

  4. Computational mechanistic investigation of radiation damage of adenine induced by hydroxyl radicals

    NASA Astrophysics Data System (ADS)

    Tan, Rongri; Liu, Huixuan; Xun, Damao; Zong, Wenjun

    2018-02-01

    Not Available Project supported by the National Natural Science Foundation of China (Grant Nos. 11564015 and 61404062), the Research Fund for the Doctoral Program of China (Grant No. 3000990110), and the Fund for Distinguished Young Scholars of Jiangxi Science & Technology Normal University (Grant Nos. 2015QNBJRC002 and 2016QNBJRC006).

  5. The Force of Multimedia Slide Shows

    ERIC Educational Resources Information Center

    Santangelo, Darcy; Guy, Mark

    2004-01-01

    Many teachers look for a creative and engaging way to bring physical science topics of force and motion to life for their students. In this project, fourth-grade students weren't "forced" to investigate physical science topics--they were thrilled to! With the help of various technology tools--digital cameras, the Internet, computers, and…

  6. Soil, Weeds, and Computers

    ERIC Educational Resources Information Center

    McClennen, Nate

    2004-01-01

    Events in a community can lead to valuable learning experiences in science. By the end of the summer of 2001, the Green Knoll Fire had burned almost 4000 acres of forest south of Wilson, Wyoming. This article describes how students at the Journeys School of Teton Science Schools participated in a collaborative project with the United States Forest…

  7. University participation via UNIDATA, part 1

    NASA Technical Reports Server (NTRS)

    Dutton, J.

    1986-01-01

    The UNIDATA Project is a cooperative university project, operated by the University Corporation for Atmospheric Research (UCAR) with National Science Foundation (NSF) funding, aimed at providing interactive communication and computations to the university community in the atmospheric and oceanic sciences. The initial focus has been on providing access to data for weather analysis and prediction. However, UNIDATA is in the process of expanding and possibly providing access to the Pilot Climate Data System (PCDS) through the UNIDATA system in an effort to develop prototypes for an Earth science information system. The notion of an Earth science information system evolved from discussions within NASA and several advisory committees in anticipation of receiving data from the many Earth observing instruments on the space station complex (Earth Observing System).

  8. Bioinformatics for Exploration

    NASA Technical Reports Server (NTRS)

    Johnson, Kathy A.

    2006-01-01

    For the purpose of this paper, bioinformatics is defined as the application of computer technology to the management of biological information. It can be thought of as the science of developing computer databases and algorithms to facilitate and expedite biological research. This is a crosscutting capability that supports nearly all human health areas ranging from computational modeling, to pharmacodynamics research projects, to decision support systems within autonomous medical care. Bioinformatics serves to increase the efficiency and effectiveness of the life sciences research program. It provides data, information, and knowledge capture which further supports management of the bioastronautics research roadmap - identifying gaps that still remain and enabling the determination of which risks have been addressed.

  9. Vanderbilt University Institute of Imaging Science Center for Computational Imaging XNAT: A multimodal data archive and processing environment.

    PubMed

    Harrigan, Robert L; Yvernault, Benjamin C; Boyd, Brian D; Damon, Stephen M; Gibney, Kyla David; Conrad, Benjamin N; Phillips, Nicholas S; Rogers, Baxter P; Gao, Yurui; Landman, Bennett A

    2016-01-01

    The Vanderbilt University Institute for Imaging Science (VUIIS) Center for Computational Imaging (CCI) has developed a database built on XNAT housing over a quarter of a million scans. The database provides framework for (1) rapid prototyping, (2) large scale batch processing of images and (3) scalable project management. The system uses the web-based interfaces of XNAT and REDCap to allow for graphical interaction. A python middleware layer, the Distributed Automation for XNAT (DAX) package, distributes computation across the Vanderbilt Advanced Computing Center for Research and Education high performance computing center. All software are made available in open source for use in combining portable batch scripting (PBS) grids and XNAT servers. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Large Scale Computing and Storage Requirements for High Energy Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerber, Richard A.; Wasserman, Harvey

    2010-11-24

    The National Energy Research Scientific Computing Center (NERSC) is the leading scientific computing facility for the Department of Energy's Office of Science, providing high-performance computing (HPC) resources to more than 3,000 researchers working on about 400 projects. NERSC provides large-scale computing resources and, crucially, the support and expertise needed for scientists to make effective use of them. In November 2009, NERSC, DOE's Office of Advanced Scientific Computing Research (ASCR), and DOE's Office of High Energy Physics (HEP) held a workshop to characterize the HPC resources needed at NERSC to support HEP research through the next three to five years. Themore » effort is part of NERSC's legacy of anticipating users needs and deploying resources to meet those demands. The workshop revealed several key points, in addition to achieving its goal of collecting and characterizing computing requirements. The chief findings: (1) Science teams need access to a significant increase in computational resources to meet their research goals; (2) Research teams need to be able to read, write, transfer, store online, archive, analyze, and share huge volumes of data; (3) Science teams need guidance and support to implement their codes on future architectures; and (4) Projects need predictable, rapid turnaround of their computational jobs to meet mission-critical time constraints. This report expands upon these key points and includes others. It also presents a number of case studies as representative of the research conducted within HEP. Workshop participants were asked to codify their requirements in this case study format, summarizing their science goals, methods of solution, current and three-to-five year computing requirements, and software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, multi-core environment that is expected to dominate HPC architectures over the next few years. The report includes a section that describes efforts already underway or planned at NERSC that address requirements collected at the workshop. NERSC has many initiatives in progress that address key workshop findings and are aligned with NERSC's strategic plans.« less

  11. Building Real World Domain-Specific Social Network Websites as a Capstone Project

    ERIC Educational Resources Information Center

    Yue, Kwok-Bun; De Silva, Dilhar; Kim, Dan; Aktepe, Mirac; Nagle, Stewart; Boerger, Chris; Jain, Anubha; Verma, Sunny

    2009-01-01

    This paper describes our experience of using Content Management Software (CMS), specifically Joomla, to build a real world domain-specific social network site (SNS) as a capstone project for graduate information systems and computer science students. As Web 2.0 technologies become increasingly important in driving business application development,…

  12. Small business innovation research. Abstracts of 1988 phase 1 awards

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Non-proprietary proposal abstracts of Phase 1 Small Business Innovation Research (SBIR) projects supported by NASA are presented. Projects in the fields of aeronautical propulsion, aerodynamics, acoustics, aircraft systems, materials and structures, teleoperators and robots, computer sciences, information systems, data processing, spacecraft propulsion, bioastronautics, satellite communication, and space processing are covered.

  13. Interdisciplinary Project Experiences: Collaboration between Majors and Non-Majors

    ERIC Educational Resources Information Center

    Smarkusky, Debra L.; Toman, Sharon A.

    2014-01-01

    Students in computer science and information technology should be engaged in solving real-world problems received from government and industry as well as those that expose them to various areas of application. In this paper, we discuss interdisciplinary project experiences between majors and non-majors that offered a creative and innovative…

  14. Malaysian Education Index (MEI): An Online Indexing and Repository System

    ERIC Educational Resources Information Center

    Kabilan, Muhammad Kamarul; Ismail, Hairul Nizam; Yaakub, Rohizani; Yusof, Najeemah Mohd; Idros, Sharifah Noraidah Syed; Umar, Irfan Naufal; Arshad, Muhammad Rafie Mohd.; Idrus, Rosnah; Rahman, Habsah Abdul

    2010-01-01

    This "Project Sheet" describes an on-going project that is being carried out by a group of educational researchers, computer science researchers and librarians from Universiti Sains Malaysia, Penang. The Malaysian Education Index (MEI) has two main functions--(1) Online Indexing System, and (2) Online Repository System. In this brief…

  15. 76 FR 43347 - Notice Pursuant to the National Cooperative Research and Production Act of 1993; Network Centric...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-20

    ... circumstances. Specifically, Wakelight Technologies, Inc., Honolulu, HI; LinQuest Corporation, Los Angeles, CA; and Computer Sciences Corporation, Rockville, MD, have withdrawn as parties to this venture. In... activity of the group research project. Membership in this group research project remains open, and NCOIC...

  16. A Project-Based Biologically-Inspired Robotics Module

    ERIC Educational Resources Information Center

    Crowder, R. M.; Zauner, K.-P.

    2013-01-01

    The design of any robotic system requires input from engineers from a variety of technical fields. This paper describes a project-based module, "Biologically-Inspired Robotics," that is offered to Electronics and Computer Science students at the University of Southampton, U.K. The overall objective of the module is for student groups to…

  17. Reducing Nutrients and Nutrient Impacts Priority Issue Team - St. Louis Bay Project: Implementing Nutrients PIT Action Step 1.1

    NASA Technical Reports Server (NTRS)

    Mason, Ted

    2011-01-01

    The NASA Applied Science & Technology Project Office at Stennis Space Center(SSC) used satellites, in-situ measurements and computational modeling to study relationships between water quality in St. Louis Bay, Mississippi and the watershed characteristics of the Jourdan and Wolf rivers from 2000-2010.

  18. Integrating Bar-Code Medication Administration Competencies in the Curriculum: Implications for Nursing Education and Interprofessional Collaboration.

    PubMed

    Angel, Vini M; Friedman, Marvin H; Friedman, Andrea L

    This article describes an innovative project involving the integration of bar-code medication administration technology competencies in the nursing curriculum through interprofessional collaboration among nursing, pharmacy, and computer science disciplines. A description of the bar-code medication administration technology project and lessons learned are presented.

  19. Implementation of Project Based Learning in Mechatronic Lab Course at Bandung State Polytechnic

    ERIC Educational Resources Information Center

    Basjaruddin, Noor Cholis; Rakhman, Edi

    2016-01-01

    Mechatronics is a multidisciplinary that includes a combination of mechanics, electronics, control systems, and computer science. The main objective of mechatronics learning is to establish a comprehensive mindset in the development of mechatronic systems. Project Based Learning (PBL) is an appropriate method for use in the learning process of…

  20. A Seminar in Mathematical Model-Building.

    ERIC Educational Resources Information Center

    Smith, David A.

    1979-01-01

    A course in mathematical model-building is described. Suggested modeling projects include: urban problems, biology and ecology, economics, psychology, games and gaming, cosmology, medicine, history, computer science, energy, and music. (MK)

  1. Utilization of computer technology by science teachers in public high schools and the impact of standardized testing

    NASA Astrophysics Data System (ADS)

    Priest, Richard Harding

    A significant percentage of high school science teachers are not using computers to teach their students or prepare them for standardized testing. A survey of high school science teachers was conducted to determine how they are having students use computers in the classroom, why science teachers are not using computers in the classroom, which variables were relevant to their not using computers, and what are the effects of standardized testing on the use of technology in the high school science classroom. A self-administered questionnaire was developed to measure these aspects of computer integration and demographic information. A follow-up telephone interview survey of a portion of the original sample was conducted in order to clarify questions, correct misunderstandings, and to draw out more holistic descriptions from the subjects. The primary method used to analyze the quantitative data was frequency distributions. Multiple regression analysis was used to investigate the relationships between the barriers and facilitators and the dimensions of instructional use, frequency, and importance of the use of computers. All high school science teachers in a large urban/suburban school district were sent surveys. A response rate of 58% resulted from two mailings of the survey. It was found that contributing factors to why science teachers do not use computers were not enough up-to-date computers in their classrooms and other educational commitments and duties do not leave them enough time to prepare lessons that include technology. While a high percentage of science teachers thought their school and district administrations were supportive of technology, they also believed more inservice technology training and follow-up activities to support that training are needed and more software needs to be created. The majority of the science teachers do not use the computer to help students prepare for standardized tests because they believe they can prepare students more efficiently without a computer. Nearly half of the teachers, however, gave lack of time to prepare instructional materials and lack of a means to project a computer image to the whole class as reasons they do not use computers. A significant percentage thought science standardized testing was having a negative effect on computer use.

  2. Using "Facebook" to Improve Communication in Undergraduate Software Development Teams

    ERIC Educational Resources Information Center

    Charlton, Terence; Devlin, Marie; Drummond, Sarah

    2009-01-01

    As part of the CETL ALiC initiative (Centre of Excellence in Teaching and Learning: Active Learning in Computing), undergraduate computing science students at Newcastle and Durham universities participated in a cross-site team software development project. To ensure we offer adequate resources to support this collaboration, we conducted an…

  3. Student Response to Hypermedia in the Lecture Theatre: A Case Study.

    ERIC Educational Resources Information Center

    Conway, Damian

    The Computer Science Department at Monash University (Victoria, Australia) recently began presenting lectures using projection of a hypertext system, HyperLecture, running on a notebook computer as the primary medium. This paper presents a statistical analysis of student reactions to this approach, focusing on the effects, as perceived by the…

  4. ODU-CAUSE: Computer Based Learning Lab.

    ERIC Educational Resources Information Center

    Sachon, Michael W.; Copeland, Gary E.

    This paper describes the Computer Based Learning Lab (CBLL) at Old Dominion University (ODU) as a component of the ODU-Comprehensive Assistance to Undergraduate Science Education (CAUSE) Project. Emphasis is directed to the structure and management of the facility and to the software under development by the staff. Serving the ODU-CAUSE User Group…

  5. Creation and Development of an Integrated Model of New Technologies and ESP

    ERIC Educational Resources Information Center

    Garcia Laborda, Jesus

    2004-01-01

    It seems irrefutable that the world is progressing in concert with computer science. Educational applications and projects for first and second language acquisition have not been left behind. However, currently it seems that the reputation of completely computer-based language learning courses has taken a nosedive, and, consequently there has been…

  6. A Project-Based Learning Setting to Human-Computer Interaction for Teenagers

    ERIC Educational Resources Information Center

    Geyer, Cornelia; Geisler, Stefan

    2012-01-01

    Knowledge of fundamentals of human-computer interaction resp. usability engineering is getting more and more important in technical domains. However this interdisciplinary field of work and corresponding degree programs are not broadly known. Therefore at the Hochschule Ruhr West, University of Applied Sciences, a program was developed to give…

  7. Theory, Modeling, Software and Hardware Development for Analytical and Computational Materials Science

    NASA Technical Reports Server (NTRS)

    Young, Gerald W.; Clemons, Curtis B.

    2004-01-01

    The focus of this Cooperative Agreement between the Computational Materials Laboratory (CML) of the Processing Science and Technology Branch of the NASA Glenn Research Center (GRC) and the Department of Theoretical and Applied Mathematics at The University of Akron was in the areas of system development of the CML workstation environment, modeling of microgravity and earth-based material processing systems, and joint activities in laboratory projects. These efforts complement each other as the majority of the modeling work involves numerical computations to support laboratory investigations. Coordination and interaction between the modelers, system analysts, and laboratory personnel are essential toward providing the most effective simulations and communication of the simulation results. Toward these means, The University of Akron personnel involved in the agreement worked at the Applied Mathematics Research Laboratory (AMRL) in the Department of Theoretical and Applied Mathematics while maintaining a close relationship with the personnel of the Computational Materials Laboratory at GRC. Network communication between both sites has been established. A summary of the projects we undertook during the time period 9/1/03 - 6/30/04 is included.

  8. Earth Science Informatics - Overview

    NASA Technical Reports Server (NTRS)

    Ramapriyan, H. K.

    2015-01-01

    Over the last 10-15 years, significant advances have been made in information management, there are an increasing number of individuals entering the field of information management as it applies to Geoscience and Remote Sensing data, and the field of informatics has come to its own. Informatics is the science and technology of applying computers and computational methods to the systematic analysis, management, interchange, and representation of science data, information, and knowledge. Informatics also includes the use of computers and computational methods to support decision making and applications. Earth Science Informatics (ESI, a.k.a. geoinformatics) is the application of informatics in the Earth science domain. ESI is a rapidly developing discipline integrating computer science, information science, and Earth science. Major national and international research and infrastructure projects in ESI have been carried out or are on-going. Notable among these are: the Global Earth Observation System of Systems (GEOSS), the European Commissions INSPIRE, the U.S. NSDI and Geospatial One-Stop, the NASA EOSDIS, and the NSF DataONE, EarthCube and Cyberinfrastructure for Geoinformatics. More than 18 departments and agencies in the U.S. federal government have been active in Earth science informatics. All major space agencies in the world, have been involved in ESI research and application activities. In the United States, the Federation of Earth Science Information Partners (ESIP), whose membership includes nearly 150 organizations (government, academic and commercial) dedicated to managing, delivering and applying Earth science data, has been working on many ESI topics since 1998. The Committee on Earth Observation Satellites (CEOS)s Working Group on Information Systems and Services (WGISS) has been actively coordinating the ESI activities among the space agencies. Remote Sensing; Earth Science Informatics, Data Systems; Data Services; Metadata

  9. Uses of the Drupal CMS Collaborative Framework in the Woods Hole Scientific Community (Invited)

    NASA Astrophysics Data System (ADS)

    Maffei, A. R.; Chandler, C. L.; Work, T. T.; Shorthouse, D.; Furfey, J.; Miller, H.

    2010-12-01

    Organizations that comprise the Woods Hole scientific community (Woods Hole Oceanographic Institution, Marine Biological Laboratory, USGS Woods Hole Coastal and Marine Science Center, Woods Hole Research Center, NOAA NMFS Northeast Fisheries Science Center, SEA Education Association) have a long history of collaborative activity regarding computing, computer network and information technologies that support common, inter-disciplinary science needs. Over the past several years there has been growing interest in the use of the Drupal Content Management System (CMS) playing a variety of roles in support of research projects resident at several of these organizations. Many of these projects are part of science programs that are national and international in scope. Here we survey the current uses of Drupal within the Woods Hole scientific community and examine reasons it has been adopted. The promise of emerging semantic features in the Drupal framework is examined and projections of how pre-existing Drupal-based websites might benefit are made. Closer examination of Drupal software design exposes it as more than simply a content management system. The flexibility of its architecture; the power of its taxonomy module; the care taken in nurturing the open-source developer community that surrounds it (including organized and often well-attended code sprints); the ability to bind emerging software technologies as Drupal modules; the careful selection process used in adopting core functionality; multi-site hosting and cross-site deployment of updates and a recent trend towards development of use-case inspired Drupal distributions casts Drupal as a general-purpose application deployment framework. Recent work in the semantic arena casts Drupal as an emerging RDF framework as well. Examples of roles played by Drupal-based websites within the Woods Hole scientific community that will be discussed include: science data metadata database, organization main website, biological taxonomy development, bibliographic database, physical media data archive inventory manager, disaster-response website development framework, science project task management, science conference planning, and spreadsheet-to-database converter.

  10. Practical Steps toward Computational Unification: Helpful Perspectives for New Systems, Adding Functionality to Existing Ones

    NASA Astrophysics Data System (ADS)

    Troy, R. M.

    2005-12-01

    With ever increasing amounts of Earth-Science funding being diverted to the war in Iraq, the Earth-Science community must now more than ever wring every bit of utility out of every dollar. We're not likely to get funded any projects perceived by others as "pie in the sky", so we have to look at already funded programs within our community and directing new programs in a unifying direction. We have not yet begun the transition to a computationally unifying, general-purpose Earth Science computing paradigm, though it was proposed at the Fall 2002 AGU meeting in San Francisco, and perhaps earlier. Encouragingly, we do see a recognition that more commonality is needed as various projects have as funded goals the addition of the processing and dissemination of new datatypes, or data-sets, if you prefer, to their existing repertoires. Unfortunately, the timelines projected for adding a datatype to an existing system are typically estimated at around two years each. Further, many organizations have the perception that they can only use their dollars to support exclusively their own needs as they don't have the money to support the goals of others, thus overlooking opportunities to satisfy their own needs while at the same time aiding the creation of a global GeoScience cyber-infrastructure. While Computational Unification appears to be an unfunded, impossible dream, at least for now, individual projects can take steps that are compatible with a unified community and can help build one over time. This session explores these opportunities. The author will discuss the issues surrounding this topic, outlining alternative perspectives on the points of difficulty, and proposing straight-forward solutions which every Earth Science data processing system should consider. Sub-topics include distributed meta-data, distributed processing, distributed data objects, interdisciplinary concerns, and scientific defensibility with an overall emphasis on how previously written processes and functions may be integrated into a system efficiently, with minimal effort, and with an eye toward an eventual Computational Unification of the Earth Sciences. A fundamental to such systems is meta-data which describe not only the content of data but also how intricate relationships are represented and used to good advantage. Retrieval techniques will be discussed including trade-offs in using externally managed meta-data versus embedded meta-data, how the two may be integrated, and how "simplifying assumptions" may or may not actually be helpful. The perspectives presented in this talk or poster session are based upon the experience of the Sequoia 2000 and BigSur research projects at the University of California, Berkeley, which sought to unify NASA's Mission To Planet Earth's EOS-DIS, and on-going experience developed by Science Tools corporation, of which the author is a principal. NOTE: These ideas are most easily shared in the form of a talk, and we suspect that this session will generate a lot of interest. We would therefore prefer to have this session accepted as a talk as opposed to a poster session.

  11. XPRESS: eXascale PRogramming Environment and System Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brightwell, Ron; Sterling, Thomas; Koniges, Alice

    The XPRESS Project is one of four major projects of the DOE Office of Science Advanced Scientific Computing Research X-stack Program initiated in September, 2012. The purpose of XPRESS is to devise an innovative system software stack to enable practical and useful exascale computing around the end of the decade with near-term contributions to efficient and scalable operation of trans-Petaflops performance systems in the next two to three years; both for DOE mission-critical applications. To this end, XPRESS directly addresses critical challenges in computing of efficiency, scalability, and programmability through introspective methods of dynamic adaptive resource management and task scheduling.

  12. A Science Information Infrastructure for Access to Earth and Space Science Data through the Nation's Science Museums

    NASA Technical Reports Server (NTRS)

    Murray, S.

    1999-01-01

    In this project, we worked with the University of California at Berkeley/Center for Extreme Ultraviolet Astrophysics and five science museums (the National Air and Space Museum, the Science Museum of Virginia, the Lawrence Hall of Science, the Exploratorium., and the New York Hall of Science) to formulate plans for computer-based laboratories located at these museums. These Science Learning Laboratories would be networked and provided with real Earth and space science observations, as well as appropriate lesson plans, that would allow the general public to directly access and manipulate the actual remote sensing data, much as a scientist would.

  13. New project to support scientific collaboration electronically

    NASA Astrophysics Data System (ADS)

    Clauer, C. R.; Rasmussen, C. E.; Niciejewski, R. J.; Killeen, T. L.; Kelly, J. D.; Zambre, Y.; Rosenberg, T. J.; Stauning, P.; Friis-Christensen, E.; Mende, S. B.; Weymouth, T. E.; Prakash, A.; McDaniel, S. E.; Olson, G. M.; Finholt, T. A.; Atkins, D. E.

    A new multidisciplinary effort is linking research in the upper atmospheric and space, computer, and behavioral sciences to develop a prototype electronic environment for conducting team science worldwide. A real-world electronic collaboration testbed has been established to support scientific work centered around the experimental operations being conducted with instruments from the Sondrestrom Upper Atmospheric Research Facility in Kangerlussuaq, Greenland. Such group computing environments will become an important component of the National Information Infrastructure initiative, which is envisioned as the high-performance communications infrastructure to support national scientific research.

  14. Advanced Hard Real-Time Operating System, The Maruti Project. Part 1.

    DTIC Science & Technology

    1997-01-01

    REAL - TIME OPERATING SYSTEM , THE MARUTI PROJECT Part 1 of 2 Ashok K. Agrawala Satish K. Tripathi Department of Computer Science University of Maryland...Hard Real - Time Operating System , The Maruti Project DASG-60-92-C-0055 5b. Program Element # 62301E 6. Author(s) 5c. Project # DRPB Ashok K. Agrawala...SdSA94), a real - time operating system developed at the I3nversity of Maryland, and conducted extensive experiments under various task

  15. Embracing Diversity: The Exploration of User Motivations in Citizen Science Astronomy Projects

    NASA Astrophysics Data System (ADS)

    Lee, Lo

    2018-06-01

    Online citizen science projects ask members of the public to donate spare time on their personal computers to process large datasets. A critical challenge for these projects is volunteer recruitment and retention. Many of these projects use Berkeley Open Infrastructure for Network Computing (BOINC), a piece of middleware, to support their operations. This poster analyzes volunteer motivations in two large, BOINC-based astronomy projects, Einstein@Home and Milkyway@Home. Volunteer opinions are addressed to assess whether and how competitive elements, such as credit and ranking systems, motivate volunteers. Findings from a study of project volunteers, comprising surveys (n=2,031) and follow-up interviews (n=21), show that altruism is the main incentive for participation because volunteers consider scientific research to be critical for humans. Multiple interviewees also revealed a passion for extrinsic motivations, i.e. those that involve recognition from other people, such as opportunities to become co-authors of publications or to earn financial benefits. Credit and ranking systems motivate nearly half of interviewees. By analyzing user motivations in astronomical BOINC projects, this research provides scientists with deeper understandings about volunteer communities and various types of volunteers. Building on these findings, scientists can develop different strategies, for example, awarding volunteers badges, to recruit and retain diverse volunteers, and thus enhance long-term user participation in astronomical BOINC projects.

  16. Continued multidisciplinary project-based learning - implementation in health informatics.

    PubMed

    Wessel, C; Spreckelsen, C

    2009-01-01

    Problem- and project-based learning are approved methods to train students, graduates and post-graduates in scientific and other professional skills. The students are trained on realistic scenarios in a broader context. For students specializing in health informatics we introduced continued multidisciplinary project-based learning (CM-PBL) at a department of medical informatics. The training approach addresses both students of medicine and students of computer science. The students are full members of an ongoing research project and develop a project-related application or module, or explore or evaluate a sub-project. Two teachers guide and review the students' work. The training on scientific work follows a workflow with defined milestones. The team acts as peer group. By participating in the research team's work the students are trained on professional skills. A research project on a web-based information system on hospitals built the scenario for the realistic context. The research team consisted of up to 14 active members at a time, who were scientists and students of computer science and medicine. The well communicated educational approach and team policy fostered the participation of the students. Formative assessment and evaluation showed a considerable improvement of the students' skills and a high participant satisfaction. Alternative education approaches such as project-based learning empower students to acquire scientific knowledge and professional skills, especially the ability of life-long learning, multidisciplinary team work and social responsibility.

  17. Final Report. Institute for Ultralscale Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, Kwan-Liu; Galli, Giulia; Gygi, Francois

    The SciDAC Institute for Ultrascale Visualization brought together leading experts from visualization, high-performance computing, and science application areas to make advanced visualization solutions for SciDAC scientists and the broader community. Over the five-year project, the Institute introduced many new enabling visualization techniques, which have significantly enhanced scientists’ ability to validate their simulations, interpret their data, and communicate with others about their work and findings. This Institute project involved a large number of junior and student researchers, who received the opportunities to work on some of the most challenging science applications and gain access to the most powerful high-performance computing facilitiesmore » in the world. They were readily trained and prepared for facing the greater challenges presented by extreme-scale computing. The Institute’s outreach efforts, through publications, workshops and tutorials, successfully disseminated the new knowledge and technologies to the SciDAC and the broader scientific communities. The scientific findings and experience of the Institute team helped plan the SciDAC3 program.« less

  18. Riding the Hype Wave: Evaluating new AI Techniques for their Applicability in Earth Science

    NASA Astrophysics Data System (ADS)

    Ramachandran, R.; Zhang, J.; Maskey, M.; Lee, T. J.

    2016-12-01

    Every few years a new technology rides the hype wave generated by the computer science community. Converts to this new technology who surface from both the science community and the informatics community promulgate that it can radically improve or even change the existing scientific process. Recent examples of new technology following in the footsteps of "big data" now include deep learning algorithms and knowledge graphs. Deep learning algorithms mimic the human brain and process information through multiple stages of transformation and representation. These algorithms are able to learn complex functions that map pixels directly to outputs without relying on human-crafted features and solve some of the complex classification problems that exist in science. Similarly, knowledge graphs aggregate information around defined topics that enable users to resolve their query without having to navigate and assemble information manually. Knowledge graphs could potentially be used in scientific research to assist in hypothesis formulation, testing, and review. The challenge for the Earth science research community is to evaluate these new technologies by asking the right questions and considering what-if scenarios. What is this new technology enabling/providing that is innovative and different? Can one justify the adoption costs with respect to the research returns? Since nothing comes for free, utilizing a new technology entails adoption costs that may outweigh the benefits. Furthermore, these technologies may require significant computing infrastructure in order to be utilized effectively. Results from two different projects will be presented along with lessons learned from testing these technologies. The first project primarily evaluates deep learning techniques for different applications of image retrieval within Earth science while the second project builds a prototype knowledge graph constructed for Hurricane science.

  19. 1976 annual summary report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1978-03-01

    Abstracts of papers published during the previous calendar year, arranged in accordance with the project titles used in the USDOE Schedule 189 Budget Proposals, are presented. The collection of abstracts supplements the listing of papers published in the Schedule 189. The following subject areas are represented: high-energy physics; nuclear physics; basic energy sciences (nuclear science, materials sciences, solid state physics, materials chemistry); molecular, mathematical, and earth sciences (fundamental interactions, processes and techniques, mathematical and computer sciences); environmental research and development; physical and technological studies (characterization, measurement and monitoring); and nuclear research and applications.

  20. Multicore: Fallout from a Computing Evolution

    ScienceCinema

    Yelick, Kathy [Director, NERSC

    2017-12-09

    July 22, 2008 Berkeley Lab lecture: Parallel computing used to be reserved for big science and engineering projects, but in two years that's all changed. Even laptops and hand-helds use parallel processors. Unfortunately, the software hasn't kept pace. Kathy Yelick, Director of the National Energy Research Scientific Computing Center at Berkeley Lab, describes the resulting chaos and the computing community's efforts to develop exciting applications that take advantage of tens or hundreds of processors on a single chip.

  1. Service engineering for grid services in medicine and life science.

    PubMed

    Weisbecker, Anette; Falkner, Jürgen

    2009-01-01

    Clearly defined services with appropriate business models are necessary in order to exploit the benefit of grid computing for industrial and academic users in medicine and life sciences. In the project Services@MediGRID the service engineering approach is used to develop those clearly defined grid services and to provide sustainable business models for their usage.

  2. Introducing Project-Based Instruction in the Saudi ESP Classroom: A Study in Qassim University

    ERIC Educational Resources Information Center

    Alsamani, Abdul-Aziz Saleh; Daif-Allah, Ayman Sabry

    2016-01-01

    The aim of this paper is to study the impact of introducing an integrative pedagogical approach in the ESP classes on developing the English language vocabulary of Computer Science and Information Technology students in the College of Science, Qassim University. The study suggests a framework for an ESP course-design employing students' project…

  3. The Human Genome Project: big science transforms biology and medicine

    PubMed Central

    2013-01-01

    The Human Genome Project has transformed biology through its integrated big science approach to deciphering a reference human genome sequence along with the complete sequences of key model organisms. The project exemplifies the power, necessity and success of large, integrated, cross-disciplinary efforts - so-called ‘big science’ - directed towards complex major objectives. In this article, we discuss the ways in which this ambitious endeavor led to the development of novel technologies and analytical tools, and how it brought the expertise of engineers, computer scientists and mathematicians together with biologists. It established an open approach to data sharing and open-source software, thereby making the data resulting from the project accessible to all. The genome sequences of microbes, plants and animals have revolutionized many fields of science, including microbiology, virology, infectious disease and plant biology. Moreover, deeper knowledge of human sequence variation has begun to alter the practice of medicine. The Human Genome Project has inspired subsequent large-scale data acquisition initiatives such as the International HapMap Project, 1000 Genomes, and The Cancer Genome Atlas, as well as the recently announced Human Brain Project and the emerging Human Proteome Project. PMID:24040834

  4. The impact of computer-based versus "traditional" textbook science instruction on selected student learning outcomes

    NASA Astrophysics Data System (ADS)

    Rothman, Alan H.

    This study reports the results of research designed to examine the impact of computer-based science instruction on elementary school level students' science content achievement, their attitude about science learning, their level of critical thinking-inquiry skills, and their level of cognitive and English language development. The study compared these learning outcomes resulting from a computer-based approach compared to the learning outcomes from a traditional, textbook-based approach to science instruction. The computer-based approach was inherent in a curriculum titled The Voyage of the Mimi , published by The Bank Street College Project in Science and Mathematics (1984). The study sample included 209 fifth-grade students enrolled in three schools in a suburban school district. This sample was divided into three groups, each receiving one of the following instructional treatments: (a) Mixed-instruction primarily based on the use of a hardcopy textbook in conjunction with computer-based instructional materials as one component of the science course; (b) Non-Traditional, Technology-Based -instruction fully utilizing computer-based material; and (c) Traditional, Textbook-Based-instruction utilizing only the textbook as the basis for instruction. Pre-test, or pre-treatment, data related to each of the student learning outcomes was collected at the beginning of the school year and post-test data was collected at the end of the school year. Statistical analyses of pre-test data were used as a covariate to account for possible pre-existing differences with regard to the variables examined among the three student groups. This study concluded that non-traditional, computer-based instruction in science significantly improved students' attitudes toward science learning and their level of English language development. Non-significant, positive trends were found for the following student learning outcomes: overall science achievement and development of critical thinking-inquiry skills. These conclusions support the value of a non-traditional, computer-based approach to instruction, such as exemplified by The Voyage of the Mimi curriculum, and a recommendation for reform in science teaching that has recommended the use of computer technology to enhance learning outcomes from science instruction to assist in reversing the trend toward what has been perceived to be relatively poor science performance by American students, as documented by the 1996 Third International Mathematics and Science Study (TIMSS).

  5. Earth Science Informatics - Overview

    NASA Technical Reports Server (NTRS)

    Ramapriyan, H. K.

    2017-01-01

    Over the last 10-15 years, significant advances have been made in information management, there are an increasing number of individuals entering the field of information management as it applies to Geoscience and Remote Sensing data, and the field of informatics has come to its own. Informatics is the science and technology of applying computers and computational methods to the systematic analysis, management, interchange, and representation of science data, information, and knowledge. Informatics also includes the use of computers and computational methods to support decision making and applications. Earth Science Informatics (ESI, a.k.a. geoinformatics) is the application of informatics in the Earth science domain. ESI is a rapidly developing discipline integrating computer science, information science, and Earth science. Major national and international research and infrastructure projects in ESI have been carried out or are on-going. Notable among these are: the Global Earth Observation System of Systems (GEOSS), the European Commissions INSPIRE, the U.S. NSDI and Geospatial One-Stop, the NASA EOSDIS, and the NSF DataONE, EarthCube and Cyberinfrastructure for Geoinformatics. More than 18 departments and agencies in the U.S. federal government have been active in Earth science informatics. All major space agencies in the world, have been involved in ESI research and application activities. In the United States, the Federation of Earth Science Information Partners (ESIP), whose membership includes over 180 organizations (government, academic and commercial) dedicated to managing, delivering and applying Earth science data, has been working on many ESI topics since 1998. The Committee on Earth Observation Satellites (CEOS)s Working Group on Information Systems and Services (WGISS) has been actively coordinating the ESI activities among the space agencies.

  6. Earth Science Informatics - Overview

    NASA Technical Reports Server (NTRS)

    Ramapriyan, H. K.

    2017-01-01

    Over the last 10-15 years, significant advances have been made in information management, there are an increasing number of individuals entering the field of information management as it applies to Geoscience and Remote Sensing data, and the field of informatics has come to its own. Informatics is the science and technology of applying computers and computational methods to the systematic analysis, management, interchange, and representation of science data, information, and knowledge. Informatics also includes the use of computers and computational methods to support decision making and applications. Earth Science Informatics (ESI, a.k.a. geoinformatics) is the application of informatics in the Earth science domain. ESI is a rapidly developing discipline integrating computer science, information science, and Earth science. Major national and international research and infrastructure projects in ESI have been carried out or are on-going. Notable among these are: the Global Earth Observation System of Systems (GEOSS), the European Commissions INSPIRE, the U.S. NSDI and Geospatial One-Stop, the NASA EOSDIS, and the NSF DataONE, EarthCube and Cyberinfrastructure for Geoinformatics. More than 18 departments and agencies in the U.S. federal government have been active in Earth science informatics. All major space agencies in the world, have been involved in ESI research and application activities. In the United States, the Federation of Earth Science Information Partners (ESIP), whose membership includes over 180 organizations (government, academic and commercial) dedicated to managing, delivering and applying Earth science data, has been working on many ESI topics since 1998. The Committee on Earth Observation Satellites (CEOS)s Working Group on Information Systems and Services (WGISS) has been actively coordinating the ESI activities among the space agencies.The talk will present an overview of current efforts in ESI, the role members of IEEE GRSS play, and discuss recent developments in data preservation and provenance.

  7. Collaborations in art/science: Renaissance teams.

    PubMed

    Cox, D J

    1991-01-01

    A Renaissance Team is a group of specialists who collaborate and provide synergism in the quest for knowledge and information. Artists can participate in Renaissance Teams with scientists and computer specialists for scientific visualization projects. Some projects are described in which the author functioned as programmer and color expert, as interface designer, as visual paradigm maker, as animator, and as producer. Examples are provided for each of these five projects.

  8. The Grinnell Science Project: Results of Over Two Decades of Reform Aimed at Inclusion in Science and Mathematics

    NASA Astrophysics Data System (ADS)

    Mahlab, Minna; Grinnell Science Project Team--Grinnell College

    2015-01-01

    The Grinnell Science Project (GSP) is a program that was developed starting in the early 1990's at Grinnell College -- a selective liberal arts college in Grinnell, Iowa. The GSP program is committed to developing the talents of all students interested in science and mathematics, especially those from groups underrepresented in the sciences -- students of color, first-generation college students, and women in physics, mathematics and computer science. The program developed over several years, drawing on national studies and efforts, and aimed at addressing barriers to success in the sciences. It has involved curricular and mentoring changes, activities and structures that foster acclimation to college life and a community of scientists, and improvement of student achievement. Prior to the full implementation of the Grinnell Science Project, from 1992-1994, an average of 42 science majors graduated annually who were women and eight who were students of color. By 2008, those numbers had jumped to 90 women (a 114% increase) and 21 students of color (a 162.5% increase). In 2009, the GSP was honored with the Presidential Award for Excellence in Science, Engineering, Mathematics, and Engineering Mentoring, administered by the National Science Foundation. Components of the GSP are now mainstream throughout the science curriculum at Grinnell, and almost all science and math faculty have played some role in the program.

  9. Computer-assisted instruction

    NASA Technical Reports Server (NTRS)

    Atkinson, R. C.

    1974-01-01

    The results are presented of a project of research and development on strategies for optimizing the instructional process, and dissemination of information about the applications of such research to the instructional medium of computer-assisted instruction. Accomplishments reported include construction of the author language INSTRUCT, construction of a practical CAI course in the area of computer science, and a number of investigations into the individualization of instruction, using the course as a vehicle.

  10. The AIST Managed Cloud Environment

    NASA Astrophysics Data System (ADS)

    Cook, S.

    2016-12-01

    ESTO is currently in the process of developing and implementing the AIST Managed Cloud Environment (AMCE) to offer cloud computing services to ESTO-funded PIs to conduct their project research. AIST will provide projects access to a cloud computing framework that incorporates NASA security, technical, and financial standards, on which project can freely store, run, and process data. Currently, many projects led by research groups outside of NASA do not have the awareness of requirements or the resources to implement NASA standards into their research, which limits the likelihood of infusing the work into NASA applications. Offering this environment to PIs will allow them to conduct their project research using the many benefits of cloud computing. In addition to the well-known cost and time savings that it allows, it also provides scalability and flexibility. The AMCE will facilitate infusion and end user access by ensuring standardization and security. This approach will ultimately benefit ESTO, the science community, and the research, allowing the technology developments to have quicker and broader applications.

  11. The AMCE (AIST Managed Cloud Environment)

    NASA Astrophysics Data System (ADS)

    Cook, S.

    2017-12-01

    ESTO has developed and implemented the AIST Managed Cloud Environment (AMCE) to offer cloud computing services to SMD-funded PIs to conduct their project research. AIST will provide projects access to a cloud computing framework that incorporates NASA security, technical, and financial standards, on which project can freely store, run, and process data. Currently, many projects led by research groups outside of NASA do not have the awareness of requirements or the resources to implement NASA standards into their research, which limits the likelihood of infusing the work into NASA applications. Offering this environment to PIs allows them to conduct their project research using the many benefits of cloud computing. In addition to the well-known cost and time savings that it allows, it also provides scalability and flexibility. The AMCE facilitates infusion and end user access by ensuring standardization and security. This approach will ultimately benefit ESTO, the science community, and the research, allowing the technology developments to have quicker and broader applications.

  12. Web-Based Instruction in Physics Courses

    NASA Astrophysics Data System (ADS)

    Wijekumar, V.

    1998-05-01

    The World Wide Web will be utilized to deliver instructional materials in physics courses in two cases. In one case, a set of physics courses will be entirely taught using WWW for high school science and mathematics teachers in the physics certification program. In the other case, the WWW will be used to enhance the linkage between the laboratory courses in medical physics, human physiology and clinical nursing courses for nursing students. This project links three departments in two colleges to enhance a project known as Integrated Computer System across the Health Science Curriculum. Partial support for this work was provided by the National Science Foundation's Division od Undergraduate Education through grant DUE # 9650793.

  13. Live theater on a virtual stage: incorporating soft skills and teamwork in computer graphics education.

    PubMed

    Schweppe, M; Geigel, J

    2011-01-01

    Industry has increasingly emphasized the need for "soft" or interpersonal skills development and team-building experience in the college curriculum. Here, we discuss our experiences with providing such opportunities via a collaborative project called the Virtual Theater. In this joint project between the Rochester Institute of Technology's School of Design and Department of Computer Science, the goal is to enable live performance in a virtual space with participants in different physical locales. Students work in teams, collaborating with other students in and out of their disciplines.

  14. The fusion of biology, computer science, and engineering: towards efficient and successful synthetic biology.

    PubMed

    Linshiz, Gregory; Goldberg, Alex; Konry, Tania; Hillson, Nathan J

    2012-01-01

    Synthetic biology is a nascent field that emerged in earnest only around the turn of the millennium. It aims to engineer new biological systems and impart new biological functionality, often through genetic modifications. The design and construction of new biological systems is a complex, multistep process, requiring multidisciplinary collaborative efforts from "fusion" scientists who have formal training in computer science or engineering, as well as hands-on biological expertise. The public has high expectations for synthetic biology and eagerly anticipates the development of solutions to the major challenges facing humanity. This article discusses laboratory practices and the conduct of research in synthetic biology. It argues that the fusion science approach, which integrates biology with computer science and engineering best practices, including standardization, process optimization, computer-aided design and laboratory automation, miniaturization, and systematic management, will increase the predictability and reproducibility of experiments and lead to breakthroughs in the construction of new biological systems. The article also discusses several successful fusion projects, including the development of software tools for DNA construction design automation, recursive DNA construction, and the development of integrated microfluidics systems.

  15. Centre for Research Infrastructure of Polish GNSS Data - response and possible contribution to EPOS

    NASA Astrophysics Data System (ADS)

    Araszkiewicz, Andrzej; Rohm, Witold; Bosy, Jaroslaw; Szolucha, Marcin; Kaplon, Jan; Kroszczynski, Krzysztof

    2017-04-01

    In the frame of the first call under Action 4.2: Development of modern research infrastructure of the science sector in the Smart Growth Operational Programme 2014-2020 in the late of 2016 the "EPOS-PL" project has launched. Following institutes are responsible for the implementation of this project: Institute of Geophysics, Polish Academy of Sciences - Project Leader, Academic Computer Centre Cyfronet AGH University of Science and Technology, Central Mining Institute, the Institute of Geodesy and Cartography, Wrocław University of Environmental and Life Sciences, Military University of Technology. In addition, resources constituting entrepreneur's own contribution will come from the Polish Mining Group. Research Infrastructure EPOS-PL will integrate both existing and newly built National Research Infrastructures (Theme Centre for Research Infrastructures), which, under the premise of the program EPOS, are financed exclusively by the national founds. In addition, the e-science platform will be developed. The Centre for Research Infrastructure of GNSS Data (CIBDG - Task 5) will be built based on the experience and facilities of two institutions: Military University of Technology and Wrocław University of Environmental and Life Sciences. The project includes the construction of the National GNNS Repository with data QC procedures and adaptation of two Regional GNNS Analysis Centres for rapid and long-term geodynamical monitoring.

  16. The open science grid

    NASA Astrophysics Data System (ADS)

    Pordes, Ruth; OSG Consortium; Petravick, Don; Kramer, Bill; Olson, Doug; Livny, Miron; Roy, Alain; Avery, Paul; Blackburn, Kent; Wenaus, Torre; Würthwein, Frank; Foster, Ian; Gardner, Rob; Wilde, Mike; Blatecky, Alan; McGee, John; Quick, Rob

    2007-07-01

    The Open Science Grid (OSG) provides a distributed facility where the Consortium members provide guaranteed and opportunistic access to shared computing and storage resources. OSG provides support for and evolution of the infrastructure through activities that cover operations, security, software, troubleshooting, addition of new capabilities, and support for existing and engagement with new communities. The OSG SciDAC-2 project provides specific activities to manage and evolve the distributed infrastructure and support it's use. The innovative aspects of the project are the maintenance and performance of a collaborative (shared & common) petascale national facility over tens of autonomous computing sites, for many hundreds of users, transferring terabytes of data a day, executing tens of thousands of jobs a day, and providing robust and usable resources for scientific groups of all types and sizes. More information can be found at the OSG web site: www.opensciencegrid.org.

  17. The Impact of an Interdisciplinary Space Program on Computer Science Student Learning

    ERIC Educational Resources Information Center

    Straub, Jeremy; Marsh, Ronald; Whalen, David

    2015-01-01

    Project-based learning and interdisciplinary projects present an opportunity for students to learn both technical skills and other skills which are relevant to their workplace success. This paper presents an assessment of the educational impact of the OpenOrbiter program, a student-run, interdisciplinary CubeSat (a type of small satellite with…

  18. Developing and Integrating a Web-Based Quiz into the Curriculum.

    ERIC Educational Resources Information Center

    Carbone, Angela; Schendzielorz, Peter

    In 1996, the Department of Computer Science at Monash University (Australia) implemented a First Year Advanced Students' Project Scheme aimed at extending and stimulating its best first year students. The goal of the scheme was to give students the opportunity to work on a project that best suited their needs and captured their interests. One of…

  19. Higher Achievement and Improvement through Instruction with Computers and Scholarly Transition and Resource Systems Program. OREA Report.

    ERIC Educational Resources Information Center

    Berney, Tomi D.; Plotkin, Donna

    Project HAITI STARS served 360 students, native speakers of Haitian Creole, Spanish, and Chinese, in its first year through supplementary instruction in English as a Second Language (ESL), native language arts (NLA), and bilingual mathematics, science, and social studies. The project provided students with academic and personal counseling,…

  20. Enhancing Project-Based Learning in Software Engineering Lab Teaching through an E-Portfolio Approach

    ERIC Educational Resources Information Center

    Macias, J. A.

    2012-01-01

    Project-based learning is one of the main successful student-centered pedagogies broadly used in computing science courses. However, this approach can be insufficient when dealing with practical subjects that implicitly require many deliverables and a great deal of feedback and organizational resources. In this paper, a worked e-portfolio is…

  1. Challenges in Mentoring Software Development Projects in the High School: Analysis According to Shulman's Teacher Knowledge Base Model

    ERIC Educational Resources Information Center

    Meerbaum-Salant, Orni; Hazzan, Orit

    2009-01-01

    This paper focuses on challenges in mentoring software development projects in the high school and analyzes difficulties encountered by Computer Science teachers in the mentoring process according to Shulman's Teacher Knowledge Base Model. The main difficulties that emerged from the data analysis belong to the following knowledge sources of…

  2. Flexible 2D RF Nanoelectronics based on Layered Semiconductor Transistor (NBIT III)

    DTIC Science & Technology

    2016-11-11

    Experimental and computational studies in multidisciplinary fields of electrical, mechanical engineering , and materials science were conducted to achieve...plan for this project. Experimental and computational studies in multidisciplinary fields of electrical, mechanical engineering , and materials...electrostatic or physisorption gating, defect engineering , and substitutional doping during the growth. These methods result in uniform doping or composition

  3. What Is the Predict Level of Which Computer Using Skills Measured in PISA for Achievement in Mathematics

    ERIC Educational Resources Information Center

    Ziya, Engin; Dogan, Nuri; Kelecioglu, Hulya

    2010-01-01

    This study aims at determining the extent to which computer using skills specified in Project for International Students Evaluation (PISA) 2006 predict Turkish students' achievement in mathematics. Apart from questions on mathematics, science and reading competencies, a student questionnaire, a school questionnaire and a parent questionnaire were…

  4. From cosmos to connectomes: the evolution of data-intensive science.

    PubMed

    Burns, Randal; Vogelstein, Joshua T; Szalay, Alexander S

    2014-09-17

    The analysis of data requires computation: originally by hand and more recently by computers. Different models of computing are designed and optimized for different kinds of data. In data-intensive science, the scale and complexity of data exceeds the comfort zone of local data stores on scientific workstations. Thus, cloud computing emerges as the preeminent model, utilizing data centers and high-performance clusters, enabling remote users to access and query subsets of the data efficiently. We examine how data-intensive computational systems originally built for cosmology, the Sloan Digital Sky Survey (SDSS), are now being used in connectomics, at the Open Connectome Project. We list lessons learned and outline the top challenges we expect to face. Success in computational connectomics would drastically reduce the time between idea and discovery, as SDSS did in cosmology. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. Multicore: Fallout From a Computing Evolution (LBNL Summer Lecture Series)

    ScienceCinema

    Yelick, Kathy [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC)

    2018-05-07

    Summer Lecture Series 2008: Parallel computing used to be reserved for big science and engineering projects, but in two years that's all changed. Even laptops and hand-helds use parallel processors. Unfortunately, the software hasn't kept pace. Kathy Yelick, Director of the National Energy Research Scientific Computing Center at Berkeley Lab, describes the resulting chaos and the computing community's efforts to develop exciting applications that take advantage of tens or hundreds of processors on a single chip.

  6. BioSIGHT: Interactive Visualization Modules for Science Education

    NASA Technical Reports Server (NTRS)

    Wong, Wee Ling

    1998-01-01

    Redefining science education to harness emerging integrated media technologies with innovative pedagogical goals represents a unique challenge. The Integrated Media Systems Center (IMSC) is the only engineering research center in the area of multimedia and creative technologies sponsored by the National Science Foundation. The research program at IMSC is focused on developing advanced technologies that address human-computer interfaces, database management, and high- speed network capabilities. The BioSIGHT project at IMSC is a demonstration technology project in the area of education that seeks to address how such emerging multimedia technologies can make an impact on science education. The scope of this project will help solidify NASA's commitment for the development of innovative educational resources that promotes science literacy for our students and the general population as well. These issues must be addressed as NASA marches towards the goal of enabling human space exploration that requires an understanding of life sciences in space. The IMSC BioSIGHT lab was established with the purpose of developing a novel methodology that will map a high school biology curriculum into a series of interactive visualization modules that can be easily incorporated into a space biology curriculum. Fundamental concepts in general biology must be mastered in order to allow a better understanding and application for space biology. Interactive visualization is a powerful component that can capture the students' imagination, facilitate their assimilation of complex ideas, and help them develop integrated views of biology. These modules will augment the role of the teacher and will establish the value of student-centered interactivity, both in an individual setting as well as in a collaborative learning environment. Students will be able to interact with the content material, explore new challenges, and perform virtual laboratory simulations. The BioSIGHT effort is truly cross-disciplinary in nature and requires expertise from many areas including Biology, Computer Science, Electrical Engineering, Education, and the Cognitive Sciences. The BioSIGHT team includes a scientific illustrator, educational software designer, computer programmers as well as IMSC graduate and undergraduate students. Our collaborators include TERC, a research and education organization with extensive k-12 math and science curricula development from Cambridge, MA.; SRI International of Menlo Park, CA.; teachers and students from local area high schools (Newbury Park High School, USC's Family of Five schools, Chadwick School, and Pasadena Polytechnic High School).

  7. Zooniverse - Real science online with more than a million people. (Invited)

    NASA Astrophysics Data System (ADS)

    Smith, A.; Lynn, S.; Lintott, C.; Whyte, L.; Borden, K. A.

    2013-12-01

    The Zooniverse (zooniverse.org) began in 2007 with the launch of Galaxy Zoo, a project in which more than 175,000 people provided shape analyses of more than 1 million galaxy images sourced from the Sloan Digital Sky Survey. These galaxy 'classifications', some 60 million in total, have since been used to produce more than 50 peer-reviewed publications based not only on the original research goals of the project but also because of serendipitous discoveries made by the volunteer community. Based upon the success of Galaxy Zoo the team have gone on to develop more than 25 web-based citizen science projects, all with a strong research focus in a range of subjects from astronomy to zoology where human-based analysis still exceeds that of machine intelligence. Over the past 6 years Zooniverse projects have collected more than 300 million data analyses from over 1 million volunteers providing fantastically rich datasets for not only the individuals working to produce research from their project but also the machine learning and computer vision research communities. This talk will focus on the core 'method' by which Zooniverse projects are developed and lessons learned by the Zooniverse team developing citizen science projects across a range of disciplines.

  8. The HSP, the QCN, and the Dragon: Developing inquiry-based QCN instructional modules in Taiwan

    NASA Astrophysics Data System (ADS)

    Chen, K. H.; Liang, W.; Chang, C.; Yen, E.; Lin, C.; Lin, G.

    2012-12-01

    High Scope Program (HSP) is a long-term project funded by NSC in Taiwan since 2006. It is designed to elevate the quality of science education by means of incorporating emerging science and technology into the traditional curricula in senior high schools. Quake-Catcher Network (QCN), a distributed computing project initiated by Stanford University and UC Riverside, encourages the volunteers to install the low-cost, novel sensors at home and school to build a seismic network. To meet both needs, we have developed a model curriculum that introduces QCN, earthquake science, and cloud computing into high school classrooms. Through professional development workshops, Taiwan cloud-based earthquake science learning platform, and QCN club on Facebook, we have worked closely with Lan-Yang Girl's Senior High School teachers' team to design workable teaching plans through a practical operation of seismic monitoring at home or school. However, some obstacles to learning appear including QCN installation/maintain problems, high self-noise of the sensor, difficulty of introducing earthquake sciences for high school teachers. The challenges of QCN outreach in Taiwan bring out our future plans: (1) development of easy, frequently updated, physics-based QCN-experiments for high school teachers, and (2) design of an interactive learning platform with social networking function for students.

  9. Impact of Interdisciplinary Undergraduate Research in Mathematics and Biology on the Development of a New Course Integrating Five STEM Disciplines

    ERIC Educational Resources Information Center

    Caudill, Lester; Hill, April; Hoke, Kathy; Lipan, Ovidiu

    2010-01-01

    Funded by innovative programs at the National Science Foundation and the Howard Hughes Medical Institute, University of Richmond faculty in biology, chemistry, mathematics, physics, and computer science teamed up to offer first- and second-year students the opportunity to contribute to vibrant, interdisciplinary research projects. The result was…

  10. Welcome to the NASA High Performance Computing and Communications Computational Aerosciences (CAS) Workshop 2000

    NASA Technical Reports Server (NTRS)

    Schulbach, Catherine H. (Editor)

    2000-01-01

    The purpose of the CAS workshop is to bring together NASA's scientists and engineers and their counterparts in industry, other government agencies, and academia working in the Computational Aerosciences and related fields. This workshop is part of the technology transfer plan of the NASA High Performance Computing and Communications (HPCC) Program. Specific objectives of the CAS workshop are to: (1) communicate the goals and objectives of HPCC and CAS, (2) promote and disseminate CAS technology within the appropriate technical communities, including NASA, industry, academia, and other government labs, (3) help promote synergy among CAS and other HPCC scientists, and (4) permit feedback from peer researchers on issues facing High Performance Computing in general and the CAS project in particular. This year we had a number of exciting presentations in the traditional aeronautics, aerospace sciences, and high-end computing areas and in the less familiar (to many of us affiliated with CAS) earth science, space science, and revolutionary computing areas. Presentations of more than 40 high quality papers were organized into ten sessions and presented over the three-day workshop. The proceedings are organized here for easy access: by author, title and topic.

  11. Gravitational Many-Body Problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makino, J.

    2008-04-29

    In this paper, we briefly review some aspects of the gravitational many-body problem, which is one of the oldest problems in the modern mathematical science. Then we review our GRAPE project to design computers specialized to this problem.

  12. Project : semi-autonomous parking for enhanced safety and efficiency.

    DOT National Transportation Integrated Search

    2016-04-01

    Index coding, a coding formulation traditionally analyzed in the theoretical computer science and : information theory communities, has received considerable attention in recent years due to its value in : wireless communications and networking probl...

  13. Dynamics of list-server discussion on genetically modified foods.

    PubMed

    Triunfol, Marcia L; Hines, Pamela J

    2004-04-01

    Computer-mediated discussion lists, or list-servers, are popular tools in settings ranging from professional to personal to educational. A discussion list on genetically modified food (GMF) was created in September 2000 as part of the Forum on Genetically Modified Food developed by Science Controversies: Online Partnerships in Education (SCOPE), an educational project that uses computer resources to aid research and learning around unresolved scientific questions. The discussion list "GMF-Science" was actively supported from January 2001 to May 2002. The GMF-Science list welcomed anyone interested in discussing the controversies surrounding GMF. Here, we analyze the dynamics of the discussions and how the GMF-Science list may contribute to learning. Activity on the GMF-Science discussion list reflected some but not all the controversies that were appearing in more traditional publication formats, broached other topics not well represented in the published literature, and tended to leave undiscussed the more technical research developments.

  14. Spacecraft computer resource margin management. [of Project Galileo Orbiter in-flight reprogramming task

    NASA Technical Reports Server (NTRS)

    Larman, B. T.

    1981-01-01

    The conduction of the Project Galileo Orbiter, with 18 microcomputers and the equivalent of 360K 8-bit bytes of memory contained within two major engineering subsystems and eight science instruments, requires that the key onboard computer system resources be managed in a very rigorous manner. Attention is given to the rationale behind the project policy, the development stage, the preliminary design stage, the design/implementation stage, and the optimization or 'scrubbing' stage. The implementation of the policy is discussed, taking into account the development of the Attitude and Articulation Control Subsystem (AACS) and the Command and Data Subsystem (CDS), the reporting of margin status, and the response to allocation oversubscription.

  15. Discover the Cosmos - Bringing Cutting Edge Science to Schools across Europe

    NASA Astrophysics Data System (ADS)

    Doran, Rosa

    2015-03-01

    The fast growing number of science data repositories is opening enormous possibilities to scientists all over the world. The emergence of citizen science projects is engaging in science discovery a large number of citizens globally. Astronomical research is now a possibility to anyone having a computer and some form of data access. This opens a very interesting and strategic possibility to engage large audiences in the making and understanding of science. On another perspective it would be only natural to imagine that soon enough data mining will be an active part of the academic path of university or even secondary schools students. The possibility is very exciting but the road not very promising. Even in the most developed nations, where all schools are equipped with modern ICT facilities the use of such possibilities is still a very rare episode. The Galileo Teacher Training Program GTTP, a legacy of IYA2009, is participating in some of the most emblematic projects funded by the European Commission and targeting modern tools, resources and methodologies for science teaching. One of this projects is Discover the Cosmos which is aiming to target this issue by empowering educators with the necessary skills to embark on this innovative path: teaching science while doing science.

  16. Climate Science's Globally Distributed Infrastructure

    NASA Astrophysics Data System (ADS)

    Williams, D. N.

    2016-12-01

    The Earth System Grid Federation (ESGF) is primarily funded by the Department of Energy's (DOE's) Office of Science (the Office of Biological and Environmental Research [BER] Climate Data Informatics Program and the Office of Advanced Scientific Computing Research Next Generation Network for Science Program), the National Oceanic and Atmospheric Administration (NOAA), the National Aeronautics and Space Administration (NASA), and the National Science Foundation (NSF), the European Infrastructure for the European Network for Earth System Modeling (IS-ENES), and the Australian National University (ANU). Support also comes from other U.S. federal and international agencies. The federation works across multiple worldwide data centers and spans seven international network organizations to provide users with the ability to access, analyze, and visualize data using a globally federated collection of networks, computers, and software. Its architecture employs a series of geographically distributed peer nodes that are independently administered and united by common federation protocols and application programming interfaces (APIs). The full ESGF infrastructure has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the Coupled Model Intercomparison Project (CMIP; output used by the Intergovernmental Panel on Climate Change assessment reports), multiple model intercomparison projects (MIPs; endorsed by the World Climate Research Programme [WCRP]), and the Accelerated Climate Modeling for Energy (ACME; ESGF is included in the overarching ACME workflow process to store model output). ESGF is a successful example of integration of disparate open-source technologies into a cohesive functional system that serves the needs the global climate science community. Data served by ESGF includes not only model output but also observational data from satellites and instruments, reanalysis, and generated images.

  17. Health sciences library building projects, 1996-1997 survey.

    PubMed Central

    Bowden, V M

    1998-01-01

    Nine building projects are briefly described, including four new libraries, two renovations, and three combined renovations and additions. The libraries range in size from 657 square feet to 136,832 square feet, with seating varying from 14 to 635. Three hospital libraries and four academic health sciences libraries are described in more detail. In each case an important consideration was the provision for computer access. Two of the libraries expanded their space for historical collections. Three of the libraries added mobile shelving as a way of storing print materials while providing space for other activities. Images PMID:9549012

  18. Synergies and Distinctions between Computational Disciplines in Biomedical Research: Perspective from the Clinical and Translational Science Award Programs

    PubMed Central

    Bernstam, Elmer V.; Hersh, William R.; Johnson, Stephen B.; Chute, Christopher G.; Nguyen, Hien; Sim, Ida; Nahm, Meredith; Weiner, Mark; Miller, Perry; DiLaura, Robert P.; Overcash, Marc; Lehmann, Harold P.; Eichmann, David; Athey, Brian D.; Scheuermann, Richard H.; Anderson, Nick; Starren, Justin B.; Harris, Paul A.; Smith, Jack W.; Barbour, Ed; Silverstein, Jonathan C.; Krusch, David A.; Nagarajan, Rakesh; Becich, Michael J.

    2010-01-01

    Clinical and translational research increasingly requires computation. Projects may involve multiple computationally-oriented groups including information technology (IT) professionals, computer scientists and biomedical informaticians. However, many biomedical researchers are not aware of the distinctions among these complementary groups, leading to confusion, delays and sub-optimal results. Although written from the perspective of clinical and translational science award (CTSA) programs within academic medical centers, the paper addresses issues that extend beyond clinical and translational research. The authors describe the complementary but distinct roles of operational IT, research IT, computer science and biomedical informatics using a clinical data warehouse as a running example. In general, IT professionals focus on technology. The authors distinguish between two types of IT groups within academic medical centers: central or administrative IT (supporting the administrative computing needs of large organizations) and research IT (supporting the computing needs of researchers). Computer scientists focus on general issues of computation such as designing faster computers or more efficient algorithms, rather than specific applications. In contrast, informaticians are concerned with data, information and knowledge. Biomedical informaticians draw on a variety of tools, including but not limited to computers, to solve information problems in health care and biomedicine. The paper concludes with recommendations regarding administrative structures that can help to maximize the benefit of computation to biomedical research within academic health centers. PMID:19550198

  19. NASA/DoD Aerospace Knowledge Diffusion Research Project. Report Number 19. The U. S. Government Technical Report and the Transfer of Federally Funded Aerospace R&D: An Analysis of Five Studies

    DTIC Science & Technology

    1994-01-01

    defined etymologically , according to report content and method (U.S. Department of Defense, 1964); behaviorally, according to the influence on the reader...SCIENCES 2 ASTRONAUTICS 7 MATERIALS & CHEMISTRY 3 ENGINEERING 8 PHYSICS 4 GEOSCIENCES 9 SPACE SCIENCES 5 LIFE SCIENCES 10 OTHER (specify) 63. IsANYof...YOUR work? (Circle ONLY one number) I AERONAUTICS 6 MATHEMATICAL & COMPUTER SCIENCES 2 ASTRONAUTICS 7 MATERIALS & CHEMISTRY 3 ENGINEERING 8 PHYSICS 4

  20. 1996 Laboratory directed research and development annual report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meyers, C.E.; Harvey, C.L.; Lopez-Andreas, L.M.

    This report summarizes progress from the Laboratory Directed Research and Development (LDRD) program during fiscal year 1996. In addition to a programmatic and financial overview, the report includes progress reports from 259 individual R&D projects in seventeen categories. The general areas of research include: engineered processes and materials; computational and information sciences; microelectronics and photonics; engineering sciences; pulsed power; advanced manufacturing technologies; biomedical engineering; energy and environmental science and technology; advanced information technologies; counterproliferation; advanced transportation; national security technology; electronics technologies; idea exploration and exploitation; production; and science at the interfaces - engineering with atoms.

  1. Entropy Masking

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B.; Stone, Leland (Technical Monitor)

    1997-01-01

    This paper details two projects that use the World Wide Web (WWW) for dissemination of curricula that focus on remote sensing. 1) Presenting grade-school students with the concepts used in remote sensing involves educating the teacher and then providing the teacher with lesson plans. In a NASA-sponsored project designed to introduce students in grades 4 through 12 to some of the ideas and terminology used in remote sensing, teachers from local grade schools and middle schools were recruited to write lessons about remote sensing concepts they could use in their classrooms. Twenty-two lessons were produced and placed in seven modules that include: the electromagnetic spectrum, two- and three-dimensional perception, maps and topography, scale, remote sensing, biotic and abiotic concepts, and landscape chi rise. Each lesson includes a section that evaluates what students have learned by doing the exercise. The lessons, instead of being published in a workbook and distributed to a limited number of teachers, have been placed on a WWW server, enabling much broader access to the package. This arrangement also allows for the lessons to be modified after feedback from teachers accessing the package. 2) Two-year colleges serve to teach trade skills, prepare students for enrollment in senior institutions of learning, and more and more, retrain students who have college degrees in new technologies and skills. A NASA-sponsored curriculum development project is producing a curriculum using remote sensing analysis an Earth science applications. The project has three major goals. First, it will implement the use of remote sensing data in a broad range of community college courses. Second, it will create curriculum modules and classes that are transportable to other community colleges. Third, the project will be an ongoing source of data and curricular materials to other community colleges. The curriculum will have these course pathways to a certificate; a) a Science emphasis, b) an Arts and Letters emphasis, and c) a Computer Science emphasis Each pathway includes course work in remote sensing, geographical information systems (GIS), computer science, Earth science, software and technology utilization, and communication. Distribution of products from this project to other two-year colleges will be accomplished using the WWW.

  2. Langley Aerospace Research Summer Scholars. Part 2

    NASA Technical Reports Server (NTRS)

    Schwan, Rafaela (Compiler)

    1995-01-01

    The Langley Aerospace Research Summer Scholars (LARSS) Program was established by Dr. Samuel E. Massenberg in 1986. The program has increased from 20 participants in 1986 to 114 participants in 1995. The program is LaRC-unique and is administered by Hampton University. The program was established for the benefit of undergraduate juniors and seniors and first-year graduate students who are pursuing degrees in aeronautical engineering, mechanical engineering, electrical engineering, material science, computer science, atmospheric science, astrophysics, physics, and chemistry. Two primary elements of the LARSS Program are: (1) a research project to be completed by each participant under the supervision of a researcher who will assume the role of a mentor for the summer, and (2) technical lectures by prominent engineers and scientists. Additional elements of this program include tours of LARC wind tunnels, computational facilities, and laboratories. Library and computer facilities will be available for use by the participants.

  3. Technical Reports: Langley Aerospace Research Summer Scholars. Part 1

    NASA Technical Reports Server (NTRS)

    Schwan, Rafaela (Compiler)

    1995-01-01

    The Langley Aerospace Research Summer Scholars (LARSS) Program was established by Dr. Samuel E. Massenberg in 1986. The program has increased from 20 participants in 1986 to 114 participants in 1995. The program is LaRC-unique and is administered by Hampton University. The program was established for the benefit of undergraduate juniors and seniors and first-year graduate students who are pursuing degrees in aeronautical engineering, mechanical engineering, electrical engineering, material science, computer science, atmospheric science, astrophysics, physics, and chemistry. Two primary elements of the LARSS Program are: (1) a research project to be completed by each participant under the supervision of a researcher who will assume the role of a mentor for the summer, and (2) technical lectures by prominent engineers and scientists. Additional elements of this program include tours of LARC wind tunnels, computational facilities, and laboratories. Library and computer facilities will be available for use by the participants.

  4. Extension of TVCAI Project to Include Demonstration of Intelligent Videodisc System. Hardware, Software, and Courseware Implementation Component. Final Report.

    ERIC Educational Resources Information Center

    Brandt, Richard C.; Knapp, Barbara H.

    This project, stemming from work started under the National Science Foundation grant "Development of a Television Computer Assisted Instruction (TVCAI) System" SER-7806412, called for the transfer to videodisc of some of the videotape materials developed under the grant. Three efforts were included in the proposal: design and development…

  5. Particle-in-cell code library for numerical simulation of the ECR source plasma

    NASA Astrophysics Data System (ADS)

    Shirkov, G.; Alexandrov, V.; Preisendorf, V.; Shevtsov, V.; Filippov, A.; Komissarov, R.; Mironov, V.; Shirkova, E.; Strekalovsky, O.; Tokareva, N.; Tuzikov, A.; Vatulin, V.; Vasina, E.; Fomin, V.; Anisimov, A.; Veselov, R.; Golubev, A.; Grushin, S.; Povyshev, V.; Sadovoi, A.; Donskoi, E.; Nakagawa, T.; Yano, Y.

    2003-05-01

    The project ;Numerical simulation and optimization of ion accumulation and production in multicharged ion sources; is funded by the International Science and Technology Center (ISTC). A summary of recent project development and the first version of a computer code library for simulation of electron-cyclotron resonance (ECR) source plasmas based on the particle-in-cell method are presented.

  6. The Contribution of Human Factors in Military System Development: Methodological Considerations

    DTIC Science & Technology

    1980-07-01

    Risk/Uncertainty Analysis - Project Scoring - Utility Scales - Relevance Tree Techniques (Reverse Factor Analysis) 2. Computer Simulation Simulation...effectiveness of mathematical models for R&D project selection. Management Science, April 1973, 18. 6-43 .1~ *.-. Souder, W.E. h scoring methodology for...per some interval PROFICIENCY test scores (written) RADIATION radiation effects aircrew performance on radiation environments REACTION TIME 1) (time

  7. Small business innovation research. Abstracts of completed 1987 phase 1 projects

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Non-proprietary summaries of Phase 1 Small Business Innovation Research (SBIR) projects supported by NASA in the 1987 program year are given. Work in the areas of aeronautical propulsion, aerodynamics, acoustics, aircraft systems, materials and structures, teleoperators and robotics, computer sciences, information systems, spacecraft systems, spacecraft power supplies, spacecraft propulsion, bioastronautics, satellite communication, and space processing are covered.

  8. Near-Space Science: A Ballooning Project to Engage Students with Space beyond the Big Screen

    ERIC Educational Resources Information Center

    Hike, Nina; Beck-Winchatz, Bernhard

    2015-01-01

    Many students probably know something about space from playing computer games or watching movies and TV shows. Teachers can expose them to the real thing by launching their experiments into near space on a weather balloon. This article describes how to use high-altitude ballooning (HAB) as a culminating project to a chemistry unit on experimental…

  9. Does Like Seek Like?: The Formation of Working Groups in a Programming Project

    ERIC Educational Resources Information Center

    Sanou Gozalo, Eduard; Hernández-Fernández, Antoni; Arias, Marta; Ferrer-i-Cancho, Ramon

    2017-01-01

    In a course of the degree of computer science, the programming project has changed from individual to teamed work, tentatively in couples (pair programming). Students have full freedom to team up with minimum intervention from teachers. The analysis of the working groups made indicates that students do not tend to associate with students with a…

  10. The Effect of Interactive, Three Dimensional, High Speed Simulations on High School Science Students' Conceptions of the Molecular Structure of Water.

    ERIC Educational Resources Information Center

    Hakerem, Gita; And Others

    The Water and Molecular Networks (WAMNet) Project uses graduate student written Reduced Instruction Set Computing (RISC) computer simulations of the molecular structure of water to assist high school students learn about the nature of water. This study examined: (1) preconceptions concerning the molecular structure of water common among high…

  11. Balancing Expression and Structure in Game Design: Developing Computational Participation Using Studio-Based Design Pedagogy

    ERIC Educational Resources Information Center

    DeVane, Ben; Steward, Cody; Tran, Kelly M.

    2016-01-01

    This article reports on a project that used a game-creation tool to introduce middle-school students ages 10 to 13 to problem-solving strategies similar to those in computer science through the lens of studio-based design arts. Drawing on historic paradigms in design pedagogy and contemporary educational approaches in the digital arts to teach…

  12. Space Science

    NASA Image and Video Library

    1993-03-01

    Marshall's wirner of a Research Technology Award, worked with the Fourier telescope. This project has developed new technology with the aid of today's advanced computers by allowing an object to be x-rayed using an absorption pattern, then sending this data to the computer where it calculates the data into pixels which inturn develops an image. This new technology is being used in fields like astronomy, astrophysics and medicine.

  13. Community Petascale Project for Accelerator Science and Simulation: Advancing Computational Science for Future Accelerators and Accelerator Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spentzouris, P.; /Fermilab; Cary, J.

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessarymore » accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors. ComPASS is in the first year of executing its plan to develop the next-generation HPC accelerator modeling tools. ComPASS aims to develop an integrated simulation environment that will utilize existing and new accelerator physics modules with petascale capabilities, by employing modern computing and solver technologies. The ComPASS vision is to deliver to accelerator scientists a virtual accelerator and virtual prototyping modeling environment, with the necessary multiphysics, multiscale capabilities. The plan for this development includes delivering accelerator modeling applications appropriate for each stage of the ComPASS software evolution. Such applications are already being used to address challenging problems in accelerator design and optimization. The ComPASS organization for software development and applications accounts for the natural domain areas (beam dynamics, electromagnetics, and advanced acceleration), and all areas depend on the enabling technologies activities, such as solvers and component technology, to deliver the desired performance and integrated simulation environment. The ComPASS applications focus on computationally challenging problems important for design or performance optimization to all major HEP, NP, and BES accelerator facilities. With the cost and complexity of particle accelerators rising, the use of computation to optimize their designs and find improved operating regimes becomes essential, potentially leading to significant cost savings with modest investment.« less

  14. UC Merced Center for Computational Biology Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Colvin, Michael; Watanabe, Masakatsu

    Final report for the UC Merced Center for Computational Biology. The Center for Computational Biology (CCB) was established to support multidisciplinary scientific research and academic programs in computational biology at the new University of California campus in Merced. In 2003, the growing gap between biology research and education was documented in a report from the National Academy of Sciences, Bio2010 Transforming Undergraduate Education for Future Research Biologists. We believed that a new type of biological sciences undergraduate and graduate programs that emphasized biological concepts and considered biology as an information science would have a dramatic impact in enabling the transformationmore » of biology. UC Merced as newest UC campus and the first new U.S. research university of the 21st century was ideally suited to adopt an alternate strategy - to create a new Biological Sciences majors and graduate group that incorporated the strong computational and mathematical vision articulated in the Bio2010 report. CCB aimed to leverage this strong commitment at UC Merced to develop a new educational program based on the principle of biology as a quantitative, model-driven science. Also we expected that the center would be enable the dissemination of computational biology course materials to other university and feeder institutions, and foster research projects that exemplify a mathematical and computations-based approach to the life sciences. As this report describes, the CCB has been successful in achieving these goals, and multidisciplinary computational biology is now an integral part of UC Merced undergraduate, graduate and research programs in the life sciences. The CCB began in fall 2004 with the aid of an award from U.S. Department of Energy (DOE), under its Genomes to Life program of support for the development of research and educational infrastructure in the modern biological sciences. This report to DOE describes the research and academic programs made possible by the CCB from its inception until August, 2010, at the end of the final extension. Although DOE support for the center ended in August 2010, the CCB will continue to exist and support its original objectives. The research and academic programs fostered by the CCB have led to additional extramural funding from other agencies, and we anticipate that CCB will continue to provide support for quantitative and computational biology program at UC Merced for many years to come. Since its inception in fall 2004, CCB research projects have continuously had a multi-institutional collaboration with Lawrence Livermore National Laboratory (LLNL), and the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign, as well as individual collaborators at other sites. CCB affiliated faculty cover a broad range of computational and mathematical research including molecular modeling, cell biology, applied math, evolutional biology, bioinformatics, etc. The CCB sponsored the first distinguished speaker series at UC Merced, which had an important role is spreading the word about the computational biology emphasis at this new campus. One of CCB's original goals is to help train a new generation of biologists who bridge the gap between the computational and life sciences. To archive this goal, by summer 2006, a new program - summer undergraduate internship program, have been established under CCB to train the highly mathematical and computationally intensive Biological Science researchers. By the end of summer 2010, 44 undergraduate students had gone through this program. Out of those participants, 11 students have been admitted to graduate schools and 10 more students are interested in pursuing graduate studies in the sciences. The center is also continuing to facilitate the development and dissemination of undergraduate and graduate course materials based on the latest research in computational biology.« less

  15. Future opportunities and trends for e-infrastructures and life sciences: going beyond the grid to enable life science data analysis

    PubMed Central

    Duarte, Afonso M. S.; Psomopoulos, Fotis E.; Blanchet, Christophe; Bonvin, Alexandre M. J. J.; Corpas, Manuel; Franc, Alain; Jimenez, Rafael C.; de Lucas, Jesus M.; Nyrönen, Tommi; Sipos, Gergely; Suhr, Stephanie B.

    2015-01-01

    With the increasingly rapid growth of data in life sciences we are witnessing a major transition in the way research is conducted, from hypothesis-driven studies to data-driven simulations of whole systems. Such approaches necessitate the use of large-scale computational resources and e-infrastructures, such as the European Grid Infrastructure (EGI). EGI, one of key the enablers of the digital European Research Area, is a federation of resource providers set up to deliver sustainable, integrated and secure computing services to European researchers and their international partners. Here we aim to provide the state of the art of Grid/Cloud computing in EU research as viewed from within the field of life sciences, focusing on key infrastructures and projects within the life sciences community. Rather than focusing purely on the technical aspects underlying the currently provided solutions, we outline the design aspects and key characteristics that can be identified across major research approaches. Overall, we aim to provide significant insights into the road ahead by establishing ever-strengthening connections between EGI as a whole and the life sciences community. PMID:26157454

  16. Future opportunities and trends for e-infrastructures and life sciences: going beyond the grid to enable life science data analysis.

    PubMed

    Duarte, Afonso M S; Psomopoulos, Fotis E; Blanchet, Christophe; Bonvin, Alexandre M J J; Corpas, Manuel; Franc, Alain; Jimenez, Rafael C; de Lucas, Jesus M; Nyrönen, Tommi; Sipos, Gergely; Suhr, Stephanie B

    2015-01-01

    With the increasingly rapid growth of data in life sciences we are witnessing a major transition in the way research is conducted, from hypothesis-driven studies to data-driven simulations of whole systems. Such approaches necessitate the use of large-scale computational resources and e-infrastructures, such as the European Grid Infrastructure (EGI). EGI, one of key the enablers of the digital European Research Area, is a federation of resource providers set up to deliver sustainable, integrated and secure computing services to European researchers and their international partners. Here we aim to provide the state of the art of Grid/Cloud computing in EU research as viewed from within the field of life sciences, focusing on key infrastructures and projects within the life sciences community. Rather than focusing purely on the technical aspects underlying the currently provided solutions, we outline the design aspects and key characteristics that can be identified across major research approaches. Overall, we aim to provide significant insights into the road ahead by establishing ever-strengthening connections between EGI as a whole and the life sciences community.

  17. Barriers and facilitators for integrating digital narratives in secondary school science instruction: A media specialist's action research study

    NASA Astrophysics Data System (ADS)

    Midland, Susan

    Media specialists are increasingly assuming professional development roles as they collaborate with teachers to design instruction that combines content with technology. I am a media specialist in an independent school, and collaborated with two science teachers over a three-year period to integrate technology with their instruction. This action study explored integration of a digital narrative project in three eighth-grade earth science units and one ninth-grade physics unit with each unit serving as a cycle of research. Students produced short digital documentaries that combined still images with an accompanying narration. Students participating in the project wrote scripts based on selected science topics. The completed scripts served as the basis for the narratives. These projects were compared with a more traditional science writing project. Barriers and facilitators for implementation of this type of media project in a science classroom were identified. Lack of adequate access to computers proved to be a significant mechanical barrier. Acquisition of a laptop cart reduced but did not eliminate the technology access issues. The complexity of the project increased implementation time in comparison with traditional alternatives. Evaluation of the completed media projects presented problems. Scores by outside evaluators reflected evaluator unfamiliarity with assessing multimedia projects rather than student performance. Despite several revisions of the assessment rubric, low inter-rater reliability remained a concern even in the last cycle. This suggests that evaluation of media could present issues for teachers who attempt projects of this kind. A writing frame was developed to facilitate production of scripts. This reduced the time required to produce the scripts, but produced writing that was formulaic in the teacher's estimate. A graphic organizer was adopted in the final cycle to address this concern. New insights emerged as the study progressed through the four cycles of the study. At the conclusion of the study, the two teachers and I had a better understanding of barriers that can prevent smooth integration of a technology-based project.

  18. Final Report on DOE Project entitled Dynamic Optimized Advanced Scheduling of Bandwidth Demands for Large-Scale Science Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramamurthy, Byravamurthy

    2014-05-05

    In this project, developed scheduling frameworks for dynamic bandwidth demands for large-scale science applications. In particular, we developed scheduling algorithms for dynamic bandwidth demands in this project. Apart from theoretical approaches such as Integer Linear Programming, Tabu Search and Genetic Algorithm heuristics, we have utilized practical data from ESnet OSCARS project (from our DOE lab partners) to conduct realistic simulations of our approaches. We have disseminated our work through conference paper presentations and journal papers and a book chapter. In this project we addressed the problem of scheduling of lightpaths over optical wavelength division multiplexed (WDM) networks. We published severalmore » conference papers and journal papers on this topic. We also addressed the problems of joint allocation of computing, storage and networking resources in Grid/Cloud networks and proposed energy-efficient mechanisms for operatin optical WDM networks.« less

  19. Japan signs Ocean Agreement

    NASA Astrophysics Data System (ADS)

    The Ocean Research Institute of the University of Tokyo and the National Science Foundation (NSF) have signed a Memorandum of Understanding for cooperation in the Ocean Drilling Program (ODP). The agreement calls for Japanese participation in ODP and an annual contribution of $2.5 million in U.S. currency for the project's 9 remaining years, according to NSF.ODP is an international project whose mission is to learn more about the formation and development of the earth through the collection and examination of core samples from beneath the ocean. The program uses the drillship JOIDES Resolution, which is equipped with laboratories and computer facilities. The Joint Oceanographic Institutions for Deep Earth Sampling (JOIDES), an international group of scientists, provides overall science planning and program advice regarding ODP's science goals and objectives.

  20. OPENING REMARKS: SciDAC: Scientific Discovery through Advanced Computing

    NASA Astrophysics Data System (ADS)

    Strayer, Michael

    2005-01-01

    Good morning. Welcome to SciDAC 2005 and San Francisco. SciDAC is all about computational science and scientific discovery. In a large sense, computational science characterizes SciDAC and its intent is change. It transforms both our approach and our understanding of science. It opens new doors and crosses traditional boundaries while seeking discovery. In terms of twentieth century methodologies, computational science may be said to be transformational. There are a number of examples to this point. First are the sciences that encompass climate modeling. The application of computational science has in essence created the field of climate modeling. This community is now international in scope and has provided precision results that are challenging our understanding of our environment. A second example is that of lattice quantum chromodynamics. Lattice QCD, while adding precision and insight to our fundamental understanding of strong interaction dynamics, has transformed our approach to particle and nuclear science. The individual investigator approach has evolved to teams of scientists from different disciplines working side-by-side towards a common goal. SciDAC is also undergoing a transformation. This meeting is a prime example. Last year it was a small programmatic meeting tracking progress in SciDAC. This year, we have a major computational science meeting with a variety of disciplines and enabling technologies represented. SciDAC 2005 should position itself as a new corner stone for Computational Science and its impact on science. As we look to the immediate future, FY2006 will bring a new cycle to SciDAC. Most of the program elements of SciDAC will be re-competed in FY2006. The re-competition will involve new instruments for computational science, new approaches for collaboration, as well as new disciplines. There will be new opportunities for virtual experiments in carbon sequestration, fusion, and nuclear power and nuclear waste, as well as collaborations with industry and virtual prototyping. New instruments of collaboration will include institutes and centers while summer schools, workshops and outreach will invite new talent and expertise. Computational science adds new dimensions to science and its practice. Disciplines of fusion, accelerator science, and combustion are poised to blur the boundaries between pure and applied science. As we open the door into FY2006 we shall see a landscape of new scientific challenges: in biology, chemistry, materials, and astrophysics to name a few. The enabling technologies of SciDAC have been transformational as drivers of change. Planning for major new software systems assumes a base line employing Common Component Architectures and this has become a household word for new software projects. While grid algorithms and mesh refinement software have transformed applications software, data management and visualization have transformed our understanding of science from data. The Gordon Bell prize now seems to be dominated by computational science and solvers developed by TOPS ISIC. The priorities of the Office of Science in the Department of Energy are clear. The 20 year facilities plan is driven by new science. High performance computing is placed amongst the two highest priorities. Moore's law says that by the end of the next cycle of SciDAC we shall have peta-flop computers. The challenges of petascale computing are enormous. These and the associated computational science are the highest priorities for computing within the Office of Science. Our effort in Leadership Class computing is just a first step towards this goal. Clearly, computational science at this scale will face enormous challenges and possibilities. Performance evaluation and prediction will be critical to unraveling the needed software technologies. We must not lose sight of our overarching goal—that of scientific discovery. Science does not stand still and the landscape of science discovery and computing holds immense promise. In this environment, I believe it is necessary to institute a system of science based performance metrics to help quantify our progress towards science goals and scientific computing. As a final comment I would like to reaffirm that the shifting landscapes of science will force changes to our computational sciences, and leave you with the quote from Richard Hamming, 'The purpose of computing is insight, not numbers'.

  1. Research and technology, 1984 report

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Research and technology projects in the following areas are described: cryogenic engineering, hypergolic engineering, hazardous warning instrumentation, structures and mechanics, sensors and controls, computer sciences, communications, material analysis, biomedicine, meteorology, engineering management, logistics, training and maintenance aids, and technology applications.

  2. IAIMS development at Harvard Medical School.

    PubMed Central

    Barnett, G O; Greenes, R A; Zielstorff, R D

    1988-01-01

    The long-range goal of this IAIMS development project is to achieve an Integrated Academic Information Management System for the Harvard Medical School, the Francis A. Countway Library of Medicine, and Harvard's affiliated institutions and their respective libraries. An "opportunistic, incremental" approach to planning has been devised. The projects selected for the initial phase are to implement an increasingly powerful electronic communications network, to encourage the use of a variety of bibliographic and information access techniques, and to begin an ambitious program of faculty and student education in computer science and its applications to medical education, medical care, and research. In addition, we will explore means to promote better collaboration among the separate computer science units in the various schools and hospitals. We believe that our planning approach will have relevance to other educational institutions where lack of strong central organizational control prevents a "top-down" approach to planning. PMID:3416098

  3. Management and Analysis of Biological and Clinical Data: How Computer Science May Support Biomedical and Clinical Research

    NASA Astrophysics Data System (ADS)

    Veltri, Pierangelo

    The use of computer based solutions for data management in biology and clinical science has contributed to improve life-quality and also to gather research results in shorter time. Indeed, new algorithms and high performance computation have been using in proteomics and genomics studies for curing chronic diseases (e.g., drug designing) as well as supporting clinicians both in diagnosis (e.g., images-based diagnosis) and patient curing (e.g., computer based information analysis on information gathered from patient). In this paper we survey on examples of computer based techniques applied in both biology and clinical contexts. The reported applications are also results of experiences in real case applications at University Medical School of Catanzaro and also part of experiences of the National project Staywell SH 2.0 involving many research centers and companies aiming to study and improve citizen wellness.

  4. Supporting Weather Data

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Since its founding in 1992, Global Science & Technology, Inc. (GST), of Greenbelt, Maryland, has been developing technologies and providing services in support of NASA scientific research. GST specialties include scientific analysis, science data and information systems, data visualization, communications, networking and Web technologies, computer science, and software system engineering. As a longtime contractor to Goddard Space Flight Center s Earth Science Directorate, GST scientific, engineering, and information technology staff have extensive qualifications with the synthesis of satellite, in situ, and Earth science data for weather- and climate-related projects. GST s experience in this arena is end-to-end, from building satellite ground receiving systems and science data systems, to product generation and research and analysis.

  5. Mechanisms and Dynamics of Abiotic and Biotic Interactions at Environmental Interfaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roso, Kevin M.

    The Stanford EMSI (SEMSI) was established in 2004 through joint funding by the National Science Foundation and the OBER-ERSD. It encompasses a number of universities and national laboratories. The PNNL component of the SEMSI is funded by ERSD and is the focus of this report. This component has the objective of providing theory support to the SEMSI by bringing computational capabilities and expertise to bear on important electron transfer problems at mineral/water and mineral/microbe interfaces. PNNL staff member Dr. Kevin Rosso, who is also ''matrixed'' into the Environmental Molecular Sciences Laboratory (EMSL) at PNNL, is a co-PI on the SEMSImore » project and the PNNL lead. The EMSL computational facilities being applied to the SEMSI project include the 11.8 teraflop massively-parallel supercomputer. Science goals of this EMSL/SEMSI partnership include advancing our understanding of: (1) The kinetics of U(VI) and Cr(VI) reduction by aqueous and solid-phase Fe(II), (2) The structure of mineral surfaces in equilibrium with solution, and (3) Mechanisms of bacterial electron transfer to iron oxide surfaces via outer-membrane cytochromes.« less

  6. Transportable, university-level educational programs in interactive information storage and retrieval systems

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D.; Roquemore, Leroy

    1984-01-01

    Pursuant to the specifications of a research contract entered into in December, 1983 with NASA, the Computer Science Departments of the University of Southwestern Louisiana and Southern University will be working jointly to address a variety of research and educational issues relating to the use, by non-computer professionals, of some of the largest and most sophiticated interactive information storage and retrieval systems available. Over the projected 6 to 8 year life of the project, in addition to NASA/RECON, the following systems will be examined: Lockheed DIALOG, DOE/RECON, DOD/DTIC, EPA/CSIN, and LLNL/TIS.

  7. Determination of the vapor-liquid transition of square-well particles using a novel generalized-canonical-ensemble-based method

    NASA Astrophysics Data System (ADS)

    Zhao, Liang; Xu, Shun; Tu, Yu-Song; Zhou, Xin

    2017-06-01

    Not Available Project supported by the National Natural Science Foundation for Outstanding Young Scholars, China (Grant No. 11422542), the National Natural Science Foundation of China (Grant Nos. 11605151 and 11675138), and the Shanghai Supercomputer Center of China and Special Program for Applied Research on Super Computation of the NSFC-Guangdong Joint Fund (the second phase).

  8. Global Warming, Africa and National Security

    DTIC Science & Technology

    2008-01-15

    African populations. This includes awareness from a global perspective in line with The Army Strategy for the Environment, the UN’s Intergovernmental...2 attention. At the time, computer models did not indicate a significant issue with global warming suggesting only a modest increase of 2°C9...projected climate changes. Current Science The science surrounding climate change and global warming was, until recently, a point of

  9. Challenges in Computational Social Modeling and Simulation for National Security Decision Making

    DTIC Science & Technology

    2011-06-01

    This study is grounded within a system-activity theory , a logico-philosophical model of interdisciplinary research [13, 14], the concepts of social...often a difficult challenge. Ironically, social science research methods , such as ethnography , may be tremendously helpful in designing these...social sciences. Moreover, CSS projects draw on knowledge and methods from other fields of study , including graph theory , information visualization

  10. A network-based distributed, media-rich computing and information environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phillips, R.L.

    1995-12-31

    Sunrise is a Los Alamos National Laboratory (LANL) project started in October 1993. It is intended to be a prototype National Information Infrastructure development project. A main focus of Sunrise is to tie together enabling technologies (networking, object-oriented distributed computing, graphical interfaces, security, multi-media technologies, and data-mining technologies) with several specific applications. A diverse set of application areas was chosen to ensure that the solutions developed in the project are as generic as possible. Some of the application areas are materials modeling, medical records and image analysis, transportation simulations, and K-12 education. This paper provides a description of Sunrise andmore » a view of the architecture and objectives of this evolving project. The primary objectives of Sunrise are three-fold: (1) To develop common information-enabling tools for advanced scientific research and its applications to industry; (2) To enhance the capabilities of important research programs at the Laboratory; (3) To define a new way of collaboration between computer science and industrially-relevant research.« less

  11. The Third Annual NASA Science Internet User Working Group Conference

    NASA Technical Reports Server (NTRS)

    Lev, Brian S. (Editor); Gary, J. Patrick (Editor)

    1993-01-01

    The NASA Science Internet (NSI) User Support Office (USO) sponsored the Third Annual NSI User Working Group (NSIUWG) Conference March 30 through April 3, 1992, in Greenbelt, MD. Approximately 130 NSI users attended to learn more about the NSI, hear from projects which use NSI, and receive updates about new networking technologies and services. This report contains material relevant to the conference; copies of the agenda, meeting summaries, presentations, and descriptions of exhibitors. Plenary sessions featured a variety of speakers, including NSI project management, scientists, and NSI user project managers whose projects and applications effectively use NSI, and notable citizens of the larger Internet community. The conference also included exhibits of advanced networking applications; tutorials on internetworking, computer security, and networking technologies; and user subgroup meetings on the future direction of the conference, networking, and user services and applications.

  12. A social science data-fusion tool and the Data Management through e-Social Science (DAMES) infrastructure.

    PubMed

    Warner, Guy C; Blum, Jesse M; Jones, Simon B; Lambert, Paul S; Turner, Kenneth J; Tan, Larry; Dawson, Alison S F; Bell, David N F

    2010-08-28

    The last two decades have seen substantially increased potential for quantitative social science research. This has been made possible by the significant expansion of publicly available social science datasets, the development of new analytical methodologies, such as microsimulation, and increases in computing power. These rich resources do, however, bring with them substantial challenges associated with organizing and using data. These processes are often referred to as 'data management'. The Data Management through e-Social Science (DAMES) project is working to support activities of data management for social science research. This paper describes the DAMES infrastructure, focusing on the data-fusion process that is central to the project approach. It covers: the background and requirements for provision of resources by DAMES; the use of grid technologies to provide easy-to-use tools and user front-ends for several common social science data-management tasks such as data fusion; the approach taken to solve problems related to data resources and metadata relevant to social science applications; and the implementation of the architecture that has been designed to achieve this infrastructure.

  13. Physicists Get INSPIREd: INSPIRE Project and Grid Applications

    NASA Astrophysics Data System (ADS)

    Klem, Jukka; Iwaszkiewicz, Jan

    2011-12-01

    INSPIRE is the new high-energy physics scientific information system developed by CERN, DESY, Fermilab and SLAC. INSPIRE combines the curated and trusted contents of SPIRES database with Invenio digital library technology. INSPIRE contains the entire HEP literature with about one million records and in addition to becoming the reference HEP scientific information platform, it aims to provide new kinds of data mining services and metrics to assess the impact of articles and authors. Grid and cloud computing provide new opportunities to offer better services in areas that require large CPU and storage resources including document Optical Character Recognition (OCR) processing, full-text indexing of articles and improved metrics. D4Science-II is a European project that develops and operates an e-Infrastructure supporting Virtual Research Environments (VREs). It develops an enabling technology (gCube) which implements a mechanism for facilitating the interoperation of its e-Infrastructure with other autonomously running data e-Infrastructures. As a result, this creates the core of an e-Infrastructure ecosystem. INSPIRE is one of the e-Infrastructures participating in D4Science-II project. In the context of the D4Science-II project, the INSPIRE e-Infrastructure makes available some of its resources and services to other members of the resulting ecosystem. Moreover, it benefits from the ecosystem via a dedicated Virtual Organization giving access to an array of resources ranging from computing and storage resources of grid infrastructures to data and services.

  14. Gravity Spy - Integrating LIGO detector characterization, citizen science, and machine learning

    NASA Astrophysics Data System (ADS)

    Zevin, Michael; Gravity Spy

    2016-06-01

    On September 14th 2015, the Advanced Laser Interferometer Gravitational-wave Observatory (aLIGO) made the first direct observation of gravitational waves and opened a new field of observational astronomy. However, being the most complicated and sensitve experiment ever undertaken in gravitational physics, aLIGO is susceptible to various sources of environmental and instrumental noise that hinder the search for more gravitational waves.Of particular concern are transient, non-Gaussian noise features known as glitches. Glitches can mimic true astrophysical gravitational waves, occur at a high enough frequency to be coherent between the two detectors, and generally worsen aLIGO's detection capabilities. The proper classification and charaterization of glitches is paramount in optimizing aLIGO's ability to detect gravitational waves. However, teaching computers to identify and morphologically classify these artifacts is exceedingly difficult.Human intuition has proven to be a useful tool in classifcation probelms such as this. Gravity Spy is an innovative, interdisciplinary project hosted by Zooniverse that combines aLIGO detector characterization, citizen science, machine learning, and social science. In this project, citizen scientists and computers will work together in a sybiotic relationship that leverages human pattern recognition and the ability of machine learning to process large amounts of data systematically: volunteers classify triggers from the aLIGO data steam that are constantly updated as aLIGO takes in new data, and these classifications are used to train machine learning algorithms which proceed to classify the bulk of aLIGO data and feed questionable glithces back to the users.In this talk, I will discuss the workflow and initial results of the Gravity Spy project with regard to aLIGO's future observing runs and highlight the potential of such citizen science projects in promoting nascent fields such as gravitational wave astrophysics.

  15. [Evaluation of the lifestyle of students of physiotherapy and technical & computer science basing on their diet and physical activity].

    PubMed

    Medrela-Kuder, Ewa

    2011-01-01

    The aim of the study was the evaluation of a dietary habits profile and physical activity of Physiotherapy and Technical & Computer Science students. The research involved a group of 174 non-full-time students of higher education institutions in Krakow aged between 22 and 27. 81 students of the surveyed studied Physiotherapy at the University of Physical Education, whereas 93 followed a course in Technical & Computer Science at the Pedagogical University. In this project a diagnostic survey method was used. The study revealed that the lifestyle of university youth left much to be desired. Dietary errors were exemplified by irregular meals intake, low consumption of fish, milk and dairy, snacking between meals on high calorie products with a poor nutrient content. With regard to physical activity, Physiotherapy students were characterised by more positive attitudes than those from Technical & Computer Science. Such physical activity forms as swimming, team sports, cycling and strolling were declared by the surveyed the most frequently. Health-oriented education should be introduced in such a way as to improve the knowledge pertaining to a health-promoting lifestyle as a means of prevention of numerous diseases.

  16. Sandia National Laboratories: Research: Materials Science

    Science.gov Websites

    Technology Partnerships Business, Industry, & Non-Profits Government Universities Center for Development Agreement (CRADA) Strategic Partnership Projects, Non-Federal Entity (SPP/NFE) Agreements New research. Research Our research uses Sandia's experimental, theoretical, and computational capabilities to

  17. Strategies for Effective Implementation of Science Models into 6-9 Grade Classrooms on Climate, Weather, and Energy Topics

    NASA Astrophysics Data System (ADS)

    Yarker, M. B.; Stanier, C. O.; Forbes, C.; Park, S.

    2011-12-01

    As atmospheric scientists, we depend on Numerical Weather Prediction (NWP) models. We use them to predict weather patterns, to understand external forcing on the atmosphere, and as evidence to make claims about atmospheric phenomenon. Therefore, it is important that we adequately prepare atmospheric science students to use computer models. However, the public should also be aware of what models are in order to understand scientific claims about atmospheric issues, such as climate change. Although familiar with weather forecasts on television and the Internet, the general public does not understand the process of using computer models to generate a weather and climate forecasts. As a result, the public often misunderstands claims scientists make about their daily weather as well as the state of climate change. Since computer models are the best method we have to forecast the future of our climate, scientific models and modeling should be a topic covered in K-12 classrooms as part of a comprehensive science curriculum. According to the National Science Education Standards, teachers are encouraged to science models into the classroom as a way to aid in the understanding of the nature of science. However, there is very little description of what constitutes a science model, so the term is often associated with scale models. Therefore, teachers often use drawings or scale representations of physical entities, such as DNA, the solar system, or bacteria. In other words, models used in classrooms are often used as visual representations, but the purpose of science models is often overlooked. The implementation of a model-based curriculum in the science classroom can be an effective way to prepare students to think critically, problem solve, and make informed decisions as a contributing member of society. However, there are few resources available to help teachers implement science models into the science curriculum effectively. Therefore, this research project looks at strategies middle school science teachers use to implement science models into their classrooms. These teachers in this study took part in a week-long professional development designed to orient them towards appropriate use of science models for a unit on weather, climate, and energy concepts. The goal of this project is to describe the professional development and describe how teachers intend to incorporate science models into each of their individual classrooms.

  18. STS-30 onboard view of fluids experiment apparatus (FEA) equipment

    NASA Image and Video Library

    1989-05-08

    STS030-10-003 (4-8 May 1989) --- An overall scene of the onboard materials science project for STS-30. Seen is the fluids experiment apparatus, supported by an accompanying computer and an 8mm camcorder for its operation. Another major component of the project-- Astronaut Mary L. Cleave, who devoted a great deal of STS-30 monitoring various experiments--is out of frame.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garzoglio, Gabriele

    The Fermilab Grid and Cloud Computing Department and the KISTI Global Science experimental Data hub Center propose a joint project. The goals are to enable scientific workflows of stakeholders to run on multiple cloud resources by use of (a) Virtual Infrastructure Automation and Provisioning, (b) Interoperability and Federat ion of Cloud Resources , and (c) High-Throughput Fabric Virtualization. This is a matching fund project in which Fermilab and KISTI will contribute equal resources .

  20. Using a Faculty-in-Residence Model to Enhance Curriculae in Computer Science and Social Work with Writing and Critical Thinking

    ERIC Educational Resources Information Center

    Sarnoff, Susan; Welch, Lonnie; Gradin, Sherrie; Sandell, Karin

    2004-01-01

    This paper will discuss the results of a project that enabled three faculty members from disparate disciplines: Social Work, Interpersonal Communication and Software Engineering, to enhance writing and critical thinking in their courses. The paper will address the Faculty-in-Residence project model, the activities taken on as a result of it, the…

  1. Activities at the Lunar and Planetary Institute

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The activities of the Lunar and Planetary Institute for the period July to December 1984 are discussed. Functions of its departments and projects are summarized. These include: planetary image center; library information center; computer center; production services; scientific staff; visitors program; scientific projects; conferences; workshops; seminars; publications and communications; panels, teams, committees and working groups; NASA-AMES vertical gun range (AVGR); and lunar and planetary science council.

  2. NASA Intelligent Systems Project: Results, Accomplishments and Impact on Science Missions.

    NASA Astrophysics Data System (ADS)

    Coughlan, J. C.

    2005-12-01

    The Intelligent Systems Project was responsible for much of NASA's programmatic investment in artificial intelligence and advanced information technologies. IS has completed three major project milestones which demonstrated increased capabilities in autonomy, human centered computing, and intelligent data understanding. Autonomy involves the ability of a robot to place an instrument on a remote surface with a single command cycle, human centered computing supported a collaborative, mission centric data and planning system for the Mars Exploration Rovers and data understanding has produced key components of a terrestrial satellite observation system with automated modeling and data analysis capabilities. This paper summarizes the technology demonstrations and metrics which quantify and summarize these new technologies which are now available for future NASA missions.

  3. NASA Intelligent Systems Project: Results, Accomplishments and Impact on Science Missions

    NASA Technical Reports Server (NTRS)

    Coughlan, Joseph C.

    2005-01-01

    The Intelligent Systems Project was responsible for much of NASA's programmatic investment in artificial intelligence and advanced information technologies. IS has completed three major project milestones which demonstrated increased capabilities in autonomy, human centered computing, and intelligent data understanding. Autonomy involves the ability of a robot to place an instrument on a remote surface with a single command cycle. Human centered computing supported a collaborative, mission centric data and planning system for the Mars Exploration Rovers and data understanding has produced key components of a terrestrial satellite observation system with automated modeling and data analysis capabilities. This paper summarizes the technology demonstrations and metrics which quantify and summarize these new technologies which are now available for future Nasa missions.

  4. Parallel Computing for Brain Simulation.

    PubMed

    Pastur-Romay, L A; Porto-Pazos, A B; Cedron, F; Pazos, A

    2017-01-01

    The human brain is the most complex system in the known universe, it is therefore one of the greatest mysteries. It provides human beings with extraordinary abilities. However, until now it has not been understood yet how and why most of these abilities are produced. For decades, researchers have been trying to make computers reproduce these abilities, focusing on both understanding the nervous system and, on processing data in a more efficient way than before. Their aim is to make computers process information similarly to the brain. Important technological developments and vast multidisciplinary projects have allowed creating the first simulation with a number of neurons similar to that of a human brain. This paper presents an up-to-date review about the main research projects that are trying to simulate and/or emulate the human brain. They employ different types of computational models using parallel computing: digital models, analog models and hybrid models. This review includes the current applications of these works, as well as future trends. It is focused on various works that look for advanced progress in Neuroscience and still others which seek new discoveries in Computer Science (neuromorphic hardware, machine learning techniques). Their most outstanding characteristics are summarized and the latest advances and future plans are presented. In addition, this review points out the importance of considering not only neurons: Computational models of the brain should also include glial cells, given the proven importance of astrocytes in information processing. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  5. Introducing Enabling Computational Tools to the Climate Sciences: Multi-Resolution Climate Modeling with Adaptive Cubed-Sphere Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jablonowski, Christiane

    The research investigates and advances strategies how to bridge the scale discrepancies between local, regional and global phenomena in climate models without the prohibitive computational costs of global cloud-resolving simulations. In particular, the research explores new frontiers in computational geoscience by introducing high-order Adaptive Mesh Refinement (AMR) techniques into climate research. AMR and statically-adapted variable-resolution approaches represent an emerging trend for atmospheric models and are likely to become the new norm in future-generation weather and climate models. The research advances the understanding of multi-scale interactions in the climate system and showcases a pathway how to model these interactions effectively withmore » advanced computational tools, like the Chombo AMR library developed at the Lawrence Berkeley National Laboratory. The research is interdisciplinary and combines applied mathematics, scientific computing and the atmospheric sciences. In this research project, a hierarchy of high-order atmospheric models on cubed-sphere computational grids have been developed that serve as an algorithmic prototype for the finite-volume solution-adaptive Chombo-AMR approach. The foci of the investigations have lied on the characteristics of both static mesh adaptations and dynamically-adaptive grids that can capture flow fields of interest like tropical cyclones. Six research themes have been chosen. These are (1) the introduction of adaptive mesh refinement techniques into the climate sciences, (2) advanced algorithms for nonhydrostatic atmospheric dynamical cores, (3) an assessment of the interplay between resolved-scale dynamical motions and subgrid-scale physical parameterizations, (4) evaluation techniques for atmospheric model hierarchies, (5) the comparison of AMR refinement strategies and (6) tropical cyclone studies with a focus on multi-scale interactions and variable-resolution modeling. The results of this research project demonstrate significant advances in all six research areas. The major conclusions are that statically-adaptive variable-resolution modeling is currently becoming mature in the climate sciences, and that AMR holds outstanding promise for future-generation weather and climate models on high-performance computing architectures.« less

  6. Projective simulation for artificial intelligence

    NASA Astrophysics Data System (ADS)

    Briegel, Hans J.; de Las Cuevas, Gemma

    2012-05-01

    We propose a model of a learning agent whose interaction with the environment is governed by a simulation-based projection, which allows the agent to project itself into future situations before it takes real action. Projective simulation is based on a random walk through a network of clips, which are elementary patches of episodic memory. The network of clips changes dynamically, both due to new perceptual input and due to certain compositional principles of the simulation process. During simulation, the clips are screened for specific features which trigger factual action of the agent. The scheme is different from other, computational, notions of simulation, and it provides a new element in an embodied cognitive science approach to intelligent action and learning. Our model provides a natural route for generalization to quantum-mechanical operation and connects the fields of reinforcement learning and quantum computation.

  7. Projective simulation for artificial intelligence

    PubMed Central

    Briegel, Hans J.; De las Cuevas, Gemma

    2012-01-01

    We propose a model of a learning agent whose interaction with the environment is governed by a simulation-based projection, which allows the agent to project itself into future situations before it takes real action. Projective simulation is based on a random walk through a network of clips, which are elementary patches of episodic memory. The network of clips changes dynamically, both due to new perceptual input and due to certain compositional principles of the simulation process. During simulation, the clips are screened for specific features which trigger factual action of the agent. The scheme is different from other, computational, notions of simulation, and it provides a new element in an embodied cognitive science approach to intelligent action and learning. Our model provides a natural route for generalization to quantum-mechanical operation and connects the fields of reinforcement learning and quantum computation. PMID:22590690

  8. The Naval Postgraduate School SECURE ARCHIVAL STORAGE SYSTEM. Part II. Segment and Process Management Implementation.

    DTIC Science & Technology

    1981-03-01

    Research Instructor of Computer Scienr-. Reviewed by: Released by: WILLIAM M. TOLLES Department puter Science Dean of Research 4c t SECURITY...Lyle A. Cox, Roger R. Schell, and Sonja L. Perdue 9. PERFORMING ORGANIZATION NAME ANO ADDRESS 10. PROGRAM ELEMENT. PROJECT. TASK AREA A WORK UNIT... Computer Networks, Operating Systems, Computer Security 20. AftUrCT (Cnthm, w v re eae old* It n..*p and idm 0 F W blk ..m.m.o’) ",A_;he security

  9. Applications of hybrid and digital computation methods in aerospace-related sciences and engineering. [problem solving methods at the University of Houston

    NASA Technical Reports Server (NTRS)

    Huang, C. J.; Motard, R. L.

    1978-01-01

    The computing equipment in the engineering systems simulation laboratory of the Houston University Cullen College of Engineering is described and its advantages are summarized. The application of computer techniques in aerospace-related research psychology and in chemical, civil, electrical, industrial, and mechanical engineering is described in abstracts of 84 individual projects and in reprints of published reports. Research supports programs in acoustics, energy technology, systems engineering, and environment management as well as aerospace engineering.

  10. Laboratory directed research and development annual report 2004.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This report summarizes progress from the Laboratory Directed Research and Development (LDRD) program during fiscal year 2004. In addition to a programmatic and financial overview, the report includes progress reports from 352 individual R and D projects in 15 categories. The 15 categories are: (1) Advanced Concepts; (2) Advanced Manufacturing; (3) Biotechnology; (4) Chemical and Earth Sciences; (5) Computational and Information Sciences; (6) Differentiating Technologies; (7) Electronics and Photonics; (8) Emerging Threats; (9) Energy and Critical Infrastructures; (10) Engineering Sciences; (11) Grand Challenges; (12) Materials Science and Technology; (13) Nonproliferation and Materials Control; (14) Pulsed Power and High Energy Densitymore » Sciences; and (15) Corporate Objectives.« less

  11. Insights on WWW-based geoscience teaching: Climbing the first year learning cliff

    NASA Astrophysics Data System (ADS)

    Lamberson, Michelle N.; Johnson, Mark; Bevier, Mary Lou; Russell, J. Kelly

    1997-06-01

    In early 1995, The University of British Columbia Department of Geological Sciences (now Earth and Ocean Sciences) initiated a project that explored the effectiveness of the World Wide Web as a teaching and learning medium. Four decisions made at the onset of the project have guided the department's educational technology plan: (1) over 90% of funding recieved from educational technology grants was committed towards personnel; (2) materials developed are modular in design; (3) a data-base approach was taken to resource development; and (4) a strong commitment to student involvement in courseware development. The project comprised development of a web site for an existing core course: Geology 202, Introduction to Petrology. The web site is a gateway to course information, content, resources, exercises, and several searchable data-bases (images, petrologic definitions, and minerals in thin section). Material was developed on either an IBM or UNIX machine, ported to a UNIX platform, and is accessed using the Netscape browser. The resources consist primarily of HTML files or CGI scripts with associated text, images, sound, digital movies, and animations. Students access the web site from the departmental student computer facility, from home or a computer station in the petrology laboratory. Results of a survey of the Geol 202 students indicate that they found the majority of the resources useful, and the site is being expanded. The Geology 202 project had a "trickle-up" effect throughout the department: prior to this project, there was minimal use of Internet resources in lower-level geology courses. By the end of the 1996-1997 academic year, we anticipate that at least 17 Earth and Ocean Science courses will have a WWW site for one or all of the following uses: (1) presenting basic information; (2) accessing lecture images; (3) providing a jumping-off point for exploring related WWW sites; (4) conducting on-line exercises; and/or (5) providing a communications forum for students and faculty via a Hypernews group. Url http://www.science.ubc.ca/

  12. Soil moisture needs in earth sciences

    NASA Technical Reports Server (NTRS)

    Engman, Edwin T.

    1992-01-01

    The author reviews the development of passive and active microwave techniques for measuring soil moisture with respect to how the data may be used. New science programs such as the EOS, the GEWEX Continental-Scale International Project (GCIP) and STORM, a mesoscale meteorology and hydrology project, will have to account for soil moisture either as a storage in water balance computations or as a state variable in-process modeling. The author discusses future soil moisture needs such as frequency of measurement, accuracy, depth, and spatial resolution, as well as the concomitant model development that must proceed concurrently if the development in microwave technology is to have a major impact in these areas.

  13. Education and Outreach Programs Offered by the Center for High Pressure Research and the Consortium for Materials Properties Research in Earth Sciences

    NASA Astrophysics Data System (ADS)

    Richard, G. A.

    2003-12-01

    Major research facilities and organizations provide an effective venue for developing partnerships with educational organizations in order to offer a wide variety of educational programs, because they constitute a base where the culture of scientific investigation can flourish. The Consortium for Materials Properties Research in Earth Sciences (COMPRES) conducts education and outreach programs through the Earth Science Educational Resource Center (ESERC), in partnership with other groups that offer research and education programs. ESERC initiated its development of education programs in 1994 under the administration of the Center for High Pressure Research (CHiPR), which was funded as a National Science Foundation Science and Technology Center from 1991 to 2002. Programs developed during ESERC's association with CHiPR and COMPRES have targeted a wide range of audiences, including pre-K, K-12 students and teachers, undergraduates, and graduate students. Since 1995, ESERC has offered inquiry-based programs to Project WISE (Women in Science and Engineering) students at a high school and undergraduate level. Activities have included projects that investigated earthquakes, high pressure mineral physics, and local geology. Through a practicum known as Project Java, undergraduate computer science students have developed interactive instructional tools for several of these activities. For K-12 teachers, a course on Long Island geology is offered each fall, which includes an examination of the role that processes in the Earth's interior have played in the geologic history of the region. ESERC has worked with Stony Brook's Department of Geosciences faculty to offer courses on natural hazards, computer modeling, and field geology to undergraduate students, and on computer programming for graduate students. Each summer, a four-week residential college-level environmental geology course is offered to rising tenth graders from the Brentwood, New York schools in partnership with Stony Brook's Department of Technology and Society. During the academic year, a college-level Earth science course is offered to tenth graders from Sayville, New York. In both programs, students conduct research projects as one of their primary responsibilities. In collaboration with the Museum of Long Island Natural Sciences on the Stony Brook campus, two programs have been developed that enable visiting K-12 school classes to investigate earthquakes and phenomena that operate in the Earth's deep interior. From 1997 to 1999, the weekly activity-based Science Enrichment for the Early Years (SEEY) program, focusing on common Earth materials and fundamental Earth processes, was conducted at a local pre-K school. Since 2002, ESERC has worked with the Digital Library for Earth System Education (DLESE) to organize the Skills Workshops for their Annual Meeting and with EarthScope for the development of their Education and Outreach Program Plan. Future education programs and tools developed through COMPRES partnerships will place an increased emphasis on deep Earth materials and phenomena.

  14. Final Report for the ZERT Project: Basic Science of Retention Issues, Risk Assessment & Measurement, Monitoring and Verification for Geologic Sequestration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spangler, Lee; Cunningham, Alfred; Lageson, David

    2011-03-31

    ZERT has made major contributions to five main areas of sequestration science: improvement of computational tools; measurement and monitoring techniques to verify storage and track migration of CO{sub 2}; development of a comprehensive performance and risk assessment framework; fundamental geophysical, geochemical and hydrological investigations of CO{sub 2} storage; and investigate innovative, bio-based mitigation strategies.

  15. Inquiring Minds

    Science.gov Websites

    Proposed Projects and Experiments Fermilab's Tevatron Questions for the Universe Theory Computing High Inquiring Minds Questions About Physics Other High-Energy Physics Sites More About Particle Physics Library Visual Media Services Timeline History High-Energy Physics Accelerator Science in Medicine Follow

  16. Software Assurance Curriculum Project Volume 4: Community College Education

    DTIC Science & Technology

    2011-09-01

    no previous programming or computer science experience expected) • Precalculus -ready (that is, proficiency sufficient to enter college-level... precalculus course) • English Composition I-ready (that is, proficiency sufficient to enter college-level English I course) Co-Requisite Discrete

  17. Sandia National Laboratories: Careers: Materials Science

    Science.gov Websites

    Technology Partnerships Business, Industry, & Non-Profits Government Universities Center for Development Agreement (CRADA) Strategic Partnership Projects, Non-Federal Entity (SPP/NFE) Agreements New Sandia's experimental, theoretical, and computational capabilities to establish the state of the art in

  18. e-Science and data management resources on the Web.

    PubMed

    Gore, Sally A

    2011-01-01

    The way research is conducted has changed over time, from simple experiments to computer modeling and simulation, from individuals working in isolated laboratories to global networks of researchers collaborating on a single topic. Often, this new paradigm results in the generation of staggering amounts of data. The intensive use of data and the existence of networks of researchers characterize e-Science. The role of libraries and librarians in e-Science has been a topic of interest for some time now. This column looks at tools, resources, and projects that demonstrate successful collaborations between libraries and researchers in e-Science.

  19. IN13B-1660: Analytics and Visualization Pipelines for Big Data on the NASA Earth Exchange (NEX) and OpenNEX

    NASA Technical Reports Server (NTRS)

    Chaudhary, Aashish; Votava, Petr; Nemani, Ramakrishna R.; Michaelis, Andrew; Kotfila, Chris

    2016-01-01

    We are developing capabilities for an integrated petabyte-scale Earth science collaborative analysis and visualization environment. The ultimate goal is to deploy this environment within the NASA Earth Exchange (NEX) and OpenNEX in order to enhance existing science data production pipelines in both high-performance computing (HPC) and cloud environments. Bridging of HPC and cloud is a fairly new concept under active research and this system significantly enhances the ability of the scientific community to accelerate analysis and visualization of Earth science data from NASA missions, model outputs and other sources. We have developed a web-based system that seamlessly interfaces with both high-performance computing (HPC) and cloud environments, providing tools that enable science teams to develop and deploy large-scale analysis, visualization and QA pipelines of both the production process and the data products, and enable sharing results with the community. Our project is developed in several stages each addressing separate challenge - workflow integration, parallel execution in either cloud or HPC environments and big-data analytics or visualization. This work benefits a number of existing and upcoming projects supported by NEX, such as the Web Enabled Landsat Data (WELD), where we are developing a new QA pipeline for the 25PB system.

  20. Analytics and Visualization Pipelines for Big ­Data on the NASA Earth Exchange (NEX) and OpenNEX

    NASA Astrophysics Data System (ADS)

    Chaudhary, A.; Votava, P.; Nemani, R. R.; Michaelis, A.; Kotfila, C.

    2016-12-01

    We are developing capabilities for an integrated petabyte-scale Earth science collaborative analysis and visualization environment. The ultimate goal is to deploy this environment within the NASA Earth Exchange (NEX) and OpenNEX in order to enhance existing science data production pipelines in both high-performance computing (HPC) and cloud environments. Bridging of HPC and cloud is a fairly new concept under active research and this system significantly enhances the ability of the scientific community to accelerate analysis and visualization of Earth science data from NASA missions, model outputs and other sources. We have developed a web-based system that seamlessly interfaces with both high-performance computing (HPC) and cloud environments, providing tools that enable science teams to develop and deploy large-scale analysis, visualization and QA pipelines of both the production process and the data products, and enable sharing results with the community. Our project is developed in several stages each addressing separate challenge - workflow integration, parallel execution in either cloud or HPC environments and big-data analytics or visualization. This work benefits a number of existing and upcoming projects supported by NEX, such as the Web Enabled Landsat Data (WELD), where we are developing a new QA pipeline for the 25PB system.

  1. Computational Physics in a Nutshell

    NASA Astrophysics Data System (ADS)

    Schillaci, Michael

    2001-11-01

    Too often students of science are expected to ``pick-up'' what they need to know about the Art of Science. A description of the two-semester Computational Physics course being taught by the author offers a remedy to this situation. The course teaches students the three pillars of modern scientific research: Problem Solving, Programming, and Presentation. Using FORTRAN, LaTeXe, MAPLE V, HTML, and JAVA, students learn the fundamentals of algorithm development, how to implement classes and packages written by others, how to produce publication quality graphics and documents and how to publish them on the world-wide-web. The course content is outlined and project examples are offered.

  2. Early Career Summer Interdisciplinary Team Experiences and Student Persistence in STEM Fields

    NASA Astrophysics Data System (ADS)

    Cadavid, A. C.; Pedone, V. A.; Horn, W.; Rich, H.

    2015-12-01

    STEPS (Students Targeting Engineering and Physical Science) is an NSF-funded program designed to increase the number of California State University Northridge students getting bachelor's degrees in the natural sciences, mathematics, engineering and computer science. The greatest loss of STEM majors occurs between sophomore and junior- years, so we designed Summer Interdisciplinary Team Experience (SITE) as an early career program for these students. Students work closely with a faculty mentor in teams of ten to investigate regionally relevant problems, many of which relate to sustainability efforts on campus or the community. The projects emphasize hands-on activities and team-based learning and decision making. We report data for five years of projects, qualitative assessment through entrance and exit surveys and student interviews, and in initial impact on retention of the participants.

  3. Superposition Quantification

    NASA Astrophysics Data System (ADS)

    Chang, Li-Na; Luo, Shun-Long; Sun, Yuan

    2017-11-01

    The principle of superposition is universal and lies at the heart of quantum theory. Although ever since the inception of quantum mechanics a century ago, superposition has occupied a central and pivotal place, rigorous and systematic studies of the quantification issue have attracted significant interests only in recent years, and many related problems remain to be investigated. In this work we introduce a figure of merit which quantifies superposition from an intuitive and direct perspective, investigate its fundamental properties, connect it to some coherence measures, illustrate it through several examples, and apply it to analyze wave-particle duality. Supported by Science Challenge Project under Grant No. TZ2016002, Laboratory of Computational Physics, Institute of Applied Physics and Computational Mathematics, Beijing, Key Laboratory of Random Complex Structures and Data Science, Chinese Academy of Sciences, Grant under No. 2008DP173182

  4. Guiding science expeditions: The design of a learning environment for project-based science

    NASA Astrophysics Data System (ADS)

    Polman, Joseph Louis

    Project-based pedagogy has been revived recently as a teaching strategy for promoting students' active engagement in learning science by doing science. Numerous reform efforts have encouraged project-based teaching in high schools, along with a range of supports for its implementation, often including computers and the Internet. History has shown, however, that academic research and new technologies are not enough to effect real change in classrooms. Ultimately, teachers accomplish activity with their students daily in classrooms. Putting the idea of project-based teaching into practice depends on many particulars of teachers' situated work with students. To better understand the complexity of project-based science teaching in schools, I conducted an interpretive case study of one exceptional teacher's work. The teacher devotes all class time after the beginning of the year to open-ended, student-designed Earth Science research projects. Over four years of involvement with the Learning through Collaborative Visualization (CoVis) reform effort, this teacher has developed, implemented, and refined strategies for supporting and guiding students in conducting open-ended inquiry. Through a close examination of the teacher's work supporting student projects, I explore the design issues involved in such an endeavor, including affordances, constraints, and tradeoffs. In particular, I show how time constrains both student and teacher action, how the traditional school culture and grading create stumbling blocks for change, and how conflicting beliefs about teaching and learning undermine the accomplishment of guided inquiry. I also show how Internet tools including Usenet news, email, and the World Wide Web afford students an opportunity to gather and make use of distributed expertise and scientific data resources; how an activity structure, combined with a corresponding structure to the artifact of the final written product, supports student accomplishment of unfamiliar practices; and how the teacher guides students in real time through mutually transformative communication. I synthesize the important design elements into a framework for conducting project-based science, especially in settings where such pedagogy is relatively new. This study will inform teachers and reformers of the practical and complex work of implementing project-based teaching in schools.

  5. OpenWorm: an open-science approach to modeling Caenorhabditis elegans.

    PubMed

    Szigeti, Balázs; Gleeson, Padraig; Vella, Michael; Khayrulin, Sergey; Palyanov, Andrey; Hokanson, Jim; Currie, Michael; Cantarelli, Matteo; Idili, Giovanni; Larson, Stephen

    2014-01-01

    OpenWorm is an international collaboration with the aim of understanding how the behavior of Caenorhabditis elegans (C. elegans) emerges from its underlying physiological processes. The project has developed a modular simulation engine to create computational models of the worm. The modularity of the engine makes it possible to easily modify the model, incorporate new experimental data and test hypotheses. The modeling framework incorporates both biophysical neuronal simulations and a novel fluid-dynamics-based soft-tissue simulation for physical environment-body interactions. The project's open-science approach is aimed at overcoming the difficulties of integrative modeling within a traditional academic environment. In this article the rationale is presented for creating the OpenWorm collaboration, the tools and resources developed thus far are outlined and the unique challenges associated with the project are discussed.

  6. The Sunrise project: An R&D project for a national information infrastructure prototype

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Juhnyoung

    1995-02-01

    Sunrise is a Los Alamos National Laboratory (LANL) project started in October 1993. It is intended to a prototype National Information Infrastructure (NII) development project. A main focus of Sunrise is to tie together enabling technologies (networking, object-oriented distributed computing, graphical interfaces, security, multimedia technologies, and data mining technologies) with several specific applications. A diverse set of application areas was chosen to ensure that the solutions developed in the project are as generic as possible. Some of the application areas are materials modeling, medical records and image analysis, transportation simulations, and education. This paper provides a description of Sunrise andmore » a view of the architecture and objectives of this evolving project. The primary objectives of Sunrise are three-fold: (1) To develop common information-enabling tools for advanced scientific research and its applications to industry; (2) To enhance the capabilities of important research programs at the Laboratory; and (3) To define a new way of collaboration between computer science and industrially relevant research.« less

  7. Knowledge Discovery from Climate Data using Graph-Based Methods

    NASA Astrophysics Data System (ADS)

    Steinhaeuser, K.

    2012-04-01

    Climate and Earth sciences have recently experienced a rapid transformation from a historically data-poor to a data-rich environment, thus bringing them into the realm of the Fourth Paradigm of scientific discovery - a term coined by the late Jim Gray (Hey et al. 2009), the other three being theory, experimentation and computer simulation. In particular, climate-related observations from remote sensors on satellites and weather radars, in situ sensors and sensor networks, as well as outputs of climate or Earth system models from large-scale simulations, provide terabytes of spatio-temporal data. These massive and information-rich datasets offer a significant opportunity for advancing climate science and our understanding of the global climate system, yet current analysis techniques are not able to fully realize their potential benefits. We describe a class of computational approaches, specifically from the data mining and machine learning domains, which may be novel to the climate science domain and can assist in the analysis process. Computer scientists have developed spatial and spatio-temporal analysis techniques for a number of years now, and many of them may be applicable and/or adaptable to problems in climate science. We describe a large-scale, NSF-funded project aimed at addressing climate science question using computational analysis methods; team members include computer scientists, statisticians, and climate scientists from various backgrounds. One of the major thrusts is in the development of graph-based methods, and several illustrative examples of recent work in this area will be presented.

  8. The Fabric for Frontier Experiments Project at Fermilab

    NASA Astrophysics Data System (ADS)

    Kirby, Michael

    2014-06-01

    The FabrIc for Frontier Experiments (FIFE) project is a new, far-reaching initiative within the Fermilab Scientific Computing Division to drive the future of computing services for experiments at FNAL and elsewhere. It is a collaborative effort between computing professionals and experiment scientists to produce an end-to-end, fully integrated set of services for computing on the grid and clouds, managing data, accessing databases, and collaborating within experiments. FIFE includes 1) easy to use job submission services for processing physics tasks on the Open Science Grid and elsewhere; 2) an extensive data management system for managing local and remote caches, cataloging, querying, moving, and tracking the use of data; 3) custom and generic database applications for calibrations, beam information, and other purposes; 4) collaboration tools including an electronic log book, speakers bureau database, and experiment membership database. All of these aspects will be discussed in detail. FIFE sets the direction of computing at Fermilab experiments now and in the future, and therefore is a major driver in the design of computing services worldwide.

  9. Bringing education to your virtual doorstep

    NASA Astrophysics Data System (ADS)

    Kaurov, Vitaliy

    2013-03-01

    We currently witness significant migration of academic resources towards online CMS, social networking, and high-end computerized education. This happens for traditional academic programs as well as for outreach initiatives. The talk will go over a set of innovative integrated technologies, many of which are free. These were developed by Wolfram Research in order to facilitate and enhance the learning process in mathematical and physical sciences. Topics include: cloud computing with Mathematica Online; natural language programming; interactive educational resources and web publishing at the Wolfram Demonstrations Project; the computational knowledge engine Wolfram Alpha; Computable Document Format (CDF) and self-publishing with interactive e-books; course assistant apps for mobile platforms. We will also discuss outreach programs where such technologies are extensively used, such as the Wolfram Science Summer School and the Mathematica Summer Camp.

  10. S'COOL Takes Students to New Heights

    NASA Technical Reports Server (NTRS)

    Green, Carolyn J.; Chambers, Lin H.

    1998-01-01

    Students Cloud Observations On-Line (S'COOL) is a hands-on educational project which supports NASA's Clouds and the Earth s Radiant Energy System (CERES) satellite instrument; part of the Earth Science Enterprise studying our planet. S'COOL meets science, math, technology and geography Standards of Learning (SOLs) as students observe clouds and related weather conditions, compute data and locate vital information while obtaining ground truth observations for the CERES instrument. These observations can then be used to help validate the CERES measurements; particularly detection of clear sky from space. Participants to date have been in 20 states and 5 countries and have reported great interest and learning among their students. Many have used this project as a stepping stone to further learning in other areas of Earth Science; and to do more with the Internet in the classroom. Satellite images and clues to their interpretation are used on the website ( http://asd-www.larc.nasa.gov/SCOOL/) . Background information is also given on Earth's Radiation Budget and it s importance in understanding our climate. Students can retrieve both their observations and the corresponding satellite data and participate in the validation efforts. A number of suggestions for studies to be done with the data, and related lesson plans, are available. Teachers can tailor this project to the appropriate level and subject matter needed for their students. The recommended grade level is 4th through 12th grade. The project is now open to new participants. We particularly seek schools in more remote areas, to obtain wider geographic coverage for ground truth data; so the project has been designed to use, but not require, computer technology. AGU participants attending the S'COOL presentation will be given a handout describing the project. Material for introducing the project in the classroom will be demonstrated in a participatory style.

  11. The GI Project: a prototype electronic textbook for high school biology.

    PubMed

    Calhoun, P S; Fishman, E K

    1997-01-01

    A prototype electronic science textbook for secondary education was developed to help bridge the gap between state-of-the-art medical technology and the basic science classroom. The prototype combines the latest in radiologic imaging techniques with a user-friendly multimedia computer program to teach the anatomy, physiology, and diseases of the gastrointestinal (GI) tract. The program includes original text, illustrations, photographs, animations, images from upper GI studies, plain radiographs, computed tomographic images, and three-dimensional reconstructions. These features are intended to create a stimulus-rich environment in which the high school science student can enjoy a variety of interactive experiences that will facilitate the learning process. The computer-based book is a new educational tool that promises to play a prominent role in the coming years. Current research suggests that computer-based books are valuable as an alternative educational medium. Although it is not yet clear what form textbooks will take in the future, computer-based books are already proving valuable as an alternative educational medium. For beginning students, they reinforce the material found in traditional textbooks and class presentations; for advanced students, they provide motivation to learn outside the traditional classroom.

  12. Socorro Students Translate NRAO Web Pages Into Spanish

    NASA Astrophysics Data System (ADS)

    2002-07-01

    Six Socorro High School students are spending their summer working at the National Radio Astronomy Observatory (NRAO) on a unique project that gives them experience in language translation, World Wide Web design, and technical communication. Under the project, called "Un puente a los cielos," the students are translating many of NRAO's Web pages on astronomy into Spanish. "These students are using their bilingual skills to help us make basic information about astronomy and radio telescopes available to the Spanish-speaking community," said Kristy Dyer, who works at NRAO as a National Science Foundation postdoctoral fellow and who developed the project and obtained funding for it from the National Aeronautics and Space Administration. The students are: Daniel Acosta, 16; Rossellys Amarante, 15; Sandra Cano, 16; Joel Gonzalez, 16; Angelica Hernandez, 16; and Cecilia Lopez, 16. The translation project, a joint effort of NRAO and the NM Tech physics department, also includes Zammaya Moreno, a teacher from Ecuador, Robyn Harrison, NRAO's education officer, and NRAO computer specialist Allan Poindexter. The students are translating NRAO Web pages aimed at the general public. These pages cover the basics of radio astronomy and frequently-asked questions about NRAO and the scientific research done with NRAO's telescopes. "Writing about science for non-technical audiences has to be done carefully. Scientific concepts must be presented in terms that are understandable to non-scientists but also that remain scientifically accurate," Dyer said. "When translating this type of writing from one language to another, we need to preserve both the understandability and the accuracy," she added. For that reason, Dyer recruited 14 Spanish-speaking astronomers from Argentina, Mexico and the U.S. to help verify the scientific accuracy of the Spanish translations. The astronomers will review the translations. The project is giving the students a broad range of experience. "They are getting hands-on experience in language translation, in Web design and computer science, and learning some astronomy as well," said Dyer. "This is a challenging project, but these students are meeting the challenge well," she added. The students are enthusiastic. "I've always been interested in stars and space, and I love working with computers," said Amarante. "We are pleased that these local students are using their skills to enhance our public-education efforts," said NRAO's director of New Mexico operations James Ulvestad. "Our Web site is one of our best tools for informing the public about astronomy and the work done at our observatory. This translation project now allows us to reach an important new audience," Ulvestad added. The students began the project in June and will complete the effort on July 26. The National Radio Astronomy Observatory is a facility of the National Science Foundation, operated under cooperative agreement by Associated Universities, Inc.

  13. Project NANO (nanoscience and nanotechnology outreach): a STEM training program that brings SEM's and stereoscopes into high-school and middle-school classrooms

    NASA Astrophysics Data System (ADS)

    Cady, Sherry L.; Blok, Mikel; Grosse, Keith; Wells, Jennifer

    2014-09-01

    The program Project NANO (Nanoscience and Nanotechnology Outreach) enables middle and high school students to discover and research submicroscopic phenomena in a new and exciting way with the use of optical and scanning electron microscopes in the familiar surroundings of their middle or high school classrooms. Project NANO provides secondary level professional development workshops, support for classroom instruction and teacher curriculum development, and the means to deliver Project NANO toolkits (SEM, stereoscope, computer, supplies) to classrooms with Project NANO trained teachers. Evaluation surveys document the impact of the program on student's attitudes toward science and technology and on the learning outcomes for secondary level teachers. Project NANO workshops (offered for professional development credit) enable teachers to gain familiarity using and teaching with the SEM. Teachers also learn to integrate new content knowledge and skills into topic-driven, standards-based units of instruction specifically designed to support the development of students' higher order thinking skills that include problem solving and evidence-based thinking. The Project NANO management team includes a former university science faculty, two high school science teachers, and an educational researcher. To date, over 7500 students have experienced the impact of the Project NANO program, which provides an exciting and effective model for engaging students in the discovery of nanoscale phenomena and concepts in a fun and engaging way.

  14. The SGI/CRAY T3E: Experiences and Insights

    NASA Technical Reports Server (NTRS)

    Bernard, Lisa Hamet

    1999-01-01

    The focus of the HPCC Earth and Space Sciences (ESS) Project is capability computing - pushing highly scalable computing testbeds to their performance limits. The drivers of this focus are the Grand Challenge problems in Earth and space science: those that could not be addressed in a capacity computing environment where large jobs must continually compete for resources. These Grand Challenge codes require a high degree of communication, large memory, and very large I/O (throughout the duration of the processing, not just in loading initial conditions and saving final results). This set of parameters led to the selection of an SGI/Cray T3E as the current ESS Computing Testbed. The T3E at the Goddard Space Flight Center is a unique computational resource within NASA. As such, it must be managed to effectively support the diverse research efforts across the NASA research community yet still enable the ESS Grand Challenge Investigator teams to achieve their performance milestones, for which the system was intended. To date, all Grand Challenge Investigator teams have achieved the 10 GFLOPS milestone, eight of nine have achieved the 50 GFLOPS milestone, and three have achieved the 100 GFLOPS milestone. In addition, many technical papers have been published highlighting results achieved on the NASA T3E, including some at this Workshop. The successes enabled by the NASA T3E computing environment are best illustrated by the 512 PE upgrade funded by the NASA Earth Science Enterprise earlier this year. Never before has an HPCC computing testbed been so well received by the general NASA science community that it was deemed critical to the success of a core NASA science effort. NASA looks forward to many more success stories before the conclusion of the NASA-SGI/Cray cooperative agreement in June 1999.

  15. Executable research compendia in geoscience research infrastructures

    NASA Astrophysics Data System (ADS)

    Nüst, Daniel

    2017-04-01

    From generation through analysis and collaboration to communication, scientific research requires the right tools. Scientists create their own software using third party libraries and platforms. Cloud computing, Open Science, public data infrastructures, and Open Source enable scientists with unprecedented opportunites, nowadays often in a field "Computational X" (e.g. computational seismology) or X-informatics (e.g. geoinformatics) [0]. This increases complexity and generates more innovation, e.g. Environmental Research Infrastructures (environmental RIs [1]). Researchers in Computational X write their software relying on both source code (e.g. from https://github.com) and binary libraries (e.g. from package managers such as APT, https://wiki.debian.org/Apt, or CRAN, https://cran.r-project.org/). They download data from domain specific (cf. https://re3data.org) or generic (e.g. https://zenodo.org) data repositories, and deploy computations remotely (e.g. European Open Science Cloud). The results themselves are archived, given persistent identifiers, connected to other works (e.g. using https://orcid.org/), and listed in metadata catalogues. A single researcher, intentionally or not, interacts with all sub-systems of RIs: data acquisition, data access, data processing, data curation, and community support [3]. To preserve computational research [3] proposes the Executable Research Compendium (ERC), a container format closing the gap of dependency preservation by encapsulating the runtime environment. ERCs and RIs can be integrated for different uses: (i) Coherence: ERC services validate completeness, integrity and results (ii) Metadata: ERCs connect the different parts of a piece of research and faciliate discovery (iii) Exchange and Preservation: ERC as usable building blocks are the shared and archived entity (iv) Self-consistency: ERCs remove dependence on ephemeral sources (v) Execution: ERC services create and execute a packaged analysis but integrate with existing platforms for display and control These integrations are vital for capturing workflows in RIs and connect key stakeholders (scientists, publishers, librarians). They are demonstrated using developments by the DFG-funded project Opening Reproducible Research (http://o2r.info). Semi-automatic creation of ERCs based on research workflows is a core goal of the project. References [0] Tony Hey, Stewart Tansley, Kristin Tolle (eds), 2009. The Fourth Paradigm: Data-Intensive Scientific Discovery. Microsoft Research. [1] P. Martin et al., Open Information Linking for Environmental Research Infrastructures, 2015 IEEE 11th International Conference on e-Science, Munich, 2015, pp. 513-520. doi: 10.1109/eScience.2015.66 [2] Y. Chen et al., Analysis of Common Requirements for Environmental Science Research Infrastructures, The International Symposium on Grids and Clouds (ISGC) 2013, Taipei, 2013, http://pos.sissa.it/archive/conferences/179/032/ISGC [3] Opening Reproducible Research, Geophysical Research Abstracts Vol. 18, EGU2016-7396, 2016, http://meetingorganizer.copernicus.org/EGU2016/EGU2016-7396.pdf

  16. The Zooniverse

    NASA Astrophysics Data System (ADS)

    Borne, K. D.; Fortson, L.; Gay, P.; Lintott, C.; Raddick, M. J.; Wallin, J.

    2009-12-01

    The remarkable success of Galaxy Zoo as a citizen science project for galaxy classification within a terascale astronomy data collection has led to the development of a broader collaboration, known as the Zooniverse. Activities will include astronomy, lunar science, solar science, and digital humanities. Some features of our program include development of a unified framework for citizen science projects, development of a common set of user-based research tools, engagement of the machine learning community to apply machine learning algorithms on the rich training data provided by citizen scientists, and extension across multiple research disciplines. The Zooniverse collaboration is just getting started, but already we are implementing a scientifically deep follow-on to Galaxy Zoo. This project, tentatively named Galaxy Merger Zoo, will engage users in running numerical simulations, whose input parameter space is voluminous and therefore demands a clever solution, such as allowing the citizen scientists to select their own sets of parameters, which then trigger new simulations of colliding galaxies. The user interface design has many of the engaging features that retain users, including rapid feedback, visually appealing graphics, and the sense of playing a competitive game for the benefit of science. We will discuss these topics. In addition, we will also describe applications of Citizen Science that are being considered for the petascale science project LSST (Large Synoptic Survey Telescope). LSST will produce a scientific data system that consists of a massive image archive (nearly 100 petabytes) and a similarly massive scientific parameter database (20-40 petabytes). Applications of Citizen Science for such an enormous data collection will enable greater scientific return in at least two ways. First, citizen scientists work with real data and perform authentic research tasks of value to the advancement of the science, providing "human computation" capabilities and resources to review, annotate, and explore aspects of the data that are too overwhelming for the science team. Second, citizen scientists' inputs (in the form of rich training data and class labels) can be used to improve the classifiers that the project team uses to classify and prioritize new events detected in the petascale data stream. This talk will review these topics and provide an update on the Zooniverse project.

  17. Journal of Undergraduate Research, Volume IX, 2009

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stiner, K. S.; Graham, S.; Khan, M.

    Each year more than 600 undergraduate students are awarded paid internships at the Department of Energy’s (DOE) National Laboratories. Th ese interns are paired with research scientists who serve as mentors in authentic research projects. All participants write a research abstract and present at a poster session and/or complete a fulllength research paper. Abstracts and selected papers from our 2007–2008 interns that represent the breadth and depth of undergraduate research performed each year at our National Laboratories are published here in the Journal of Undergraduate Research. The fields in which these students worked included: Biology; Chemistry; Computer Science; Engineering; Environmentalmore » Science; General Science; Materials Science; Medical and Health Sciences; Nuclear Science; Physics; Science Policy; and Waste Management.« less

  18. Capabilities: Science Pillars

    Science.gov Websites

    Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron

  19. Science Briefs

    Science.gov Websites

    Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron

  20. Office of Science

    Science.gov Websites

    Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron

  1. Bradbury Science Museum

    Science.gov Websites

    Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron

  2. Sandia National Laboratories: National Security Missions: Nuclear Weapons

    Science.gov Websites

    Technology Partnerships Business, Industry, & Non-Profits Government Universities Center for Development Agreement (CRADA) Strategic Partnership Projects, Non-Federal Entity (SPP/NFE) Agreements New , in which fundamental science, computer models, and unique experimental facilities come together so

  3. 77 FR 59938 - Center for Scientific Review Notice of Closed Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-01

    ... Panel; Program Project: Drug Addiction. Date: October 30-31, 2012. Time: 8:00 a.m. to 8:30 p.m. Agenda... Biomedical Computational Science and Technology Initiative. Date: October 30, 2012. Time: 3:00 p.m. to 4:00 p...

  4. The Importance of Simulation Workflow and Data Management in the Accelerated Climate Modeling for Energy Project

    NASA Astrophysics Data System (ADS)

    Bader, D. C.

    2015-12-01

    The Accelerated Climate Modeling for Energy (ACME) Project is concluding its first year. Supported by the Office of Science in the U.S. Department of Energy (DOE), its vision is to be "an ongoing, state-of-the-science Earth system modeling, modeling simulation and prediction project that optimizes the use of DOE laboratory resources to meet the science needs of the nation and the mission needs of DOE." Included in the "laboratory resources," is a large investment in computational, network and information technologies that will be utilized to both build better and more accurate climate models and broadly disseminate the data they generate. Current model diagnostic analysis and data dissemination technologies will not scale to the size of the simulations and the complexity of the models envisioned by ACME and other top tier international modeling centers. In this talk, the ACME Workflow component plans to meet these future needs will be described and early implementation examples will be highlighted.

  5. Carbon-doping-induced negative differential resistance in armchair phosphorene nanoribbons

    NASA Astrophysics Data System (ADS)

    Guo, Caixia; Xia, Congxin; Wang, Tianxing; Liu, Yufang

    2017-03-01

    By using a combined method of density functional theory and non-equilibrium Green’s function formalism, we investigate the electronic transport properties of carbon-doped armchair phosphorene nanoribbons (APNRs). The results show that C atom doping can strongly affect the electronic transport properties of the APNR and change it from semiconductor to metal. Meanwhile, obvious negative differential resistance (NDR) behaviors are obtained by tuning the doping position and concentration. In particular, with reducing doping concentration, NDR peak position can enter into mV bias range. These results provide a theoretical support to design the related nanodevice by tuning the doping position and concentration in the APNRs. Project supported by the National Natural Science Foundation of China (No. 11274096), the University Science and Technology Innovation Team Support Project of Henan Province (No. 13IRTSTHN016), the University key Science Research Project of Henan Province (No.16A140043). The calculation about this work was supported by the High Performance Computing Center of Henan Normal University.

  6. Parallel Architectures for Planetary Exploration Requirements (PAPER)

    NASA Astrophysics Data System (ADS)

    Cezzar, Ruknet

    1993-08-01

    The project's main contributions have been in the area of student support. Throughout the project, at least one, in some cases two, undergraduate students have been supported. By working with the project, these students gained valuable knowledge involving the scientific research project, including the not-so-pleasant reporting requirements to the funding agencies. The other important contribution was towards the establishment of a graduate program in computer science at Hampton University. Primarily, the PAPER project has served as the main research basis in seeking funds from other agencies, such as the National Science Foundation, for establishing a research infrastructure in the department. In technical areas, especially in the first phase, we believe the trip to Jet Propulsion Laboratory, and gathering together all the pertinent information involving experimental computer architectures aimed for planetary explorations was very helpful. Indeed, if this effort is to be revived in the future due to congressional funding for planetary explorations, say an unmanned mission to Mars, our interim report will be an important starting point. In other technical areas, our simulator has pinpointed and highlighted several important performance issues related to the design of operating system kernels for MIMD machines. In particular, the critical issue of how the kernel itself will run in parallel on a multiple-processor system has been addressed through the various ready list organization and access policies. In the area of neural computing, our main contribution was an introductory tutorial package to familiarize the researchers at NASA with this new and promising field zone axes (20). Finally, we have introduced the notion of reversibility in programming systems which may find applications in various areas of space research.

  7. Parallel Architectures for Planetary Exploration Requirements (PAPER)

    NASA Technical Reports Server (NTRS)

    Cezzar, Ruknet

    1993-01-01

    The project's main contributions have been in the area of student support. Throughout the project, at least one, in some cases two, undergraduate students have been supported. By working with the project, these students gained valuable knowledge involving the scientific research project, including the not-so-pleasant reporting requirements to the funding agencies. The other important contribution was towards the establishment of a graduate program in computer science at Hampton University. Primarily, the PAPER project has served as the main research basis in seeking funds from other agencies, such as the National Science Foundation, for establishing a research infrastructure in the department. In technical areas, especially in the first phase, we believe the trip to Jet Propulsion Laboratory, and gathering together all the pertinent information involving experimental computer architectures aimed for planetary explorations was very helpful. Indeed, if this effort is to be revived in the future due to congressional funding for planetary explorations, say an unmanned mission to Mars, our interim report will be an important starting point. In other technical areas, our simulator has pinpointed and highlighted several important performance issues related to the design of operating system kernels for MIMD machines. In particular, the critical issue of how the kernel itself will run in parallel on a multiple-processor system has been addressed through the various ready list organization and access policies. In the area of neural computing, our main contribution was an introductory tutorial package to familiarize the researchers at NASA with this new and promising field zone axes (20). Finally, we have introduced the notion of reversibility in programming systems which may find applications in various areas of space research.

  8. The EarthKAM project: creating space imaging tools for teaching and learning

    NASA Astrophysics Data System (ADS)

    Dodson, Holly; Levin, Paula; Ride, Sally; Souviney, Randall

    2000-07-01

    The EarthKAM Project is a NASA-supported partnership of secondary and university students with Earth Science and educational researchers. This report describes an ongoing series of activities that more effectively integrate Earth images into classroom instruction. In this project, students select and analyze images of the Earth taken during Shuttle flights and use the tools of modern science (computers, data analysis tools and the Internet) to disseminate the images and results of their research. A related study, the Visualizing Earth Project, explores in greater detail the cognitive aspects of image processing and the educational potential of visualizations in science teaching and learning. The content and organization of the EarthKAM datasystem of images and metadata are also described. An associated project is linking this datasystem of images with the Getty Thesaurus of Geographic Names, which will allow users to access a wide range of geographic and political information for the regions shown in EarthKAM images. Another project will provide tools for automated feature extraction from EarthKAM images. In order to make EarthKAM resources available to a larger number of schools, the next important goal is to create an integrated datasystem that combines iterative resource validation and publication, with multimedia management of instructional materials.

  9. A Computing Infrastructure for Supporting Climate Studies

    NASA Astrophysics Data System (ADS)

    Yang, C.; Bambacus, M.; Freeman, S. M.; Huang, Q.; Li, J.; Sun, M.; Xu, C.; Wojcik, G. S.; Cahalan, R. F.; NASA Climate @ Home Project Team

    2011-12-01

    Climate change is one of the major challenges facing us on the Earth planet in the 21st century. Scientists build many models to simulate the past and predict the climate change for the next decades or century. Most of the models are at a low resolution with some targeting high resolution in linkage to practical climate change preparedness. To calibrate and validate the models, millions of model runs are needed to find the best simulation and configuration. This paper introduces the NASA effort on Climate@Home project to build a supercomputer based-on advanced computing technologies, such as cloud computing, grid computing, and others. Climate@Home computing infrastructure includes several aspects: 1) a cloud computing platform is utilized to manage the potential spike access to the centralized components, such as grid computing server for dispatching and collecting models runs results; 2) a grid computing engine is developed based on MapReduce to dispatch models, model configuration, and collect simulation results and contributing statistics; 3) a portal serves as the entry point for the project to provide the management, sharing, and data exploration for end users; 4) scientists can access customized tools to configure model runs and visualize model results; 5) the public can access twitter and facebook to get the latest about the project. This paper will introduce the latest progress of the project and demonstrate the operational system during the AGU fall meeting. It will also discuss how this technology can become a trailblazer for other climate studies and relevant sciences. It will share how the challenges in computation and software integration were solved.

  10. Bay in Flux: Marine Climate Impacts, Art and Tablet App Design

    NASA Astrophysics Data System (ADS)

    Kintisch, E. S.

    2012-12-01

    Bay in Flux is a year-long experimental effort to design and develop interactive tablet computer apps exploring the marine impacts of climate change. The goal is to convey, visualize and enliven scientific ideas around this topic, while engaging a broad audience through the design of interactive content. Pioneering new models of scientist-artist collaborations are a central part of the effort as well. The project begins with an innovative studio class at the Rhode Island School of Design (RISD) called Bay in Flux, taught in the Fall 2012 semester. Its three instructor team includes two artist-designers and one science reporter, with active collaborations from affiliated marine scientists. The subject matter focus is the Narragansett Bay, which has shown physical, chemical and ecological impacts of climate change, along with the ongoing efforts of researchers to explain and characterize it. In exploring this rich story, we intend to innovate pioneering means of handling narrative material on interactive e-books, enable data collection by citizen scientists or devise game-like simulations to enable audiences to explore and understand complex natural systems. The lessons we seek to learn in this project include: how to effectively encourage collaborations between scientists and designers around digital design; how to pioneer new and compelling ways to tell science-based nonfiction stories on tablets; and how art and design students with no scientific training can engage with complex scientific content effectively. The project will also challenge us to think about the tablet computer not only as a data output device -- in which the user reads, watches, or interacts with provided content -- but also as a dynamic and ideal tool for mobile data input, enabling citizen science projects and novel connections between working researchers and the public. The intended audience could include high school students or older audiences who currently eschew science journalism. HTML5 is the likely language of choice, with the iPad being the initial intended platform. Following the fall class, a spring 2013 effort will involve developing a prototype app. Partners in the Bay in Flux project are the Knight Science Journalism program at MIT, RISD and the National Science Foundation's Rhode Island Experimental Program to Stimulate Competitive Research. Ultimately, the goal is to foster new ways for artists and designers to collaborate with scientists in the environmental field while reaching broad audiences.

  11. e-Infrastructures for e-Sciences 2013 A CHAIN-REDS Workshop organised under the aegis of the European Commission

    NASA Astrophysics Data System (ADS)

    The CHAIN-REDS Project is organising a workshop on "e-Infrastructures for e-Sciences" focusing on Cloud Computing and Data Repositories under the aegis of the European Commission and in co-location with the International Conference on e-Science 2013 (IEEE2013) that will be held in Beijing, P.R. of China on October 17-22, 2013. The core objective of the CHAIN-REDS project is to promote, coordinate and support the effort of a critical mass of non-European e-Infrastructures for Research and Education to collaborate with Europe addressing interoperability and interoperation of Grids and other Distributed Computing Infrastructures (DCI). From this perspective, CHAIN-REDS will optimise the interoperation of European infrastructures with those present in 6 other regions of the world, both from a development and use point of view, and catering to different communities. Overall, CHAIN-REDS will provide input for future strategies and decision-making regarding collaboration with other regions on e-Infrastructure deployment and availability of related data; it will raise the visibility of e-Infrastructures towards intercontinental audiences, covering most of the world and will provide support to establish globally connected and interoperable infrastructures, in particular between the EU and the developing regions. Organised by IHEP, INFN and Sigma Orionis with the support of all project partners, this workshop will aim at: - Presenting the state of the art of Cloud computing in Europe and in China and discussing the opportunities offered by having interoperable and federated e-Infrastructures; - Exploring the existing initiatives of Data Infrastructures in Europe and China, and highlighting the Data Repositories of interest for the Virtual Research Communities in several domains such as Health, Agriculture, Climate, etc.

  12. Alaska's Secondary Science Teachers and Students Receive Earth Systems Science Knowledge, GIS Know How and University Technical Support for Pre- College Research Experiences: The EDGE Project

    NASA Astrophysics Data System (ADS)

    Connor, C. L.; Prakash, A.

    2007-12-01

    Alaska's secondary school teachers are increasingly required to provide Earth systems science (ESS) education that integrates student observations of local natural processes related to rapid climate change with geospatial datasets and satellite imagery using Geographic Information Systems (GIS) technology. Such skills are also valued in various employment sectors of the state where job opportunities requiring Earth science and GIS training are increasing. University of Alaska's EDGE (Experiential Discoveries in Geoscience Education) program has provided training and classroom resources for 3 cohorts of inservice Alaska science and math teachers in GIS and Earth Systems Science (2005-2007). Summer workshops include geologic field experiences, GIS instruction, computer equipment and technical support for groups of Alaska high school (HS) and middle school (MS) science teachers each June and their students in August. Since 2005, EDGE has increased Alaska science and math teachers' Earth science content knowledge and developed their GIS and computer skills. In addition, EDGE has guided teachers using a follow-up, fall online course that provided more extensive ESS knowledge linked with classroom standards and provided course content that was directly transferable into their MS and HS science classrooms. EDGE teachers were mentored by University faculty and technical staff as they guided their own students through semester-scale, science fair style projects using geospatial data that was student- collected. EDGE program assessment indicates that all teachers have improved their ESS knowledge, GIS knowledge, and the use of technology in their classrooms. More than 230 middle school students have learned GIS, from EDGE teachers and 50 EDGE secondary students have conducted original research related to landscape change and its impacts on their own communities. Longer-term EDGE goals include improving student performance on the newly implemented (spring 2008) 10th grade, standards-based, High School Qualifying Exam, on recruiting first-generation college students, and on increasing the number of Earth science majors in the University of Alaska system.

  13. A cyber-linked undergraduate research experience in computational biomolecular structure prediction and design

    PubMed Central

    Alford, Rebecca F.; Dolan, Erin L.

    2017-01-01

    Computational biology is an interdisciplinary field, and many computational biology research projects involve distributed teams of scientists. To accomplish their work, these teams must overcome both disciplinary and geographic barriers. Introducing new training paradigms is one way to facilitate research progress in computational biology. Here, we describe a new undergraduate program in biomolecular structure prediction and design in which students conduct research at labs located at geographically-distributed institutions while remaining connected through an online community. This 10-week summer program begins with one week of training on computational biology methods development, transitions to eight weeks of research, and culminates in one week at the Rosetta annual conference. To date, two cohorts of students have participated, tackling research topics including vaccine design, enzyme design, protein-based materials, glycoprotein modeling, crowd-sourced science, RNA processing, hydrogen bond networks, and amyloid formation. Students in the program report outcomes comparable to students who participate in similar in-person programs. These outcomes include the development of a sense of community and increases in their scientific self-efficacy, scientific identity, and science values, all predictors of continuing in a science research career. Furthermore, the program attracted students from diverse backgrounds, which demonstrates the potential of this approach to broaden the participation of young scientists from backgrounds traditionally underrepresented in computational biology. PMID:29216185

  14. A cyber-linked undergraduate research experience in computational biomolecular structure prediction and design.

    PubMed

    Alford, Rebecca F; Leaver-Fay, Andrew; Gonzales, Lynda; Dolan, Erin L; Gray, Jeffrey J

    2017-12-01

    Computational biology is an interdisciplinary field, and many computational biology research projects involve distributed teams of scientists. To accomplish their work, these teams must overcome both disciplinary and geographic barriers. Introducing new training paradigms is one way to facilitate research progress in computational biology. Here, we describe a new undergraduate program in biomolecular structure prediction and design in which students conduct research at labs located at geographically-distributed institutions while remaining connected through an online community. This 10-week summer program begins with one week of training on computational biology methods development, transitions to eight weeks of research, and culminates in one week at the Rosetta annual conference. To date, two cohorts of students have participated, tackling research topics including vaccine design, enzyme design, protein-based materials, glycoprotein modeling, crowd-sourced science, RNA processing, hydrogen bond networks, and amyloid formation. Students in the program report outcomes comparable to students who participate in similar in-person programs. These outcomes include the development of a sense of community and increases in their scientific self-efficacy, scientific identity, and science values, all predictors of continuing in a science research career. Furthermore, the program attracted students from diverse backgrounds, which demonstrates the potential of this approach to broaden the participation of young scientists from backgrounds traditionally underrepresented in computational biology.

  15. Apache Open Climate Workbench: Building Open Source Climate Science Tools and Community at the Apache Software Foundation

    NASA Astrophysics Data System (ADS)

    Joyce, M.; Ramirez, P.; Boustani, M.; Mattmann, C. A.; Khudikyan, S.; McGibbney, L. J.; Whitehall, K. D.

    2014-12-01

    Apache Open Climate Workbench (OCW; https://climate.apache.org/) is a Top-Level Project at the Apache Software Foundation that aims to provide a suite of tools for performing climate science evaluations using model outputs from a multitude of different sources (ESGF, CORDEX, U.S. NCA, NARCCAP) with remote sensing data from NASA, NOAA, and other agencies. Apache OCW is the second NASA project to become a Top-Level Project at the Apache Software Foundation. It grew out of the Jet Propulsion Laboratory's (JPL) Regional Climate Model Evaluation System (RCMES) project, a collaboration between JPL and the University of California, Los Angeles' Joint Institute for Regional Earth System Science and Engineering (JIFRESSE). Apache OCW provides scientists and developers with tools for data manipulation, metrics for dataset comparisons, and a visualization suite. In addition to a powerful low-level API, Apache OCW also supports a web application for quick, browser-controlled evaluations, a command line application for local evaluations, and a virtual machine for isolated experimentation with minimal setup. This talk will look at the difficulties and successes of moving a closed community research project out into the wild world of open source. We'll explore the growing pains Apache OCW went through to become a Top-Level Project at the Apache Software Foundation as well as the benefits gained by opening up development to the broader climate and computer science communities.

  16. Improving Mobile Infrastructure for Pervasive Personal Computing

    DTIC Science & Technology

    2007-11-01

    fulfillment of the requirements for the degree of Master of Science. Copyright c© 2007 Ajay Surie This research was supported by the National Science Foundation...NSF) under grant number CNS-0509004 and by the Army Research Office (ARO) through grant number DAAD19-02-1-0389 (“Perpetually Available and Secure...efforts my final project could not have been successful. Working with the members of my research group, Niraj Tolia, Benjamin Gilbert, Jan Harkes, Adam

  17. Closing the race and gender gaps in computer science education

    NASA Astrophysics Data System (ADS)

    Robinson, John Henry

    Life in a technological society brings new paradigms and pressures to bear on education. These pressures are magnified for underrepresented students and must be addressed if they are to play a vital part in society. Educational pipelines need to be established to provide at risk students with the means and opportunity to succeed in science, technology, engineering, and mathematics (STEM) majors. STEM educational pipelines are programs consisting of components that seek to facilitate students' completion of a college degree by providing access to higher education, intervention, mentoring, support infrastructure, and programs that encourage academic success. Successes in the STEM professions mean that more educators, scientist, engineers, and researchers will be available to add diversity to the professions and to provide role models for future generations. The issues that the educational pipelines must address are improving at risk groups' perceptions and awareness of the math, science, and engineering professions. Additionally, the educational pipelines must provide intervention in math preparation, overcome gender and race socialization, and provide mentors and counseling to help students achieve better self perceptions and provide positive role models. This study was designed to explorer the underrepresentation of minorities and women in the computer science major at Rowan University through a multilayered action research methodology. The purpose of this research study was to define and understand the needs of underrepresented students in computer science, to examine current policies and enrollment data for Rowan University, to develop a historical profile of the Computer Science program from the standpoint of ethnicity and gender enrollment to ascertain trends in students' choice of computer science as a major, and an attempt to determine if raising awareness about computer science for incoming freshmen, and providing an alternate route into the computer science major will entice more women and minorities to pursue a degree in computer science at Rowan University. Finally, this study examined my espoused leadership theories and my leadership theories in use through reflective practices as I progressed through the cycles of this project. The outcomes of this study indicated a large downward trend in women enrollment in computer science and a relatively flat trend in minority enrollment. The enrollment data at Rowan University was found to follow a nationwide trend for underrepresented students' enrollment in STEM majors. The study also indicated that students' mental models are based upon their race and gender socialization and their understanding of the world and society. The mental models were shown to play a large role in the students' choice of major. Finally, a computer science pipeline was designed and piloted as part of this study in an attempt to entice more students into the major and facilitate their success. Additionally, the mental models of the participants were challenged through interactions to make them aware of what possibilities are available with a degree in computer science. The entire study was wrapped in my leadership, which was practiced and studied over the course of this work.

  18. Meteor Observations as Big Data Citizen Science

    NASA Astrophysics Data System (ADS)

    Gritsevich, M.; Vinkovic, D.; Schwarz, G.; Nina, A.; Koschny, D.; Lyytinen, E.

    2016-12-01

    Meteor science represents an excellent example of the citizen science project, where progress in the field has been largely determined by amateur observations. Over the last couple of decades technological advancements in observational techniques have yielded drastic improvements in the quality, quantity and diversity of meteor data, while even more ambitious instruments are about to become operational. This empowers meteor science to boost its experimental and theoretical horizons and seek more advanced scientific goals. We review some of the developments that push meteor science into the Big Data era that requires more complex methodological approaches through interdisciplinary collaborations with other branches of physics and computer science. We argue that meteor science should become an integral part of large surveys in astronomy, aeronomy and space physics, and tackle the complexity of micro-physics of meteor plasma and its interaction with the atmosphere. The recent increased interest in meteor science triggered by the Chelyabinsk fireball helps in building the case for technologically and logistically more ambitious meteor projects. This requires developing new methodological approaches in meteor research, with Big Data science and close collaboration between citizen science, geoscience and astronomy as critical elements. We discuss possibilities for improvements and promote an opportunity for collaboration in meteor science within the currently established BigSkyEarth http://bigskyearth.eu/ network.

  19. Educational process in modern climatology within the web-GIS platform "Climate"

    NASA Astrophysics Data System (ADS)

    Gordova, Yulia; Gorbatenko, Valentina; Gordov, Evgeny; Martynova, Yulia; Okladnikov, Igor; Titov, Alexander; Shulgina, Tamara

    2013-04-01

    These days, common to all scientific fields the problem of training of scientists in the environmental sciences is exacerbated by the need to develop new computational and information technology skills in distributed multi-disciplinary teams. To address this and other pressing problems of Earth system sciences, software infrastructure for information support of integrated research in the geosciences was created based on modern information and computational technologies and a software and hardware platform "Climate» (http://climate.scert.ru/) was developed. In addition to the direct analysis of geophysical data archives, the platform is aimed at teaching the basics of the study of changes in regional climate. The educational component of the platform includes a series of lectures on climate, environmental and meteorological modeling and laboratory work cycles on the basics of analysis of current and potential future regional climate change using Siberia territory as an example. The educational process within the Platform is implemented using the distance learning system Moodle (www.moodle.org). This work is partially supported by the Ministry of education and science of the Russian Federation (contract #8345), SB RAS project VIII.80.2.1, RFBR grant #11-05-01190a, and integrated project SB RAS #131.

  20. Mobile game development: improving student engagement and motivation in introductory computing courses

    NASA Astrophysics Data System (ADS)

    Kurkovsky, Stan

    2013-06-01

    Computer games have been accepted as an engaging and motivating tool in the computer science (CS) curriculum. However, designing and implementing a playable game is challenging, and is best done in advanced courses. Games for mobile devices, on the other hand, offer the advantage of being simpler and, thus, easier to program for lower level students. Learning context of mobile game development can be used to reinforce many core programming topics, such as loops, classes, and arrays. Furthermore, it can also be used to expose students in introductory computing courses to a wide range of advanced topics in order to illustrate that CS can be much more than coding. This paper describes the author's experience with using mobile game development projects in CS I and II, how these projects were integrated into existing courses at several universities, and the lessons learned from this experience.

  1. Using the Eclipse Parallel Tools Platform to Assist Earth Science Model Development and Optimization on High Performance Computers

    NASA Astrophysics Data System (ADS)

    Alameda, J. C.

    2011-12-01

    Development and optimization of computational science models, particularly on high performance computers, and with the advent of ubiquitous multicore processor systems, practically on every system, has been accomplished with basic software tools, typically, command-line based compilers, debuggers, performance tools that have not changed substantially from the days of serial and early vector computers. However, model complexity, including the complexity added by modern message passing libraries such as MPI, and the need for hybrid code models (such as openMP and MPI) to be able to take full advantage of high performance computers with an increasing core count per shared memory node, has made development and optimization of such codes an increasingly arduous task. Additional architectural developments, such as many-core processors, only complicate the situation further. In this paper, we describe how our NSF-funded project, "SI2-SSI: A Productive and Accessible Development Workbench for HPC Applications Using the Eclipse Parallel Tools Platform" (WHPC) seeks to improve the Eclipse Parallel Tools Platform, an environment designed to support scientific code development targeted at a diverse set of high performance computing systems. Our WHPC project to improve Eclipse PTP takes an application-centric view to improve PTP. We are using a set of scientific applications, each with a variety of challenges, and using PTP to drive further improvements to both the scientific application, as well as to understand shortcomings in Eclipse PTP from an application developer perspective, to drive our list of improvements we seek to make. We are also partnering with performance tool providers, to drive higher quality performance tool integration. We have partnered with the Cactus group at Louisiana State University to improve Eclipse's ability to work with computational frameworks and extremely complex build systems, as well as to develop educational materials to incorporate into computational science and engineering codes. Finally, we are partnering with the lead PTP developers at IBM, to ensure we are as effective as possible within the Eclipse community development. We are also conducting training and outreach to our user community, including conference BOF sessions, monthly user calls, and an annual user meeting, so that we can best inform the improvements we make to Eclipse PTP. With these activities we endeavor to encourage use of modern software engineering practices, as enabled through the Eclipse IDE, with computational science and engineering applications. These practices include proper use of source code repositories, tracking and rectifying issues, measuring and monitoring code performance changes against both optimizations as well as ever-changing software stacks and configurations on HPC systems, as well as ultimately encouraging development and maintenance of testing suites -- things that have become commonplace in many software endeavors, but have lagged in the development of science applications. We view that the challenge with the increased complexity of both HPC systems and science applications demands the use of better software engineering methods, preferably enabled by modern tools such as Eclipse PTP, to help the computational science community thrive as we evolve the HPC landscape.

  2. An urban area minority outreach program for K-6 children in space science

    NASA Astrophysics Data System (ADS)

    Morris, P.; Garza, O.; Lindstrom, M.; Allen, J.; Wooten, J.; Sumners, C.; Obot, V.

    The Houston area has minority populations with significant school dropout rates. This is similar to other major cities in the United States and elsewhere in the world where there are significant minority populations from rural areas. The student dropout rates are associated in many instances with the absence of educational support opportuni- ties either from the school and/or from the family. This is exacerbated if the student has poor English language skills. To address this issue, a NASA minority university initiative enabled us to develop a broad-based outreach program that includes younger children and their parents at a primarily Hispanic inner city charter school. The pro- gram at the charter school was initiated by teaching computer skills to the older chil- dren, who in turn taught parents. The older children were subsequently asked to help teach a computer literacy class for mothers with 4-5 year old children. The computers initially intimidated the mothers as most had limited educational backgrounds and En- glish language skills. To practice their newly acquired computer skills and learn about space science, the mothers and their children were asked to pick a space project and investigate it using their computer skills. The mothers and their children decided to learn about black holes. The project included designing space suits for their children so that they could travel through space and observe black holes from a closer proxim- ity. The children and their mothers learned about computers and how to use them for educational purposes. In addition, they learned about black holes and the importance of space suits in protecting astronauts as they investigated space. The parents are proud of their children and their achievements. By including the parents in the program, they have a greater understanding of the importance of their children staying in school and the opportunities for careers in space science and technology. For more information on our overall program, the charter school and their other space science related activities, visit their web site, http://www.tccc-ryss.org/solarsys/solarmingrant.htm

  3. Designing and Implementing a Computational Methods Course for Upper-level Undergraduates and Postgraduates in Atmospheric and Oceanic Sciences

    NASA Astrophysics Data System (ADS)

    Nelson, E.; L'Ecuyer, T. S.; Douglas, A.; Hansen, Z.

    2017-12-01

    In the modern computing age, scientists must utilize a wide variety of skills to carry out scientific research. Programming, including a focus on collaborative development, has become more prevalent in both academic and professional career paths. Faculty in the Department of Atmospheric and Oceanic Sciences at the University of Wisconsin—Madison recognized this need and recently approved a new course offering for undergraduates and postgraduates in computational methods that was first held in Spring 2017. Three programming languages were covered in the inaugural course semester and development themes such as modularization, data wrangling, and conceptual code models were woven into all of the sections. In this presentation, we will share successes and challenges in developing a research project-focused computational course that leverages hands-on computer laboratory learning and open-sourced course content. Improvements and changes in future iterations of the course based on the first offering will also be discussed.

  4. AstrodyToolsWeb an e-Science project in Astrodynamics and Celestial Mechanics fields

    NASA Astrophysics Data System (ADS)

    López, R.; San-Juan, J. F.

    2013-05-01

    Astrodynamics Web Tools, AstrodyToolsWeb (http://tastrody.unirioja.es), is an ongoing collaborative Web Tools computing infrastructure project which has been specially designed to support scientific computation. AstrodyToolsWeb provides project collaborators with all the technical and human facilities in order to wrap, manage, and use specialized noncommercial software tools in Astrodynamics and Celestial Mechanics fields, with the aim of optimizing the use of resources, both human and material. However, this project is open to collaboration from the whole scientific community in order to create a library of useful tools and their corresponding theoretical backgrounds. AstrodyToolsWeb offers a user-friendly web interface in order to choose applications, introduce data, and select appropriate constraints in an intuitive and easy way for the user. After that, the application is executed in real time, whenever possible; then the critical information about program behavior (errors and logs) and output, including the postprocessing and interpretation of its results (graphical representation of data, statistical analysis or whatever manipulation therein), are shown via the same web interface or can be downloaded to the user's computer.

  5. Intelligent Monitoring of Rocket Test Systems

    NASA Technical Reports Server (NTRS)

    Duran, Esteban; Rocha, Stephanie; Figueroa, Fernando

    2016-01-01

    Stephanie Rocha is an undergraduate student pursuing a degree in Mechanical Engineering. Esteban Duran is pursuing a degree in Computer Science. Our mentor is Fernando Figueroa. Our project involved developing Intelligent Health Monitoring at the High Pressure Gas Facility (HPGF) utilizing the software GensymG2.

  6. Chaos: A Topic for Interdisciplinary Education in Physics

    ERIC Educational Resources Information Center

    Bae, Saebyok

    2009-01-01

    Since society and science need interdisciplinary works, the interesting topic of chaos is chosen for interdisciplinary education in physics. The educational programme contains various university-level activities such as computer simulations, chaos experiment and team projects besides ordinary teaching. According to the participants, the programme…

  7. A Multidimensional Software Engineering Course

    ERIC Educational Resources Information Center

    Barzilay, O.; Hazzan, O.; Yehudai, A.

    2009-01-01

    Software engineering (SE) is a multidimensional field that involves activities in various areas and disciplines, such as computer science, project management, and system engineering. Though modern SE curricula include designated courses that address these various subjects, an advanced summary course that synthesizes them is still missing. Such a…

  8. Los Alamos Science Facilities

    Science.gov Websites

    Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron

  9. Frontiers in Science Lectures

    Science.gov Websites

    Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron

  10. Dynamic Interactions for Network Visualization and Simulation

    DTIC Science & Technology

    2009-03-01

    projects.htm, Site accessed January 5, 2009. 12. John S. Weir, Major, USAF, Mediated User-Simulator Interactive Command with Visualization ( MUSIC -V). Master’s...Computing Sciences in Colleges, December 2005). 14. Enrique Campos -Nanez, “nscript user manual,” Department of System Engineer- ing University of

  11. Science Projects | Akron-Summit County Public Library

    Science.gov Websites

    Resources Suggest a Purchase Hours & Locations Main Library Mobile Services Ellet Fairlawn-Bath Training Print/Scan/Fax/Copy Assistive Services Borrower Services Library Computers Meeting Rooms Mobile ; Notable Student Resources Suggest a Purchase Hours & Locations Main Library Mobile Services Ellet

  12. Development of EarthCube Governance: An Agile Approach

    NASA Astrophysics Data System (ADS)

    Pearthree, G.; Allison, M. L.; Patten, K.

    2013-12-01

    Governance of geosciences cyberinfrastructure is a complex and essential undertaking, critical in enabling distributed knowledge communities to collaborate and communicate across disciplines, distances, and cultures. Advancing science with respect to 'grand challenges," such as global climate change, weather prediction, and core fundamental science, depends not just on technical cyber systems, but also on social systems for strategic planning, decision-making, project management, learning, teaching, and building a community of practice. Simply put, a robust, agile technical system depends on an equally robust and agile social system. Cyberinfrastructure development is wrapped in social, organizational and governance challenges, which may significantly impede progress. An agile development process is underway for governance of transformative investments in geosciences cyberinfrastructure through the NSF EarthCube initiative. Agile development is iterative and incremental, and promotes adaptive planning and rapid and flexible response. Such iterative deployment across a variety of EarthCube stakeholders encourages transparency, consensus, accountability, and inclusiveness. A project Secretariat acts as the coordinating body, carrying out duties for planning, organizing, communicating, and reporting. A broad coalition of stakeholder groups comprises an Assembly (Mainstream Scientists, Cyberinfrastructure Institutions, Information Technology/Computer Sciences, NSF EarthCube Investigators, Science Communities, EarthCube End-User Workshop Organizers, Professional Societies) to serve as a preliminary venue for identifying, evaluating, and testing potential governance models. To offer opportunity for broader end-user input, a crowd-source approach will engage stakeholders not involved otherwise. An Advisory Committee from the Earth, ocean, atmosphere, social, computer and library sciences is guiding the process from a high-level policy point of view. Developmental evaluators from the social sciences embedded in the project provide real-time review and adjustments. While a large number of agencies and organizations have agreed to participate, in order to ensure an open and inclusive process, community selected leaders yet to be identified will play key roles through an Assembly Advisory Council. Once consensus is reached on a governing framework, a community-selected demonstration governance pilot will help facilitate community convergence on system design.

  13. Collaborative Project. Understanding the effects of tides and eddies on the ocean dynamics, sea ice cover and decadal/centennial climate prediction using the Regional Arctic Climate Model (RACM)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hutchings, Jennifer; Joseph, Renu

    2013-09-14

    The goal of this project is to develop an eddy resolving ocean model (POP) with tides coupled to a sea ice model (CICE) within the Regional Arctic System Model (RASM) to investigate the importance of ocean tides and mesoscale eddies in arctic climate simulations and quantify biases associated with these processes and how their relative contribution may improve decadal to centennial arctic climate predictions. Ocean, sea ice and coupled arctic climate response to these small scale processes will be evaluated with regard to their influence on mass, momentum and property exchange between oceans, shelf-basin, ice-ocean, and ocean-atmosphere. The project willmore » facilitate the future routine inclusion of polar tides and eddies in Earth System Models when computing power allows. As such, the proposed research addresses the science in support of the BER’s Climate and Environmental Sciences Division Long Term Measure as it will improve the ocean and sea ice model components as well as the fully coupled RASM and Community Earth System Model (CESM) and it will make them more accurate and computationally efficient.« less

  14. Radio Synthesis Imaging - A High Performance Computing and Communications Project

    NASA Astrophysics Data System (ADS)

    Crutcher, Richard M.

    The National Science Foundation has funded a five-year High Performance Computing and Communications project at the National Center for Supercomputing Applications (NCSA) for the direct implementation of several of the computing recommendations of the Astronomy and Astrophysics Survey Committee (the "Bahcall report"). This paper is a summary of the project goals and a progress report. The project will implement a prototype of the next generation of astronomical telescope systems - remotely located telescopes connected by high-speed networks to very high performance, scalable architecture computers and on-line data archives, which are accessed by astronomers over Gbit/sec networks. Specifically, a data link has been installed between the BIMA millimeter-wave synthesis array at Hat Creek, California and NCSA at Urbana, Illinois for real-time transmission of data to NCSA. Data are automatically archived, and may be browsed and retrieved by astronomers using the NCSA Mosaic software. In addition, an on-line digital library of processed images will be established. BIMA data will be processed on a very high performance distributed computing system, with I/O, user interface, and most of the software system running on the NCSA Convex C3880 supercomputer or Silicon Graphics Onyx workstations connected by HiPPI to the high performance, massively parallel Thinking Machines Corporation CM-5. The very computationally intensive algorithms for calibration and imaging of radio synthesis array observations will be optimized for the CM-5 and new algorithms which utilize the massively parallel architecture will be developed. Code running simultaneously on the distributed computers will communicate using the Data Transport Mechanism developed by NCSA. The project will also use the BLANCA Gbit/s testbed network between Urbana and Madison, Wisconsin to connect an Onyx workstation in the University of Wisconsin Astronomy Department to the NCSA CM-5, for development of long-distance distributed computing. Finally, the project is developing 2D and 3D visualization software as part of the international AIPS++ project. This research and development project is being carried out by a team of experts in radio astronomy, algorithm development for massively parallel architectures, high-speed networking, database management, and Thinking Machines Corporation personnel. The development of this complete software, distributed computing, and data archive and library solution to the radio astronomy computing problem will advance our expertise in high performance computing and communications technology and the application of these techniques to astronomical data processing.

  15. Development of an Integrated, Computer-Based Bibliographical Data System for a Large University Library. Annual Report to the National Science Foundation from the University of Chicago Library, 1966/67.

    ERIC Educational Resources Information Center

    Fussler, Herman; Payne, Charles T.

    Part I is a discussion of the following project tasks: A) development of an on-line, real-time bibliographic data processing system; B) implementation in library operations; C) character sets; D) Project MARC; E) circulation; and F) processing operation studies. Part II is a brief discussion of efforts to work out cooperative library systems…

  16. Earthdata Cloud Analytics Project

    NASA Technical Reports Server (NTRS)

    Ramachandran, Rahul; Lynnes, Chris

    2018-01-01

    This presentation describes a nascent project in NASA to develop a framework to support end-user analytics of NASA's Earth science data in the cloud. The chief benefit of migrating EOSDIS (Earth Observation System Data and Information Systems) data to the cloud is to position the data next to enormous computing capacity to allow end users to process data at scale. The Earthdata Cloud Analytics project will user a service-based approach to facilitate the infusion of evolving analytics technology and the integration with non-NASA analytics or other complementary functionality at other agencies and in other nations.

  17. Data science for mental health: a UK perspective on a global challenge.

    PubMed

    McIntosh, Andrew M; Stewart, Robert; John, Ann; Smith, Daniel J; Davis, Katrina; Sudlow, Cathie; Corvin, Aiden; Nicodemus, Kristin K; Kingdon, David; Hassan, Lamiece; Hotopf, Matthew; Lawrie, Stephen M; Russ, Tom C; Geddes, John R; Wolpert, Miranda; Wölbert, Eva; Porteous, David J

    2016-10-01

    Data science uses computer science and statistics to extract new knowledge from high-dimensional datasets (ie, those with many different variables and data types). Mental health research, diagnosis, and treatment could benefit from data science that uses cohort studies, genomics, and routine health-care and administrative data. The UK is well placed to trial these approaches through robust NHS-linked data science projects, such as the UK Biobank, Generation Scotland, and the Clinical Record Interactive Search (CRIS) programme. Data science has great potential as a low-cost, high-return catalyst for improved mental health recognition, understanding, support, and outcomes. Lessons learnt from such studies could have global implications. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Laboratory for Computer Science Progress Report 21, July 1983-June 1984.

    DTIC Science & Technology

    1984-06-01

    Systems 269 4. Distributed Consensus 270 5. Election of a Leader in a Distributed Ring of Processors 273 6. Distributed Network Algorithms 274 7. Diagnosis...multiprocessor systems. This facility, funded by the new!y formed Strategic Computing Program of the Defense Advanced Research Projects Agency, will enable...Academic Staff P. Szo)ovits, Group Leader R. Patil Collaborating Investigators M. Criscitiello, M.D., Tufts-New England Medical Center Hospital R

  19. miniTri Mantevo miniapp v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berry, Johathan; Stark, Dylan; Wolf, Michael

    2016-02-02

    miniTri is a miniapplication developed as part of the Mantevo project. Given a graph, miniTri enumerates all triangles in this graph and computes a metric for each triangle based on the triangle edge and vertex degree. The output of miniTri is a summary of this metric. miniTri mimics the computational requirements of an important set of data science applications. Several approaches to this problem are included in the miniTri software.

  20. Elementary and Advanced Computer Projects for the Physics Classroom and Laboratory

    DTIC Science & Technology

    1992-12-01

    are SPF/PC, MS Word, n3, Symphony, Mathematics, and FORTRAN. The authors’ programs assist data analysis in particular laboratory experiments and make...assist data analysis in particular laboratory experiments and make use of the Monte Carlo and other numerical techniques in computer simulation and...the language of science and engineering in industry and government laboratories (alth..4h C is becoming a powerful competitor ). RM/FORTRAN (cost $400

  1. The National Cancer Institute's Physical Sciences - Oncology Network

    NASA Astrophysics Data System (ADS)

    Espey, Michael Graham

    In 2009, the NCI launched the Physical Sciences - Oncology Centers (PS-OC) initiative with 12 Centers (U54) funded through 2014. The current phase of the Program includes U54 funded Centers with the added feature of soliciting new Physical Science - Oncology Projects (PS-OP) U01 grant applications through 2017; see NCI PAR-15-021. The PS-OPs, individually and along with other PS-OPs and the Physical Sciences-Oncology Centers (PS-OCs), comprise the Physical Sciences-Oncology Network (PS-ON). The foundation of the Physical Sciences-Oncology initiative is a high-risk, high-reward program that promotes a `physical sciences perspective' of cancer and fosters the convergence of physical science and cancer research by forming transdisciplinary teams of physical scientists (e.g., physicists, mathematicians, chemists, engineers, computer scientists) and cancer researchers (e.g., cancer biologists, oncologists, pathologists) who work closely together to advance our understanding of cancer. The collaborative PS-ON structure catalyzes transformative science through increased exchange of people, ideas, and approaches. PS-ON resources are leveraged to fund Trans-Network pilot projects to enable synergy and cross-testing of experimental and/or theoretical concepts. This session will include a brief PS-ON overview followed by a strategic discussion with the APS community to exchange perspectives on the progression of trans-disciplinary physical sciences in cancer research.

  2. Application of SLURM, BOINC, and GlusterFS as Software System for Sustainable Modeling and Data Analytics

    NASA Astrophysics Data System (ADS)

    Kashansky, Vladislav V.; Kaftannikov, Igor L.

    2018-02-01

    Modern numerical modeling experiments and data analytics problems in various fields of science and technology reveal a wide variety of serious requirements for distributed computing systems. Many scientific computing projects sometimes exceed the available resource pool limits, requiring extra scalability and sustainability. In this paper we share the experience and findings of our own on combining the power of SLURM, BOINC and GlusterFS as software system for scientific computing. Especially, we suggest a complete architecture and highlight important aspects of systems integration.

  3. Federated data storage system prototype for LHC experiments and data intensive science

    NASA Astrophysics Data System (ADS)

    Kiryanov, A.; Klimentov, A.; Krasnopevtsev, D.; Ryabinkin, E.; Zarochentsev, A.

    2017-10-01

    Rapid increase of data volume from the experiments running at the Large Hadron Collider (LHC) prompted physics computing community to evaluate new data handling and processing solutions. Russian grid sites and universities’ clusters scattered over a large area aim at the task of uniting their resources for future productive work, at the same time giving an opportunity to support large physics collaborations. In our project we address the fundamental problem of designing a computing architecture to integrate distributed storage resources for LHC experiments and other data-intensive science applications and to provide access to data from heterogeneous computing facilities. Studies include development and implementation of federated data storage prototype for Worldwide LHC Computing Grid (WLCG) centres of different levels and University clusters within one National Cloud. The prototype is based on computing resources located in Moscow, Dubna, Saint Petersburg, Gatchina and Geneva. This project intends to implement a federated distributed storage for all kind of operations such as read/write/transfer and access via WAN from Grid centres, university clusters, supercomputers, academic and commercial clouds. The efficiency and performance of the system are demonstrated using synthetic and experiment-specific tests including real data processing and analysis workflows from ATLAS and ALICE experiments, as well as compute-intensive bioinformatics applications (PALEOMIX) running on supercomputers. We present topology and architecture of the designed system, report performance and statistics for different access patterns and show how federated data storage can be used efficiently by physicists and biologists. We also describe how sharing data on a widely distributed storage system can lead to a new computing model and reformations of computing style, for instance how bioinformatics program running on supercomputers can read/write data from the federated storage.

  4. Inventors in the Making

    ERIC Educational Resources Information Center

    Murray, Jenny; Bartelmay, Kathy

    2005-01-01

    Can second-grade students construct an understanding of sophisticated science processes and explore physics concepts while creating their own inventions? Yes! Students accomplished this and much more through a month-long project in which they used Legos and Robolab, the Lego computer programing software, to create their own inventions. One…

  5. Science and Innovation at Los Alamos

    Science.gov Websites

    Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron

  6. Growth Potential

    ERIC Educational Resources Information Center

    Barry, Dana M.

    2004-01-01

    Students enjoy carrying out an exciting and challenging research project that combines science with computers and mathematics to investigate how polyacrylate animals change in size over time when placed in water and aqueous salt solutions. The hands-on activity motivates students and provides them with the necessary skills and information to have…

  7. Fusion Sciences Education Outreach in the Middle Schools, an Unplanned Case Study

    NASA Astrophysics Data System (ADS)

    Danielson, C. A.

    1997-11-01

    Before bringing a class to General Atomics (GA) for the DIII--D educational tour, the teacher is provided with pre-tour materials which include a videotape, curriculum notebook and fusion poster. These materials are used in the classroom to familiarize students with fusion concepts before the tour. This presentation will focus on the results of projects of 7th grade students of Chula Vista Junior High School (a magnet school for performing arts with a majority of Hispanic students). The assignment given by Physics Teacher Caryn Hoffman to her students prior to the tour was to focus on one or two of the DIII--D tour guides, ask questions relating to their careers in science and then prepare a presentation based on their interviews and their tour experience. The completed projects were very diverse -- calendars, comic strips, newspapers, plays, and board games were some of the media the students used. Tour guides selected by the students ranged from physicists, designers and computer support personnel. Project results reflected a surprisingly good understanding of fusion science concepts. Subsequent classroom interviews with the students demonstrated an overall increase in science interest and a specific interest in plasma and fusion research.

  8. Montage Version 3.0

    NASA Technical Reports Server (NTRS)

    Jacob, Joseph; Katz, Daniel; Prince, Thomas; Berriman, Graham; Good, John; Laity, Anastasia

    2006-01-01

    The final version (3.0) of the Montage software has been released. To recapitulate from previous NASA Tech Briefs articles about Montage: This software generates custom, science-grade mosaics of astronomical images on demand from input files that comply with the Flexible Image Transport System (FITS) standard and contain image data registered on projections that comply with the World Coordinate System (WCS) standards. This software can be executed on single-processor computers, multi-processor computers, and such networks of geographically dispersed computers as the National Science Foundation s TeraGrid or NASA s Information Power Grid. The primary advantage of running Montage in a grid environment is that computations can be done on a remote supercomputer for efficiency. Multiple computers at different sites can be used for different parts of a computation a significant advantage in cases of computations for large mosaics that demand more processor time than is available at any one site. Version 3.0 incorporates several improvements over prior versions. The most significant improvement is that this version is accessible to scientists located anywhere, through operational Web services that provide access to data from several large astronomical surveys and construct mosaics on either local workstations or remote computational grids as needed.

  9. Development of the virtual research environment for analysis, evaluation and prediction of global climate change impacts on the regional environment

    NASA Astrophysics Data System (ADS)

    Okladnikov, Igor; Gordov, Evgeny; Titov, Alexander; Fazliev, Alexander

    2017-04-01

    Description and the first results of the Russian Science Foundation project "Virtual computational information environment for analysis, evaluation and prediction of the impacts of global climate change on the environment and climate of a selected region" is presented. The project is aimed at development of an Internet-accessible computation and information environment providing unskilled in numerical modelling and software design specialists, decision-makers and stakeholders with reliable and easy-used tools for in-depth statistical analysis of climatic characteristics, and instruments for detailed analysis, assessment and prediction of impacts of global climate change on the environment and climate of the targeted region. In the framework of the project, approaches of "cloud" processing and analysis of large geospatial datasets will be developed on the technical platform of the Russian leading institution involved in research of climate change and its consequences. Anticipated results will create a pathway for development and deployment of thematic international virtual research laboratory focused on interdisciplinary environmental studies. VRE under development will comprise best features and functionality of earlier developed information and computing system CLIMATE (http://climate.scert.ru/), which is widely used in Northern Eurasia environment studies. The Project includes several major directions of research listed below. 1. Preparation of geo-referenced data sets, describing the dynamics of the current and possible future climate and environmental changes in detail. 2. Improvement of methods of analysis of climate change. 3. Enhancing the functionality of the VRE prototype in order to create a convenient and reliable tool for the study of regional social, economic and political consequences of climate change. 4. Using the output of the first three tasks, compilation of the VRE prototype, its validation, preparation of applicable detailed description of climate change in Western Siberia, and dissemination of the Project results. Results of the first stage of the Project implementation are presented. This work is supported by the Russian Science Foundation grant No16-19-10257.

  10. Avenues for crowd science in Hydrology.

    NASA Astrophysics Data System (ADS)

    Koch, Julian; Stisen, Simon

    2016-04-01

    Crowd science describes research that is conducted with the participation of the general public (the crowd) and gives the opportunity to involve the crowd in research design, data collection and analysis. In various fields, scientists have already drawn on underused human resources to advance research at low cost, with high transparency and large acceptance of the public due to the bottom up structure and the participatory process. Within the hydrological sciences, crowd research has quite recently become more established in the form of crowd observatories to generate hydrological data on water quality, precipitation or river flow. These innovative observatories complement more traditional ways of monitoring hydrological data and strengthen a community-based environmental decision making. However, the full potential of crowd science lies in internet based participation of the crowd and it is not yet fully exploited in the field of Hydrology. New avenues that are not primarily based on the outsourcing of labor, but instead capitalize the full potential of human capabilities have to emerge. In multiple realms of solving complex problems, like image detection, optimization tasks, narrowing of possible solutions, humans still remain more effective than computer algorithms. The most successful online crowd science projects Foldit and Galaxy Zoo have proven that the collective of tens of thousands users could clearly outperform traditional computer based science approaches. Our study takes advantage of the well trained human perception to conduct a spatial sensitivity analysis of land-surface variables of a distributed hydrological model to identify the most sensitive spatial inputs. True spatial performance metrics, that quantitatively compare patterns, are not trivial to choose and their applicability is often not universal. On the other hand humans can quickly integrate spatial information at various scales and are therefore a trusted competence. We selected zooniverse, the most popular crowd science platform where over a million registered users contribute to various research projects, to build a survey of the human perception. The survey will be shown during the interactive discussion, but moreover for building future avenues of crowd science in Hydrology the following questions should be discussed: (1) What hydrological problems are suitable for an internet based crowd science application? (2) How to abstract the complex problem to a medium that appeals to the crowd? (3) How to secure good science with reliable results? (4) Can the crowd replace existing and established computer based applications like parameter optimization or forecasting at all?

  11. Research and Development Annual Report, 1992

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Issued as a companion to Johnson Space Center's Research and Technology Annual Report, which reports JSC accomplishments under NASA Research and Technology Operating Plan (RTOP) funding, this report describes 42 additional JSC projects that are funded through sources other than the RTOP. Emerging technologies in four major disciplines are summarized: space systems technology, medical and life sciences, mission operations, and computer systems. Although these projects focus on support of human spacecraft design, development, and safety, most have wide civil and commercial applications in areas such as advanced materials, superconductors, advanced semiconductors, digital imaging, high density data storage, high performance computers, optoelectronics, artificial intelligence, robotics and automation, sensors, biotechnology, medical devices and diagnosis, and human factors engineering.

  12. The JSC Research and Development Annual Report 1993

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Issued as a companion to Johnson Space Center's Research and Technology Annual Report, which reports JSC accomplishments under NASA Research and Technology Operating Plan (RTOP) funding, this report describes 47 additional projects that are funded through sources other than the RTOP. Emerging technologies in four major disciplines are summarized: space systems technology, medical and life sciences, mission operations, and computer systems. Although these projects focus on support of human spacecraft design, development, and safety, most have wide civil and commercial applications in areas such as advanced materials, superconductors, advanced semiconductors, digital imaging, high density data storage, high performance computers, optoelectronics, artificial intelligence, robotics and automation, sensors, biotechnology, medical devices and diagnosis, and human factors engineering.

  13. Semantic Web technologies for the big data in life sciences.

    PubMed

    Wu, Hongyan; Yamaguchi, Atsuko

    2014-08-01

    The life sciences field is entering an era of big data with the breakthroughs of science and technology. More and more big data-related projects and activities are being performed in the world. Life sciences data generated by new technologies are continuing to grow in not only size but also variety and complexity, with great speed. To ensure that big data has a major influence in the life sciences, comprehensive data analysis across multiple data sources and even across disciplines is indispensable. The increasing volume of data and the heterogeneous, complex varieties of data are two principal issues mainly discussed in life science informatics. The ever-evolving next-generation Web, characterized as the Semantic Web, is an extension of the current Web, aiming to provide information for not only humans but also computers to semantically process large-scale data. The paper presents a survey of big data in life sciences, big data related projects and Semantic Web technologies. The paper introduces the main Semantic Web technologies and their current situation, and provides a detailed analysis of how Semantic Web technologies address the heterogeneous variety of life sciences big data. The paper helps to understand the role of Semantic Web technologies in the big data era and how they provide a promising solution for the big data in life sciences.

  14. Point spread function modeling and image restoration for cone-beam CT

    NASA Astrophysics Data System (ADS)

    Zhang, Hua; Huang, Kui-Dong; Shi, Yi-Kai; Xu, Zhe

    2015-03-01

    X-ray cone-beam computed tomography (CT) has such notable features as high efficiency and precision, and is widely used in the fields of medical imaging and industrial non-destructive testing, but the inherent imaging degradation reduces the quality of CT images. Aimed at the problems of projection image degradation and restoration in cone-beam CT, a point spread function (PSF) modeling method is proposed first. The general PSF model of cone-beam CT is established, and based on it, the PSF under arbitrary scanning conditions can be calculated directly for projection image restoration without the additional measurement, which greatly improved the application convenience of cone-beam CT. Secondly, a projection image restoration algorithm based on pre-filtering and pre-segmentation is proposed, which can make the edge contours in projection images and slice images clearer after restoration, and control the noise in the equivalent level to the original images. Finally, the experiments verified the feasibility and effectiveness of the proposed methods. Supported by National Science and Technology Major Project of the Ministry of Industry and Information Technology of China (2012ZX04007021), Young Scientists Fund of National Natural Science Foundation of China (51105315), Natural Science Basic Research Program of Shaanxi Province of China (2013JM7003) and Northwestern Polytechnical University Foundation for Fundamental Research (JC20120226, 3102014KYJD022)

  15. Enlarging the STEM pipeline working with youth-serving organizations

    NASA Astrophysics Data System (ADS)

    Porro, I.

    2005-12-01

    The After-School Astronomy Project (ASAP) is a comprehensive initiative to promote the pursuit of science learning among underrepresented youth. To this end ASAP specifically aims at building the capacity of urban community-based centers to deliver innovative science out-of-school programming to their youth. ASAP makes use of a modular curriculum consisting of a combination of hands-on activities and youth-led explorations of the night sky using MicroObservatory. Through project-based investigations students reinforce learning in astronomy and develop an understanding of science as inquiry, while also develop communication and computer skills. Through MicroObservatory students gain access to a network of educational telescopes, that they control over the Internet, software analysis tools and an online community of users. An integral part of ASAP is to provide professional development opportunities for after-school workers. This promotes a self-sustainable implementation of ASAP long-term and fosters the creation of a cadre of after-school professionals dedicated to facilitating science-based programs.

  16. Detection and Characterisation of Meteors as a Big Data Citizen Science project

    NASA Astrophysics Data System (ADS)

    Gritsevich, M.

    2017-12-01

    Out of a total around 50,000 meteorites currently known to science, the atmospheric passage was recorded instrumentally in only 30 cases with the potential to derive their atmospheric trajectories and pre-impact heliocentric orbits. Similarly, while the observations of meteors, add thousands of new entries per month to existing databases, it is extremely rare they lead to meteorite recovery. Meteor studies thus represent an excellent example of the Big Data citizen science project, where progress in the field largely depends on the prompt identification and characterisation of meteor events as well as on extensive and valuable contributions by amateur observers. Over the last couple of decades technological advancements in observational techniques have yielded drastic improvements in the quality, quantity and diversity of meteor data, while even more ambitious instruments are about to become operational. This empowers meteor science to boost its experimental and theoretical horizons and seek more advanced scientific goals. We review some of the developments that push meteor science into the Big Data era that requires more complex methodological approaches through interdisciplinary collaborations with other branches of physics and computer science. We argue that meteor science should become an integral part of large surveys in astronomy, aeronomy and space physics, and tackle the complexity of micro-physics of meteor plasma and its interaction with the atmosphere. The recent increased interest in meteor science triggered by the Chelyabinsk fireball helps in building the case for technologically and logistically more ambitious meteor projects. This requires developing new methodological approaches in meteor research, with Big Data science and close collaboration between citizen science, geoscience and astronomy as critical elements. We discuss possibilities for improvements and promote an opportunity for collaboration in meteor science within the currently established EU COST BigSkyEarth http://bigskyearth.eu/ network.

  17. GeoBrain Computational Cyber-laboratory for Earth Science Studies

    NASA Astrophysics Data System (ADS)

    Deng, M.; di, L.

    2009-12-01

    Computational approaches (e.g., computer-based data visualization, analysis and modeling) are critical for conducting increasingly data-intensive Earth science (ES) studies to understand functions and changes of the Earth system. However, currently Earth scientists, educators, and students have met two major barriers that prevent them from being effectively using computational approaches in their learning, research and application activities. The two barriers are: 1) difficulties in finding, obtaining, and using multi-source ES data; and 2) lack of analytic functions and computing resources (e.g., analysis software, computing models, and high performance computing systems) to analyze the data. Taking advantages of recent advances in cyberinfrastructure, Web service, and geospatial interoperability technologies, GeoBrain, a project funded by NASA, has developed a prototype computational cyber-laboratory to effectively remove the two barriers. The cyber-laboratory makes ES data and computational resources at large organizations in distributed locations available to and easily usable by the Earth science community through 1) enabling seamless discovery, access and retrieval of distributed data, 2) federating and enhancing data discovery with a catalogue federation service and a semantically-augmented catalogue service, 3) customizing data access and retrieval at user request with interoperable, personalized, and on-demand data access and services, 4) automating or semi-automating multi-source geospatial data integration, 5) developing a large number of analytic functions as value-added, interoperable, and dynamically chainable geospatial Web services and deploying them in high-performance computing facilities, 6) enabling the online geospatial process modeling and execution, and 7) building a user-friendly extensible web portal for users to access the cyber-laboratory resources. Users can interactively discover the needed data and perform on-demand data analysis and modeling through the web portal. The GeoBrain cyber-laboratory provides solutions to meet common needs of ES research and education, such as, distributed data access and analysis services, easy access to and use of ES data, and enhanced geoprocessing and geospatial modeling capability. It greatly facilitates ES research, education, and applications. The development of the cyber-laboratory provides insights, lessons-learned, and technology readiness to build more capable computing infrastructure for ES studies, which can meet wide-range needs of current and future generations of scientists, researchers, educators, and students for their formal or informal educational training, research projects, career development, and lifelong learning.

  18. Principles for Integrating Mars Analog Science, Operations, and Technology Research

    NASA Technical Reports Server (NTRS)

    Clancey, William J.

    2003-01-01

    During the Apollo program, the scientific community and NASA used terrestrial analog sites for understanding planetary features and for training astronauts to be scientists. Human factors studies (Harrison, Clearwater, & McKay 1991; Stuster 1996) have focused on the effects of isolation in extreme environments. More recently, with the advent of wireless computing, we have prototyped advanced EVA technologies for navigation, scheduling, and science data logging (Clancey 2002b; Clancey et al., in press). Combining these interests in a single expedition enables tremendous synergy and authenticity, as pioneered by Pascal Lee's Haughton-Mars Project (Lee 2001; Clancey 2000a) and the Mars Society s research stations on a crater rim on Devon Island in the High Canadian Arctic (Clancey 2000b; 2001b) and the Morrison Formation of southeast Utah (Clancey 2002a). Based on this experience, the following principles are proposed for conducting an integrated science, operations, and technology research program at analog sites: 1) Authentic work; 2) PI-based projects; 3) Unencumbered baseline studies; 4) Closed simulations; and 5) Observation and documentation. Following these principles, we have been integrating field science, operations research, and technology development at analog sites on Devon Island and in Utah over the past five years. Analytic methods include work practice simulation (Clancey 2002c; Sierhuis et a]., 2000a;b), by which the interaction of human behavior, facilities, geography, tools, and procedures are formalized in computer models. These models are then converted into the runtime EVA system we call mobile agents (Clancey 2002b; Clancey et al., in press). Furthermore, we have found that the Apollo Lunar Surface Journal (Jones, 1999) provides a vast repository or understanding astronaut and CapCom interactions, serving as a baseline for Mars operations and quickly highlighting opportunities for computer automation (Clancey, in press).

  19. Industry/Postsecondary Education Partnership for Faculty Development.

    ERIC Educational Resources Information Center

    Zanville, Holly

    The project addressed the need for Oregon higher education faculty to receive state-of-the art information from Oregon businesses and industries in computer science, business, and engineering areas. Planning for a statewide interactive Eudcational Television Network (ED-NET) has been underway in Oregon for several years. The network will involve…

  20. The Evolution of Untethered Communications.

    ERIC Educational Resources Information Center

    National Academy of Sciences - National Research Council, Washington, DC.

    In response to a request from the Defense Advanced Research Projects Agency (DARPA), the Computer Science and Telecommunications Board (CSTB) of the National Research Council initiated a one-year study on untethered communications in July 1996. To carry out the study, the CSTB appointed a committee of 15 wireless-technology experts, including…

Top