Sample records for computational research projects

  1. Software design studies emphasizing Project LOGOS

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The results of a research project on the development of computer software are presented. Research funds of $200,000 were expended over a three year period for software design and projects in connection with Project LOGOS (computer-aided design and certification of computing systems). Abstracts of theses prepared during the project are provided.

  2. Graphics supercomputer for computational fluid dynamics research

    NASA Astrophysics Data System (ADS)

    Liaw, Goang S.

    1994-11-01

    The objective of this project is to purchase a state-of-the-art graphics supercomputer to improve the Computational Fluid Dynamics (CFD) research capability at Alabama A & M University (AAMU) and to support the Air Force research projects. A cutting-edge graphics supercomputer system, Onyx VTX, from Silicon Graphics Computer Systems (SGI), was purchased and installed. Other equipment including a desktop personal computer, PC-486 DX2 with a built-in 10-BaseT Ethernet card, a 10-BaseT hub, an Apple Laser Printer Select 360, and a notebook computer from Zenith were also purchased. A reading room has been converted to a research computer lab by adding some furniture and an air conditioning unit in order to provide an appropriate working environments for researchers and the purchase equipment. All the purchased equipment were successfully installed and are fully functional. Several research projects, including two existing Air Force projects, are being performed using these facilities.

  3. 78 FR 26626 - Applications for New Awards; National Institute on Disability and Rehabilitation Research...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-07

    ... Rehabilitation Research--Disability and Rehabilitation Research Projects--Inclusive Cloud and Web Computing... Rehabilitation Research Projects (DRRPs)--Inclusive Cloud and Web Computing Notice inviting applications for new...#DRRP . Priorities: Priority 1--DRRP on Inclusive Cloud and Web Computing-- is from the notice of final...

  4. French Plans for Fifth Generation Computer Systems.

    DTIC Science & Technology

    1984-12-07

    centrally man- French industry In electronics, compu- aged project in France that covers all ters, software, and services and to make the facets of the...Centre National of Japan’s Fifth Generation Project , the de Recherche Scientifique (CNRS) Cooper- French scientific and industrial com- ative Research...systems, man-computer The National Projects interaction, novel computer structures, The French Ministry of Research and knowledge-based computer systems

  5. Good enough practices in scientific computing.

    PubMed

    Wilson, Greg; Bryan, Jennifer; Cranston, Karen; Kitzes, Justin; Nederbragt, Lex; Teal, Tracy K

    2017-06-01

    Computers are now essential in all branches of science, but most researchers are never taught the equivalent of basic lab skills for research computing. As a result, data can get lost, analyses can take much longer than necessary, and researchers are limited in how effectively they can work with software and data. Computing workflows need to follow the same practices as lab projects and notebooks, with organized data, documented steps, and the project structured for reproducibility, but researchers new to computing often don't know where to start. This paper presents a set of good computing practices that every researcher can adopt, regardless of their current level of computational skill. These practices, which encompass data management, programming, collaborating with colleagues, organizing projects, tracking work, and writing manuscripts, are drawn from a wide variety of published sources from our daily lives and from our work with volunteer organizations that have delivered workshops to over 11,000 people since 2010.

  6. Tracking the NGS revolution: managing life science research on shared high-performance computing clusters.

    PubMed

    Dahlö, Martin; Scofield, Douglas G; Schaal, Wesley; Spjuth, Ola

    2018-05-01

    Next-generation sequencing (NGS) has transformed the life sciences, and many research groups are newly dependent upon computer clusters to store and analyze large datasets. This creates challenges for e-infrastructures accustomed to hosting computationally mature research in other sciences. Using data gathered from our own clusters at UPPMAX computing center at Uppsala University, Sweden, where core hour usage of ∼800 NGS and ∼200 non-NGS projects is now similar, we compare and contrast the growth, administrative burden, and cluster usage of NGS projects with projects from other sciences. The number of NGS projects has grown rapidly since 2010, with growth driven by entry of new research groups. Storage used by NGS projects has grown more rapidly since 2013 and is now limited by disk capacity. NGS users submit nearly twice as many support tickets per user, and 11 more tools are installed each month for NGS projects than for non-NGS projects. We developed usage and efficiency metrics and show that computing jobs for NGS projects use more RAM than non-NGS projects, are more variable in core usage, and rarely span multiple nodes. NGS jobs use booked resources less efficiently for a variety of reasons. Active monitoring can improve this somewhat. Hosting NGS projects imposes a large administrative burden at UPPMAX due to large numbers of inexperienced users and diverse and rapidly evolving research areas. We provide a set of recommendations for e-infrastructures that host NGS research projects. We provide anonymized versions of our storage, job, and efficiency databases.

  7. Tracking the NGS revolution: managing life science research on shared high-performance computing clusters

    PubMed Central

    2018-01-01

    Abstract Background Next-generation sequencing (NGS) has transformed the life sciences, and many research groups are newly dependent upon computer clusters to store and analyze large datasets. This creates challenges for e-infrastructures accustomed to hosting computationally mature research in other sciences. Using data gathered from our own clusters at UPPMAX computing center at Uppsala University, Sweden, where core hour usage of ∼800 NGS and ∼200 non-NGS projects is now similar, we compare and contrast the growth, administrative burden, and cluster usage of NGS projects with projects from other sciences. Results The number of NGS projects has grown rapidly since 2010, with growth driven by entry of new research groups. Storage used by NGS projects has grown more rapidly since 2013 and is now limited by disk capacity. NGS users submit nearly twice as many support tickets per user, and 11 more tools are installed each month for NGS projects than for non-NGS projects. We developed usage and efficiency metrics and show that computing jobs for NGS projects use more RAM than non-NGS projects, are more variable in core usage, and rarely span multiple nodes. NGS jobs use booked resources less efficiently for a variety of reasons. Active monitoring can improve this somewhat. Conclusions Hosting NGS projects imposes a large administrative burden at UPPMAX due to large numbers of inexperienced users and diverse and rapidly evolving research areas. We provide a set of recommendations for e-infrastructures that host NGS research projects. We provide anonymized versions of our storage, job, and efficiency databases. PMID:29659792

  8. Argonne's Magellan Cloud Computing Research Project

    ScienceCinema

    Beckman, Pete

    2017-12-11

    Pete Beckman, head of Argonne's Leadership Computing Facility (ALCF), discusses the Department of Energy's new $32-million Magellan project, which designed to test how cloud computing can be used for scientific research. More information: http://www.anl.gov/Media_Center/News/2009/news091014a.html

  9. Argonne's Magellan Cloud Computing Research Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beckman, Pete

    Pete Beckman, head of Argonne's Leadership Computing Facility (ALCF), discusses the Department of Energy's new $32-million Magellan project, which designed to test how cloud computing can be used for scientific research. More information: http://www.anl.gov/Media_Center/News/2009/news091014a.html

  10. A Comprehensive Toolset for General-Purpose Private Computing and Outsourcing

    DTIC Science & Technology

    2016-12-08

    project and scientific advances made towards each of the research thrusts throughout the project duration. 1 Project Objectives Cloud computing enables...possibilities that the cloud enables is computation outsourcing, when the client can utilize any necessary computing resources for its computational task...Security considerations, however, stand on the way of harnessing the full benefits of cloud computing to the fullest extent and prevent clients from

  11. Distributed and grid computing projects with research focus in human health.

    PubMed

    Diomidous, Marianna; Zikos, Dimitrios

    2012-01-01

    Distributed systems and grid computing systems are used to connect several computers to obtain a higher level of performance, in order to solve a problem. During the last decade, projects use the World Wide Web to aggregate individuals' CPU power for research purposes. This paper presents the existing active large scale distributed and grid computing projects with research focus in human health. There have been found and presented 11 active projects with more than 2000 Processing Units (PUs) each. The research focus for most of them is molecular biology and, specifically on understanding or predicting protein structure through simulation, comparing proteins, genomic analysis for disease provoking genes and drug design. Though not in all cases explicitly stated, common target diseases include research to find cure against HIV, dengue, Duchene dystrophy, Parkinson's disease, various types of cancer and influenza. Other diseases include malaria, anthrax, Alzheimer's disease. The need for national initiatives and European Collaboration for larger scale projects is stressed, to raise the awareness of citizens to participate in order to create a culture of internet volunteering altruism.

  12. The Magellan Final Report on Cloud Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    ,; Coghlan, Susan; Yelick, Katherine

    The goal of Magellan, a project funded through the U.S. Department of Energy (DOE) Office of Advanced Scientific Computing Research (ASCR), was to investigate the potential role of cloud computing in addressing the computing needs for the DOE Office of Science (SC), particularly related to serving the needs of mid- range computing and future data-intensive computing workloads. A set of research questions was formed to probe various aspects of cloud computing from performance, usability, and cost. To address these questions, a distributed testbed infrastructure was deployed at the Argonne Leadership Computing Facility (ALCF) and the National Energy Research Scientific Computingmore » Center (NERSC). The testbed was designed to be flexible and capable enough to explore a variety of computing models and hardware design points in order to understand the impact for various scientific applications. During the project, the testbed also served as a valuable resource to application scientists. Applications from a diverse set of projects such as MG-RAST (a metagenomics analysis server), the Joint Genome Institute, the STAR experiment at the Relativistic Heavy Ion Collider, and the Laser Interferometer Gravitational Wave Observatory (LIGO), were used by the Magellan project for benchmarking within the cloud, but the project teams were also able to accomplish important production science utilizing the Magellan cloud resources.« less

  13. Student teaching and research laboratory focusing on brain-computer interface paradigms--A creative environment for computer science students.

    PubMed

    Rutkowski, Tomasz M

    2015-08-01

    This paper presents an applied concept of a brain-computer interface (BCI) student research laboratory (BCI-LAB) at the Life Science Center of TARA, University of Tsukuba, Japan. Several successful case studies of the student projects are reviewed together with the BCI Research Award 2014 winner case. The BCI-LAB design and project-based teaching philosophy is also explained. Future teaching and research directions summarize the review.

  14. Summary of Research 1997, Department of Computer Science.

    DTIC Science & Technology

    1999-01-01

    Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington, VA 22202-4302, and to the Office of Management and Budget, Paperwork Reduction Project (0704...contains summaries of research projects in the Department of Computer Science . A list of recent publications is also included which consists of conference...parallel programming. Recently, in a joint research project between NPS and the Russian Academy of Sciences Systems Programming Insti- tute in Moscow

  15. A Research Program in Computer Technology. 1982 Annual Technical Report

    DTIC Science & Technology

    1983-03-01

    for the Defense Advanced Research Projects Agency. The research applies computer science and technology to areas of high DoD/ military impact. The ISI...implement the plan; New Computing Environment - investigation and adaptation of developing computer technologies to serve the research and military ...Computing Environment - ,.*_i;.;"’.)n and adaptation of developing computer technologies to serve the research and military tser communities; and Computer

  16. 10 CFR 961.11 - Text of the contract.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... program including information on cost projections, project plans and progress reports. 5. (a) Beginning on...-type documents or computer software (including computer programs, computer software data bases, and computer software documentation). Examples of technical data include research and engineering data...

  17. 10 CFR 961.11 - Text of the contract.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... program including information on cost projections, project plans and progress reports. 5. (a) Beginning on...-type documents or computer software (including computer programs, computer software data bases, and computer software documentation). Examples of technical data include research and engineering data...

  18. New European Training Network to Improve Young Scientists' Capabilities in Computational Wave Propagation

    NASA Astrophysics Data System (ADS)

    Igel, Heiner

    2004-07-01

    The European Commission recently funded a Marie-Curie Research Training Network (MCRTN) in the field of computational seismology within the 6th Framework Program. SPICE (Seismic wave Propagation and Imaging in Complex media: a European network) is coordinated by the computational seismology group of the Ludwig-Maximilians-Universität in Munich linking 14 European research institutions in total. The 4-year project will provide funding for 14 Ph.D. students (3-year projects) and 14 postdoctoral positions (2-year projects) within the various fields of computational seismology. These positions have been advertised and are currently being filled.

  19. The AIST Managed Cloud Environment

    NASA Astrophysics Data System (ADS)

    Cook, S.

    2016-12-01

    ESTO is currently in the process of developing and implementing the AIST Managed Cloud Environment (AMCE) to offer cloud computing services to ESTO-funded PIs to conduct their project research. AIST will provide projects access to a cloud computing framework that incorporates NASA security, technical, and financial standards, on which project can freely store, run, and process data. Currently, many projects led by research groups outside of NASA do not have the awareness of requirements or the resources to implement NASA standards into their research, which limits the likelihood of infusing the work into NASA applications. Offering this environment to PIs will allow them to conduct their project research using the many benefits of cloud computing. In addition to the well-known cost and time savings that it allows, it also provides scalability and flexibility. The AMCE will facilitate infusion and end user access by ensuring standardization and security. This approach will ultimately benefit ESTO, the science community, and the research, allowing the technology developments to have quicker and broader applications.

  20. The AMCE (AIST Managed Cloud Environment)

    NASA Astrophysics Data System (ADS)

    Cook, S.

    2017-12-01

    ESTO has developed and implemented the AIST Managed Cloud Environment (AMCE) to offer cloud computing services to SMD-funded PIs to conduct their project research. AIST will provide projects access to a cloud computing framework that incorporates NASA security, technical, and financial standards, on which project can freely store, run, and process data. Currently, many projects led by research groups outside of NASA do not have the awareness of requirements or the resources to implement NASA standards into their research, which limits the likelihood of infusing the work into NASA applications. Offering this environment to PIs allows them to conduct their project research using the many benefits of cloud computing. In addition to the well-known cost and time savings that it allows, it also provides scalability and flexibility. The AMCE facilitates infusion and end user access by ensuring standardization and security. This approach will ultimately benefit ESTO, the science community, and the research, allowing the technology developments to have quicker and broader applications.

  1. Computational mechanics research and support for aerodynamics and hydraulics at TFHRC, year 2 quarter 1 progress report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lottes, S.A.; Bojanowski, C.; Shen, J.

    2012-04-09

    The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. Themore » analysis methods employ well-benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water effects on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to improve design allowing for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, CFD analysis of the operation of the wind tunnel in the TFHRC wind engineering laboratory. This quarterly report documents technical progress on the project tasks for the period of October through December 2011.« less

  2. Computational mechanics research and support for aerodynamics and hydraulics at TFHRC, year 2 quarter 2 progress report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lottes, S.A.; Bojanowski, C.; Shen, J.

    2012-06-28

    The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. Themore » analysis methods employ well benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water effects on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to improve design allowing for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, CFD analysis of the operation of the wind tunnel in the TFHRC wind engineering laboratory. This quarterly report documents technical progress on the project tasks for the period of January through March 2012.« less

  3. Computational mechanics research and support for aerodynamics and hydraulics at TFHRC, year 1 quarter 3 progress report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lottes, S.A.; Kulak, R.F.; Bojanowski, C.

    2011-08-26

    The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. Themore » analysis methods employ well-benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water loads on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to assess them for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, vehicle stability under high wind loading, and the use of electromagnetic shock absorbers to improve vehicle stability under high wind conditions. This quarterly report documents technical progress on the project tasks for the period of April through June 2011.« less

  4. A Cloud-based Infrastructure and Architecture for Environmental System Research

    NASA Astrophysics Data System (ADS)

    Wang, D.; Wei, Y.; Shankar, M.; Quigley, J.; Wilson, B. E.

    2016-12-01

    The present availability of high-capacity networks, low-cost computers and storage devices, and the widespread adoption of hardware virtualization and service-oriented architecture provide a great opportunity to enable data and computing infrastructure sharing between closely related research activities. By taking advantage of these approaches, along with the world-class high computing and data infrastructure located at Oak Ridge National Laboratory, a cloud-based infrastructure and architecture has been developed to efficiently deliver essential data and informatics service and utilities to the environmental system research community, and will provide unique capabilities that allows terrestrial ecosystem research projects to share their software utilities (tools), data and even data submission workflow in a straightforward fashion. The infrastructure will minimize large disruptions from current project-based data submission workflows for better acceptances from existing projects, since many ecosystem research projects already have their own requirements or preferences for data submission and collection. The infrastructure will eliminate scalability problems with current project silos by provide unified data services and infrastructure. The Infrastructure consists of two key components (1) a collection of configurable virtual computing environments and user management systems that expedite data submission and collection from environmental system research community, and (2) scalable data management services and system, originated and development by ORNL data centers.

  5. Japanese supercomputer technology.

    PubMed

    Buzbee, B L; Ewald, R H; Worlton, W J

    1982-12-17

    Under the auspices of the Ministry for International Trade and Industry the Japanese have launched a National Superspeed Computer Project intended to produce high-performance computers for scientific computation and a Fifth-Generation Computer Project intended to incorporate and exploit concepts of artificial intelligence. If these projects are successful, which appears likely, advanced economic and military research in the United States may become dependent on access to supercomputers of foreign manufacture.

  6. Alliance for Computational Science Collaboration, HBCU Partnership at Alabama A&M University Final Performance Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deng, Z.T.

    2001-11-15

    The objective of this project was to conduct high-performance computing research and teaching at AAMU, and to train African-American and other minority students and scientists in the computational science field for eventual employment with DOE. During the project period, eight tasks were accomplished. Student Research Assistant, Work Study, Summer Interns, Scholarship were proved to be one of the best ways for us to attract top-quality minority students. Under the support of DOE, through research, summer interns, collaborations, scholarships programs, AAMU has successfully provided research and educational opportunities to minority students in the field related to computational science.

  7. A Synthesis and Survey of Critical Success Factors for Computer Technology Projects

    ERIC Educational Resources Information Center

    Baker, Ross A.

    2012-01-01

    The author investigated the existence of critical success factors for computer technology projects. Current research literature and a survey of experienced project managers indicate that there are 23 critical success factors (CSFs) that correlate with project success. The survey gathered an assessment of project success and the degree to which…

  8. Infrastructure for Training and Partnershipes: California Water and Coastal Ocean Resources

    NASA Technical Reports Server (NTRS)

    Siegel, David A.; Dozier, Jeffrey; Gautier, Catherine; Davis, Frank; Dickey, Tommy; Dunne, Thomas; Frew, James; Keller, Arturo; MacIntyre, Sally; Melack, John

    2000-01-01

    The purpose of this project was to advance the existing ICESS/Bren School computing infrastructure to allow scientists, students, and research trainees the opportunity to interact with environmental data and simulations in near-real time. Improvements made with the funding from this project have helped to strengthen the research efforts within both units, fostered graduate research training, and helped fortify partnerships with government and industry. With this funding, we were able to expand our computational environment in which computer resources, software, and data sets are shared by ICESS/Bren School faculty researchers in all areas of Earth system science. All of the graduate and undergraduate students associated with the Donald Bren School of Environmental Science and Management and the Institute for Computational Earth System Science have benefited from the infrastructure upgrades accomplished by this project. Additionally, the upgrades fostered a significant number of research projects (attached is a list of the projects that benefited from the upgrades). As originally proposed, funding for this project provided the following infrastructure upgrades: 1) a modem file management system capable of interoperating UNIX and NT file systems that can scale to 6.7 TB, 2) a Qualstar 40-slot tape library with two AIT tape drives and Legato Networker backup/archive software, 3) previously unavailable import/export capability for data sets on Zip, Jaz, DAT, 8mm, CD, and DLT media in addition to a 622Mb/s Internet 2 connection, 4) network switches capable of 100 Mbps to 128 desktop workstations, 5) Portable Batch System (PBS) computational task scheduler, and vi) two Compaq/Digital Alpha XP1000 compute servers each with 1.5 GB of RAM along with an SGI Origin 2000 (purchased partially using funds from this project along with funding from various other sources) to be used for very large computations, as required for simulation of mesoscale meteorology or climate.

  9. Improved Computer-Aided Instruction by the Use of Interfaced Random-Access Audio-Visual Equipment. Report on Research Project No. P/24/1.

    ERIC Educational Resources Information Center

    Bryce, C. F. A.; Stewart, A. M.

    A brief review of the characteristics of computer assisted instruction and the attributes of audiovisual media introduces this report on a project designed to improve the effectiveness of computer assisted learning through the incorporation of audiovisual materials. A discussion of the implications of research findings on the design and layout of…

  10. Research in Computational Astrobiology

    NASA Technical Reports Server (NTRS)

    Chaban, Galina; Colombano, Silvano; Scargle, Jeff; New, Michael H.; Pohorille, Andrew; Wilson, Michael A.

    2003-01-01

    We report on several projects in the field of computational astrobiology, which is devoted to advancing our understanding of the origin, evolution and distribution of life in the Universe using theoretical and computational tools. Research projects included modifying existing computer simulation codes to use efficient, multiple time step algorithms, statistical methods for analysis of astrophysical data via optimal partitioning methods, electronic structure calculations on water-nuclei acid complexes, incorporation of structural information into genomic sequence analysis methods and calculations of shock-induced formation of polycylic aromatic hydrocarbon compounds.

  11. A review of small canned computer programs for survey research and demographic analysis.

    PubMed

    Sinquefield, J C

    1976-12-01

    A variety of small canned computer programs for survey research and demographic analysis appropriate for use in developing countries are reviewed in this article. The programs discussed are SPSS (Statistical Package for the Social Sciences); CENTS, CO-CENTS, CENTS-AID, CENTS-AIE II; MINI-TAB EDIT, FREQUENCIES, TABLES, REGRESSION, CLIENT RECORD, DATES, MULT, LIFE, and PREGNANCY HISTORY; FIVFIV and SINSIN; DCL (Demographic Computer Library); MINI-TAB Population Projection, Functional Population Projection, and Family Planning Target Projection. A description and evaluation for each program of uses, instruction manuals, computer requirements, and procedures for obtaining manuals and programs are provided. Such information is intended to facilitate and encourage the use of the computer by data processors in developing countries.

  12. HPCCP/CAS Workshop Proceedings 1998

    NASA Technical Reports Server (NTRS)

    Schulbach, Catherine; Mata, Ellen (Editor); Schulbach, Catherine (Editor)

    1999-01-01

    This publication is a collection of extended abstracts of presentations given at the HPCCP/CAS (High Performance Computing and Communications Program/Computational Aerosciences Project) Workshop held on August 24-26, 1998, at NASA Ames Research Center, Moffett Field, California. The objective of the Workshop was to bring together the aerospace high performance computing community, consisting of airframe and propulsion companies, independent software vendors, university researchers, and government scientists and engineers. The Workshop was sponsored by the HPCCP Office at NASA Ames Research Center. The Workshop consisted of over 40 presentations, including an overview of NASA's High Performance Computing and Communications Program and the Computational Aerosciences Project; ten sessions of papers representative of the high performance computing research conducted within the Program by the aerospace industry, academia, NASA, and other government laboratories; two panel sessions; and a special presentation by Mr. James Bailey.

  13. Computer-Mediated Communication and a Cautionary Tale of Two Cities

    ERIC Educational Resources Information Center

    Sadler, Randall

    2007-01-01

    This paper describes an action research project that investigated the pedagogical applicability of computer-mediated communication (CMC) tools for collaborative projects. The research involved two groups of students studying to become ESL/EFL teachers, one group at a university located in the US Midwest and the other in the Catalan region of…

  14. Computer-assisted map projection research

    USGS Publications Warehouse

    Snyder, John Parr

    1985-01-01

    Computers have opened up areas of map projection research which were previously too complicated to utilize, for example, using a least-squares fit to a very large number of points. One application has been in the efficient transfer of data between maps on different projections. While the transfer of moderate amounts of data is satisfactorily accomplished using the analytical map projection formulas, polynomials are more efficient for massive transfers. Suitable coefficients for the polynomials may be determined more easily for general cases using least squares instead of Taylor series. A second area of research is in the determination of a map projection fitting an unlabeled map, so that accurate data transfer can take place. The computer can test one projection after another, and include iteration where required. A third area is in the use of least squares to fit a map projection with optimum parameters to the region being mapped, so that distortion is minimized. This can be accomplished for standard conformal, equalarea, or other types of projections. Even less distortion can result if complex transformations of conformal projections are utilized. This bulletin describes several recent applications of these principles, as well as historical usage and background.

  15. Computational mechanics research and support for aerodynamics and hydraulics at TFHRC year 1 quarter 4 progress report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lottes, S.A.; Kulak, R.F.; Bojanowski, C.

    2011-12-09

    The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. Themore » analysis methods employ well-benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water effects on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to assess them for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, CFD analysis of the operation of the wind tunnel in the TFCHR wind engineering laboratory, vehicle stability under high wind loading, and the use of electromagnetic shock absorbers to improve vehicle stability under high wind conditions. This quarterly report documents technical progress on the project tasks for the period of July through September 2011.« less

  16. Challenges and opportunities of cloud computing for atmospheric sciences

    NASA Astrophysics Data System (ADS)

    Pérez Montes, Diego A.; Añel, Juan A.; Pena, Tomás F.; Wallom, David C. H.

    2016-04-01

    Cloud computing is an emerging technological solution widely used in many fields. Initially developed as a flexible way of managing peak demand it has began to make its way in scientific research. One of the greatest advantages of cloud computing for scientific research is independence of having access to a large cyberinfrastructure to fund or perform a research project. Cloud computing can avoid maintenance expenses for large supercomputers and has the potential to 'democratize' the access to high-performance computing, giving flexibility to funding bodies for allocating budgets for the computational costs associated with a project. Two of the most challenging problems in atmospheric sciences are computational cost and uncertainty in meteorological forecasting and climate projections. Both problems are closely related. Usually uncertainty can be reduced with the availability of computational resources to better reproduce a phenomenon or to perform a larger number of experiments. Here we expose results of the application of cloud computing resources for climate modeling using cloud computing infrastructures of three major vendors and two climate models. We show how the cloud infrastructure compares in performance to traditional supercomputers and how it provides the capability to complete experiments in shorter periods of time. The monetary cost associated is also analyzed. Finally we discuss the future potential of this technology for meteorological and climatological applications, both from the point of view of operational use and research.

  17. Multiresolution and Explicit Methods for Vector Field Analysis and Visualization

    NASA Technical Reports Server (NTRS)

    Nielson, Gregory M.

    1997-01-01

    This is a request for a second renewal (3d year of funding) of a research project on the topic of multiresolution and explicit methods for vector field analysis and visualization. In this report, we describe the progress made on this research project during the second year and give a statement of the planned research for the third year. There are two aspects to this research project. The first is concerned with the development of techniques for computing tangent curves for use in visualizing flow fields. The second aspect of the research project is concerned with the development of multiresolution methods for curvilinear grids and their use as tools for visualization, analysis and archiving of flow data. We report on our work on the development of numerical methods for tangent curve computation first.

  18. Developing Teachers' Computational Thinking Beliefs and Engineering Practices through Game Design and Robotics

    ERIC Educational Resources Information Center

    Leonard, Jacqueline; Barnes-Johnson, Joy; Mitchell, Monica; Unertl, Adrienne; Stubbe, Christopher R.; Ingraham, Latanya

    2017-01-01

    This research report presents the final year results of a three-year research project on computational thinking (CT). The project, funded by the National Science Foundation, involved training teachers in grades four through six to implement Scalable Game Design and LEGO® EV3 robotics during afterschool clubs. Thirty teachers and 531 students took…

  19. CESDIS

    NASA Technical Reports Server (NTRS)

    1994-01-01

    CESDIS, the Center of Excellence in Space Data and Information Sciences was developed jointly by NASA, Universities Space Research Association (USRA), and the University of Maryland in 1988 to focus on the design of advanced computing techniques and data systems to support NASA Earth and space science research programs. CESDIS is operated by USRA under contract to NASA. The Director, Associate Director, Staff Scientists, and administrative staff are located on-site at NASA's Goddard Space Flight Center in Greenbelt, Maryland. The primary CESDIS mission is to increase the connection between computer science and engineering research programs at colleges and universities and NASA groups working with computer applications in Earth and space science. Research areas of primary interest at CESDIS include: 1) High performance computing, especially software design and performance evaluation for massively parallel machines; 2) Parallel input/output and data storage systems for high performance parallel computers; 3) Data base and intelligent data management systems for parallel computers; 4) Image processing; 5) Digital libraries; and 6) Data compression. CESDIS funds multiyear projects at U. S. universities and colleges. Proposals are accepted in response to calls for proposals and are selected on the basis of peer reviews. Funds are provided to support faculty and graduate students working at their home institutions. Project personnel visit Goddard during academic recess periods to attend workshops, present seminars, and collaborate with NASA scientists on research projects. Additionally, CESDIS takes on specific research tasks of shorter duration for computer science research requested by NASA Goddard scientists.

  20. BNCI systems as a potential assistive technology: ethical issues and participatory research in the BrainAble project.

    PubMed

    Carmichael, Clare; Carmichael, Patrick

    2014-01-01

    This paper highlights aspects related to current research and thinking about ethical issues in relation to Brain Computer Interface (BCI) and Brain-Neuronal Computer Interfaces (BNCI) research through the experience of one particular project, BrainAble, which is exploring and developing the potential of these technologies to enable people with complex disabilities to control computers. It describes how ethical practice has been developed both within the multidisciplinary research team and with participants. The paper presents findings in which participants shared their views of the project prototypes, of the potential of BCI/BNCI systems as an assistive technology, and of their other possible applications. This draws attention to the importance of ethical practice in projects where high expectations of technologies, and representations of "ideal types" of disabled users may reinforce stereotypes or drown out participant "voices". Ethical frameworks for research and development in emergent areas such as BCI/BNCI systems should be based on broad notions of a "duty of care" while being sufficiently flexible that researchers can adapt project procedures according to participant needs. They need to be frequently revisited, not only in the light of experience, but also to ensure they reflect new research findings and ever more complex and powerful technologies.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hules, John

    This 1998 annual report from the National Scientific Energy Research Computing Center (NERSC) presents the year in review of the following categories: Computational Science; Computer Science and Applied Mathematics; and Systems and Services. Also presented are science highlights in the following categories: Basic Energy Sciences; Biological and Environmental Research; Fusion Energy Sciences; High Energy and Nuclear Physics; and Advanced Scientific Computing Research and Other Projects.

  2. [Earth Science Technology Office's Computational Technologies Project

    NASA Technical Reports Server (NTRS)

    Fischer, James (Technical Monitor); Merkey, Phillip

    2005-01-01

    This grant supported the effort to characterize the problem domain of the Earth Science Technology Office's Computational Technologies Project, to engage the Beowulf Cluster Computing Community as well as the High Performance Computing Research Community so that we can predict the applicability of said technologies to the scientific community represented by the CT project and formulate long term strategies to provide the computational resources necessary to attain the anticipated scientific objectives of the CT project. Specifically, the goal of the evaluation effort is to use the information gathered over the course of the Round-3 investigations to quantify the trends in scientific expectations, the algorithmic requirements and capabilities of high-performance computers to satisfy this anticipated need.

  3. Investigating the Mechanism of Action and the Identification of Breast Carcinogens by Computational Analysis of Female Rodent Carcinogens

    DTIC Science & Technology

    2006-08-01

    preparing a COBRE Molecular Targets Project with a goal to extend the computational work of Specific Aims of this project to the discovery of novel...million Center of Biomedical Research Excellence ( COBRE ) grant from the National Center for Research Resources at the National Institutes of Health...three year COBRE -funded project in Molecular Targets. My recruitment to the University of Louisville’s Brown Cancer Center and my proposed COBRE

  4. Bioconductor: open software development for computational biology and bioinformatics

    PubMed Central

    Gentleman, Robert C; Carey, Vincent J; Bates, Douglas M; Bolstad, Ben; Dettling, Marcel; Dudoit, Sandrine; Ellis, Byron; Gautier, Laurent; Ge, Yongchao; Gentry, Jeff; Hornik, Kurt; Hothorn, Torsten; Huber, Wolfgang; Iacus, Stefano; Irizarry, Rafael; Leisch, Friedrich; Li, Cheng; Maechler, Martin; Rossini, Anthony J; Sawitzki, Gunther; Smith, Colin; Smyth, Gordon; Tierney, Luke; Yang, Jean YH; Zhang, Jianhua

    2004-01-01

    The Bioconductor project is an initiative for the collaborative creation of extensible software for computational biology and bioinformatics. The goals of the project include: fostering collaborative development and widespread use of innovative software, reducing barriers to entry into interdisciplinary scientific research, and promoting the achievement of remote reproducibility of research results. We describe details of our aims and methods, identify current challenges, compare Bioconductor to other open bioinformatics projects, and provide working examples. PMID:15461798

  5. 75 FR 29818 - Privacy Act of 1974; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-27

    ... computers used with the particular project are available to authorized personnel only. Records on... Research and Development Project Records--VA'' (34VA12) as set forth in the Federal Register 40 FR 38095... include electronic or other databases containing research information developed during a research project...

  6. NASA high performance computing and communications program

    NASA Technical Reports Server (NTRS)

    Holcomb, Lee; Smith, Paul; Hunter, Paul

    1993-01-01

    The National Aeronautics and Space Administration's HPCC program is part of a new Presidential initiative aimed at producing a 1000-fold increase in supercomputing speed and a 100-fold improvement in available communications capability by 1997. As more advanced technologies are developed under the HPCC program, they will be used to solve NASA's 'Grand Challenge' problems, which include improving the design and simulation of advanced aerospace vehicles, allowing people at remote locations to communicate more effectively and share information, increasing scientist's abilities to model the Earth's climate and forecast global environmental trends, and improving the development of advanced spacecraft. NASA's HPCC program is organized into three projects which are unique to the agency's mission: the Computational Aerosciences (CAS) project, the Earth and Space Sciences (ESS) project, and the Remote Exploration and Experimentation (REE) project. An additional project, the Basic Research and Human Resources (BRHR) project exists to promote long term research in computer science and engineering and to increase the pool of trained personnel in a variety of scientific disciplines. This document presents an overview of the objectives and organization of these projects as well as summaries of individual research and development programs within each project.

  7. 78 FR 2919 - Proposed Priority-National Institute on Disability and Rehabilitation Research-Disability and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-15

    ... Rehabilitation Research--Disability and Rehabilitation Research Project--Inclusive Cloud and Web Computing CFDA... inclusive Cloud and Web computing. The Assistant Secretary may use this priority for competitions in fiscal... Priority for Inclusive Cloud and Web Computing'' in the subject line of your electronic message. FOR...

  8. Impact of configuration management system of computer center on support of scientific projects throughout their lifecycle

    NASA Astrophysics Data System (ADS)

    Bogdanov, A. V.; Iuzhanin, N. V.; Zolotarev, V. I.; Ezhakova, T. R.

    2017-12-01

    In this article the problem of scientific projects support throughout their lifecycle in the computer center is considered in every aspect of support. Configuration Management system plays a connecting role in processes related to the provision and support of services of a computer center. In view of strong integration of IT infrastructure components with the use of virtualization, control of infrastructure becomes even more critical to the support of research projects, which means higher requirements for the Configuration Management system. For every aspect of research projects support, the influence of the Configuration Management system is being reviewed and development of the corresponding elements of the system is being described in the present paper.

  9. Final Technical Progress Report; Closeout Certifications; CSSV Newsletter Volume I; CSSV Newsletter Volume II; CSSV Activity Journal; CSSV Final Financial Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Houston, Johnny L; Geter, Kerry

    This Project?s third year of implementation in 2007-2008, the final year, as designated by Elizabeth City State University (ECSU), in cooperation with the National Association of Mathematicians (NAM) Inc., in an effort to promote research and research training programs in computational science ? scientific visualization (CSSV). A major goal of the Project was to attract the energetic and productive faculty, graduate and upper division undergraduate students of diverse ethnicities to a program that investigates science and computational science issues of long-term interest to the Department of Energy (DoE) and the nation. The breadth and depth of computational science?scientific visualization andmore » the magnitude of resources available are enormous for permitting a variety of research activities. ECSU?s Computational Science-Science Visualization Center will serve as a conduit for directing users to these enormous resources.« less

  10. Computer Information Project for Monographs at the Medical Research Library of Brooklyn

    PubMed Central

    Koch, Michael S.; Kovacs, Helen

    1973-01-01

    The article describes a resource library's computer-based project that provides cataloging and other bibliographic services and promotes greater use of the book collection. A few studies are cited to show the significance of monographic literature in medical libraries. The educational role of the Medical Research Library of Brooklyn is discussed, both with regard to the parent institution and to smaller medical libraries in the same geographic area. Types of aid given to smaller libraries are enumerated. Information is given on methods for providing machine-produced catalog cards, current awareness notes, and bibliographic lists. Actualities and potentialities of the computer project are discussed. PMID:4579767

  11. The World through Glass: Developing Novel Methods with Wearable Computing for Urban Videographic Research

    ERIC Educational Resources Information Center

    Paterson, Mark; Glass, Michael R.

    2015-01-01

    Google Glass was deployed in an Urban Studies field course to gather videographic data for team-based student research projects. We evaluate the potential for wearable computing technology such as Glass, in combination with other mobile computing devices, to enhance reflexive research skills, and videography in particular, during field research.…

  12. Increasing the trustworthiness of research results: the role of computers in qualitative text analysis

    Treesearch

    Lynne M. Westphal

    2000-01-01

    By using computer packages designed for qualitative data analysis a researcher can increase trustworthiness (i.e., validity and reliability) of conclusions drawn from qualitative research results. This paper examines trustworthiness issues and therole of computer software (QSR's NUD*IST) in the context of a current research project investigating the social...

  13. Storage and network bandwidth requirements through the year 2000 for the NASA Center for Computational Sciences

    NASA Technical Reports Server (NTRS)

    Salmon, Ellen

    1996-01-01

    The data storage and retrieval demands of space and Earth sciences researchers have made the NASA Center for Computational Sciences (NCCS) Mass Data Storage and Delivery System (MDSDS) one of the world's most active Convex UniTree systems. Science researchers formed the NCCS's Computer Environments and Research Requirements Committee (CERRC) to relate their projected supercomputing and mass storage requirements through the year 2000. Using the CERRC guidelines and observations of current usage, some detailed projections of requirements for MDSDS network bandwidth and mass storage capacity and performance are presented.

  14. A distributed programming environment for Ada

    NASA Technical Reports Server (NTRS)

    Brennan, Peter; Mcdonnell, Tom; Mcfarland, Gregory; Timmins, Lawrence J.; Litke, John D.

    1986-01-01

    Despite considerable commercial exploitation of fault tolerance systems, significant and difficult research problems remain in such areas as fault detection and correction. A research project is described which constructs a distributed computing test bed for loosely coupled computers. The project is constructing a tool kit to support research into distributed control algorithms, including a distributed Ada compiler, distributed debugger, test harnesses, and environment monitors. The Ada compiler is being written in Ada and will implement distributed computing at the subsystem level. The design goal is to provide a variety of control mechanics for distributed programming while retaining total transparency at the code level.

  15. Applied Linguistics Project: Student-Led Computer Assisted Research in High School EAL/EAP

    ERIC Educational Resources Information Center

    Bohát, Róbert; Rödlingová, Beata; Horáková, Nina

    2015-01-01

    The Applied Linguistics Project (ALP) started at the International School of Prague (ISP) in 2013. Every year, Grade 9 English as an Additional Language (EAL) students identify an area of learning in need of improvement and design a research method followed by data collection and analysis using basic computer software tools or online corpora.…

  16. Approaches for Measuring the Management Effectiveness of Software Projects

    DTIC Science & Technology

    2008-04-01

    John S. Osmundson Research Assoc. Professor of...and Department of Computer Science Dean of Research ...caused otherwise good projects grind to a halt.” [RO]. Various other studies, researchers and practitioners report similar issues regarding the

  17. Alliance for Computational Science Collaboration: HBCU Partnership at Alabama A&M University Continuing High Performance Computing Research and Education at AAMU

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qian, Xiaoqing; Deng, Z. T.

    2009-11-10

    This is the final report for the Department of Energy (DOE) project DE-FG02-06ER25746, entitled, "Continuing High Performance Computing Research and Education at AAMU". This three-year project was started in August 15, 2006, and it was ended in August 14, 2009. The objective of this project was to enhance high performance computing research and education capabilities at Alabama A&M University (AAMU), and to train African-American and other minority students and scientists in the computational science field for eventual employment with DOE. AAMU has successfully completed all the proposed research and educational tasks. Through the support of DOE, AAMU was able tomore » provide opportunities to minority students through summer interns and DOE computational science scholarship program. In the past three years, AAMU (1). Supported three graduate research assistants in image processing for hypersonic shockwave control experiment and in computational science related area; (2). Recruited and provided full financial support for six AAMU undergraduate summer research interns to participate Research Alliance in Math and Science (RAMS) program at Oak Ridge National Lab (ORNL); (3). Awarded highly competitive 30 DOE High Performance Computing Scholarships ($1500 each) to qualified top AAMU undergraduate students in science and engineering majors; (4). Improved high performance computing laboratory at AAMU with the addition of three high performance Linux workstations; (5). Conducted image analysis for electromagnetic shockwave control experiment and computation of shockwave interactions to verify the design and operation of AAMU-Supersonic wind tunnel. The high performance computing research and education activities at AAMU created great impact to minority students. As praised by Accreditation Board for Engineering and Technology (ABET) in 2009, ?The work on high performance computing that is funded by the Department of Energy provides scholarships to undergraduate students as computational science scholars. This is a wonderful opportunity to recruit under-represented students.? Three ASEE papers were published in 2007, 2008 and 2009 proceedings of ASEE Annual Conferences, respectively. Presentations of these papers were also made at the ASEE Annual Conferences. It is very critical to continue the research and education activities.« less

  18. The Computer Science Technical Report (CS-TR) Project: A Pioneering Digital Library Project Viewed from a Library Perspective.

    ERIC Educational Resources Information Center

    Anderson, Greg; And Others

    1996-01-01

    Describes the Computer Science Technical Report Project, one of the earliest investigations into the system engineering of digital libraries which pioneered multiinstitutional collaborative research into technical, social, and legal issues related to the development and implementation of a large, heterogeneous, distributed digital library. (LRW)

  19. Project IMPACT Software Documentation: Overview of the Computer-Administered Instruction Subsystem.

    ERIC Educational Resources Information Center

    Stelzer, John; Garneau, Jean

    Research in Project IMPACT, prototypes of computerized training for Army personnel, is documented in an overview of the IMPACT computer software system for computer-administered instruction, exclusive of instructional software. The overview description provides a basis for an understanding of the rationale and motivation for the development of the…

  20. Reflections on Component Computing from the Boxer Project's Perspective

    ERIC Educational Resources Information Center

    diSessa, Andrea A.

    2004-01-01

    The Boxer Project conducted the research that led to the synthetic review "Issues in Component Computing." This brief essay provides a platform from which to develop our general perspective on educational computing and how it relates to components. The two most important lines of our thinking are (1) the goal to open technology's creative…

  1. The UK Human Genome Mapping Project online computing service.

    PubMed

    Rysavy, F R; Bishop, M J; Gibbs, G P; Williams, G W

    1992-04-01

    This paper presents an overview of computing and networking facilities developed by the Medical Research Council to provide online computing support to the Human Genome Mapping Project (HGMP) in the UK. The facility is connected to a number of other computing facilities in various centres of genetics and molecular biology research excellence, either directly via high-speed links or through national and international wide-area networks. The paper describes the design and implementation of the current system, a 'client/server' network of Sun, IBM, DEC and Apple servers, gateways and workstations. A short outline of online computing services currently delivered by this system to the UK human genetics research community is also provided. More information about the services and their availability could be obtained by a direct approach to the UK HGMP-RC.

  2. Architectural Aspects of Grid Computing and its Global Prospects for E-Science Community

    NASA Astrophysics Data System (ADS)

    Ahmad, Mushtaq

    2008-05-01

    The paper reviews the imminent Architectural Aspects of Grid Computing for e-Science community for scientific research and business/commercial collaboration beyond physical boundaries. Grid Computing provides all the needed facilities; hardware, software, communication interfaces, high speed internet, safe authentication and secure environment for collaboration of research projects around the globe. It provides highly fast compute engine for those scientific and engineering research projects and business/commercial applications which are heavily compute intensive and/or require humongous amounts of data. It also makes possible the use of very advanced methodologies, simulation models, expert systems and treasure of knowledge available around the globe under the umbrella of knowledge sharing. Thus it makes possible one of the dreams of global village for the benefit of e-Science community across the globe.

  3. [Earth and Space Sciences Project Services for NASA HPCC

    NASA Technical Reports Server (NTRS)

    Merkey, Phillip

    2002-01-01

    This grant supported the effort to characterize the problem domain of the Earth Science Technology Office's Computational Technologies Project, to engage the Beowulf Cluster Computing Community as well as the High Performance Computing Research Community so that we can predict the applicability of said technologies to the scientific community represented by the CT project and formulate long term strategies to provide the computational resources necessary to attain the anticipated scientific objectives of the CT project. Specifically, the goal of the evaluation effort is to use the information gathered over the course of the Round-3 investigations to quantify the trends in scientific expectations, the algorithmic requirements and capabilities of high-performance computers to satisfy this anticipated need.

  4. Spin-Off Successes of SETI Research at Berkeley

    NASA Astrophysics Data System (ADS)

    Douglas, K. A.; Anderson, D. P.; Bankay, R.; Chen, H.; Cobb, J.; Korpela, E. J.; Lebofsky, M.; Parsons, A.; von Korff, J.; Werthimer, D.

    2009-12-01

    Our group contributes to the Search for Extra-Terrestrial Intelligence (SETI) by developing and using world-class signal processing computers to analyze data collected on the Arecibo telescope. Although no patterned signal of extra-terrestrial origin has yet been detected, and the immediate prospects for making such a detection are highly uncertain, the SETI@home project has nonetheless proven the value of pursuing such research through its impact on the fields of distributed computing, real-time signal processing, and radio astronomy. The SETI@home project has spun off the Center for Astronomy Signal Processing and Electronics Research (CASPER) and the Berkeley Open Infrastructure for Networked Computing (BOINC), both of which are responsible for catalyzing a smorgasbord of new research in scientific disciplines in countries around the world. Futhermore, the data collected and archived for the SETI@home project is proving valuable in data-mining experiments for mapping neutral galatic hydrogen and for detecting black-hole evaporation.

  5. The CP-PACS Project and Lattice QCD Results

    NASA Astrophysics Data System (ADS)

    Iwasaki, Y.

    The aim of the CP-PACS project was to develop a massively parallel computer for performing numerical research in computational physics with primary emphasis on lattice QCD. The CP-PACS computer with a peak speed of 614 GFLOPS with 2048 processors was completed in September 1996, and has been in full operation since October 1996. We present an overview of the CP-PACS project and describe characteristics of the CP-PACS computer. The CP-PACS has been mainly used for hadron spectroscopy studies in lattice QCD. Main results in lattice QCD simulations are given.

  6. Ten quick tips for machine learning in computational biology.

    PubMed

    Chicco, Davide

    2017-01-01

    Machine learning has become a pivotal tool for many projects in computational biology, bioinformatics, and health informatics. Nevertheless, beginners and biomedical researchers often do not have enough experience to run a data mining project effectively, and therefore can follow incorrect practices, that may lead to common mistakes or over-optimistic results. With this review, we present ten quick tips to take advantage of machine learning in any computational biology context, by avoiding some common errors that we observed hundreds of times in multiple bioinformatics projects. We believe our ten suggestions can strongly help any machine learning practitioner to carry on a successful project in computational biology and related sciences.

  7. NASA High Performance Computing and Communications program

    NASA Technical Reports Server (NTRS)

    Holcomb, Lee; Smith, Paul; Hunter, Paul

    1994-01-01

    The National Aeronautics and Space Administration's HPCC program is part of a new Presidential initiative aimed at producing a 1000-fold increase in supercomputing speed and a 1(X)-fold improvement in available communications capability by 1997. As more advanced technologies are developed under the HPCC program, they will be used to solve NASA's 'Grand Challenge' problems, which include improving the design and simulation of advanced aerospace vehicles, allowing people at remote locations to communicate more effectively and share information, increasing scientists' abilities to model the Earth's climate and forecast global environmental trends, and improving the development of advanced spacecraft. NASA's HPCC program is organized into three projects which are unique to the agency's mission: the Computational Aerosciences (CAS) project, the Earth and Space Sciences (ESS) project, and the Remote Exploration and Experimentation (REE) project. An additional project, the Basic Research and Human Resources (BRHR) project, exists to promote long term research in computer science and engineering and to increase the pool of trained personnel in a variety of scientific disciplines. This document presents an overview of the objectives and organization of these projects, as well as summaries of early accomplishments and the significance, status, and plans for individual research and development programs within each project. Areas of emphasis include benchmarking, testbeds, software and simulation methods.

  8. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Mid-year report FY17 Q2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreland, Kenneth D.; Pugmire, David; Rogers, David

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less

  9. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Year-end report FY17.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreland, Kenneth D.; Pugmire, David; Rogers, David

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less

  10. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem. Mid-year report FY16 Q2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreland, Kenneth D.; Sewell, Christopher; Childs, Hank

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less

  11. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Year-end report FY15 Q4.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreland, Kenneth D.; Sewell, Christopher; Childs, Hank

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less

  12. Computers in the Classroom: The School of the Future, The Future of the School.

    ERIC Educational Resources Information Center

    Tapia, Ivan, Ed.

    1995-01-01

    Computer uses in the classroom is the theme topic of this journal issue. Contents include: "Emo Welzl: 1995 Leibniz Laureate" (Hartmut Wewetzer); "Learning to Read with the Aid of a Computer: Research Project with Children Starting School" (Horst Meermann); "The Multimedia School: The Comenius Pilot Project" (Tom Sperlich); "A Very Useful Piece of…

  13. SAGE as a Source for Undergraduate Research Projects

    ERIC Educational Resources Information Center

    Hutz, Benjamin

    2017-01-01

    This article examines the use of the computer algebra system SAGE for undergraduate student research projects. After reading this article, the reader should understand the benefits of using SAGE as a source of research projects and how to commence working with SAGE. The author proposes a tiered working group model to allow maximum benefit to the…

  14. A Review of New Brunswick's Dedicated Notebook Research Project: One-to-One Computing--A Compelling Classroom-Change Intervention

    ERIC Educational Resources Information Center

    Milton, Penny

    2008-01-01

    The Canadian Education Association (CEA) was commissioned by Hewlett-Packard Canada to create a case study describing the development, implementation and outcomes of New Brunswick's Dedicated Notebook Research Project. The New Brunswick Department of Education designed its research project to assess impacts on teaching and learning of dedicated…

  15. Projecting Grammatical Features in Nominals: Cognitive Processing Theory & Computational Implementation

    DTIC Science & Technology

    2010-03-01

    functionality and plausibility distinguishes this research from most research in computational linguistics and computational psycholinguistics . The... Psycholinguistic Theory There is extensive psycholinguistic evidence that human language processing is essentially incremental and interactive...challenges of psycholinguistic research is to explain how humans can process language effortlessly and accurately given the complexity and ambiguity that is

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shankar, Arjun

    Computer scientist Arjun Shankar is director of the Compute and Data Environment for Science (CADES), ORNL’s multidisciplinary big data computing center. CADES offers computing, networking and data analytics to facilitate workflows for both ORNL and external research projects.

  17. First 3 years of operation of RIACS (Research Institute for Advanced Computer Science) (1983-1985)

    NASA Technical Reports Server (NTRS)

    Denning, P. J.

    1986-01-01

    The focus of the Research Institute for Advanced Computer Science (RIACS) is to explore matches between advanced computing architectures and the processes of scientific research. An architecture evaluation of the MIT static dataflow machine, specification of a graphical language for expressing distributed computations, and specification of an expert system for aiding in grid generation for two-dimensional flow problems was initiated. Research projects for 1984 and 1985 are summarized.

  18. Innovative architectures for dense multi-microprocessor computers

    NASA Technical Reports Server (NTRS)

    Donaldson, Thomas; Doty, Karl; Engle, Steven W.; Larson, Robert E.; O'Reilly, John G.

    1988-01-01

    The results of a Phase I Small Business Innovative Research (SBIR) project performed for the NASA Langley Computational Structural Mechanics Group are described. The project resulted in the identification of a family of chordal-ring interconnection architectures with excellent potential to serve as the basis for new multimicroprocessor (MMP) computers. The paper presents examples of how computational algorithms from structural mechanics can be efficiently implemented on the chordal-ring architecture.

  19. The development of an engineering computer graphics laboratory

    NASA Technical Reports Server (NTRS)

    Anderson, D. C.; Garrett, R. E.

    1975-01-01

    Hardware and software systems developed to further research and education in interactive computer graphics were described, as well as several of the ongoing application-oriented projects, educational graphics programs, and graduate research projects. The software system consists of a FORTRAN 4 subroutine package, in conjunction with a PDP 11/40 minicomputer as the primary computation processor and the Imlac PDS-1 as an intelligent display processor. The package comprises a comprehensive set of graphics routines for dynamic, structured two-dimensional display manipulation, and numerous routines to handle a variety of input devices at the Imlac.

  20. A network-based distributed, media-rich computing and information environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phillips, R.L.

    1995-12-31

    Sunrise is a Los Alamos National Laboratory (LANL) project started in October 1993. It is intended to be a prototype National Information Infrastructure development project. A main focus of Sunrise is to tie together enabling technologies (networking, object-oriented distributed computing, graphical interfaces, security, multi-media technologies, and data-mining technologies) with several specific applications. A diverse set of application areas was chosen to ensure that the solutions developed in the project are as generic as possible. Some of the application areas are materials modeling, medical records and image analysis, transportation simulations, and K-12 education. This paper provides a description of Sunrise andmore » a view of the architecture and objectives of this evolving project. The primary objectives of Sunrise are three-fold: (1) To develop common information-enabling tools for advanced scientific research and its applications to industry; (2) To enhance the capabilities of important research programs at the Laboratory; (3) To define a new way of collaboration between computer science and industrially-relevant research.« less

  1. Financial Analysis for R&D Decisions.

    ERIC Educational Resources Information Center

    Carter, Robert

    1997-01-01

    Using personal computer spreadsheet software, standard corporate financial analysis can help university research administrators communicate the value of research and development to sponsors and other stakeholders; balance projects, technologies, or categories of research; and continually assess the value of investing in ongoing projects. It also…

  2. The School in Its Relations with the Community. Research Projects EUDISED 1975-1977.

    ERIC Educational Resources Information Center

    Documentation Centre for Education in Europe, Strasbourg (France).

    The document presents abstracts of 40 research projects dealing with the relationship between school and community in Europe. These have been compiled by the European Documentation and Information System for the Education Project, (EUDISED). The aim of the EUDISED project is to create a computer-based network of national agencies dealing with…

  3. Has computational creativity successfully made it "Beyond the Fence" in musical theatre?

    NASA Astrophysics Data System (ADS)

    Jordanous, Anna

    2017-10-01

    A significant test for software is to task it with replicating human performance, as done recently with creative software and the commercial project Beyond the Fence (undertaken for a television documentary Computer Says Show). The remit of this project was to use computer software as much as possible to produce "the world's first computer-generated musical". Several creative systems were used to generate this musical, which was performed in London's West End in 2016. This paper considers the challenge of evaluating this project. Current computational creativity evaluation methods are ill-suited to evaluating projects that involve creative input from multiple systems and people. Following recent inspiration within computational creativity research from interaction design, here the DECIDE evaluation framework is applied to evaluate the Beyond the Fence project. Evaluation finds that the project was reasonably successful at achieving the task of using computational generation to produce a credible musical. Lessons have been learned for future computational creativity projects though, particularly for affording creative software more agency and enabling software to interact with other creative partners. Upon reflection, the DECIDE framework emerges as a useful evaluation "checklist" (if not a tangible operational methodology) for evaluating multiple creative systems participating in a creative task.

  4. 78 FR 42976 - Notice Pursuant to the National Cooperative Research and Production Act of 1993-Heterogeneous...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-18

    ... Computer Science and Engineering, Seoul, REPUBLIC OF KOREA; Missouri University of Science and Technology, Rolla, MO; Industrial Technology Research Institute of Taiwan, Chutung, Hsinchu, TAIWAN, Northeastern... activity of the group research project. Membership in this group research project remains open, and HSA...

  5. Systems and Software Producibility Collaboration and Experimental Environment (SPRUCE)

    DTIC Science & Technology

    2009-04-23

    Research Manhattan Project Like Research – Transition timeframe needed • Current generation programs – DoD acquisitions over next 1-5 years • Next...Specific Computing Plant B a s i c Transformational Research Manhattan Project Like Research B a s i c 16 • Sponsored by Lockheed Martin

  6. Problems and Prospects in Foreign Language Computing.

    ERIC Educational Resources Information Center

    Pusack, James P.

    The problems and prospects of the field of foreign language computing are profiled through a survey of typical implementation, development, and research projects that language teachers may undertake. Basic concepts in instructional design, hardware, and software are first clarified. Implementation projects involving courseware evaluation, textbook…

  7. Computers in Schools: Implementing for Sustainability. Why the Truth Is Rarely Pure and Never Simple

    ERIC Educational Resources Information Center

    Thomas, H.; Cronje, J.

    2007-01-01

    This study investigates influences on the sustainability of a computers-in-schools project during the implementation phase thereof. The Computer Assisted Learning in Schools (CALIS) Project (1992-1996) is the unit of analysis. A qualitative case study research design is used to elicit data, in the form of participant narratives, from people who…

  8. Summary of Research Report

    NASA Technical Reports Server (NTRS)

    Long, Lyle N.

    1999-01-01

    This report describes a project to predict ducted fan noise using massively parallel computers. The investigators are part of a larger team of researchers, most of whom are working at NASA Langley Research Center. The portion of the project described below not only stands alone as an individual research project, it also compliments the NASA Langley work. The write-up included in this report is relatively brief, since the details are described in technical papers.

  9. Overview of the Cranked-Arrow Wing Aerodynamics Project International

    NASA Technical Reports Server (NTRS)

    Obara, Clifford J.; Lamar, John E.

    2008-01-01

    This paper provides a brief history of the F-16XL-1 aircraft, its role in the High Speed Research program and how it was morphed into the Cranked Arrow Wing Aerodynamics Project. Various flight, wind-tunnel and Computational Fluid Dynamics data sets were generated as part of the project. These unique and open flight datasets for surface pressures, boundary-layer profiles and skin-friction distributions, along with surface flow data, are described and sample data comparisons given. This is followed by a description of how the project became internationalized to be known as Cranked Arrow Wing Aerodynamics Project International and is concluded by an introduction to the results of a four year computational predictive study of data collected at flight conditions by participating researchers.

  10. Student Research in Computational Astrophysics

    NASA Astrophysics Data System (ADS)

    Blondin, J. M.

    1999-12-01

    Computational physics can shorten the long road from freshman physics major to independent research by providing students with powerful tools to deal with the complexities of modern research problems. At North Carolina State University we have introduced dozens of students to astrophysics research using the tools of computational fluid dynamics. We have used several formats for working with students, including the traditional approach of one-on-one mentoring, a more group-oriented format in which several students work together on one or more related projects, and a novel attempt to involve an entire class in a coordinated semester research project. The advantages and disadvantages of these formats will be discussed at length, but the single most important influence has been peer support. Having students work in teams or learn the tools of research together but tackle different problems has led to more positive experiences than a lone student diving into solo research. This work is supported by an NSF CAREER Award.

  11. Data collection and storage in long-term ecological and evolutionary studies: The Mongoose 2000 system.

    PubMed

    Marshall, Harry H; Griffiths, David J; Mwanguhya, Francis; Businge, Robert; Griffiths, Amber G F; Kyabulima, Solomon; Mwesige, Kenneth; Sanderson, Jennifer L; Thompson, Faye J; Vitikainen, Emma I K; Cant, Michael A

    2018-01-01

    Studying ecological and evolutionary processes in the natural world often requires research projects to follow multiple individuals in the wild over many years. These projects have provided significant advances but may also be hampered by needing to accurately and efficiently collect and store multiple streams of the data from multiple individuals concurrently. The increase in the availability and sophistication of portable computers (smartphones and tablets) and the applications that run on them has the potential to address many of these data collection and storage issues. In this paper we describe the challenges faced by one such long-term, individual-based research project: the Banded Mongoose Research Project in Uganda. We describe a system we have developed called Mongoose 2000 that utilises the potential of apps and portable computers to meet these challenges. We discuss the benefits and limitations of employing such a system in a long-term research project. The app and source code for the Mongoose 2000 system are freely available and we detail how it might be used to aid data collection and storage in other long-term individual-based projects.

  12. Research in nonlinear structural and solid mechanics

    NASA Technical Reports Server (NTRS)

    Mccomb, H. G., Jr. (Compiler); Noor, A. K. (Compiler)

    1981-01-01

    Recent and projected advances in applied mechanics, numerical analysis, computer hardware and engineering software, and their impact on modeling and solution techniques in nonlinear structural and solid mechanics are discussed. The fields covered are rapidly changing and are strongly impacted by current and projected advances in computer hardware. To foster effective development of the technology perceptions on computing systems and nonlinear analysis software systems are presented.

  13. Composite structural materials

    NASA Technical Reports Server (NTRS)

    Ansell, G. S.; Loewy, R. G.; Wiberley, S. E.

    1981-01-01

    The composite aircraft program component (CAPCOMP) is a graduate level project conducted in parallel with a composite structures program. The composite aircraft program glider (CAPGLIDE) is an undergraduate demonstration project which has as its objectives the design, fabrication, and testing of a foot launched ultralight glider using composite structures. The objective of the computer aided design (COMPAD) portion of the composites project is to provide computer tools for the analysis and design of composite structures. The major thrust of COMPAD is in the finite element area with effort directed at implementing finite element analysis capabilities and developing interactive graphics preprocessing and postprocessing capabilities. The criteria for selecting research projects to be conducted under the innovative and supporting research (INSURE) program are described.

  14. A Computer Supported Teamwork Project for People with a Visual Impairment.

    ERIC Educational Resources Information Center

    Hale, Greg

    2000-01-01

    Discussion of the use of computer supported teamwork (CSTW) in team-based organizations focuses on problems that visually impaired people have reading graphical user interface software via screen reader software. Describes a project that successfully used email for CSTW, and suggests issues needing further research. (LRW)

  15. Advanced networks and computing in healthcare

    PubMed Central

    Ackerman, Michael

    2011-01-01

    As computing and network capabilities continue to rise, it becomes increasingly important to understand the varied applications for using them to provide healthcare. The objective of this review is to identify key characteristics and attributes of healthcare applications involving the use of advanced computing and communication technologies, drawing upon 45 research and development projects in telemedicine and other aspects of healthcare funded by the National Library of Medicine over the past 12 years. Only projects publishing in the professional literature were included in the review. Four projects did not publish beyond their final reports. In addition, the authors drew on their first-hand experience as project officers, reviewers and monitors of the work. Major themes in the corpus of work were identified, characterizing key attributes of advanced computing and network applications in healthcare. Advanced computing and network applications are relevant to a range of healthcare settings and specialties, but they are most appropriate for solving a narrower range of problems in each. Healthcare projects undertaken primarily to explore potential have also demonstrated effectiveness and depend on the quality of network service as much as bandwidth. Many applications are enabling, making it possible to provide service or conduct research that previously was not possible or to achieve outcomes in addition to those for which projects were undertaken. Most notable are advances in imaging and visualization, collaboration and sense of presence, and mobility in communication and information-resource use. PMID:21486877

  16. IPAD: A unique approach to government/industry cooperation for technology development and transfer

    NASA Technical Reports Server (NTRS)

    Fulton, Robert E.; Salley, George C.

    1985-01-01

    A key element to improved industry productivity is effective management of Computer Aided Design / Computer Aided Manufacturing (CAD/CAM) information. To stimulate advancement, a unique joint government/industry project designated Integrated Programs for Aerospace-Vehicle Design (IPAD) was carried out from 1971 to 1984. The goal was to raise aerospace industry productivity through advancement of computer based technology to integrate and manage information involved in the design and manufacturing process. IPAD research was guided by an Industry Technical Advisory Board (ITAB) composed of over 100 representatives from aerospace and computer companies. The project complemented traditional NASA/DOD research to develop aerospace design technology and the Air Force's Integrated Computer Aided Manufacturing (ICAM) program to advance CAM technology. IPAD had unprecedented industry support and involvement and served as a unique approach to government industry cooperation in the development and transfer of advanced technology. The IPAD project background, approach, accomplishments, industry involvement, technology transfer mechanisms and lessons learned are summarized.

  17. SpaceScience@Home: Authentic Research Projects that Use Citizen Scientists

    NASA Astrophysics Data System (ADS)

    Méndez, B. J. H.

    2008-06-01

    In recent years, several space science research projects have enlisted the help of large numbers of non-professional volunteers, ``citizen scientists'', to aid in performing tasks that are critical to a project, but require more person-time (or computing time) than a small professional research team can practically perform themselves. Examples of such projects include SETI@home, which uses time from volunteers computers to process radio-telescope observation looking for signals originating from extra-terrestrial intelligences; Clickworkers, which asks volunteers to review images of the surface of Mars to identify craters; Spacewatch, which used volunteers to review astronomical telescopic images of the sky to identify streaks made by possible Near Earth Asteroids; and Stardust@home, which asks volunteers to review ``focus movies'' taken of the Stardust interstellar dust aerogel collector to search for possible impacts from interstellar dust particles. We shall describe these and other similar projects and discuss lessons learned from carrying out such projects, including the educational opportunities they create.

  18. The Sunrise project: An R&D project for a national information infrastructure prototype

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Juhnyoung

    1995-02-01

    Sunrise is a Los Alamos National Laboratory (LANL) project started in October 1993. It is intended to a prototype National Information Infrastructure (NII) development project. A main focus of Sunrise is to tie together enabling technologies (networking, object-oriented distributed computing, graphical interfaces, security, multimedia technologies, and data mining technologies) with several specific applications. A diverse set of application areas was chosen to ensure that the solutions developed in the project are as generic as possible. Some of the application areas are materials modeling, medical records and image analysis, transportation simulations, and education. This paper provides a description of Sunrise andmore » a view of the architecture and objectives of this evolving project. The primary objectives of Sunrise are three-fold: (1) To develop common information-enabling tools for advanced scientific research and its applications to industry; (2) To enhance the capabilities of important research programs at the Laboratory; and (3) To define a new way of collaboration between computer science and industrially relevant research.« less

  19. Use of PL/1 in a Bibliographic Information Retrieval System.

    ERIC Educational Resources Information Center

    Schipma, Peter B.; And Others

    The Information Sciences section of ITT Research Institute (IITRI) has developed a Computer Search Center and is currently conducting a research project to explore computer searching of a variety of machine-readable data bases. The Center provides Selective Dissemination of Information services to academic, industrial and research organizations…

  20. Enabling BOINC in infrastructure as a service cloud system

    NASA Astrophysics Data System (ADS)

    Montes, Diego; Añel, Juan A.; Pena, Tomás F.; Uhe, Peter; Wallom, David C. H.

    2017-02-01

    Volunteer or crowd computing is becoming increasingly popular for solving complex research problems from an increasingly diverse range of areas. The majority of these have been built using the Berkeley Open Infrastructure for Network Computing (BOINC) platform, which provides a range of different services to manage all computation aspects of a project. The BOINC system is ideal in those cases where not only does the research community involved need low-cost access to massive computing resources but also where there is a significant public interest in the research being done.We discuss the way in which cloud services can help BOINC-based projects to deliver results in a fast, on demand manner. This is difficult to achieve using volunteers, and at the same time, using scalable cloud resources for short on demand projects can optimize the use of the available resources. We show how this design can be used as an efficient distributed computing platform within the cloud, and outline new approaches that could open up new possibilities in this field, using Climateprediction.net (http://www.climateprediction.net/) as a case study.

  1. 1999 NCCS Highlights

    NASA Technical Reports Server (NTRS)

    Bennett, Jerome (Technical Monitor)

    2002-01-01

    The NASA Center for Computational Sciences (NCCS) is a high-performance scientific computing facility operated, maintained and managed by the Earth and Space Data Computing Division (ESDCD) of NASA Goddard Space Flight Center's (GSFC) Earth Sciences Directorate. The mission of the NCCS is to advance leading-edge science by providing the best people, computers, and data storage systems to NASA's Earth and space sciences programs and those of other U.S. Government agencies, universities, and private institutions. Among the many computationally demanding Earth science research efforts supported by the NCCS in Fiscal Year 1999 (FY99) are the NASA Seasonal-to-Interannual Prediction Project, the NASA Search and Rescue Mission, Earth gravitational model development efforts, the National Weather Service's North American Observing System program, Data Assimilation Office studies, a NASA-sponsored project at the Center for Ocean-Land-Atmosphere Studies, a NASA-sponsored microgravity project conducted by researchers at the City University of New York and the University of Pennsylvania, the completion of a satellite-derived global climate data set, simulations of a new geodynamo model, and studies of Earth's torque. This document presents highlights of these research efforts and an overview of the NCCS, its facilities, and its people.

  2. Computational mechanics research and support for aerodynamics and hydraulics at TFHRC. Quarterly report January through March 2011. Year 1 Quarter 2 progress report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lottes, S. A.; Kulak, R. F.; Bojanowski, C.

    2011-05-19

    This project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at the Turner-Fairbank Highway Research Center for a period of five years, beginning in October 2010. The analysis methods employ well-benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focusmore » of the project are wind and water loads on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to assess them for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, vehicle stability under high wind loading, and the use of electromagnetic shock absorbers to improve vehicle stability under high wind conditions. This quarterly report documents technical progress on the project tasks for the period of January through March 2011.« less

  3. A parallel-processing approach to computing for the geographic sciences; applications and systems enhancements

    USGS Publications Warehouse

    Crane, Michael; Steinwand, Dan; Beckmann, Tim; Krpan, Greg; Liu, Shu-Guang; Nichols, Erin; Haga, Jim; Maddox, Brian; Bilderback, Chris; Feller, Mark; Homer, George

    2001-01-01

    The overarching goal of this project is to build a spatially distributed infrastructure for information science research by forming a team of information science researchers and providing them with similar hardware and software tools to perform collaborative research. Four geographically distributed Centers of the U.S. Geological Survey (USGS) are developing their own clusters of low-cost, personal computers into parallel computing environments that provide a costeffective way for the USGS to increase participation in the high-performance computing community. Referred to as Beowulf clusters, these hybrid systems provide the robust computing power required for conducting information science research into parallel computing systems and applications.

  4. Investigations in Computer-Aided Instruction and Computer-Aided Controls. Final Report.

    ERIC Educational Resources Information Center

    Rosenberg, R.C.; And Others

    These research projects, designed to delve into certain relationships between humans and computers, are focused on computer-assisted instruction and on man-computer interaction. One study demonstrates that within the limits of formal engineering theory, a computer simulated laboratory (Dynamic Systems Laboratory) can be built in which freshmen…

  5. Linguistic analysis of project ownership for undergraduate research experiences.

    PubMed

    Hanauer, D I; Frederick, J; Fotinakes, B; Strobel, S A

    2012-01-01

    We used computational linguistic and content analyses to explore the concept of project ownership for undergraduate research. We used linguistic analysis of student interview data to develop a quantitative methodology for assessing project ownership and applied this method to measure degrees of project ownership expressed by students in relation to different types of educational research experiences. The results of the study suggest that the design of a research experience significantly influences the degree of project ownership expressed by students when they describe those experiences. The analysis identified both positive and negative aspects of project ownership and provided a working definition for how a student experiences his or her research opportunity. These elements suggest several features that could be incorporated into an undergraduate research experience to foster a student's sense of project ownership.

  6. Overview of the NASA Dryden Flight Research Facility aeronautical flight projects

    NASA Technical Reports Server (NTRS)

    Meyer, Robert R., Jr.

    1992-01-01

    Several principal aerodynamics flight projects of the NASA Dryden Flight Research Facility are discussed. Key vehicle technology areas from a wide range of flight vehicles are highlighted. These areas include flight research data obtained for ground facility and computation correlation, applied research in areas not well suited to ground facilities (wind tunnels), and concept demonstration.

  7. Extreme-Scale Computing Project Aims to Advance Precision Oncology | Frederick National Laboratory for Cancer Research

    Cancer.gov

    Two government agencies and five national laboratories are collaborating to develop extremely high-performance computing capabilities that will analyze mountains of research and clinical data to improve scientific understanding of cancer, predict dru

  8. Development of Distributed Research Center for monitoring and projecting regional climatic and environmental changes: first results

    NASA Astrophysics Data System (ADS)

    Gordov, Evgeny; Shiklomanov, Alexander; Okladinikov, Igor; Prusevich, Alex; Titov, Alexander

    2016-04-01

    Description and first results of the cooperative project "Development of Distributed Research Center for monitoring and projecting of regional climatic and environmental changes" recently started by SCERT IMCES and ESRC UNH are reported. The project is aimed at development of hardware and software platform prototype of Distributed Research Center (DRC) for monitoring and projecting regional climatic and environmental changes over the areas of mutual interest and demonstration the benefits of such collaboration that complements skills and regional knowledge across the northern extratropics. In the framework of the project, innovative approaches of "cloud" processing and analysis of large geospatial datasets will be developed on the technical platforms of two U.S. and Russian leading institutions involved in research of climate change and its consequences. Anticipated results will create a pathway for development and deployment of thematic international virtual research centers focused on interdisciplinary environmental studies by international research teams. DRC under development will comprise best features and functionality of earlier developed by the cooperating teams' information-computational systems RIMS (http://rims.unh.edu) and CLIMATE(http://climate.scert.ru/), which are widely used in Northern Eurasia environment studies. The project includes several major directions of research (Tasks) listed below. 1. Development of architecture and defining major hardware and software components of DRC for monitoring and projecting of regional environmental changes. 2. Development of an information database and computing software suite for distributed processing and analysis of large geospatial data hosted at ESRC and IMCES SB RAS. 3. Development of geoportal, thematic web client and web services providing international research teams with an access to "cloud" computing resources at DRC; two options will be executed: access through a basic graphical web browser and using geographic information systems - (GIS). 4. Using the output of the first three tasks, compilation of the DRC prototype, its validation, and testing the DRC feasibility for analyses of the recent regional environmental changes over Northern Eurasia and North America. Results of the first stage of the Project implementation are presented. This work is supported by the Ministry of Education and Science of the Russian Federation, Agreement № 14.613.21.0037.

  9. Spacecraft crew procedures from paper to computers

    NASA Technical Reports Server (NTRS)

    Oneal, Michael; Manahan, Meera

    1991-01-01

    Described here is a research project that uses human factors and computer systems knowledge to explore and help guide the design and creation of an effective Human-Computer Interface (HCI) for spacecraft crew procedures. By having a computer system behind the user interface, it is possible to have increased procedure automation, related system monitoring, and personalized annotation and help facilities. The research project includes the development of computer-based procedure system HCI prototypes and a testbed for experiments that measure the effectiveness of HCI alternatives in order to make design recommendations. The testbed will include a system for procedure authoring, editing, training, and execution. Progress on developing HCI prototypes for a middeck experiment performed on Space Shuttle Mission STS-34 and for upcoming medical experiments are discussed. The status of the experimental testbed is also discussed.

  10. Research-oriented teaching in optical design course and its function in education

    NASA Astrophysics Data System (ADS)

    Cen, Zhaofeng; Li, Xiaotong; Liu, Xiangdong; Deng, Shitao

    2008-03-01

    The principles and operation plans of research-oriented teaching in the course of computer aided optical design are presented, especially the mode of research in practice course. This program includes contract definition phase, project organization and execution, post project evaluation and discussion. Modes of academic organization are used in the practice course of computer aided optical design. In this course the students complete their design projects in research teams by autonomous group approach and cooperative exploration. In this research process they experience the interpersonal relationship in modern society, the importance of cooperation in team, the functions of each individual, the relationships between team members, the competition and cooperation in one academic group and with other groups, and know themselves objectively. In the design practice the knowledge of many academic fields is applied including applied optics, computer programming, engineering software and etc. The characteristic of interdisciplinary is very useful for academic research and makes the students be ready for innovation by integrating the knowledge of interdisciplinary field. As shown by the practice that this teaching mode has taken very important part in bringing up the abilities of engineering, cooperation, digesting the knowledge at a high level and problem analyzing and solving.

  11. Longitudinal Evaluation of the Computer Assisted Instruction, Title I Project, 1979-82.

    ERIC Educational Resources Information Center

    Lavin, Richard J.; Sanders, Jean E.

    The Computer-Assisted Instruction (CAI) Project is an alternative, supplementary approach to providing reading, mathematics, and language arts instruction in schools in six northeastern Massachusetts communities. The CAI activities are provided as a supplement to instruction in Title I/Chapter I programs. Beginning in 1979, a 3-year research study…

  12. Classroom Talk and Computational Thinking

    ERIC Educational Resources Information Center

    Jenkins, Craig W.

    2017-01-01

    This paper is part of a wider action research project taking place at a secondary school in South Wales, UK. The overarching aim of the project is to examine the potential for aspects of literacy and computational thinking to be developed using extensible 'build your own block' programming activities. This paper examines classroom talk at an…

  13. The Lower Manhattan Project: A New Approach to Computer-Assisted Learning in History Classrooms.

    ERIC Educational Resources Information Center

    Crozier, William; Gaffield, Chad

    1990-01-01

    The Lower Manhattan Project, a computer-assisted undergraduate course in U.S. history, enhances student appreciation of the historical process through research and writing. Focuses on the late nineteenth and early twentieth centuries emphasizing massive immigration, rapid industrialization, and the growth of cities. Includes a reading list and…

  14. The Application of Statistics Education Research in My Classroom

    ERIC Educational Resources Information Center

    Jordan, Joy

    2007-01-01

    A collaborative, statistics education research project (Lovett, 2001) is discussed. Some results of the project were applied in the computer lab sessions of my elementary statistics course. I detail the process of applying these research results, as well as the use of knowledge surveys. Furthermore, I give general suggestions to teachers who want…

  15. 76 FR 43347 - Notice Pursuant to the National Cooperative Research and Production Act of 1993; Network Centric...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-20

    ... circumstances. Specifically, Wakelight Technologies, Inc., Honolulu, HI; LinQuest Corporation, Los Angeles, CA; and Computer Sciences Corporation, Rockville, MD, have withdrawn as parties to this venture. In... activity of the group research project. Membership in this group research project remains open, and NCOIC...

  16. 76 FR 20010 - Notice Pursuant to the National Cooperative Research and Production Act of 1993-DVD Copy Control...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-11

    ... Computer Corp., Taipei, TAIWAN; Dongguan ChuDong Electronic Technology Co., Ltd., Dongguan City, Guangdong... have been made in either the membership or planned activity of the group research project. Membership in this group research project remains open, and DVD CCA intends to file additional written...

  17. Using the Microcomputer for Advertising Research Presentations.

    ERIC Educational Resources Information Center

    Larkin, Ernest F.

    A midwestern university is testing a program that uses the Apple II computer to help students in an advertising research course develop their skills in preparing and presenting research reports using computer generated graphics for both oral and written presentations. One of the course requirements is the preparation of a final project, including…

  18. Carbon dioxide in the atmosphere. [and other research projects

    NASA Technical Reports Server (NTRS)

    Johnson, F. S.

    1974-01-01

    Research projects for the period ending September 15, 1973 are reported as follows: (1) the abundances of carbon dioxide in the atmosphere, and the processes by which it is released from carbonate deposits in the earth and then transferred to organic material by photosynthesis; the pathways for movement of carbon and oxygen through the atmosphere; (2) space science computation assistance by PDP computer; the performance characteristics and user instances; (3) OGO-6 data analysis studies of the variations of nighttime ion temperature in the upper atmosphere.

  19. Marc Henry de Frahan | NREL

    Science.gov Websites

    Computing Project, Marc develops high-fidelity turbulence models to enhance simulation accuracy and efficient numerical algorithms for future high performance computing hardware architectures. Research Interests High performance computing High order numerical methods for computational fluid dynamics Fluid

  20. Climate@Home: Crowdsourcing Climate Change Research

    NASA Astrophysics Data System (ADS)

    Xu, C.; Yang, C.; Li, J.; Sun, M.; Bambacus, M.

    2011-12-01

    Climate change deeply impacts human wellbeing. Significant amounts of resources have been invested in building super-computers that are capable of running advanced climate models, which help scientists understand climate change mechanisms, and predict its trend. Although climate change influences all human beings, the general public is largely excluded from the research. On the other hand, scientists are eagerly seeking communication mediums for effectively enlightening the public on climate change and its consequences. The Climate@Home project is devoted to connect the two ends with an innovative solution: crowdsourcing climate computing to the general public by harvesting volunteered computing resources from the participants. A distributed web-based computing platform will be built to support climate computing, and the general public can 'plug-in' their personal computers to participate in the research. People contribute the spare computing power of their computers to run a computer model, which is used by scientists to predict climate change. Traditionally, only super-computers could handle such a large computing processing load. By orchestrating massive amounts of personal computers to perform atomized data processing tasks, investments on new super-computers, energy consumed by super-computers, and carbon release from super-computers are reduced. Meanwhile, the platform forms a social network of climate researchers and the general public, which may be leveraged to raise climate awareness among the participants. A portal is to be built as the gateway to the climate@home project. Three types of roles and the corresponding functionalities are designed and supported. The end users include the citizen participants, climate scientists, and project managers. Citizen participants connect their computing resources to the platform by downloading and installing a computing engine on their personal computers. Computer climate models are defined at the server side. Climate scientists configure computer model parameters through the portal user interface. After model configuration, scientists then launch the computing task. Next, data is atomized and distributed to computing engines that are running on citizen participants' computers. Scientists will receive notifications on the completion of computing tasks, and examine modeling results via visualization modules of the portal. Computing tasks, computing resources, and participants are managed by project managers via portal tools. A portal prototype has been built for proof of concept. Three forums have been setup for different groups of users to share information on science aspect, technology aspect, and educational outreach aspect. A facebook account has been setup to distribute messages via the most popular social networking platform. New treads are synchronized from the forums to facebook. A mapping tool displays geographic locations of the participants and the status of tasks on each client node. A group of users have been invited to test functions such as forums, blogs, and computing resource monitoring.

  1. Internet: road to heaven or hell for the clinical laboratory?

    PubMed

    Chou, D

    1996-05-01

    The Internet started as a research project by the Department of Defense Advanced Research Projects Agency for networking computers. Ironically, the networking project now predominantly supports human rather than computer communications. The Internet's growth, estimated at 20% per month, has been fueled by commercial and public perception that it will become an important medium for merchandising, marketing, and advertising. For the clinical laboratory, the Internet provides high-speed communications through e-mail and allows the retrieval of important information held in repositories. All this capability comes at a price, including the need to manage a complex technology and the risk of instrusions on patient privacy.

  2. XVIS: Visualization for the Extreme-Scale Scientific-Computation Ecosystem Final Scientific/Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geveci, Berk; Maynard, Robert

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. The XVis project brought together collaborators from predominant DOE projects for visualization on accelerators and combining their respectivemore » features into a new visualization toolkit called VTK-m.« less

  3. Outcomes and challenges of global high-resolution non-hydrostatic atmospheric simulations using the K computer

    NASA Astrophysics Data System (ADS)

    Satoh, Masaki; Tomita, Hirofumi; Yashiro, Hisashi; Kajikawa, Yoshiyuki; Miyamoto, Yoshiaki; Yamaura, Tsuyoshi; Miyakawa, Tomoki; Nakano, Masuo; Kodama, Chihiro; Noda, Akira T.; Nasuno, Tomoe; Yamada, Yohei; Fukutomi, Yoshiki

    2017-12-01

    This article reviews the major outcomes of a 5-year (2011-2016) project using the K computer to perform global numerical atmospheric simulations based on the non-hydrostatic icosahedral atmospheric model (NICAM). The K computer was made available to the public in September 2012 and was used as a primary resource for Japan's Strategic Programs for Innovative Research (SPIRE), an initiative to investigate five strategic research areas; the NICAM project fell under the research area of climate and weather simulation sciences. Combining NICAM with high-performance computing has created new opportunities in three areas of research: (1) higher resolution global simulations that produce more realistic representations of convective systems, (2) multi-member ensemble simulations that are able to perform extended-range forecasts 10-30 days in advance, and (3) multi-decadal simulations for climatology and variability. Before the K computer era, NICAM was used to demonstrate realistic simulations of intra-seasonal oscillations including the Madden-Julian oscillation (MJO), merely as a case study approach. Thanks to the big leap in computational performance of the K computer, we could greatly increase the number of cases of MJO events for numerical simulations, in addition to integrating time and horizontal resolution. We conclude that the high-resolution global non-hydrostatic model, as used in this five-year project, improves the ability to forecast intra-seasonal oscillations and associated tropical cyclogenesis compared with that of the relatively coarser operational models currently in use. The impacts of the sub-kilometer resolution simulation and the multi-decadal simulations using NICAM are also reviewed.

  4. University of Tennessee Center for Space Transportation and Applied Research (CSTAR)

    NASA Astrophysics Data System (ADS)

    1995-10-01

    The Center for Space Transportation and Applied Research had projects with space applications in six major areas: laser materials processing, artificial intelligence/expert systems, space transportation, computational methods, chemical propulsion, and electric propulsion. The closeout status of all these projects is addressed.

  5. University of Tennessee Center for Space Transportation and Applied Research (CSTAR)

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The Center for Space Transportation and Applied Research had projects with space applications in six major areas: laser materials processing, artificial intelligence/expert systems, space transportation, computational methods, chemical propulsion, and electric propulsion. The closeout status of all these projects is addressed.

  6. Network and computing infrastructure for scientific applications in Georgia

    NASA Astrophysics Data System (ADS)

    Kvatadze, R.; Modebadze, Z.

    2016-09-01

    Status of network and computing infrastructure and available services for research and education community of Georgia are presented. Research and Educational Networking Association - GRENA provides the following network services: Internet connectivity, network services, cyber security, technical support, etc. Computing resources used by the research teams are located at GRENA and at major state universities. GE-01-GRENA site is included in European Grid infrastructure. Paper also contains information about programs of Learning Center and research and development projects in which GRENA is participating.

  7. Transportation Research and Analysis Computing Center (TRACC) Year 6 Quarter 4 Progress Report

    DOT National Transportation Integrated Search

    2013-03-01

    Argonne National Laboratory initiated a FY2006-FY2009 multi-year program with the US Department of Transportation (USDOT) on October 1, 2006, to establish the Transportation Research and Analysis Computing Center (TRACC). As part of the TRACC project...

  8. Data collection and storage in long-term ecological and evolutionary studies: The Mongoose 2000 system

    PubMed Central

    Griffiths, David J.; Mwanguhya, Francis; Businge, Robert; Griffiths, Amber G. F.; Kyabulima, Solomon; Mwesige, Kenneth; Sanderson, Jennifer L.; Thompson, Faye J.; Vitikainen, Emma I. K.; Cant, Michael A.

    2018-01-01

    Studying ecological and evolutionary processes in the natural world often requires research projects to follow multiple individuals in the wild over many years. These projects have provided significant advances but may also be hampered by needing to accurately and efficiently collect and store multiple streams of the data from multiple individuals concurrently. The increase in the availability and sophistication of portable computers (smartphones and tablets) and the applications that run on them has the potential to address many of these data collection and storage issues. In this paper we describe the challenges faced by one such long-term, individual-based research project: the Banded Mongoose Research Project in Uganda. We describe a system we have developed called Mongoose 2000 that utilises the potential of apps and portable computers to meet these challenges. We discuss the benefits and limitations of employing such a system in a long-term research project. The app and source code for the Mongoose 2000 system are freely available and we detail how it might be used to aid data collection and storage in other long-term individual-based projects. PMID:29315317

  9. Project JOVE. [microgravity experiments and applications

    NASA Technical Reports Server (NTRS)

    Lyell, M. J.

    1994-01-01

    The goal of this project is to investigate new areas of research pertaining to free surface-interface fluids mechanics and/or microgravity which have potential commercial applications. This paper presents an introduction to ferrohydrodynamics (FHD), and discusses some applications. Also, computational methods for solving free surface flow problems are presented in detail. Both have diverse applications in industry and in microgravity fluids applications. Three different modeling schemes for FHD flows are addressed and the governing equations, including Maxwell's equations, are introduced. In the area of computational modeling of free surface flows, both Eulerian and Lagrangian schemes are discussed. The state of the art in computational methods applied to free surface flows is elucidated. In particular, adaptive grids and re-zoning methods are discussed. Additional research results are addressed and copies of the publications produced under the JOVE Project are included.

  10. Computer Ratio and Student Achievement in Reading and Math in a North Carolina School District

    ERIC Educational Resources Information Center

    Preswood, Erica

    2017-01-01

    This longitudinal research project explored the relationship between a 1:1 computing initiative and student achievement on the North Carolina End of Grade Reading Comprehension and Math tests in the study school district. The purpose of this research study was to determine if the implementation of a 1:1 computing initiative impacted student…

  11. Terrestrial implications of mathematical modeling developed for space biomedical research

    NASA Technical Reports Server (NTRS)

    Lujan, Barbara F.; White, Ronald J.; Leonard, Joel I.; Srinivasan, R. Srini

    1988-01-01

    This paper summarizes several related research projects supported by NASA which seek to apply computer models to space medicine and physiology. These efforts span a wide range of activities, including mathematical models used for computer simulations of physiological control systems; power spectral analysis of physiological signals; pattern recognition models for detection of disease processes; and computer-aided diagnosis programs.

  12. NASA/Army Rotorcraft Transmission Research, a Review of Recent Significant Accomplishments

    NASA Technical Reports Server (NTRS)

    Krantz, Timothy L.

    1994-01-01

    A joint helicopter transmission research program between NASA Lewis Research Center and the U.S. Army Research Lab has existed since 1970. Research goals are to reduce weight and noise while increasing life, reliability, and safety. These research goals are achieved by the NASA/Army Mechanical Systems Technology Branch through both in-house research and cooperative research projects with university and industry partners. Some recent significant technical accomplishments produced by this cooperative research are reviewed. The following research projects are reviewed: oil-off survivability of tapered roller bearings, design and evaluation of high contact ratio gearing, finite element analysis of spiral bevel gears, computer numerical control grinding of spiral bevel gears, gear dynamics code validation, computer program for life and reliability of helicopter transmissions, planetary gear train efficiency study, and the Advanced Rotorcraft Transmission (ART) program.

  13. Pervasive healthcare as a scientific discipline.

    PubMed

    Bardram, J E

    2008-01-01

    The OECD countries are facing a set of core challenges; an increasing elderly population; increasing number of chronic and lifestyle-related diseases; expanding scope of what medicine can do; and increasing lack of medical professionals. Pervasive healthcare asks how pervasive computing technology can be designed to meet these challenges. The objective of this paper is to discuss 'pervasive healthcare' as a research field and tries to establish how novel and distinct it is, compared to related work within biomedical engineering, medical informatics, and ubiquitous computing. The paper presents the research questions, approach, technologies, and methods of pervasive healthcare and discusses these in comparison to those of other related scientific disciplines. A set of central research themes are presented; monitoring and body sensor networks; pervasive assistive technologies; pervasive computing for hospitals; and preventive and persuasive technologies. Two projects illustrate the kind of research being done in pervasive healthcare. The first project is targeted at home-based monitoring of hypertension; the second project is designing context-aware technologies for hospitals. Both projects approach the healthcare challenges in a new way, apply a new type of research method, and come up with new kinds of technological solutions. 'Clinical proof-of-concept' is recommended as a new method for pervasive healthcare research; the method helps design and test pervasive healthcare technologies, and in ascertaining their clinical potential before large-scale clinical tests are needed. The paper concludes that pervasive healthcare as a research field and agenda is novel; it is addressing new emerging research questions, represents a novel approach, designs new types of technologies, and applies a new kind of research method.

  14. DOE EPSCoR Initiative in Structural and computational Biology/Bioinformatics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wallace, Susan S.

    2008-02-21

    The overall goal of the DOE EPSCoR Initiative in Structural and Computational Biology was to enhance the competiveness of Vermont research in these scientific areas. To develop self-sustaining infrastructure, we increased the critical mass of faculty, developed shared resources that made junior researchers more competitive for federal research grants, implemented programs to train graduate and undergraduate students who participated in these research areas and provided seed money for research projects. During the time period funded by this DOE initiative: (1) four new faculty were recruited to the University of Vermont using DOE resources, three in Computational Biology and one inmore » Structural Biology; (2) technical support was provided for the Computational and Structural Biology facilities; (3) twenty-two graduate students were directly funded by fellowships; (4) fifteen undergraduate students were supported during the summer; and (5) twenty-eight pilot projects were supported. Taken together these dollars resulted in a plethora of published papers, many in high profile journals in the fields and directly impacted competitive extramural funding based on structural or computational biology resulting in 49 million dollars awarded in grants (Appendix I), a 600% return on investment by DOE, the State and University.« less

  15. Technology in the Service of Creativity: Computer Assisted Writing Project--Stetson Middle School, Philadelphia, Pennsylvania. Final Report.

    ERIC Educational Resources Information Center

    Bender, Evelyn

    The American Library Association's Carroll Preston Baber Research Award supported this project on the use, impact and feasibility of a computer assisted writing facility located in the library of Stetson Middle School in Philadelphia, an inner city school with a population of minority, "at risk" students. The writing facility consisted…

  16. Un projet de logiciels d'assistance a l'apprentissage de la lecture en FLE (An Interdisciplinary Research Project Oriented toward Computer Programs for Reading Instruction in French as a Second Language).

    ERIC Educational Resources Information Center

    Challe, Odile; And Others

    1985-01-01

    Describes a French project entitled "Lecticiel," jointly undertaken by specialists in reading, computer programing, and second language instruction to integrate these disciplines and provide assistance for students learning to read French as a foreign language. (MSE)

  17. Computational Everyday Life Human Behavior Model as Servicable Knowledge

    NASA Astrophysics Data System (ADS)

    Motomura, Yoichi; Nishida, Yoshifumi

    A project called `Open life matrix' is not only a research activity but also real problem solving as an action research. This concept is realized by large-scale data collection, probabilistic causal structure model construction and information service providing using the model. One concrete outcome of this project is childhood injury prevention activity in new team consist of hospital, government, and many varieties of researchers. The main result from the project is a general methodology to apply probabilistic causal structure models as servicable knowledge for action research. In this paper, the summary of this project and future direction to emphasize action research driven by artificial intelligence technology are discussed.

  18. Research use of the AIDA www.2aida.org diabetes software simulation program: a review--part 2. Generating simulated blood glucose data for prototype validation.

    PubMed

    Lehmann, Eldon D

    2003-01-01

    The purpose of this review is to describe research applications of the AIDA diabetes software simulator. AIDA is a computer program that permits the interactive simulation of insulin and glucose profiles for teaching, demonstration, and self-learning purposes. Since March/April 1996 it has been made freely available on the Internet as a noncommercial contribution to continuing diabetes education. Up to May 2003 well over 320,000 visits have been logged at the main AIDA Website--www.2aida.org--and over 65,000 copies of the AIDA program have been downloaded free-of-charge. This review (the second of two parts) overviews research projects and ventures, undertaken for the most part by other research workers in the diabetes computing field, that have made use of the freeware AIDA program. As with Part 1 of the review (Diabetes Technol Ther 2003;5:425-438) relevant research work was identified in three main ways: (i) by personal (e-mail/written) communications from researchers, (ii) via the ISI Web of Science citation database to identify published articles which referred to AIDA-related papers, and (iii) via searches on the Internet. Also, in a number of cases research students who had sought advice about AIDA, and diabetes computing in general, provided copies of their research dissertations/theses upon the completion of their projects. Part 2 of this review highlights some more of the research projects that have made use of the AIDA diabetes simulation program to date. A wide variety of diabetes computing topics are addressed. These range from learning about parameter interactions using simulated blood glucose data, to considerations of dietary assessments, developing new diabetes models, and performance monitoring of closed-loop insulin delivery devices. Other topics include evaluation/validation research usage of such software, applying simulated blood glucose data for prototype training/validation, and other research uses of placing technical information on the Web. This review confirms an unexpected but useful benefit of distributing a medical program, like AIDA, for free via the Internet--demonstrating how it is possible to have a synergistic benefit with other researchers--facilitating their own research projects in related medical fields. A common theme that emerges from the research ventures that have been reviewed is the use of simulated blood glucose data from the AIDA software for preliminary computer lab-based testing of other decision support prototypes. Issues surrounding such use of simulated data for separate computer prototype testing are considered further.

  19. Project Photofly: New 3d Modeling Online Web Service (case Studies and Assessments)

    NASA Astrophysics Data System (ADS)

    Abate, D.; Furini, G.; Migliori, S.; Pierattini, S.

    2011-09-01

    During summer 2010, Autodesk has released a still ongoing project called Project Photofly, freely downloadable from AutodeskLab web site until August 1 2011. Project Photofly based on computer-vision and photogrammetric principles, exploiting the power of cloud computing, is a web service able to convert collections of photographs into 3D models. Aim of our research was to evaluate the Project Photofly, through different case studies, for 3D modeling of cultural heritage monuments and objects, mostly to identify for which goals and objects it is suitable. The automatic approach will be mainly analyzed.

  20. The Human Genome Project: Information access, management, and regulation. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McInerney, J.D.; Micikas, L.B.

    The Human Genome Project is a large, internationally coordinated effort in biological research directed at creating a detailed map of human DNA. This report describes the access of information, management, and regulation of the project. The project led to the development of an instructional module titled The Human Genome Project: Biology, Computers, and Privacy, designed for use in high school biology classes. The module consists of print materials and both Macintosh and Windows versions of related computer software-Appendix A contains a copy of the print materials and discs containing the two versions of the software.

  1. An intelligent multi-media human-computer dialogue system

    NASA Technical Reports Server (NTRS)

    Neal, J. G.; Bettinger, K. E.; Byoun, J. S.; Dobes, Z.; Thielman, C. Y.

    1988-01-01

    Sophisticated computer systems are being developed to assist in the human decision-making process for very complex tasks performed under stressful conditions. The human-computer interface is a critical factor in these systems. The human-computer interface should be simple and natural to use, require a minimal learning period, assist the user in accomplishing his task(s) with a minimum of distraction, present output in a form that best conveys information to the user, and reduce cognitive load for the user. In pursuit of this ideal, the Intelligent Multi-Media Interfaces project is devoted to the development of interface technology that integrates speech, natural language text, graphics, and pointing gestures for human-computer dialogues. The objective of the project is to develop interface technology that uses the media/modalities intelligently in a flexible, context-sensitive, and highly integrated manner modelled after the manner in which humans converse in simultaneous coordinated multiple modalities. As part of the project, a knowledge-based interface system, called CUBRICON (CUBRC Intelligent CONversationalist) is being developed as a research prototype. The application domain being used to drive the research is that of military tactical air control.

  2. Computer Applications in Health Care. NCHSR Research Report Series.

    ERIC Educational Resources Information Center

    Medical Information Systems Cluster, Rockville, MD.

    This NCHSR research program in the application of computers in health care--conducted over the ten year span 1968-1978--identified two areas of application research, an inpatient care support system, and an outpatient care support system. Both of these systems were conceived as conceptual frameworks for a related network of projects and ideas that…

  3. A CAS Project Ten Years On

    ERIC Educational Resources Information Center

    Garner, Sue; Pierce, Robyn

    2016-01-01

    Although research shows that Computer Algebra Systems offer pedagogical opportunities, more than a decade later some teachers are reluctant to change established practices. In 2002, the University of Melbourne in Australia launched a research project to investigate implementation of a senior mathematics course in which students could use a…

  4. Study of Local Radon Occurrence as an Interdisciplinary Undergraduate Research Project.

    ERIC Educational Resources Information Center

    Purdom, William Berlin; And Others

    1990-01-01

    Described is an undergraduate interdisciplinary project encompassing physics, computer science, and geology and involving a number of students from several academic departments. The project used the topic of the occurrence of in-home radon. Student projects, radon sampling, and results are discussed. (CW)

  5. A Multi-Class, Interdisciplinary Project Using Elementary Statistics

    ERIC Educational Resources Information Center

    Reese, Margaret

    2012-01-01

    This article describes a multi-class project that employs statistical computing and writing in a statistics class. Three courses, General Ecology, Meteorology, and Introductory Statistics, cooperated on a project for the EPA's Student Design Competition. The continuing investigation has also spawned several undergraduate research projects in…

  6. Bayesian Research at the NASA Ames Research Center,Computational Sciences Division

    NASA Technical Reports Server (NTRS)

    Morris, Robin D.

    2003-01-01

    NASA Ames Research Center is one of NASA s oldest centers, having started out as part of the National Advisory Committee on Aeronautics, (NACA). The site, about 40 miles south of San Francisco, still houses many wind tunnels and other aviation related departments. In recent years, with the growing realization that space exploration is heavily dependent on computing and data analysis, its focus has turned more towards Information Technology. The Computational Sciences Division has expanded rapidly as a result. In this article, I will give a brief overview of some of the past and present projects with a Bayesian content. Much more than is described here goes on with the Division. The web pages at http://ic.arc. nasa.gov give more information on these, and the other Division projects.

  7. Cryogenic Memories based on Spin-Singlet and Spin-Triplet Ferromagnetic Josephson Junctions

    NASA Astrophysics Data System (ADS)

    Gingrich, Eric

    The last several decades have seen an explosion in the use and size of computers for scientific applications. The US Department of Energy has set an ExaScale computing goal for high performance computing that is projected to be unattainable by current CMOS computing designs. This has led to a renewed interest in superconducting computing as a means of beating these projections. One of the primary requirements of this thrust is the development of an efficient cryogenic memory. Estimates of power consumption of early Rapid Single Flux Quantum (RSFQ) memory designs are on the order of MW, far too steep for any real application. Therefore, other memory concepts are required. S/F/S Josephson Junctions, a class of device in which two superconductors (S) are separated by one or more ferromagnetic layers (F) has shown promise as a memory element. Several different systems have been proposed utilizing either the spin-singlet or spin-triplet superconducting states. This talk will discuss the concepts underpinning these devices, and the recent work done to demonstrate their feasibility. This research is supported in part by the Office of the Director of National Intelligence (ODNI), Intelligence Advanced Research Projects Activity (IARPA), via U.S. Army Research Office Contract W911NF-14-C-0115.

  8. Molecular electronics: The technology of sixth generation computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jarvis, M.T.; Miller, R.K.

    1987-01-01

    In February 1986, Japan began the 6th Generation project. At the 1987 Economic Summit in Venice, Prime Minister Yashuhiro Makasone opened the project to world collaboration. A project director suggests that the 6th Generation ''may just be a turning point for human society.'' The major rationale for building molecular electronic devices is to achieve advances in computational densities and speeds. Proposed chromophore chains for molecular-scale chips, for example, could be spaced closer than today's silicone elements by a factor of almost 100. This book describes the research and proposed designs for molecular electronic devices and computers. It examines specific potentialmore » applications and the relationship to molecular electronics to silicon technology and presents the first published survey of experts on research issues, applications, and forecast of future developments and also includes market forecast. An interesting suggestion of the survey is that the chemical industry may become a significant factor in the computer industry as the sixth generation unfolds.« less

  9. Production Support Flight Control Computers: Research Capability for F/A-18 Aircraft at Dryden Flight Research Center

    NASA Technical Reports Server (NTRS)

    Carter, John F.

    1997-01-01

    NASA Dryden Flight Research Center (DFRC) is working with the United States Navy to complete ground testing and initiate flight testing of a modified set of F/A-18 flight control computers. The Production Support Flight Control Computers (PSFCC) can give any fleet F/A-18 airplane an in-flight, pilot-selectable research control law capability. NASA DFRC can efficiently flight test the PSFCC for the following four reasons: (1) Six F/A-18 chase aircraft are available which could be used with the PSFCC; (2) An F/A-18 processor-in-the-loop simulation exists for validation testing; (3) The expertise has been developed in programming the research processor in the PSFCC; and (4) A well-defined process has been established for clearing flight control research projects for flight. This report presents a functional description of the PSFCC. Descriptions of the NASA DFRC facilities, PSFCC verification and validation process, and planned PSFCC projects are also provided.

  10. The Argonne Leadership Computing Facility 2010 annual report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drugan, C.

    Researchers found more ways than ever to conduct transformative science at the Argonne Leadership Computing Facility (ALCF) in 2010. Both familiar initiatives and innovative new programs at the ALCF are now serving a growing, global user community with a wide range of computing needs. The Department of Energy's (DOE) INCITE Program remained vital in providing scientists with major allocations of leadership-class computing resources at the ALCF. For calendar year 2011, 35 projects were awarded 732 million supercomputer processor-hours for computationally intensive, large-scale research projects with the potential to significantly advance key areas in science and engineering. Argonne also continued tomore » provide Director's Discretionary allocations - 'start up' awards - for potential future INCITE projects. And DOE's new ASCR Leadership Computing (ALCC) Program allocated resources to 10 ALCF projects, with an emphasis on high-risk, high-payoff simulations directly related to the Department's energy mission, national emergencies, or for broadening the research community capable of using leadership computing resources. While delivering more science today, we've also been laying a solid foundation for high performance computing in the future. After a successful DOE Lehman review, a contract was signed to deliver Mira, the next-generation Blue Gene/Q system, to the ALCF in 2012. The ALCF is working with the 16 projects that were selected for the Early Science Program (ESP) to enable them to be productive as soon as Mira is operational. Preproduction access to Mira will enable ESP projects to adapt their codes to its architecture and collaborate with ALCF staff in shaking down the new system. We expect the 10-petaflops system to stoke economic growth and improve U.S. competitiveness in key areas such as advancing clean energy and addressing global climate change. Ultimately, we envision Mira as a stepping-stone to exascale-class computers that will be faster than petascale-class computers by a factor of a thousand. Pete Beckman, who served as the ALCF's Director for the past few years, has been named director of the newly created Exascale Technology and Computing Institute (ETCi). The institute will focus on developing exascale computing to extend scientific discovery and solve critical science and engineering problems. Just as Pete's leadership propelled the ALCF to great success, we know that that ETCi will benefit immensely from his expertise and experience. Without question, the future of supercomputing is certainly in good hands. I would like to thank Pete for all his effort over the past two years, during which he oversaw the establishing of ALCF2, the deployment of the Magellan project, increases in utilization, availability, and number of projects using ALCF1. He managed the rapid growth of ALCF staff and made the facility what it is today. All the staff and users are better for Pete's efforts.« less

  11. Laboratory directed research and development program FY 1997

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-03-01

    This report compiles the annual reports of Laboratory Directed Research and Development projects supported by the Berkeley Lab. Projects are arranged under the following topical sections: (1) Accelerator and fusion research division; (2) Chemical sciences division; (3) Computing Sciences; (4) Earth sciences division; (5) Environmental energy technologies division; (6) life sciences division; (7) Materials sciences division; (8) Nuclear science division; (9) Physics division; (10) Structural biology division; and (11) Cross-divisional. A total of 66 projects are summarized.

  12. Building a Propulsion Experiment Project Management Environment

    NASA Technical Reports Server (NTRS)

    Keiser, Ken; Tanner, Steve; Hatcher, Danny; Graves, Sara

    2004-01-01

    What do you get when you cross rocket scientists with computer geeks? It is an interactive, distributed computing web of tools and services providing a more productive environment for propulsion research and development. The Rocket Engine Advancement Program 2 (REAP2) project involves researchers at several institutions collaborating on propulsion experiments and modeling. In an effort to facilitate these collaborations among researchers at different locations and with different specializations, researchers at the Information Technology and Systems Center,' University of Alabama in Huntsville, are creating a prototype web-based interactive information system in support of propulsion research. This system, to be based on experience gained in creating similar systems for NASA Earth science field experiment campaigns such as the Convection and Moisture Experiments (CAMEX), will assist in the planning and analysis of model and experiment results across REAP2 participants. The initial version of the Propulsion Experiment Project Management Environment (PExPM) consists of a controlled-access web portal facilitating the drafting and sharing of working documents and publications. Interactive tools for building and searching an annotated bibliography of publications related to REAP2 research topics have been created to help organize and maintain the results of literature searches. Also work is underway, with some initial prototypes in place, for interactive project management tools allowing project managers to schedule experiment activities, track status and report on results. This paper describes current successes, plans, and expected challenges for this project.

  13. Organization and Management of Project Athena.

    ERIC Educational Resources Information Center

    Champine, George A.

    1991-01-01

    Project Athena is a $100 million, eight-year project to install a large network of high performance computer work stations for education and research at the Massachusetts Institute of Technology (MIT). Organizational, legal, and administrative aspects of the project allow two competitors (Digital Equipment Corporation and IBM) to work together…

  14. Evaluation in the Classroom.

    ERIC Educational Resources Information Center

    Becnel, Shirley

    Six classroom research-based instructional projects funded under Chapter 2 are described, and their outcomes are summarized. The projects each used computer hardware and software in the classroom setting. The projects and their salient points include: (1) the Science Technology Project, in which 48 teachers and 2,847 students in 18 schools used…

  15. The Natural Revegetation of a Pitheap: An Extended Sixth Form Research Project.

    ERIC Educational Resources Information Center

    Sanderson, Phil

    1987-01-01

    Describes a five year research project in plant ecology by successive teams of students. The aim was to identify environmental factors which were determining the distribution of the vegetation on a pitheap. Includes descriptions of vegetation, soil properties, and computer assisted analysis of results. (Author/CW)

  16. Endangered Species & Biodiversity: A Classroom Project & Theme

    ERIC Educational Resources Information Center

    Lauro, Brook

    2012-01-01

    Students discover the factors contributing to species losses worldwide by conducting a project about endangered species as a component of a larger classroom theme of biodiversity. Groups conduct research using online endangered- species databases and present results to the class using PowerPoint. Students will improve computer research abilities…

  17. Updates on EPA’s High-Throughput Exposure Forecast (ExpoCast) Research Project (CPCP)

    EPA Science Inventory

    Recent research advances by the ORD ExpoCast project (CSS Rapid Exposure and Dosimetry) are presented to the computational toxicology community in the context of prioritizing chemicals on a risk-basis using joint ExpoCast and ToxCast predictions. Recent publications by Wambaugh e...

  18. Learning Strategies and Motivation among Procrastinators of Various English Proficiency Levels

    ERIC Educational Resources Information Center

    Goda, Yoshiko; Yamada, Masanori; Matsuda, Takeshi; Kato, Hiroshi; Saito, Yutaka; Miyagawa, Hiroyuki

    2014-01-01

    Our research project focuses on learning strategies and motivation among academic procrastinators in computer assisted language learning (CALL) settings. In this study, we aim to compare them according to students' levels of English proficiency. One hundred and fourteen university students participated in this research project. Sixty-four students…

  19. Combining Instructionist and Constructionist Learning in a Virtual Biotech Lab.

    ERIC Educational Resources Information Center

    Dawabi, Peter; Wessner, Martin

    The background of this paper is an internal research project at the German National Research Center for Information Technology, Integrated Publication and Information Systems Institute, (GMD-IPSI) dealing with software engineering, computer-supported cooperative learning (CSCL) and practical biotech knowledge. The project goal is to develop a…

  20. Synesthetic art through 3-D projection: The requirements of a computer-based supermedium

    NASA Technical Reports Server (NTRS)

    Mallary, Robert

    1989-01-01

    A computer-based form of multimedia art is proposed that uses the computer to fuse aspects of painting, sculpture, dance, music, film, and other media into a one-to-one synthesia of image and sound for spatially synchronous 3-D projection. Called synesthetic art, this conversion of many varied media into an aesthetically unitary experience determines the character and requirements of the system and its software. During the start-up phase, computer stereographic systems are unsuitable for software development. Eventually, a new type of illusory-projective supermedium will be required to achieve the needed combination of large-format projection and convincing real life presence, and to handle the vast amount of 3-D visual and acoustic information required. The influence of the concept on the author's research and creative work is illustrated through two examples.

  1. Computer Modeling and Research in the Classroom

    ERIC Educational Resources Information Center

    Ramos, Maria Joao; Fernandes, Pedro Alexandrino

    2005-01-01

    We report on a computational chemistry course for undergraduate students that successfully incorporated a research project on the design of new contrast agents for magnetic resonance imaging and shift reagents for in vivo NMR. Course outcomes were positive: students were quite motivated during the whole year--they learned what was required of…

  2. eXascale PRogramming Environment and System Software (XPRESS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chapman, Barbara; Gabriel, Edgar

    Exascale systems, with a thousand times the compute capacity of today’s leading edge petascale computers, are expected to emerge during the next decade. Their software systems will need to facilitate the exploitation of exceptional amounts of concurrency in applications, and ensure that jobs continue to run despite the occurrence of system failures and other kinds of hard and soft errors. Adapting computations at runtime to cope with changes in the execution environment, as well as to improve power and performance characteristics, is likely to become the norm. As a result, considerable innovation is required to develop system support to meetmore » the needs of future computing platforms. The XPRESS project aims to develop and prototype a revolutionary software system for extreme-­scale computing for both exascale and strong­scaled problems. The XPRESS collaborative research project will advance the state-­of-­the-­art in high performance computing and enable exascale computing for current and future DOE mission-­critical applications and supporting systems. The goals of the XPRESS research project are to: A. enable exascale performance capability for DOE applications, both current and future, B. develop and deliver a practical computing system software X-­stack, OpenX, for future practical DOE exascale computing systems, and C. provide programming methods and environments for effective means of expressing application and system software for portable exascale system execution.« less

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heroux, Michael Allen; Marker, Bryan

    This report summarizes the progress made as part of a one year lab-directed research and development (LDRD) project to fund the research efforts of Bryan Marker at the University of Texas at Austin. The goal of the project was to develop new techniques for automatically tuning the performance of dense linear algebra kernels. These kernels often represent the majority of computational time in an application. The primary outcome from this work is a demonstration of the value of model driven engineering as an approach to accurately predict and study performance trade-offs for dense linear algebra computations.

  4. Continued multidisciplinary project-based learning - implementation in health informatics.

    PubMed

    Wessel, C; Spreckelsen, C

    2009-01-01

    Problem- and project-based learning are approved methods to train students, graduates and post-graduates in scientific and other professional skills. The students are trained on realistic scenarios in a broader context. For students specializing in health informatics we introduced continued multidisciplinary project-based learning (CM-PBL) at a department of medical informatics. The training approach addresses both students of medicine and students of computer science. The students are full members of an ongoing research project and develop a project-related application or module, or explore or evaluate a sub-project. Two teachers guide and review the students' work. The training on scientific work follows a workflow with defined milestones. The team acts as peer group. By participating in the research team's work the students are trained on professional skills. A research project on a web-based information system on hospitals built the scenario for the realistic context. The research team consisted of up to 14 active members at a time, who were scientists and students of computer science and medicine. The well communicated educational approach and team policy fostered the participation of the students. Formative assessment and evaluation showed a considerable improvement of the students' skills and a high participant satisfaction. Alternative education approaches such as project-based learning empower students to acquire scientific knowledge and professional skills, especially the ability of life-long learning, multidisciplinary team work and social responsibility.

  5. Bionimbus: a cloud for managing, analyzing and sharing large genomics datasets

    PubMed Central

    Heath, Allison P; Greenway, Matthew; Powell, Raymond; Spring, Jonathan; Suarez, Rafael; Hanley, David; Bandlamudi, Chai; McNerney, Megan E; White, Kevin P; Grossman, Robert L

    2014-01-01

    Background As large genomics and phenotypic datasets are becoming more common, it is increasingly difficult for most researchers to access, manage, and analyze them. One possible approach is to provide the research community with several petabyte-scale cloud-based computing platforms containing these data, along with tools and resources to analyze it. Methods Bionimbus is an open source cloud-computing platform that is based primarily upon OpenStack, which manages on-demand virtual machines that provide the required computational resources, and GlusterFS, which is a high-performance clustered file system. Bionimbus also includes Tukey, which is a portal, and associated middleware that provides a single entry point and a single sign on for the various Bionimbus resources; and Yates, which automates the installation, configuration, and maintenance of the software infrastructure required. Results Bionimbus is used by a variety of projects to process genomics and phenotypic data. For example, it is used by an acute myeloid leukemia resequencing project at the University of Chicago. The project requires several computational pipelines, including pipelines for quality control, alignment, variant calling, and annotation. For each sample, the alignment step requires eight CPUs for about 12 h. BAM file sizes ranged from 5 GB to 10 GB for each sample. Conclusions Most members of the research community have difficulty downloading large genomics datasets and obtaining sufficient storage and computer resources to manage and analyze the data. Cloud computing platforms, such as Bionimbus, with data commons that contain large genomics datasets, are one choice for broadening access to research data in genomics. PMID:24464852

  6. Knowledge-based geographic information systems on the Macintosh computer: a component of the GypsES project

    Treesearch

    Gregory Elmes; Thomas Millette; Charles B. Yuill

    1991-01-01

    GypsES, a decision-support and expert system for the management of Gypsy Moth addresses five related research problems in a modular, computer-based project. The modules are hazard rating, monitoring, prediction, treatment decision and treatment implementation. One common component is a geographic information system designed to function intelligently. We refer to this...

  7. Sign language recognition and translation: a multidisciplined approach from the field of artificial intelligence.

    PubMed

    Parton, Becky Sue

    2006-01-01

    In recent years, research has progressed steadily in regard to the use of computers to recognize and render sign language. This paper reviews significant projects in the field beginning with finger-spelling hands such as "Ralph" (robotics), CyberGloves (virtual reality sensors to capture isolated and continuous signs), camera-based projects such as the CopyCat interactive American Sign Language game (computer vision), and sign recognition software (Hidden Markov Modeling and neural network systems). Avatars such as "Tessa" (Text and Sign Support Assistant; three-dimensional imaging) and spoken language to sign language translation systems such as Poland's project entitled "THETOS" (Text into Sign Language Automatic Translator, which operates in Polish; natural language processing) are addressed. The application of this research to education is also explored. The "ICICLE" (Interactive Computer Identification and Correction of Language Errors) project, for example, uses intelligent computer-aided instruction to build a tutorial system for deaf or hard-of-hearing children that analyzes their English writing and makes tailored lessons and recommendations. Finally, the article considers synthesized sign, which is being added to educational material and has the potential to be developed by students themselves.

  8. Educational NASA Computational and Scientific Studies (enCOMPASS)

    NASA Technical Reports Server (NTRS)

    Memarsadeghi, Nargess

    2013-01-01

    Educational NASA Computational and Scientific Studies (enCOMPASS) is an educational project of NASA Goddard Space Flight Center aimed at bridging the gap between computational objectives and needs of NASA's scientific research, missions, and projects, and academia's latest advances in applied mathematics and computer science. enCOMPASS achieves this goal via bidirectional collaboration and communication between NASA and academia. Using developed NASA Computational Case Studies in university computer science/engineering and applied mathematics classes is a way of addressing NASA's goals of contributing to the Science, Technology, Education, and Math (STEM) National Objective. The enCOMPASS Web site at http://encompass.gsfc.nasa.gov provides additional information. There are currently nine enCOMPASS case studies developed in areas of earth sciences, planetary sciences, and astrophysics. Some of these case studies have been published in AIP and IEEE's Computing in Science and Engineering magazines. A few university professors have used enCOMPASS case studies in their computational classes and contributed their findings to NASA scientists. In these case studies, after introducing the science area, the specific problem, and related NASA missions, students are first asked to solve a known problem using NASA data and past approaches used and often published in a scientific/research paper. Then, after learning about the NASA application and related computational tools and approaches for solving the proposed problem, students are given a harder problem as a challenge for them to research and develop solutions for. This project provides a model for NASA scientists and engineers on one side, and university students, faculty, and researchers in computer science and applied mathematics on the other side, to learn from each other's areas of work, computational needs and solutions, and the latest advances in research and development. This innovation takes NASA science and engineering applications to computer science and applied mathematics university classes, and makes NASA objectives part of the university curricula. There is great potential for growth and return on investment of this program to the point where every major university in the U.S. would use at least one of these case studies in one of their computational courses, and where every NASA scientist and engineer facing a computational challenge (without having resources or expertise to solve it) would use enCOMPASS to formulate the problem as a case study, provide it to a university, and get back their solutions and ideas.

  9. AHPCRC (Army High Performance Computing Research Center) Bulletin. Volume 1, Issue 2

    DTIC Science & Technology

    2011-01-01

    area and the researchers working on these projects. Also inside: news from the AHPCRC consortium partners at Morgan State University and the NASA ...Computing Research Center is provided by the supercomputing and research facilities at Stanford University and at the NASA Ames Research Center at...atomic and molecular level, he said. He noted that “every general would like to have” a Star Trek -like holodeck, where holographic avatars could

  10. Innovative architectures for dense multi-microprocessor computers

    NASA Technical Reports Server (NTRS)

    Larson, Robert E.

    1989-01-01

    The purpose is to summarize a Phase 1 SBIR project performed for the NASA/Langley Computational Structural Mechanics Group. The project was performed from February to August 1987. The main objectives of the project were to: (1) expand upon previous research into the application of chordal ring architectures to the general problem of designing multi-microcomputer architectures, (2) attempt to identify a family of chordal rings such that each chordal ring can be simply expanded to produce the next member of the family, (3) perform a preliminary, high-level design of an expandable multi-microprocessor computer based upon chordal rings, (4) analyze the potential use of chordal ring based multi-microprocessors for sparse matrix problems and other applications arising in computational structural mechanics.

  11. Productivity enhancement planning using participative management concepts

    NASA Technical Reports Server (NTRS)

    White, M. E.; Kukla, J. C.

    1985-01-01

    A productivity enhancement project which used participative management for both planning and implementation is described. The process and results associated with using participative management to plan and implement a computer terminal upgrade project where the computer terminals are used by research and development (R&D) personnel are reported. The upgrade improved the productivity of R&D personnel substantially, and their commitment of the implementation is high. Successful utilization of participative management for this project has laid a foundation for continued style shift toward participation within the organization.

  12. High-Performance Computing and Four-Dimensional Data Assimilation: The Impact on Future and Current Problems

    NASA Technical Reports Server (NTRS)

    Makivic, Miloje S.

    1996-01-01

    This is the final technical report for the project entitled: "High-Performance Computing and Four-Dimensional Data Assimilation: The Impact on Future and Current Problems", funded at NPAC by the DAO at NASA/GSFC. First, the motivation for the project is given in the introductory section, followed by the executive summary of major accomplishments and the list of project-related publications. Detailed analysis and description of research results is given in subsequent chapters and in the Appendix.

  13. Applications of Out-of-Domain Knowledge in Students' Reasoning about Computer Program State

    ERIC Educational Resources Information Center

    Lewis, Colleen Marie

    2012-01-01

    To meet a growing demand and a projected deficit in the supply of computer professionals (NCWIT, 2009), it is of vital importance to expand students' access to computer science. However, many researchers in the computer science education community unproductively assume that some students lack an innate ability for computer science and…

  14. The Unified Medical Language System: an informatics research collaboration.

    PubMed

    Humphreys, B L; Lindberg, D A; Schoolman, H M; Barnett, G O

    1998-01-01

    In 1986, the National Library of Medicine (NLM) assembled a large multidisciplinary, multisite team to work on the Unified Medical Language System (UMLS), a collaborative research project aimed at reducing fundamental barriers to the application of computers to medicine. Beyond its tangible products, the UMLS Knowledge Sources, and its influence on the field of informatics, the UMLS project is an interesting case study in collaborative research and development. It illustrates the strengths and challenges of substantive collaboration among widely distributed research groups. Over the past decade, advances in computing and communications have minimized the technical difficulties associated with UMLS collaboration and also facilitated the development, dissemination, and use of the UMLS Knowledge Sources. The spread of the World Wide Web has increased the visibility of the information access problems caused by multiple vocabularies and many information sources which are the focus of UMLS work. The time is propitious for building on UMLS accomplishments and making more progress on the informatics research issues first highlighted by the UMLS project more than 10 years ago.

  15. ADP Analysis project for the Human Resources Management Division

    NASA Technical Reports Server (NTRS)

    Tureman, Robert L., Jr.

    1993-01-01

    The ADP (Automated Data Processing) Analysis Project was conducted for the Human Resources Management Division (HRMD) of NASA's Langley Research Center. The three major areas of work in the project were computer support, automated inventory analysis, and an ADP study for the Division. The goal of the computer support work was to determine automation needs of Division personnel and help them solve computing problems. The goal of automated inventory analysis was to find a way to analyze installed software and usage on a Macintosh. Finally, the ADP functional systems study for the Division was designed to assess future HRMD needs concerning ADP organization and activities.

  16. Impact of Interdisciplinary Undergraduate Research in Mathematics and Biology on the Development of a New Course Integrating Five STEM Disciplines

    PubMed Central

    Caudill, Lester; Hill, April; Lipan, Ovidiu

    2010-01-01

    Funded by innovative programs at the National Science Foundation and the Howard Hughes Medical Institute, University of Richmond faculty in biology, chemistry, mathematics, physics, and computer science teamed up to offer first- and second-year students the opportunity to contribute to vibrant, interdisciplinary research projects. The result was not only good science but also good science that motivated and informed course development. Here, we describe four recent undergraduate research projects involving students and faculty in biology, physics, mathematics, and computer science and how each contributed in significant ways to the conception and implementation of our new Integrated Quantitative Science course, a course for first-year students that integrates the material in the first course of the major in each of biology, chemistry, mathematics, computer science, and physics. PMID:20810953

  17. Impact of Interdisciplinary Undergraduate Research in mathematics and biology on the development of a new course integrating five STEM disciplines.

    PubMed

    Caudill, Lester; Hill, April; Hoke, Kathy; Lipan, Ovidiu

    2010-01-01

    Funded by innovative programs at the National Science Foundation and the Howard Hughes Medical Institute, University of Richmond faculty in biology, chemistry, mathematics, physics, and computer science teamed up to offer first- and second-year students the opportunity to contribute to vibrant, interdisciplinary research projects. The result was not only good science but also good science that motivated and informed course development. Here, we describe four recent undergraduate research projects involving students and faculty in biology, physics, mathematics, and computer science and how each contributed in significant ways to the conception and implementation of our new Integrated Quantitative Science course, a course for first-year students that integrates the material in the first course of the major in each of biology, chemistry, mathematics, computer science, and physics.

  18. Computer-aided drug discovery research at a global contract research organization

    NASA Astrophysics Data System (ADS)

    Kitchen, Douglas B.

    2017-03-01

    Computer-aided drug discovery started at Albany Molecular Research, Inc in 1997. Over nearly 20 years the role of cheminformatics and computational chemistry has grown throughout the pharmaceutical industry and at AMRI. This paper will describe the infrastructure and roles of CADD throughout drug discovery and some of the lessons learned regarding the success of several methods. Various contributions provided by computational chemistry and cheminformatics in chemical library design, hit triage, hit-to-lead and lead optimization are discussed. Some frequently used computational chemistry techniques are described. The ways in which they may contribute to discovery projects are presented based on a few examples from recent publications.

  19. Computer-aided drug discovery research at a global contract research organization.

    PubMed

    Kitchen, Douglas B

    2017-03-01

    Computer-aided drug discovery started at Albany Molecular Research, Inc in 1997. Over nearly 20 years the role of cheminformatics and computational chemistry has grown throughout the pharmaceutical industry and at AMRI. This paper will describe the infrastructure and roles of CADD throughout drug discovery and some of the lessons learned regarding the success of several methods. Various contributions provided by computational chemistry and cheminformatics in chemical library design, hit triage, hit-to-lead and lead optimization are discussed. Some frequently used computational chemistry techniques are described. The ways in which they may contribute to discovery projects are presented based on a few examples from recent publications.

  20. BridgeUP: STEM. Creating Opportunities for Women through Tiered Mentorship

    NASA Astrophysics Data System (ADS)

    Secunda, Amy; Cornelis, Juliette; Ferreira, Denelis; Gomez, Anay; Khan, Ariba; Li, Anna; Soo, Audrey; Mac Low, Mordecai

    2018-01-01

    BridgeUP: STEM is an ambitious, and exciting initiative responding to the extensive gender and opportunity gaps that exist in the STEM pipeline for women, girls, and under-resourced youth. BridgeUP: STEM has developed a distinct identity in the landscape of computer science education by embedding programming in the context of scientific research. One of the ways in which this is accomplished is through a tiered mentorship program. Five Helen Fellows are chosen from a pool of female, postbaccalaureate applicants to be mentored by researchers at the American Museum of Natural History in a computational research project. The Helen Fellows then act as mentors to six high school women (Brown Scholars), guiding them through a computational project aligned with their own research. This year, three of the Helen Fellows, and by extension, eighteen Brown Scholars, are performing computational astrophysics research. This poster presents one example of a tiered mentorship working on modeling the migration of stellar mass black holes (BH) in active galactic nucleus (AGN) disks. Making an analogy from the well-studied migration and formation of planets in protoplanetary disks to the newer field of migration and formation of binary BH in AGN disks, the Helen Fellow is working with her mentors to make the necessary adaptations of an N-body code incorporating migration torques from the protoplanetary disk case to the AGN disk case to model how binary BH form. This is in order to better understand and make predictions for gravitational wave observations from the Laser Interferometer Gravitational-Wave Observatory (LIGO). The Brown Scholars then implement the Helen Fellow’s code for a variety of different distributions of initial stellar mass BH populations that they generate using python, and produce visualizations of the output to be used in a published paper. Over the course of the project, students will develop a basic understanding of the physics related to their project and develop their practical computational skills.

  1. Computer-assisted instruction

    NASA Technical Reports Server (NTRS)

    Atkinson, R. C.

    1974-01-01

    The results are presented of a project of research and development on strategies for optimizing the instructional process, and dissemination of information about the applications of such research to the instructional medium of computer-assisted instruction. Accomplishments reported include construction of the author language INSTRUCT, construction of a practical CAI course in the area of computer science, and a number of investigations into the individualization of instruction, using the course as a vehicle.

  2. Final report to the Florida Department of Transportation Systems Planning Office on project "Improvements and enhancements to LOSPLAN 2007".

    DOT National Transportation Integrated Search

    2011-03-01

    This project addressed several aspects of the LOSPLAN software, primarily with respect to incorporating : new FDOT and NCHRP research project results. In addition, some existing computational methodology : aspects were refined to provide more accurat...

  3. Small business innovation research. Abstracts of 1988 phase 1 awards

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Non-proprietary proposal abstracts of Phase 1 Small Business Innovation Research (SBIR) projects supported by NASA are presented. Projects in the fields of aeronautical propulsion, aerodynamics, acoustics, aircraft systems, materials and structures, teleoperators and robots, computer sciences, information systems, data processing, spacecraft propulsion, bioastronautics, satellite communication, and space processing are covered.

  4. China Brain Project: Basic Neuroscience, Brain Diseases, and Brain-Inspired Computing.

    PubMed

    Poo, Mu-Ming; Du, Jiu-Lin; Ip, Nancy Y; Xiong, Zhi-Qi; Xu, Bo; Tan, Tieniu

    2016-11-02

    The China Brain Project covers both basic research on neural mechanisms underlying cognition and translational research for the diagnosis and intervention of brain diseases as well as for brain-inspired intelligence technology. We discuss some emerging themes, with emphasis on unique aspects. Copyright © 2016. Published by Elsevier Inc.

  5. Malaysian Education Index (MEI): An Online Indexing and Repository System

    ERIC Educational Resources Information Center

    Kabilan, Muhammad Kamarul; Ismail, Hairul Nizam; Yaakub, Rohizani; Yusof, Najeemah Mohd; Idros, Sharifah Noraidah Syed; Umar, Irfan Naufal; Arshad, Muhammad Rafie Mohd.; Idrus, Rosnah; Rahman, Habsah Abdul

    2010-01-01

    This "Project Sheet" describes an on-going project that is being carried out by a group of educational researchers, computer science researchers and librarians from Universiti Sains Malaysia, Penang. The Malaysian Education Index (MEI) has two main functions--(1) Online Indexing System, and (2) Online Repository System. In this brief…

  6. Parallel aeroelastic computations for wing and wing-body configurations

    NASA Technical Reports Server (NTRS)

    Byun, Chansup

    1994-01-01

    The objective of this research is to develop computationally efficient methods for solving fluid-structural interaction problems by directly coupling finite difference Euler/Navier-Stokes equations for fluids and finite element dynamics equations for structures on parallel computers. This capability will significantly impact many aerospace projects of national importance such as Advanced Subsonic Civil Transport (ASCT), where the structural stability margin becomes very critical at the transonic region. This research effort will have direct impact on the High Performance Computing and Communication (HPCC) Program of NASA in the area of parallel computing.

  7. A Brain-Computer Interface Project Applied in Computer Engineering

    ERIC Educational Resources Information Center

    Katona, Jozsef; Kovari, Attila

    2016-01-01

    Keeping up with novel methods and keeping abreast of new applications are crucial issues in engineering education. In brain research, one of the most significant research areas in recent decades, many developments have application in both modern engineering technology and education. New measurement methods in the observation of brain activity open…

  8. A Primer on Infectious Disease Bacterial Genomics

    PubMed Central

    Petkau, Aaron; Knox, Natalie; Graham, Morag; Van Domselaar, Gary

    2016-01-01

    SUMMARY The number of large-scale genomics projects is increasing due to the availability of affordable high-throughput sequencing (HTS) technologies. The use of HTS for bacterial infectious disease research is attractive because one whole-genome sequencing (WGS) run can replace multiple assays for bacterial typing, molecular epidemiology investigations, and more in-depth pathogenomic studies. The computational resources and bioinformatics expertise required to accommodate and analyze the large amounts of data pose new challenges for researchers embarking on genomics projects for the first time. Here, we present a comprehensive overview of a bacterial genomics projects from beginning to end, with a particular focus on the planning and computational requirements for HTS data, and provide a general understanding of the analytical concepts to develop a workflow that will meet the objectives and goals of HTS projects. PMID:28590251

  9. Notification: Audit of EPA's Cloud Computer Initiative

    EPA Pesticide Factsheets

    Project #OA-FY13-0095, December 17, 2012. The U.S. Environmental Protection Agency (EPA) Office of Inspector General plans to begin preliminary research on the audit of EPA’s cloud computer initiative.

  10. Promoting the safe and strategic use of technology for victims of intimate partner violence: evaluation of the technology safety project.

    PubMed

    Finn, Jerry; Atkinson, Teresa

    2009-11-01

    The Technology Safety Project of the Washington State Coalition Against Domestic Violence was designed to increase awareness and knowledge of technology safety issues for domestic violence victims, survivors, and advocacy staff. The project used a "train-the-trainer" model and provided computer and Internet resources to domestic violence service providers to (a) increase safe computer and Internet access for domestic violence survivors in Washington, (b) reduce the risk posed by abusers by educating survivors about technology safety and privacy, and (c) increase the ability of survivors to help themselves and their children through information technology. Evaluation of the project suggests that the program is needed, useful, and effective. Consumer satisfaction was high, and there was perceived improvement in computer confidence and knowledge of computer safety. Areas for future program development and further research are discussed.

  11. Cross Domain Deterrence: Livermore Technical Report, 2014-2016

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnes, Peter D.; Bahney, Ben; Matarazzo, Celeste

    2016-08-03

    Lawrence Livermore National Laboratory (LLNL) is an original collaborator on the project titled “Deterring Complex Threats: The Effects of Asymmetry, Interdependence, and Multi-polarity on International Strategy,” (CDD Project) led by the UC Institute on Global Conflict and Cooperation at UCSD under PIs Jon Lindsay and Erik Gartzke , and funded through the DoD Minerva Research Initiative. In addition to participating in workshops and facilitating interaction among UC social scientists, LLNL is leading the computational modeling effort and assisting with empirical case studies to probe the viability of analytic, modeling and data analysis concepts. This report summarizes LLNL work on themore » CDD Project to date, primarily in Project Years 1-2, corresponding to Federal fiscal year 2015. LLNL brings two unique domains of expertise to bear on this Project: (1) access to scientific expertise on the technical dimensions of emerging threat technology, and (2) high performance computing (HPC) expertise, required for analyzing the complexity of bargaining interactions in the envisioned threat models. In addition, we have a small group of researchers trained as social scientists who are intimately familiar with the International Relations research. We find that pairing simulation scientists, who are typically trained in computer science, with domain experts, social scientists in this case, is the most effective route to developing powerful new simulation tools capable of representing domain concepts accurately and answering challenging questions in the field.« less

  12. Development of iterative techniques for the solution of unsteady compressible viscous flows

    NASA Technical Reports Server (NTRS)

    Sankar, Lakshmi; Hixon, Duane

    1993-01-01

    The work done under this project was documented in detail as the Ph. D. dissertation of Dr. Duane Hixon. The objectives of the research project were evaluation of the generalized minimum residual method (GMRES) as a tool for accelerating 2-D and 3-D unsteady flows and evaluation of the suitability of the GMRES algorithm for unsteady flows, computed on parallel computer architectures.

  13. Teachers' Views about the Use of Tablet Computers Distributed in Schools as Part of the Fatih Project

    ERIC Educational Resources Information Center

    Gökmen, Ömer Faruk; Duman, Ibrahim; Akgün, Özcan Erkan

    2018-01-01

    The purpose of this study is to investigate teachers' views about the use of tablet computers distributed as a part of the FATIH (Movement for Enhancing Opportunities and Improving Technology) Project. In this study, the case study method, one of the qualitative research methods, was used. The participants were 20 teachers from various fields…

  14. Nbody Simulations and Weak Gravitational Lensing using new HPC-Grid resources: the PI2S2 project

    NASA Astrophysics Data System (ADS)

    Becciani, U.; Antonuccio-Delogu, V.; Costa, A.; Comparato, M.

    2008-08-01

    We present the main project of the new grid infrastructure and the researches, that have been already started in Sicily and will be completed by next year. The PI2S2 project of the COMETA consortium is funded by the Italian Ministry of University and Research and will be completed in 2009. Funds are from the European Union Structural Funds for Objective 1 regions. The project, together with a similar project called Trinacria GRID Virtual Laboratory (Trigrid VL), aims to create in Sicily a computational grid for e-science and e-commerce applications with the main goal of increasing the technological innovation of local enterprises and their competition on the global market. PI2S2 project aims to build and develop an e-Infrastructure in Sicily, based on the grid paradigm, mainly for research activity using the grid environment and High Performance Computer systems. As an example we present the first results of a new grid version of FLY a tree Nbody code developed by INAF Astrophysical Observatory of Catania, already published in the CPC program Library, that will be used in the Weak Gravitational Lensing field.

  15. Research use of the AIDA www.2aida.org diabetes software simulation program: a review-part 1. decision support testing and neural network training.

    PubMed

    Lehmann, Eldon D

    2003-01-01

    The purpose of this two-part review is to overview research use of the AIDA diabetes software simulator. AIDA is a diabetes computer program that permits the interactive simulation of plasma insulin and blood glucose profiles for teaching, demonstration, and self-learning purposes. It has been made freely available, without charge, on the Internet as a noncommercial contribution to continuing diabetes education. Since its launch in 1996 over 300,000 visits have been logged at the main AIDA Website-www.2aida.org-and over 60,000 copies of the AIDA program have been downloaded free-of-charge. This review describes research projects and ventures, undertaken for the most part by other research workers in the diabetes computing field, that have made use of the freeware AIDA software. Relevant research work was identified in three main ways: (i) by personal (e-mail/written) communications from researchers, (ii) via the ISI Web of Science citation database to identify published articles that referred to AIDA-related papers, and (iii) via searches on the Internet. In a number of cases research students who had sought advice about AIDA, and diabetes computing in general, provided copies of their research dissertations/theses upon the completion of their projects. The two reviews highlight some of the many and varied research projects that have made use of the AIDA diabetes simulation software to date. A wide variety of diabetes computing topics have been addressed. In Part 1 of the review, these range from testing decision support prototypes to training artificial neural networks. In Part 2 of the review, issues surrounding dietary assessments, developing new diabetes models, and performance monitoring of closed-loop insulin delivery devices are considered. Overall, research projects making use of AIDA have been identified in Australia, Italy, South Korea, the United Kingdom, and the United States. These reviews confirm an unexpected but useful benefit of distributing medical software, like AIDA, for free via the Internet-demonstrating how it is possible to have a synergistic benefit with other researchers-facilitating their own research projects in related medical fields. The reviews highlight a variety of these projects that have benefited from the free availability of the AIDA diabetes software simulator. In a number of cases these other research projects simply would not have been possible without unrestricted access to the AIDA software and/or technical descriptions of its workings. In addition, some specific common themes begin to emerge from the research ventures that have been reviewed. These include the use of simulated blood glucose data from the AIDA program for preliminary computerlab-based testing of other decision support prototypes. Issues surrounding such use of simulated data for separate prototype testing are discussed further in Part 2 of the review.

  16. "I Like Computers but My Favourite Is Playing outside with My Friends": Young Children's Beliefs about Computers

    ERIC Educational Resources Information Center

    Hatzigianni, Maria

    2014-01-01

    This exploratory study investigated young children's beliefs and opinions about computers by actively involving children in the research project. Fifty two children, four to six years old, were asked a set of questions about their favourite activities and their opinions on computer use before and after a seven month computer intervention…

  17. Research and Development Annual Report, 1992

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Issued as a companion to Johnson Space Center's Research and Technology Annual Report, which reports JSC accomplishments under NASA Research and Technology Operating Plan (RTOP) funding, this report describes 42 additional JSC projects that are funded through sources other than the RTOP. Emerging technologies in four major disciplines are summarized: space systems technology, medical and life sciences, mission operations, and computer systems. Although these projects focus on support of human spacecraft design, development, and safety, most have wide civil and commercial applications in areas such as advanced materials, superconductors, advanced semiconductors, digital imaging, high density data storage, high performance computers, optoelectronics, artificial intelligence, robotics and automation, sensors, biotechnology, medical devices and diagnosis, and human factors engineering.

  18. The JSC Research and Development Annual Report 1993

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Issued as a companion to Johnson Space Center's Research and Technology Annual Report, which reports JSC accomplishments under NASA Research and Technology Operating Plan (RTOP) funding, this report describes 47 additional projects that are funded through sources other than the RTOP. Emerging technologies in four major disciplines are summarized: space systems technology, medical and life sciences, mission operations, and computer systems. Although these projects focus on support of human spacecraft design, development, and safety, most have wide civil and commercial applications in areas such as advanced materials, superconductors, advanced semiconductors, digital imaging, high density data storage, high performance computers, optoelectronics, artificial intelligence, robotics and automation, sensors, biotechnology, medical devices and diagnosis, and human factors engineering.

  19. Secondary Computer-Based Instruction in Microeconomics: Cognitive and Affective Issues.

    ERIC Educational Resources Information Center

    Lasnik, Vincent E.

    This paper describes the general rationale, hypotheses, methodology, findings and implications of a recent dissertation research project conducted in the Columbus, Ohio, public schools. The computer-based study investigated the simultaneous relationship between achievement in microeconomics and attitude toward economics, level of computer anxiety,…

  20. Sandia National Laboratories: Research: Materials Science

    Science.gov Websites

    Technology Partnerships Business, Industry, & Non-Profits Government Universities Center for Development Agreement (CRADA) Strategic Partnership Projects, Non-Federal Entity (SPP/NFE) Agreements New research. Research Our research uses Sandia's experimental, theoretical, and computational capabilities to

  1. Transportable, university-level educational programs in interactive information storage and retrieval systems

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D.; Roquemore, Leroy

    1984-01-01

    Pursuant to the specifications of a research contract entered into in December, 1983 with NASA, the Computer Science Departments of the University of Southwestern Louisiana and Southern University will be working jointly to address a variety of research and educational issues relating to the use, by non-computer professionals, of some of the largest and most sophiticated interactive information storage and retrieval systems available. Over the projected 6 to 8 year life of the project, in addition to NASA/RECON, the following systems will be examined: Lockheed DIALOG, DOE/RECON, DOD/DTIC, EPA/CSIN, and LLNL/TIS.

  2. Bionimbus: a cloud for managing, analyzing and sharing large genomics datasets.

    PubMed

    Heath, Allison P; Greenway, Matthew; Powell, Raymond; Spring, Jonathan; Suarez, Rafael; Hanley, David; Bandlamudi, Chai; McNerney, Megan E; White, Kevin P; Grossman, Robert L

    2014-01-01

    As large genomics and phenotypic datasets are becoming more common, it is increasingly difficult for most researchers to access, manage, and analyze them. One possible approach is to provide the research community with several petabyte-scale cloud-based computing platforms containing these data, along with tools and resources to analyze it. Bionimbus is an open source cloud-computing platform that is based primarily upon OpenStack, which manages on-demand virtual machines that provide the required computational resources, and GlusterFS, which is a high-performance clustered file system. Bionimbus also includes Tukey, which is a portal, and associated middleware that provides a single entry point and a single sign on for the various Bionimbus resources; and Yates, which automates the installation, configuration, and maintenance of the software infrastructure required. Bionimbus is used by a variety of projects to process genomics and phenotypic data. For example, it is used by an acute myeloid leukemia resequencing project at the University of Chicago. The project requires several computational pipelines, including pipelines for quality control, alignment, variant calling, and annotation. For each sample, the alignment step requires eight CPUs for about 12 h. BAM file sizes ranged from 5 GB to 10 GB for each sample. Most members of the research community have difficulty downloading large genomics datasets and obtaining sufficient storage and computer resources to manage and analyze the data. Cloud computing platforms, such as Bionimbus, with data commons that contain large genomics datasets, are one choice for broadening access to research data in genomics. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  3. Parallel Architectures for Planetary Exploration Requirements (PAPER)

    NASA Astrophysics Data System (ADS)

    Cezzar, Ruknet

    1993-08-01

    The project's main contributions have been in the area of student support. Throughout the project, at least one, in some cases two, undergraduate students have been supported. By working with the project, these students gained valuable knowledge involving the scientific research project, including the not-so-pleasant reporting requirements to the funding agencies. The other important contribution was towards the establishment of a graduate program in computer science at Hampton University. Primarily, the PAPER project has served as the main research basis in seeking funds from other agencies, such as the National Science Foundation, for establishing a research infrastructure in the department. In technical areas, especially in the first phase, we believe the trip to Jet Propulsion Laboratory, and gathering together all the pertinent information involving experimental computer architectures aimed for planetary explorations was very helpful. Indeed, if this effort is to be revived in the future due to congressional funding for planetary explorations, say an unmanned mission to Mars, our interim report will be an important starting point. In other technical areas, our simulator has pinpointed and highlighted several important performance issues related to the design of operating system kernels for MIMD machines. In particular, the critical issue of how the kernel itself will run in parallel on a multiple-processor system has been addressed through the various ready list organization and access policies. In the area of neural computing, our main contribution was an introductory tutorial package to familiarize the researchers at NASA with this new and promising field zone axes (20). Finally, we have introduced the notion of reversibility in programming systems which may find applications in various areas of space research.

  4. Parallel Architectures for Planetary Exploration Requirements (PAPER)

    NASA Technical Reports Server (NTRS)

    Cezzar, Ruknet

    1993-01-01

    The project's main contributions have been in the area of student support. Throughout the project, at least one, in some cases two, undergraduate students have been supported. By working with the project, these students gained valuable knowledge involving the scientific research project, including the not-so-pleasant reporting requirements to the funding agencies. The other important contribution was towards the establishment of a graduate program in computer science at Hampton University. Primarily, the PAPER project has served as the main research basis in seeking funds from other agencies, such as the National Science Foundation, for establishing a research infrastructure in the department. In technical areas, especially in the first phase, we believe the trip to Jet Propulsion Laboratory, and gathering together all the pertinent information involving experimental computer architectures aimed for planetary explorations was very helpful. Indeed, if this effort is to be revived in the future due to congressional funding for planetary explorations, say an unmanned mission to Mars, our interim report will be an important starting point. In other technical areas, our simulator has pinpointed and highlighted several important performance issues related to the design of operating system kernels for MIMD machines. In particular, the critical issue of how the kernel itself will run in parallel on a multiple-processor system has been addressed through the various ready list organization and access policies. In the area of neural computing, our main contribution was an introductory tutorial package to familiarize the researchers at NASA with this new and promising field zone axes (20). Finally, we have introduced the notion of reversibility in programming systems which may find applications in various areas of space research.

  5. The iPlant Collaborative: Cyberinfrastructure for Plant Biology.

    PubMed

    Goff, Stephen A; Vaughn, Matthew; McKay, Sheldon; Lyons, Eric; Stapleton, Ann E; Gessler, Damian; Matasci, Naim; Wang, Liya; Hanlon, Matthew; Lenards, Andrew; Muir, Andy; Merchant, Nirav; Lowry, Sonya; Mock, Stephen; Helmke, Matthew; Kubach, Adam; Narro, Martha; Hopkins, Nicole; Micklos, David; Hilgert, Uwe; Gonzales, Michael; Jordan, Chris; Skidmore, Edwin; Dooley, Rion; Cazes, John; McLay, Robert; Lu, Zhenyuan; Pasternak, Shiran; Koesterke, Lars; Piel, William H; Grene, Ruth; Noutsos, Christos; Gendler, Karla; Feng, Xin; Tang, Chunlao; Lent, Monica; Kim, Seung-Jin; Kvilekval, Kristian; Manjunath, B S; Tannen, Val; Stamatakis, Alexandros; Sanderson, Michael; Welch, Stephen M; Cranston, Karen A; Soltis, Pamela; Soltis, Doug; O'Meara, Brian; Ane, Cecile; Brutnell, Tom; Kleibenstein, Daniel J; White, Jeffery W; Leebens-Mack, James; Donoghue, Michael J; Spalding, Edgar P; Vision, Todd J; Myers, Christopher R; Lowenthal, David; Enquist, Brian J; Boyle, Brad; Akoglu, Ali; Andrews, Greg; Ram, Sudha; Ware, Doreen; Stein, Lincoln; Stanzione, Dan

    2011-01-01

    The iPlant Collaborative (iPlant) is a United States National Science Foundation (NSF) funded project that aims to create an innovative, comprehensive, and foundational cyberinfrastructure in support of plant biology research (PSCIC, 2006). iPlant is developing cyberinfrastructure that uniquely enables scientists throughout the diverse fields that comprise plant biology to address Grand Challenges in new ways, to stimulate and facilitate cross-disciplinary research, to promote biology and computer science research interactions, and to train the next generation of scientists on the use of cyberinfrastructure in research and education. Meeting humanity's projected demands for agricultural and forest products and the expectation that natural ecosystems be managed sustainably will require synergies from the application of information technologies. The iPlant cyberinfrastructure design is based on an unprecedented period of research community input, and leverages developments in high-performance computing, data storage, and cyberinfrastructure for the physical sciences. iPlant is an open-source project with application programming interfaces that allow the community to extend the infrastructure to meet its needs. iPlant is sponsoring community-driven workshops addressing specific scientific questions via analysis tool integration and hypothesis testing. These workshops teach researchers how to add bioinformatics tools and/or datasets into the iPlant cyberinfrastructure enabling plant scientists to perform complex analyses on large datasets without the need to master the command-line or high-performance computational services.

  6. The iPlant Collaborative: Cyberinfrastructure for Plant Biology

    PubMed Central

    Goff, Stephen A.; Vaughn, Matthew; McKay, Sheldon; Lyons, Eric; Stapleton, Ann E.; Gessler, Damian; Matasci, Naim; Wang, Liya; Hanlon, Matthew; Lenards, Andrew; Muir, Andy; Merchant, Nirav; Lowry, Sonya; Mock, Stephen; Helmke, Matthew; Kubach, Adam; Narro, Martha; Hopkins, Nicole; Micklos, David; Hilgert, Uwe; Gonzales, Michael; Jordan, Chris; Skidmore, Edwin; Dooley, Rion; Cazes, John; McLay, Robert; Lu, Zhenyuan; Pasternak, Shiran; Koesterke, Lars; Piel, William H.; Grene, Ruth; Noutsos, Christos; Gendler, Karla; Feng, Xin; Tang, Chunlao; Lent, Monica; Kim, Seung-Jin; Kvilekval, Kristian; Manjunath, B. S.; Tannen, Val; Stamatakis, Alexandros; Sanderson, Michael; Welch, Stephen M.; Cranston, Karen A.; Soltis, Pamela; Soltis, Doug; O'Meara, Brian; Ane, Cecile; Brutnell, Tom; Kleibenstein, Daniel J.; White, Jeffery W.; Leebens-Mack, James; Donoghue, Michael J.; Spalding, Edgar P.; Vision, Todd J.; Myers, Christopher R.; Lowenthal, David; Enquist, Brian J.; Boyle, Brad; Akoglu, Ali; Andrews, Greg; Ram, Sudha; Ware, Doreen; Stein, Lincoln; Stanzione, Dan

    2011-01-01

    The iPlant Collaborative (iPlant) is a United States National Science Foundation (NSF) funded project that aims to create an innovative, comprehensive, and foundational cyberinfrastructure in support of plant biology research (PSCIC, 2006). iPlant is developing cyberinfrastructure that uniquely enables scientists throughout the diverse fields that comprise plant biology to address Grand Challenges in new ways, to stimulate and facilitate cross-disciplinary research, to promote biology and computer science research interactions, and to train the next generation of scientists on the use of cyberinfrastructure in research and education. Meeting humanity's projected demands for agricultural and forest products and the expectation that natural ecosystems be managed sustainably will require synergies from the application of information technologies. The iPlant cyberinfrastructure design is based on an unprecedented period of research community input, and leverages developments in high-performance computing, data storage, and cyberinfrastructure for the physical sciences. iPlant is an open-source project with application programming interfaces that allow the community to extend the infrastructure to meet its needs. iPlant is sponsoring community-driven workshops addressing specific scientific questions via analysis tool integration and hypothesis testing. These workshops teach researchers how to add bioinformatics tools and/or datasets into the iPlant cyberinfrastructure enabling plant scientists to perform complex analyses on large datasets without the need to master the command-line or high-performance computational services. PMID:22645531

  7. Laboratory-directed research and development: FY 1996 progress report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vigil, J.; Prono, J.

    1997-05-01

    This report summarizes the FY 1996 goals and accomplishments of Laboratory-Directed Research and Development (LDRD) projects. It gives an overview of the LDRD program, summarizes work done on individual research projects, and provides an index to the projects` principal investigators. Projects are grouped by their LDRD component: Individual Projects, Competency Development, and Program Development. Within each component, they are further divided into nine technical disciplines: (1) materials science, (2) engineering and base technologies, (3) plasmas, fluids, and particle beams, (4) chemistry, (5) mathematics and computational sciences, (6) atomic and molecular physics, (7) geoscience, space science, and astrophysics, (8) nuclear andmore » particle physics, and (9) biosciences.« less

  8. The Research Path to the Virtual Class. ZIFF Papiere 105.

    ERIC Educational Resources Information Center

    Rajasingham, Lalita

    This paper describes a project conducted in 1991-92, based on research conducted in 1986-87 that demonstrated the need for a telecommunications system with the capacity of integrated services digital networks (ISDN) that would allow for sound, vision, and integrated computer services. Called the Tri-Centre Project, it set out to explore, from the…

  9. Becoming Little Scientists: Technologically-Enhanced Project-Based Language Learning

    ERIC Educational Resources Information Center

    Dooly, Melinda; Sadler, Randall

    2016-01-01

    This article outlines research into innovative language teaching practices that make optimal use of technology and Computer-Mediated Communication (CMC) for an integrated approach to Project-Based Learning. It is based on data compiled during a 10- week language project that employed videoconferencing and "machinima" (short video clips…

  10. Design Experimentation with Multiple Perspectives: The GenScope Assessment Project.

    ERIC Educational Resources Information Center

    Hickey, Daniel T.; Kruger, Ann Cale; Frederick, Laura D.; Schafer, Nancy Jo; Zuiker, Steven

    The GenScope Assessment Project is studying assessment in the context of a month-long computer-supported learning environment for introductory genetics. Across three annual iterations with multiple teachers, project researchers manipulated the materials, incentives, and contexts in which students were invited to use formative feedback on…

  11. Perseus Project: Interactive Teaching and Research Tools for Ancient Greek Civilization.

    ERIC Educational Resources Information Center

    Crane, Gregory; Harward, V. Judson

    1987-01-01

    Describes the Perseus Project, an educational program utilizing computer technology to study ancient Greek civilization. Including approximately 10 percent of all ancient literature and visual information on architecture, sculpture, ceramics, topography, and archaeology, the project spans a range of disciplines. States that Perseus fuels student…

  12. The CAI/Cooperative Learning Project. First Year Evaluation Report.

    ERIC Educational Resources Information Center

    Beyer, Francine S.

    This report presents a first year evaluation of the Computer Assisted Instruction (CAI)/ Cooperative Learning Project, a 3-year collaborative effort by two Pennsylvania school districts--the Pittston Area School District and the Hatboro-Horsham School District--and Research for Better Schools (RBS). The project proposed to integrate advanced…

  13. The Use of a Relational Database in Qualitative Research on Educational Computing.

    ERIC Educational Resources Information Center

    Winer, Laura R.; Carriere, Mario

    1990-01-01

    Discusses the use of a relational database as a data management and analysis tool for nonexperimental qualitative research, and describes the use of the Reflex Plus database in the Vitrine 2001 project in Quebec to study computer-based learning environments. Information systems are also discussed, and the use of a conceptual model is explained.…

  14. A Multidisciplinary Research Team Approach to Computer-Aided Drafting (CAD) System Selection. Final Report.

    ERIC Educational Resources Information Center

    Franken, Ken; And Others

    A multidisciplinary research team was assembled to review existing computer-aided drafting (CAD) systems for the purpose of enabling staff in the Design Drafting Department at Linn Technical College (Missouri) to select the best system out of the many CAD systems in existence. During the initial stage of the evaluation project, researchers…

  15. Planning and Scheduling of Software Manufacturing Projects

    DTIC Science & Technology

    1991-03-01

    based on the previous results in social analysis of computing, operations research in manufacturing, artificial intelligence in manufacturing...planning and scheduling, and the traditional approaches to planning in artificial intelligence, and extends the techniques that have been developed by them...social analysis of computing, operations research in manufacturing, artificial intelligence in manufacturing planning and scheduling, and the

  16. The Unified Medical Language System

    PubMed Central

    Humphreys, Betsy L.; Lindberg, Donald A. B.; Schoolman, Harold M.; Barnett, G. Octo

    1998-01-01

    In 1986, the National Library of Medicine (NLM) assembled a large multidisciplinary, multisite team to work on the Unified Medical Language System (UMLS), a collaborative research project aimed at reducing fundamental barriers to the application of computers to medicine. Beyond its tangible products, the UMLS Knowledge Sources, and its influence on the field of informatics, the UMLS project is an interesting case study in collaborative research and development. It illustrates the strengths and challenges of substantive collaboration among widely distributed research groups. Over the past decade, advances in computing and communications have minimized the technical difficulties associated with UMLS collaboration and also facilitated the development, dissemination, and use of the UMLS Knowledge Sources. The spread of the World Wide Web has increased the visibility of the information access problems caused by multiple vocabularies and many information sources which are the focus of UMLS work. The time is propitious for building on UMLS accomplishments and making more progress on the informatics research issues first highlighted by the UMLS project more than 10 years ago. PMID:9452981

  17. Final Report: A Broad Research Project on the Sciences of Complexity, September 15, 1994 - November 15, 1999

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2000-02-01

    DOE support for a broad research program in the sciences of complexity permitted the Santa Fe Institute to initiate new collaborative research within its integrative core activities as well as to host visitors to participate in research on specific topics that serve as motivation and testing ground for the study of the general principles of complex systems. Results are presented on computational biology, biodiversity and ecosystem research, and advanced computing and simulation.

  18. Tutor Training in Computer Science: Tutor Opinions and Student Results.

    ERIC Educational Resources Information Center

    Carbone, Angela; Mitchell, Ian

    Edproj, a project team of faculty from the departments of computer science, software development and education at Monash University (Australia) investigated the quality of teaching and student learning and understanding in the computer science and software development departments. Edproj's research led to the development of a training program to…

  19. Learning with Ubiquitous Computing

    ERIC Educational Resources Information Center

    Rosenheck, Louisa

    2008-01-01

    If ubiquitous computing becomes a reality and is widely adopted, it will inevitably have an impact on education. This article reviews the background of ubiquitous computing and current research projects done involving educational "ubicomp." Finally it explores how ubicomp may and may not change education in both formal and informal settings and…

  20. Efficient computational methods to study new and innovative signal detection techniques in SETI

    NASA Technical Reports Server (NTRS)

    Deans, Stanley R.

    1991-01-01

    The purpose of the research reported here is to provide a rapid computational method for computing various statistical parameters associated with overlapped Hann spectra. These results are important for the Targeted Search part of the Search for ExtraTerrestrial Intelligence (SETI) Microwave Observing Project.

  1. A Project to Computerize Performance Objectives and Criterion-Referenced Measures in Occupational Education for Research and Determination of Applicability to Handicapped Learners. Final Report.

    ERIC Educational Resources Information Center

    Lee, Connie W.; Hinson, Tony M.

    This publication is the final report of a 21-month project designed to (1) expand and refine the computer capabilities of the Vocational-Technical Education Consortium of States (V-TECS) to ensure rapid data access for generating routine and special occupational data-based reports; (2) develop and implement a computer storage and retrieval system…

  2. User Interface on the World Wide Web: How to Implement a Multi-Level Program Online

    NASA Technical Reports Server (NTRS)

    Cranford, Jonathan W.

    1995-01-01

    The objective of this Langley Aerospace Research Summer Scholars (LARSS) research project was to write a user interface that utilizes current World Wide Web (WWW) technologies for an existing computer program written in C, entitled LaRCRisk. The project entailed researching data presentation and script execution on the WWW and than writing input/output procedures for the database management portion of LaRCRisk.

  3. Structures and Dynamics Division: Research and technology plans for FY 1983 and accomplishments for FY 1982

    NASA Technical Reports Server (NTRS)

    Bales, K. S.

    1983-01-01

    The objectives, expected results, approach, and milestones for research projects of the IPAD Project Office and the impact dynamics, structural mechanics, and structural dynamics branches of the Structures and Dynamics Division are presented. Research facilities are described. Topics covered include computer aided design; general aviation/transport crash dynamics; aircraft ground performance; composite structures; failure analysis, space vehicle dynamics; and large space structures.

  4. Project Final Report: Ubiquitous Computing and Monitoring System (UCoMS) for Discovery and Management of Energy Resources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tzeng, Nian-Feng; White, Christopher D.; Moreman, Douglas

    2012-07-14

    The UCoMS research cluster has spearheaded three research areas since August 2004, including wireless and sensor networks, Grid computing, and petroleum applications. The primary goals of UCoMS research are three-fold: (1) creating new knowledge to push forward the technology forefronts on pertinent research on the computing and monitoring aspects of energy resource management, (2) developing and disseminating software codes and toolkits for the research community and the public, and (3) establishing system prototypes and testbeds for evaluating innovative techniques and methods. Substantial progress and diverse accomplishment have been made by research investigators in their respective areas of expertise cooperatively onmore » such topics as sensors and sensor networks, wireless communication and systems, computational Grids, particularly relevant to petroleum applications.« less

  5. The EMCC / DARPA Massively Parallel Electromagnetic Scattering Project

    NASA Technical Reports Server (NTRS)

    Woo, Alex C.; Hill, Kueichien C.

    1996-01-01

    The Electromagnetic Code Consortium (EMCC) was sponsored by the Advanced Research Program Agency (ARPA) to demonstrate the effectiveness of massively parallel computing in large scale radar signature predictions. The EMCC/ARPA project consisted of three parts.

  6. ART/Ada design project, phase 1: Project plan

    NASA Technical Reports Server (NTRS)

    Allen, Bradley P.

    1988-01-01

    The plan and schedule for Phase 1 of the Ada based ESBT Design Research Project is described. The main platform for the project is a DEC Ada compiler on VAX mini-computers and VAXstations running the Virtual Memory System (VMS) operating system. The Ada effort and lines of code are given in tabular form. A chart is given of the entire project life cycle.

  7. Nondestructive pavement evaluation using ILLI-PAVE based artificial neural network models.

    DOT National Transportation Integrated Search

    2008-09-01

    The overall objective in this research project is to develop advanced pavement structural analysis models for more accurate solutions with fast computation schemes. Soft computing and modeling approaches, specifically the Artificial Neural Network (A...

  8. Extreme-Scale Computing Project Aims to Advance Precision Oncology | FNLCR Staging

    Cancer.gov

    Two government agencies and five national laboratories are collaborating to develop extremely high-performance computing capabilities that will analyze mountains of research and clinical data to improve scientific understanding of cancer, predict dru

  9. A parallel-processing approach to computing for the geographic sciences

    USGS Publications Warehouse

    Crane, Michael; Steinwand, Dan; Beckmann, Tim; Krpan, Greg; Haga, Jim; Maddox, Brian; Feller, Mark

    2001-01-01

    The overarching goal of this project is to build a spatially distributed infrastructure for information science research by forming a team of information science researchers and providing them with similar hardware and software tools to perform collaborative research. Four geographically distributed Centers of the U.S. Geological Survey (USGS) are developing their own clusters of low-cost personal computers into parallel computing environments that provide a costeffective way for the USGS to increase participation in the high-performance computing community. Referred to as Beowulf clusters, these hybrid systems provide the robust computing power required for conducting research into various areas, such as advanced computer architecture, algorithms to meet the processing needs for real-time image and data processing, the creation of custom datasets from seamless source data, rapid turn-around of products for emergency response, and support for computationally intense spatial and temporal modeling.

  10. 2009 ALCF annual report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beckman, P.; Martin, D.; Drugan, C.

    2010-11-23

    This year the Argonne Leadership Computing Facility (ALCF) delivered nearly 900 million core hours of science. The research conducted at their leadership class facility touched our lives in both minute and massive ways - whether it was studying the catalytic properties of gold nanoparticles, predicting protein structures, or unearthing the secrets of exploding stars. The authors remained true to their vision to act as the forefront computational center in extending science frontiers by solving pressing problems for our nation. Our success in this endeavor was due mainly to the Department of Energy's (DOE) INCITE (Innovative and Novel Computational Impact onmore » Theory and Experiment) program. The program awards significant amounts of computing time to computationally intensive, unclassified research projects that can make high-impact scientific advances. This year, DOE allocated 400 million hours of time to 28 research projects at the ALCF. Scientists from around the world conducted the research, representing such esteemed institutions as the Princeton Plasma Physics Laboratory, National Institute of Standards and Technology, and European Center for Research and Advanced Training in Scientific Computation. Argonne also provided Director's Discretionary allocations for research challenges, addressing such issues as reducing aerodynamic noise, critical for next-generation 'green' energy systems. Intrepid - the ALCF's 557-teraflops IBM Blue/Gene P supercomputer - enabled astounding scientific solutions and discoveries. Intrepid went into full production five months ahead of schedule. As a result, the ALCF nearly doubled the days of production computing available to the DOE Office of Science, INCITE awardees, and Argonne projects. One of the fastest supercomputers in the world for open science, the energy-efficient system uses about one-third as much electricity as a machine of comparable size built with more conventional parts. In October 2009, President Barack Obama recognized the excellence of the entire Blue Gene series by awarding it to the National Medal of Technology and Innovation. Other noteworthy achievements included the ALCF's collaboration with the National Energy Research Scientific Computing Center (NERSC) to examine cloud computing as a potential new computing paradigm for scientists. Named Magellan, the DOE-funded initiative will explore which science application programming models work well within the cloud, as well as evaluate the challenges that come with this new paradigm. The ALCF obtained approval for its next-generation machine, a 10-petaflops system to be delivered in 2012. This system will allow us to resolve ever more pressing problems, even more expeditiously through breakthrough science in the years to come.« less

  11. The Virtual Test Bed Project

    NASA Technical Reports Server (NTRS)

    Rabelo, Luis C.

    2002-01-01

    This is a report of my activities as a NASA Fellow during the summer of 2002 at the NASA Kennedy Space Center (KSC). The core of these activities is the assigned project: the Virtual Test Bed (VTB) from the Spaceport Engineering and Technology Directorate. The VTB Project has its foundations in the NASA Ames Research Center (ARC) Intelligent Launch & Range Operations program. The objective of the VTB project is to develop a new and unique collaborative computing environment where simulation models can be hosted and integrated in a seamless fashion. This collaborative computing environment will be used to build a Virtual Range as well as a Virtual Spaceport. This project will work as a technology pipeline to research, develop, test and validate R&D efforts against real time operations without interfering with the actual operations or consuming the operational personnel s time. This report will also focus on the systems issues required to conceptualize and provide form to a systems architecture capable of handling the different demands.

  12. Emerging Roles of Combat Communication Squadrons in Cyber Warfare as Related to Computer Network Attack, Defense and Exploitation

    DTIC Science & Technology

    2011-06-01

    EMERGING ROLES OF COMBAT COMMUNICATION SQUADRONS IN CYBER WARFARE AS RELATED TO COMPUTER NETWORK ATTACK, DEFENSE AND EXPLOITATION GRADUATE RESEARCH...Communication Squadrons in Cyber Warfare as Related to Computer Network Attack, Defense and Exploitation GRADUATE RESEARCH PROJECT Presented to the Faculty...Education and Training Command In Partial Fulfillment of the Requirements for the Degree of Master of Cyber Warfare Michael J. Myers Major, USAF June 2011

  13. Optimization of knowledge-based systems and expert system building tools

    NASA Technical Reports Server (NTRS)

    Yasuda, Phyllis; Mckellar, Donald

    1993-01-01

    The objectives of the NASA-AMES Cooperative Agreement were to investigate, develop, and evaluate, via test cases, the system parameters and processing algorithms that constrain the overall performance of the Information Sciences Division's Artificial Intelligence Research Facility. Written reports covering various aspects of the grant were submitted to the co-investigators for the grant. Research studies concentrated on the field of artificial intelligence knowledge-based systems technology. Activities included the following areas: (1) AI training classes; (2) merging optical and digital processing; (3) science experiment remote coaching; (4) SSF data management system tests; (5) computer integrated documentation project; (6) conservation of design knowledge project; (7) project management calendar and reporting system; (8) automation and robotics technology assessment; (9) advanced computer architectures and operating systems; and (10) honors program.

  14. Monogamy relations of quantum entanglement for partially coherently superposed states

    NASA Astrophysics Data System (ADS)

    Shi, Xian

    2017-12-01

    Not Available Project partially supported by the National Key Research and Development Program of China (Grant No. 2016YFB1000902), the National Natural Science Foundation of China (Grant Nos. 61232015, 61472412, and 61621003), the Beijing Science and Technology Project (2016), Tsinghua-Tencent-AMSS-Joint Project (2016), and the Key Laboratory of Mathematics Mechanization Project: Quantum Computing and Quantum Information Processing.

  15. Asset Management of Roadway Signs Through Advanced Technology

    DOT National Transportation Integrated Search

    2003-06-01

    This research project aims to ease the process of Roadway Sign asset management. The project utilized handheld computer and global positioning system (GPS) technology to capture sign location data along with a timestamp. This data collection effort w...

  16. Compute-to-Learn: Authentic Learning via Development of Interactive Computer Demonstrations within a Peer-Led Studio Environment

    ERIC Educational Resources Information Center

    Jafari, Mina; Welden, Alicia Rae; Williams, Kyle L.; Winograd, Blair; Mulvihill, Ellen; Hendrickson, Heidi P.; Lenard, Michael; Gottfried, Amy; Geva, Eitan

    2017-01-01

    In this paper, we report on the implementation of a novel compute-to-learn pedagogy, which is based upon the theories of situated cognition and meaningful learning. The "compute-to-learn" pedagogy is designed to simulate an authentic research experience as part of the undergraduate curriculum, including project development, teamwork,…

  17. Computer-Intensive School Environments and the Reorganization of Knowledge and Learning: A Qualitative Assessment of Apple Computer's Classroom of Tomorrow.

    ERIC Educational Resources Information Center

    Levine, Harold G.

    The Apple Classroom of Tomorrow (ACOT) project is an attempt to alter the instructional premises of a selected group of seven experimental classrooms in the United States by saturating them with computer technology. A recent proposal submitted to Apple Computer described STAR (Sensible Technology Assessment/Research), which includes both…

  18. Diminishing the Gap between University and High School Research Programs: Computational Physics

    ERIC Educational Resources Information Center

    Vondracek, Mark

    2007-01-01

    There are many schools (grades K-12) around the country that offer some sort of science research option for students to pursue. Often this option is a local science fair, where students do smaller projects that are then presented at poster sessions. Many times the top local projects can advance to some type of regional and, possibly, state science…

  19. Virtual University of Applied Sciences--German Flagship Project in the Field of E-Learning in Higher Education.

    ERIC Educational Resources Information Center

    Granow, Rolf; Bischoff, Michael

    In 1997, the German Federal Ministry of Education and Research started an initiative to promote e-learning in Germany by installing an extensive research program. The Virtual University of Applied Sciences in Engineering, Computer Science and Economic Engineering is the most prominent and best-funded of the more than 100 projects in the field…

  20. Small business innovation research. Abstracts of completed 1987 phase 1 projects

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Non-proprietary summaries of Phase 1 Small Business Innovation Research (SBIR) projects supported by NASA in the 1987 program year are given. Work in the areas of aeronautical propulsion, aerodynamics, acoustics, aircraft systems, materials and structures, teleoperators and robotics, computer sciences, information systems, spacecraft systems, spacecraft power supplies, spacecraft propulsion, bioastronautics, satellite communication, and space processing are covered.

  1. A Meta-Analysis of National Research: Effects of Teaching Strategies on Student Achievement in Science in the United States

    ERIC Educational Resources Information Center

    Schroeder, Carolyn M.; Scott, Timothy P.; Tolson, Homer; Huang, Tse-Yang; Lee, Yi-Hsuan

    2007-01-01

    This project consisted of a meta-analysis of U.S. research published from 1980 to 2004 on the effect of specific science teaching strategies on student achievement. The six phases of the project included study acquisition, study coding, determination of intercoder objectivity, establishing criteria for inclusion of studies, computation of effect…

  2. Los Alamos Space Weather Summer School: Institutional Computing 2016

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cowee, Misa

    During the summer school, students carry out independent research projects on a range of topics related to space weather. In 2016, one student used the LANL Institutional Computing resources. Results of this project were the first to demonstrate that the magnitude of radial diffusion is found to agree well with the early observations of the Earth's radiation belts, indicating this effect should be included in community models of the radiation belts.

  3. Teaching Formative Assessment Strategies to Preservice Teachers: Exploring the Use of Handheld Computing to Facilitate the Action Research Process

    ERIC Educational Resources Information Center

    Bennett, Kristin Redington; Cunningham, Ann C.

    2009-01-01

    Appropriate classroom assessment now tends to utilize formative measures with greater frequency, especially in the early grades and with learner groups at risk of not passing state-mandated standardized tests. Within the authentic context of an action research project, teacher candidates were given handheld computers equipped with data-collection…

  4. Impact of Collaborative Project-Based Learning on Self-Efficacy of Urban Minority Students in Engineering

    ERIC Educational Resources Information Center

    Chen, Pearl; Hernandez, Anthony; Dong, Jane

    2015-01-01

    This paper presents an interdisciplinary research project that studies the impact of collaborative project-based learning (CPBL) on the development of self-efficacy of students from various ethnic groups in an undergraduate senior-level computer networking class. Grounded in social constructivist and situated theories of learning, the study…

  5. Using R-Project for Free Statistical Analysis in Extension Research

    ERIC Educational Resources Information Center

    Mangiafico, Salvatore S.

    2013-01-01

    One option for Extension professionals wishing to use free statistical software is to use online calculators, which are useful for common, simple analyses. A second option is to use a free computing environment capable of performing statistical analyses, like R-project. R-project is free, cross-platform, powerful, and respected, but may be…

  6. A Systematic Approach for Understanding Slater-Gaussian Functions in Computational Chemistry

    ERIC Educational Resources Information Center

    Stewart, Brianna; Hylton, Derrick J.; Ravi, Natarajan

    2013-01-01

    A systematic way to understand the intricacies of quantum mechanical computations done by a software package known as "Gaussian" is undertaken via an undergraduate research project. These computations involve the evaluation of key parameters in a fitting procedure to express a Slater-type orbital (STO) function in terms of the linear…

  7. A Study of Young Children's Metaknowing Talk: Learning Experiences with Computers

    ERIC Educational Resources Information Center

    Choi, Ji-Young

    2010-01-01

    This research project was undertaken in a time of increasing emphasis on the exploration of young children's learning and thinking at computers. The purpose of this study was to describe and interpret the characteristics of metaknowing talk that occurred during learning experiences with computers in a kindergarten community of learners. This…

  8. The Quality of Talk in Children's Joint Activity at the Computer.

    ERIC Educational Resources Information Center

    Mercer, Neil

    1994-01-01

    Describes findings of the Spoken Language and New Technology (SLANT) research project which studied the talk of primary school children in the United Kingdom who were working in small groups at computers with various kinds of software. Improvements in the quality of talk and collaboration during computer-based activities are suggested. (Contains…

  9. Intelligent Computer-Assisted Instruction: A Review and Assessment of ICAI Research and Its Potential for Education.

    ERIC Educational Resources Information Center

    Dede, Christopher J.; And Others

    The first of five sections in this report places intelligent computer-assisted instruction (ICAI) in its historical context through discussions of traditional computer-assisted instruction (CAI) linear and branching programs; TICCIT and PLATO IV, two CAI demonstration projects funded by the National Science Foundation; generative programs, the…

  10. Computer Habits and Behaviours among Young Children in Singapore

    ERIC Educational Resources Information Center

    Karuppiah, Nirmala

    2015-01-01

    This exploratory research project was aimed at developing baseline data on computer habits and behaviours among preschool children in Singapore. Three sets of data were collected from teachers, parents and children which are (1) why and how young children use computers; (2) what are the key physical, social and health habits and behaviours of…

  11. Scientific Grid activities and PKI deployment in the Cybermedia Center, Osaka University.

    PubMed

    Akiyama, Toyokazu; Teranishi, Yuuichi; Nozaki, Kazunori; Kato, Seiichi; Shimojo, Shinji; Peltier, Steven T; Lin, Abel; Molina, Tomas; Yang, George; Lee, David; Ellisman, Mark; Naito, Sei; Koike, Atsushi; Matsumoto, Shuichi; Yoshida, Kiyokazu; Mori, Hirotaro

    2005-10-01

    The Cybermedia Center (CMC), Osaka University, is a research institution that offers knowledge and technology resources obtained from advanced researches in the areas of large-scale computation, information and communication, multimedia content and education. Currently, CMC is involved in Japanese national Grid projects such as JGN II (Japan Gigabit Network), NAREGI and BioGrid. Not limited to Japan, CMC also actively takes part in international activities such as PRAGMA. In these projects and international collaborations, CMC has developed a Grid system that allows scientists to perform their analysis by remote-controlling the world's largest ultra-high voltage electron microscope located in Osaka University. In another undertaking, CMC has assumed a leadership role in BioGrid by sharing its experiences and knowledge on the system development for the area of biology. In this paper, we will give an overview of the BioGrid project and introduce the progress of the Telescience unit, which collaborates with the Telescience Project led by the National Center for Microscopy and Imaging Research (NCMIR). Furthermore, CMC collaborates with seven Computing Centers in Japan, NAREGI and National Institute of Informatics to deploy PKI base authentication infrastructure. The current status of this project and future collaboration with Grid Projects will be delineated in this paper.

  12. Astrolabe: Curating, Linking, and Computing Astronomy’s Dark Data

    NASA Astrophysics Data System (ADS)

    Heidorn, P. Bryan; Stahlman, Gretchen R.; Steffen, Julie

    2018-05-01

    Where appropriate repositories are not available to support all relevant astronomical data products, data can fall into darkness: unseen and unavailable for future reference and reuse. Some data in this category are legacy or old data, but newer data sets are also often uncurated and could remain dark. This paper provides a description of the design motivation and development of Astrolabe, a cyberinfrastructure project that addresses a set of community recommendations for locating and ensuring the long-term curation of dark or otherwise at-risk data and integrated computing. This paper also describes the outcomes of the series of community workshops that informed creation of Astrolabe. According to participants in these workshops, much astronomical dark data currently exist that are not curated elsewhere, as well as software that can only be executed by a few individuals and therefore becomes unusable because of changes in computing platforms. Astronomical research questions and challenges would be better addressed with integrated data and computational resources that fall outside the scope of existing observatory and space mission projects. As a solution, the design of the Astrolabe system is aimed at developing new resources for management of astronomical data. The project is based in CyVerse cyberinfrastructure technology and is a collaboration between the University of Arizona and the American Astronomical Society. Overall, the project aims to support open access to research data by leveraging existing cyberinfrastructure resources and promoting scientific discovery by making potentially useful data available to the astronomical community, in a computable format.

  13. ORCA Project: Research on high-performance parallel computer programming environments. Final report, 1 Apr-31 Mar 90

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snyder, L.; Notkin, D.; Adams, L.

    1990-03-31

    This task relates to research on programming massively parallel computers. Previous work on the Ensamble concept of programming was extended and investigation into nonshared memory models of parallel computation was undertaken. Previous work on the Ensamble concept defined a set of programming abstractions and was used to organize the programming task into three distinct levels; Composition of machine instruction, composition of processes, and composition of phases. It was applied to shared memory models of computations. During the present research period, these concepts were extended to nonshared memory models. During the present research period, one Ph D. thesis was completed, onemore » book chapter, and six conference proceedings were published.« less

  14. Management of CAD/CAM information: Key to improved manufacturing productivity

    NASA Technical Reports Server (NTRS)

    Fulton, R. E.; Brainin, J.

    1984-01-01

    A key element to improved industry productivity is effective management of CAD/CAM information. To stimulate advancements in this area, a joint NASA/Navy/industry project designated Intergrated Programs for Aerospace-Vehicle Design (IPAD) is underway with the goal of raising aerospace industry productivity through advancement of technology to integrate and manage information involved in the design and manufacturing process. The project complements traditional NASA/DOD research to develop aerospace design technology and the Air Force's Integrated Computer-Aided Manufacturing (ICAM) program to advance CAM technology. IPAD research is guided by an Industry Technical Advisory Board (ITAB) composed of over 100 representatives from aerospace and computer companies.

  15. Large Scale Computing and Storage Requirements for High Energy Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerber, Richard A.; Wasserman, Harvey

    2010-11-24

    The National Energy Research Scientific Computing Center (NERSC) is the leading scientific computing facility for the Department of Energy's Office of Science, providing high-performance computing (HPC) resources to more than 3,000 researchers working on about 400 projects. NERSC provides large-scale computing resources and, crucially, the support and expertise needed for scientists to make effective use of them. In November 2009, NERSC, DOE's Office of Advanced Scientific Computing Research (ASCR), and DOE's Office of High Energy Physics (HEP) held a workshop to characterize the HPC resources needed at NERSC to support HEP research through the next three to five years. Themore » effort is part of NERSC's legacy of anticipating users needs and deploying resources to meet those demands. The workshop revealed several key points, in addition to achieving its goal of collecting and characterizing computing requirements. The chief findings: (1) Science teams need access to a significant increase in computational resources to meet their research goals; (2) Research teams need to be able to read, write, transfer, store online, archive, analyze, and share huge volumes of data; (3) Science teams need guidance and support to implement their codes on future architectures; and (4) Projects need predictable, rapid turnaround of their computational jobs to meet mission-critical time constraints. This report expands upon these key points and includes others. It also presents a number of case studies as representative of the research conducted within HEP. Workshop participants were asked to codify their requirements in this case study format, summarizing their science goals, methods of solution, current and three-to-five year computing requirements, and software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, multi-core environment that is expected to dominate HPC architectures over the next few years. The report includes a section that describes efforts already underway or planned at NERSC that address requirements collected at the workshop. NERSC has many initiatives in progress that address key workshop findings and are aligned with NERSC's strategic plans.« less

  16. Perceptual factors that influence use of computer enhanced visual displays

    NASA Technical Reports Server (NTRS)

    Littman, David; Boehm-Davis, Debbie

    1993-01-01

    This document is the final report for the NASA/Langley contract entitled 'Perceptual Factors that Influence Use of Computer Enhanced Visual Displays.' The document consists of two parts. The first part contains a discussion of the problem to which the grant was addressed, a brief discussion of work performed under the grant, and several issues suggested for follow-on work. The second part, presented as Appendix I, contains the annual report produced by Dr. Ann Fulop, the Postdoctoral Research Associate who worked on-site in this project. The main focus of this project was to investigate perceptual factors that might affect a pilot's ability to use computer generated information that is projected into the same visual space that contains information about real world objects. For example, computer generated visual information can identify the type of an attacking aircraft, or its likely trajectory. Such computer generated information must not be so bright that it adversely affects a pilot's ability to perceive other potential threats in the same volume of space. Or, perceptual attributes of computer generated and real display components should not contradict each other in ways that lead to problems of accommodation and, thus, distance judgments. The purpose of the research carried out under this contract was to begin to explore the perceptual factors that contribute to effective use of these displays.

  17. Bioinformatics for Exploration

    NASA Technical Reports Server (NTRS)

    Johnson, Kathy A.

    2006-01-01

    For the purpose of this paper, bioinformatics is defined as the application of computer technology to the management of biological information. It can be thought of as the science of developing computer databases and algorithms to facilitate and expedite biological research. This is a crosscutting capability that supports nearly all human health areas ranging from computational modeling, to pharmacodynamics research projects, to decision support systems within autonomous medical care. Bioinformatics serves to increase the efficiency and effectiveness of the life sciences research program. It provides data, information, and knowledge capture which further supports management of the bioastronautics research roadmap - identifying gaps that still remain and enabling the determination of which risks have been addressed.

  18. Improving brain computer interface research through user involvement - The transformative potential of integrating civil society organisations in research projects

    PubMed Central

    Wakunuma, Kutoma; Rainey, Stephen; Hansen, Christian

    2017-01-01

    Research on Brain Computer Interfaces (BCI) often aims to provide solutions for vulnerable populations, such as individuals with diseases, conditions or disabilities that keep them from using traditional interfaces. Such research thereby contributes to the public good. This contribution to the public good corresponds to a broader drive of research and funding policy that focuses on promoting beneficial societal impact. One way of achieving this is to engage with the public. In practical terms this can be done by integrating civil society organisations (CSOs) in research. The open question at the heart of this paper is whether and how such CSO integration can transform the research and contribute to the public good. To answer this question the paper describes five detailed qualitative case studies of research projects including CSOs. The paper finds that transformative impact of CSO integration is possible but by no means assured. It provides recommendations on how transformative impact can be promoted. PMID:28207882

  19. Applied Computational Fluid Dynamics at NASA Ames Research Center

    NASA Technical Reports Server (NTRS)

    Holst, Terry L.; Kwak, Dochan (Technical Monitor)

    1994-01-01

    The field of Computational Fluid Dynamics (CFD) has advanced to the point where it can now be used for many applications in fluid mechanics research and aerospace vehicle design. A few applications being explored at NASA Ames Research Center will be presented and discussed. The examples presented will range in speed from hypersonic to low speed incompressible flow applications. Most of the results will be from numerical solutions of the Navier-Stokes or Euler equations in three space dimensions for general geometry applications. Computational results will be used to highlight the presentation as appropriate. Advances in computational facilities including those associated with NASA's CAS (Computational Aerosciences) Project of the Federal HPCC (High Performance Computing and Communications) Program will be discussed. Finally, opportunities for future research will be presented and discussed. All material will be taken from non-sensitive, previously-published and widely-disseminated work.

  20. Evaluation of data requirements for computerized constructability analysis of pavement rehabilitation projects.

    DOT National Transportation Integrated Search

    2013-08-01

    This research aimed to evaluate the data requirements for computer assisted construction planning : and staging methods that can be implemented in pavement rehabilitation projects in the state of : Georgia. Results showed that two main issues for the...

  1. New Insights into Teaching Apparel Design.

    ERIC Educational Resources Information Center

    Capjack, Linda

    1993-01-01

    Describes projects intended to integrate competitive strategies, develop research skills, increase problem-solving ability, and foster a closer link with the apparel industry: the design of children's wear using computer-aided design technology and a project using the Functional Design Process. (Author/JOW)

  2. Synergies and Distinctions between Computational Disciplines in Biomedical Research: Perspective from the Clinical and Translational Science Award Programs

    PubMed Central

    Bernstam, Elmer V.; Hersh, William R.; Johnson, Stephen B.; Chute, Christopher G.; Nguyen, Hien; Sim, Ida; Nahm, Meredith; Weiner, Mark; Miller, Perry; DiLaura, Robert P.; Overcash, Marc; Lehmann, Harold P.; Eichmann, David; Athey, Brian D.; Scheuermann, Richard H.; Anderson, Nick; Starren, Justin B.; Harris, Paul A.; Smith, Jack W.; Barbour, Ed; Silverstein, Jonathan C.; Krusch, David A.; Nagarajan, Rakesh; Becich, Michael J.

    2010-01-01

    Clinical and translational research increasingly requires computation. Projects may involve multiple computationally-oriented groups including information technology (IT) professionals, computer scientists and biomedical informaticians. However, many biomedical researchers are not aware of the distinctions among these complementary groups, leading to confusion, delays and sub-optimal results. Although written from the perspective of clinical and translational science award (CTSA) programs within academic medical centers, the paper addresses issues that extend beyond clinical and translational research. The authors describe the complementary but distinct roles of operational IT, research IT, computer science and biomedical informatics using a clinical data warehouse as a running example. In general, IT professionals focus on technology. The authors distinguish between two types of IT groups within academic medical centers: central or administrative IT (supporting the administrative computing needs of large organizations) and research IT (supporting the computing needs of researchers). Computer scientists focus on general issues of computation such as designing faster computers or more efficient algorithms, rather than specific applications. In contrast, informaticians are concerned with data, information and knowledge. Biomedical informaticians draw on a variety of tools, including but not limited to computers, to solve information problems in health care and biomedicine. The paper concludes with recommendations regarding administrative structures that can help to maximize the benefit of computation to biomedical research within academic health centers. PMID:19550198

  3. Extreme-Scale Computing Project Aims to Advance Precision Oncology | Poster

    Cancer.gov

    Two government agencies and five national laboratories are collaborating to develop extremely high-performance computing capabilities that will analyze mountains of research and clinical data to improve scientific understanding of cancer, predict drug response, and improve treatments for patients.

  4. Computational fluid dynamics at NASA Ames and the numerical aerodynamic simulation program

    NASA Technical Reports Server (NTRS)

    Peterson, V. L.

    1985-01-01

    Computers are playing an increasingly important role in the field of aerodynamics such as that they now serve as a major complement to wind tunnels in aerospace research and development. Factors pacing advances in computational aerodynamics are identified, including the amount of computational power required to take the next major step in the discipline. The four main areas of computational aerodynamics research at NASA Ames Research Center which are directed toward extending the state of the art are identified and discussed. Example results obtained from approximate forms of the governing equations are presented and discussed, both in the context of levels of computer power required and the degree to which they either further the frontiers of research or apply to programs of practical importance. Finally, the Numerical Aerodynamic Simulation Program--with its 1988 target of achieving a sustained computational rate of 1 billion floating-point operations per second--is discussed in terms of its goals, status, and its projected effect on the future of computational aerodynamics.

  5. The Naval Postgraduate School SECURE ARCHIVAL STORAGE SYSTEM. Part II. Segment and Process Management Implementation.

    DTIC Science & Technology

    1981-03-01

    Research Instructor of Computer Scienr-. Reviewed by: Released by: WILLIAM M. TOLLES Department puter Science Dean of Research 4c t SECURITY...Lyle A. Cox, Roger R. Schell, and Sonja L. Perdue 9. PERFORMING ORGANIZATION NAME ANO ADDRESS 10. PROGRAM ELEMENT. PROJECT. TASK AREA A WORK UNIT... Computer Networks, Operating Systems, Computer Security 20. AftUrCT (Cnthm, w v re eae old* It n..*p and idm 0 F W blk ..m.m.o’) ",A_;he security

  6. Applications of hybrid and digital computation methods in aerospace-related sciences and engineering. [problem solving methods at the University of Houston

    NASA Technical Reports Server (NTRS)

    Huang, C. J.; Motard, R. L.

    1978-01-01

    The computing equipment in the engineering systems simulation laboratory of the Houston University Cullen College of Engineering is described and its advantages are summarized. The application of computer techniques in aerospace-related research psychology and in chemical, civil, electrical, industrial, and mechanical engineering is described in abstracts of 84 individual projects and in reprints of published reports. Research supports programs in acoustics, energy technology, systems engineering, and environment management as well as aerospace engineering.

  7. Enabling Extreme Scale Earth Science Applications at the Oak Ridge Leadership Computing Facility

    NASA Astrophysics Data System (ADS)

    Anantharaj, V. G.; Mozdzynski, G.; Hamrud, M.; Deconinck, W.; Smith, L.; Hack, J.

    2014-12-01

    The Oak Ridge Leadership Facility (OLCF), established at the Oak Ridge National Laboratory (ORNL) under the auspices of the U.S. Department of Energy (DOE), welcomes investigators from universities, government agencies, national laboratories and industry who are prepared to perform breakthrough research across a broad domain of scientific disciplines, including earth and space sciences. Titan, the OLCF flagship system, is currently listed as #2 in the Top500 list of supercomputers in the world, and the largest available for open science. The computational resources are allocated primarily via the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, sponsored by the U.S. DOE Office of Science. In 2014, over 2.25 billion core hours on Titan were awarded via INCITE projects., including 14% of the allocation toward earth sciences. The INCITE competition is also open to research scientists based outside the USA. In fact, international research projects account for 12% of the INCITE awards in 2014. The INCITE scientific review panel also includes 20% participation from international experts. Recent accomplishments in earth sciences at OLCF include the world's first continuous simulation of 21,000 years of earth's climate history (2009); and an unprecedented simulation of a magnitude 8 earthquake over 125 sq. miles. One of the ongoing international projects involves scaling the ECMWF Integrated Forecasting System (IFS) model to over 200K cores of Titan. ECMWF is a partner in the EU funded Collaborative Research into Exascale Systemware, Tools and Applications (CRESTA) project. The significance of the research carried out within this project is the demonstration of techniques required to scale current generation Petascale capable simulation codes towards the performance levels required for running on future Exascale systems. One of the techniques pursued by ECMWF is to use Fortran2008 coarrays to overlap computations and communications and to reduce the total volume of data communicated. Use of Titan has enabled ECMWF to plan future scalability developments and resource requirements. We will also discuss the best practices developed over the years in navigating logistical, legal and regulatory hurdles involved in supporting the facility's diverse user community.

  8. A new DoD initiative: the Computational Research and Engineering Acquisition Tools and Environments (CREATE) program

    NASA Astrophysics Data System (ADS)

    Arevalo, S.; Atwood, C.; Bell, P.; Blacker, T. D.; Dey, S.; Fisher, D.; Fisher, D. A.; Genalis, P.; Gorski, J.; Harris, A.; Hill, K.; Hurwitz, M.; Kendall, R. P.; Meakin, R. L.; Morton, S.; Moyer, E. T.; Post, D. E.; Strawn, R.; Veldhuizen, D. v.; Votta, L. G.; Wynn, S.; Zelinski, G.

    2008-07-01

    In FY2008, the U.S. Department of Defense (DoD) initiated the Computational Research and Engineering Acquisition Tools and Environments (CREATE) program, a 360M program with a two-year planning phase and a ten-year execution phase. CREATE will develop and deploy three computational engineering tool sets for DoD acquisition programs to use to design aircraft, ships and radio-frequency antennas. The planning and execution of CREATE are based on the 'lessons learned' from case studies of large-scale computational science and engineering projects. The case studies stress the importance of a stable, close-knit development team; a focus on customer needs and requirements; verification and validation; flexible and agile planning, management, and development processes; risk management; realistic schedules and resource levels; balanced short- and long-term goals and deliverables; and stable, long-term support by the program sponsor. Since it began in FY2008, the CREATE program has built a team and project structure, developed requirements and begun validating them, identified candidate products, established initial connections with the acquisition programs, begun detailed project planning and development, and generated the initial collaboration infrastructure necessary for success by its multi-institutional, multidisciplinary teams.

  9. Investigation of chemically-reacting supersonic internal flows

    NASA Technical Reports Server (NTRS)

    Chitsomboon, T.; Tiwari, S. N.

    1985-01-01

    This report covers work done on the research project Analysis and Computation of Internal Flow Field in a Scramjet Engine. The work is supported by the NASA Langley Research Center (Computational Methods Branch of the High-Speed Aerodynamics Division) through research grant NAG1-423. The governing equations of two-dimensional chemically-reacting flows are presented together with the global two-step chemistry model. The finite-difference algorithm used is illustrated and the method of circumventing the stiffness is discussed. The computer program developed is used to solve two model problems of a premixed chemically-reacting flow. The results obtained are physically reasonable.

  10. Through Kazan ASPERA to Modern Projects

    NASA Astrophysics Data System (ADS)

    Gusev, Alexander; Kitiashvili, Irina; Petrova, Natasha

    Now the European Union form the Sixth Framework Programme. One of its the objects of the EU Programme is opening national researches and training programmes. The Russian PhD students and young astronomers have business and financial difficulties in access to modern databases and astronomical projects and so they has not been included in European overview of priorities. Modern requirements to the organization of observant projects on powerful telescopes assumes painstaking scientific computer preparation of the application. A rigid competition for observation time assume preliminary computer modeling of target object for success of the application. Kazan AstroGeoPhysics Partnership

  11. All Hazards Risk Assessment Transition Project: Report on Capability Assessment Management System (CAMS) Automation

    DTIC Science & Technology

    2014-04-01

    All Hazards Risk Assessment Transition Project : Report on Capability Assessment Management System (CAMS) Automation...Prepared by: George Giroux Computer Applications Specialist Modis155 Queen Street, Suite 1206 Ottawa, ON K1P 6L1 Contract # THS 2335474-2 Project ...Under a Canadian Safety and Security Program (CSSP) targeted investigation (TI) project (CSSP-2012-TI- 1108), Defence Research and Development

  12. Alliance for Computational Science Collaboration HBCU Partnership at Fisk University. Final Report 2001

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, W. E.

    2004-08-16

    Computational Science plays a big role in research and development in mathematics, science, engineering and biomedical disciplines. The Alliance for Computational Science Collaboration (ACSC) has the goal of training African-American and other minority scientists in the computational science field for eventual employment with the Department of Energy (DOE). The involvements of Historically Black Colleges and Universities (HBCU) in the Alliance provide avenues for producing future DOE African-American scientists. Fisk University has been participating in this program through grants from the DOE. The DOE grant supported computational science activities at Fisk University. The research areas included energy related projects, distributed computing,more » visualization of scientific systems and biomedical computing. Students' involvement in computational science research included undergraduate summer research at Oak Ridge National Lab, on-campus research involving the participation of undergraduates, participation of undergraduate and faculty members in workshops, and mentoring of students. These activities enhanced research and education in computational science, thereby adding to Fisk University's spectrum of research and educational capabilities. Among the successes of the computational science activities are the acceptance of three undergraduate students to graduate schools with full scholarships beginning fall 2002 (one for master degree program and two for Doctoral degree program).« less

  13. Reassessing the English Course Offered to Computer Engineering Students at the National School of Applied Sciences of Al-Hoceima in Morocco: An Action Research Project

    ERIC Educational Resources Information Center

    Dahbi, M.

    2015-01-01

    In computer engineering education, specific English language practices are needed to enable computer engineering students to succeed in professional settings. This study was conducted for two purposes. First, it aimed at investigating to what extent the English courses offered to computer engineering students at the National School of Applied…

  14. Network Computer Technology. Phase I: Viability and Promise within NASA's Desktop Computing Environment

    NASA Technical Reports Server (NTRS)

    Paluzzi, Peter; Miller, Rosalind; Kurihara, West; Eskey, Megan

    1998-01-01

    Over the past several months, major industry vendors have made a business case for the network computer as a win-win solution toward lowering total cost of ownership. This report provides results from Phase I of the Ames Research Center network computer evaluation project. It identifies factors to be considered for determining cost of ownership; further, it examines where, when, and how network computer technology might fit in NASA's desktop computing architecture.

  15. Research in advanced formal theorem-proving techniques. [design and implementation of computer languages

    NASA Technical Reports Server (NTRS)

    Raphael, B.; Fikes, R.; Waldinger, R.

    1973-01-01

    The results are summarised of a project aimed at the design and implementation of computer languages to aid in expressing problem solving procedures in several areas of artificial intelligence including automatic programming, theorem proving, and robot planning. The principal results of the project were the design and implementation of two complete systems, QA4 and QLISP, and their preliminary experimental use. The various applications of both QA4 and QLISP are given.

  16. Downscaling seasonal to centennial simulations on distributed computing infrastructures using WRF model. The WRF4G project

    NASA Astrophysics Data System (ADS)

    Cofino, A. S.; Fernández Quiruelas, V.; Blanco Real, J. C.; García Díez, M.; Fernández, J.

    2013-12-01

    Nowadays Grid Computing is powerful computational tool which is ready to be used for scientific community in different areas (such as biomedicine, astrophysics, climate, etc.). However, the use of this distributed computing infrastructures (DCI) is not yet common practice in climate research, and only a few teams and applications in this area take advantage of this infrastructure. Thus, the WRF4G project objective is to popularize the use of this technology in the atmospheric sciences area. In order to achieve this objective, one of the most used applications has been taken (WRF; a limited- area model, successor of the MM5 model), that has a user community formed by more than 8000 researchers worldwide. This community develop its research activity on different areas and could benefit from the advantages of Grid resources (case study simulations, regional hind-cast/forecast, sensitivity studies, etc.). The WRF model is used by many groups, in the climate research community, to carry on downscaling simulations. Therefore this community will also benefit. However, Grid infrastructures have some drawbacks for the execution of applications that make an intensive use of CPU and memory for a long period of time. This makes necessary to develop a specific framework (middleware). This middleware encapsulates the application and provides appropriate services for the monitoring and management of the simulations and the data. Thus,another objective of theWRF4G project consists on the development of a generic adaptation of WRF to DCIs. It should simplify the access to the DCIs for the researchers, and also to free them from the technical and computational aspects of the use of theses DCI. Finally, in order to demonstrate the ability of WRF4G solving actual scientific challenges with interest and relevance on the climate science (implying a high computational cost) we will shown results from different kind of downscaling experiments, like ERA-Interim re-analysis, CMIP5 models, or seasonal. WRF4G is been used to run WRF simulations which are contributing to the CORDEX initiative and others projects like SPECS and EUPORIAS. This work is been partially funded by the European Regional Development Fund (ERDF) and the Spanish National R&D Plan 2008-2011 (CGL2011-28864)

  17. Research into Queueing Network Theory.

    DTIC Science & Technology

    1977-09-01

    and Zeigler, B. (1975) "Equilibrium properties of arbitrarily interconnected queueing netowrks ," Tech. Report 75-4, Computer and Communication...Associate. The project was extremely fortunate to secure the services of Dr. Wendel. Dr. Wendel was a project member for one month in the summer of

  18. MU-SPIN Project Update

    NASA Technical Reports Server (NTRS)

    Harrington, James L., Jr.

    2000-01-01

    The Minority University Space Interdisciplinary (MUSPIN) Network project is a comprehensive outreach and education initiative that focuses on the transfer of advanced computer networking technologies and relevant science to Historically Black Colleges and Universities (HBCU's) and Other Minority Universities (OMU's) for supporting multi-disciplinary education research.

  19. SELDI PROTEINCHIP-BASED LIVER BIOMARKERS IN FUNGICIDE EXPOSED ZEBRAFISH

    EPA Science Inventory

    The research presented here is part of a three-phased small fish computational toxicology project using a combination of 1) whole organism endpoints, 2) genomic, proteomic, and metabolomic approaches, and 3) computational modeling to (a) identify new molecular biomarkers of expos...

  20. National Educational Computing Conference Proceedings (9th, Dallas, Texas, June 15-17, 1988).

    ERIC Educational Resources Information Center

    Ryan, William C., Ed.

    The more than 200 papers and panel, project, and special session reports represented in this collection focus on innovations, trends, and research on the use of computers in a variety of educational settings. Of these, the full text is provided for 37 presentations and abstracts for 182. The topics discussed include: computer applications in…

  1. Computer Literacy for UK Shipping Management Ashore and Afloat. A Summary. FEU/PICKUP Project Report.

    ERIC Educational Resources Information Center

    Moreby, D. H.

    A study assessed the need of various levels of management in the shipping industry of the United Kingdom for computer literacy training. During the study, researchers interviewed managers in eight shipping companies identified as using computers, spoke with managers and consultants from five companies actively engaged in designing and installing…

  2. Security Risks of Cloud Computing and Its Emergence as 5th Utility Service

    NASA Astrophysics Data System (ADS)

    Ahmad, Mushtaq

    Cloud Computing is being projected by the major cloud services provider IT companies such as IBM, Google, Yahoo, Amazon and others as fifth utility where clients will have access for processing those applications and or software projects which need very high processing speed for compute intensive and huge data capacity for scientific, engineering research problems and also e- business and data content network applications. These services for different types of clients are provided under DASM-Direct Access Service Management based on virtualization of hardware, software and very high bandwidth Internet (Web 2.0) communication. The paper reviews these developments for Cloud Computing and Hardware/Software configuration of the cloud paradigm. The paper also examines the vital aspects of security risks projected by IT Industry experts, cloud clients. The paper also highlights the cloud provider's response to cloud security risks.

  3. Mobile clusters of single board computers: an option for providing resources to student projects and researchers.

    PubMed

    Baun, Christian

    2016-01-01

    Clusters usually consist of servers, workstations or personal computers as nodes. But especially for academic purposes like student projects or scientific projects, the cost for purchase and operation can be a challenge. Single board computers cannot compete with the performance or energy-efficiency of higher-value systems, but they are an option to build inexpensive cluster systems. Because of the compact design and modest energy consumption, it is possible to build clusters of single board computers in a way that they are mobile and can be easily transported by the users. This paper describes the construction of such a cluster, useful applications and the performance of the single nodes. Furthermore, the clusters' performance and energy-efficiency is analyzed by executing the High Performance Linpack benchmark with a different number of nodes and different proportion of the systems total main memory utilized.

  4. Inventory of Federal energy-related environment and safety research for FY 1979. Volume II. Project listings and indexes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    This volume contains summaries of FY 1979 government-sponsored environment and safety research related to energy arranged by log number, which groups the projects by reporting agency. The log number is a unique number assigned to each project from a block of numbers set aside for each contributing agency. Information elements included in the summary listings are project title, principal investigators, research organization, project number, contract number, supporting organization, funding level, related energy sources with numbers indicating percentages of effort devoted to each, and R and D categories. A brief description of each project is given, and this is followed bymore » subject index terms that were assigned for computer searching and for generating the printed subject index in the back of this volume.« less

  5. Overview of ICE Project: Integration of Computational Fluid Dynamics and Experiments

    NASA Technical Reports Server (NTRS)

    Stegeman, James D.; Blech, Richard A.; Babrauckas, Theresa L.; Jones, William H.

    2001-01-01

    Researchers at the NASA Glenn Research Center have developed a prototype integrated environment for interactively exploring, analyzing, and validating information from computational fluid dynamics (CFD) computations and experiments. The Integrated CFD and Experiments (ICE) project is a first attempt at providing a researcher with a common user interface for control, manipulation, analysis, and data storage for both experiments and simulation. ICE can be used as a live, on-tine system that displays and archives data as they are gathered; as a postprocessing system for dataset manipulation and analysis; and as a control interface or "steering mechanism" for simulation codes while visualizing the results. Although the full capabilities of ICE have not been completely demonstrated, this report documents the current system. Various applications of ICE are discussed: a low-speed compressor, a supersonic inlet, real-time data visualization, and a parallel-processing simulation code interface. A detailed data model for the compressor application is included in the appendix.

  6. Improved dense trajectories for action recognition based on random projection and Fisher vectors

    NASA Astrophysics Data System (ADS)

    Ai, Shihui; Lu, Tongwei; Xiong, Yudian

    2018-03-01

    As an important application of intelligent monitoring system, the action recognition in video has become a very important research area of computer vision. In order to improve the accuracy rate of the action recognition in video with improved dense trajectories, one advanced vector method is introduced. Improved dense trajectories combine Fisher Vector with Random Projection. The method realizes the reduction of the characteristic trajectory though projecting the high-dimensional trajectory descriptor into the low-dimensional subspace based on defining and analyzing Gaussian mixture model by Random Projection. And a GMM-FV hybrid model is introduced to encode the trajectory feature vector and reduce dimension. The computational complexity is reduced by Random Projection which can drop Fisher coding vector. Finally, a Linear SVM is used to classifier to predict labels. We tested the algorithm in UCF101 dataset and KTH dataset. Compared with existed some others algorithm, the result showed that the method not only reduce the computational complexity but also improved the accuracy of action recognition.

  7. Heterogeneous Distributed Computing for Computational Aerosciences

    NASA Technical Reports Server (NTRS)

    Sunderam, Vaidy S.

    1998-01-01

    The research supported under this award focuses on heterogeneous distributed computing for high-performance applications, with particular emphasis on computational aerosciences. The overall goal of this project was to and investigate issues in, and develop solutions to, efficient execution of computational aeroscience codes in heterogeneous concurrent computing environments. In particular, we worked in the context of the PVM[1] system and, subsequent to detailed conversion efforts and performance benchmarking, devising novel techniques to increase the efficacy of heterogeneous networked environments for computational aerosciences. Our work has been based upon the NAS Parallel Benchmark suite, but has also recently expanded in scope to include the NAS I/O benchmarks as specified in the NHT-1 document. In this report we summarize our research accomplishments under the auspices of the grant.

  8. The Quality in Quantity - Enhancing Text-based Research -

    NASA Astrophysics Data System (ADS)

    Harms, Patrick; Smith, Kathleen; Aschenbrenner, Andreas; Pempe, Wolfgang; Hedges, Mark; Roberts, Angus; Ács, Bernie; Blanke, Tobias

    Computers are becoming more and more a tool for researchers in the humanities. There are already several projects which aim to implement environments and infrastructures to support research. However, they either address qualitative or quantitative research methods, and there has been less work considering support for both methodologies in one environment. This paper analyzes the difference between qualitative and quantitative research in the humanities, outlines some examples and respective projects, and states why the support for both methodologies needs to be combined and how it might be used to form an integrated research infrastructure for the humanities.

  9. Does Cloud Computing in the Atmospheric Sciences Make Sense? A case study of hybrid cloud computing at NASA Langley Research Center

    NASA Astrophysics Data System (ADS)

    Nguyen, L.; Chee, T.; Minnis, P.; Spangenberg, D.; Ayers, J. K.; Palikonda, R.; Vakhnin, A.; Dubois, R.; Murphy, P. R.

    2014-12-01

    The processing, storage and dissemination of satellite cloud and radiation products produced at NASA Langley Research Center are key activities for the Climate Science Branch. A constellation of systems operates in sync to accomplish these goals. Because of the complexity involved with operating such intricate systems, there are both high failure rates and high costs for hardware and system maintenance. Cloud computing has the potential to ameliorate cost and complexity issues. Over time, the cloud computing model has evolved and hybrid systems comprising off-site as well as on-site resources are now common. Towards our mission of providing the highest quality research products to the widest audience, we have explored the use of the Amazon Web Services (AWS) Cloud and Storage and present a case study of our results and efforts. This project builds upon NASA Langley Cloud and Radiation Group's experience with operating large and complex computing infrastructures in a reliable and cost effective manner to explore novel ways to leverage cloud computing resources in the atmospheric science environment. Our case study presents the project requirements and then examines the fit of AWS with the LaRC computing model. We also discuss the evaluation metrics, feasibility, and outcomes and close the case study with the lessons we learned that would apply to others interested in exploring the implementation of the AWS system in their own atmospheric science computing environments.

  10. From the genetic to the computer program: the historicity of 'data' and 'computation' in the investigations on the nematode worm C. elegans (1963-1998).

    PubMed

    García-Sancho, Miguel

    2012-03-01

    This paper argues that the history of the computer, of the practice of computation and of the notions of 'data' and 'programme' are essential for a critical account of the emergence and implications of data-driven research. In order to show this, I focus on the transition that the investigations on the worm C. elegans experienced in the Laboratory of Molecular Biology of Cambridge (UK). Throughout the 1980s, this research programme evolved from a study of the genetic basis of the worm's development and behaviour to a DNA mapping and sequencing initiative. By examining the changing computing technologies which were used at the Laboratory, I demonstrate that by the time of this transition researchers shifted from modelling the worm's genetic programme on a mainframe apparatus to writing minicomputer programs aimed at providing map and sequence data which was then circulated to other groups working on the genetics of C. elegans. The shift in the worm research should thus not be simply explained in the application of computers which transformed the project from hypothesis-driven to a data-intensive endeavour. The key factor was rather a historically specific technology-in-house and easy programmable minicomputers-which redefined the way of achieving the project's long-standing goal, leading the genetic programme to co-evolve with the practices of data production and distribution. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. Development of the virtual research environment for analysis, evaluation and prediction of global climate change impacts on the regional environment

    NASA Astrophysics Data System (ADS)

    Okladnikov, Igor; Gordov, Evgeny; Titov, Alexander; Fazliev, Alexander

    2017-04-01

    Description and the first results of the Russian Science Foundation project "Virtual computational information environment for analysis, evaluation and prediction of the impacts of global climate change on the environment and climate of a selected region" is presented. The project is aimed at development of an Internet-accessible computation and information environment providing unskilled in numerical modelling and software design specialists, decision-makers and stakeholders with reliable and easy-used tools for in-depth statistical analysis of climatic characteristics, and instruments for detailed analysis, assessment and prediction of impacts of global climate change on the environment and climate of the targeted region. In the framework of the project, approaches of "cloud" processing and analysis of large geospatial datasets will be developed on the technical platform of the Russian leading institution involved in research of climate change and its consequences. Anticipated results will create a pathway for development and deployment of thematic international virtual research laboratory focused on interdisciplinary environmental studies. VRE under development will comprise best features and functionality of earlier developed information and computing system CLIMATE (http://climate.scert.ru/), which is widely used in Northern Eurasia environment studies. The Project includes several major directions of research listed below. 1. Preparation of geo-referenced data sets, describing the dynamics of the current and possible future climate and environmental changes in detail. 2. Improvement of methods of analysis of climate change. 3. Enhancing the functionality of the VRE prototype in order to create a convenient and reliable tool for the study of regional social, economic and political consequences of climate change. 4. Using the output of the first three tasks, compilation of the VRE prototype, its validation, preparation of applicable detailed description of climate change in Western Siberia, and dissemination of the Project results. Results of the first stage of the Project implementation are presented. This work is supported by the Russian Science Foundation grant No16-19-10257.

  12. Automated metadata--final project report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schissel, David

    This report summarizes the work of the Automated Metadata, Provenance Cataloging, and Navigable Interfaces: Ensuring the Usefulness of Extreme-Scale Data Project (MPO Project) funded by the United States Department of Energy (DOE), Offices of Advanced Scientific Computing Research and Fusion Energy Sciences. Initially funded for three years starting in 2012, it was extended for 6 months with additional funding. The project was a collaboration between scientists at General Atomics, Lawrence Berkley National Laboratory (LBNL), and Massachusetts Institute of Technology (MIT). The group leveraged existing computer science technology where possible, and extended or created new capabilities where required. The MPO projectmore » was able to successfully create a suite of software tools that can be used by a scientific community to automatically document their scientific workflows. These tools were integrated into workflows for fusion energy and climate research illustrating the general applicability of the project’s toolkit. Feedback was very positive on the project’s toolkit and the value of such automatic workflow documentation to the scientific endeavor.« less

  13. Learning "Hands On."

    ERIC Educational Resources Information Center

    Ritter, Janice T.

    2001-01-01

    Discusses a computer teacher's incorporation of hand-held computer technology into her third- and fifth-grade students' study of acid rain. The project successfully brought two grade levels together for cross-grade research, provided an opportunity for classroom teachers and technology specialists to work collaboratively, and enhanced students'…

  14. Using Pedagogical Tools to Help Hispanics be Successful in Computer Science

    NASA Astrophysics Data System (ADS)

    Irish, Rodger

    Irish, Rodger, Using Pedagogical Tools to Help Hispanics Be Successful in Computer Science. Master of Science (MS), July 2017, 68 pp., 4 tables, 2 figures, references 48 titles. Computer science (CS) jobs are a growing field and pay a living wage, but the Hispanics are underrepresented in this field. This project seeks to give an overview of several contributing factors to this problem. It will then explore some possible solutions to this problem and how a combination of some tools (teaching methods) can create the best possible outcome. It is my belief that this approach can produce successful Hispanics to fill the needed jobs in the CS field. Then the project will test its hypothesis. I will discuss the tools used to measure progress both in the affective and the cognitive domains. I will show how the decision to run a Computer Club was reached and the results of the research. The conclusion will summarize the results and tell of future research that still needs to be done.

  15. Aerodynamic Characterization of a Modern Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Hall, Robert M.; Holland, Scott D.; Blevins, John A.

    2011-01-01

    A modern launch vehicle is by necessity an extremely integrated design. The accurate characterization of its aerodynamic characteristics is essential to determine design loads, to design flight control laws, and to establish performance. The NASA Ares Aerodynamics Panel has been responsible for technical planning, execution, and vetting of the aerodynamic characterization of the Ares I vehicle. An aerodynamics team supporting the Panel consists of wind tunnel engineers, computational engineers, database engineers, and other analysts that address topics such as uncertainty quantification. The team resides at three NASA centers: Langley Research Center, Marshall Space Flight Center, and Ames Research Center. The Panel has developed strategies to synergistically combine both the wind tunnel efforts and the computational efforts with the goal of validating the computations. Selected examples highlight key flow physics and, where possible, the fidelity of the comparisons between wind tunnel results and the computations. Lessons learned summarize what has been gleaned during the project and can be useful for other vehicle development projects.

  16. Computational methods in drug discovery

    PubMed Central

    Leelananda, Sumudu P

    2016-01-01

    The process for drug discovery and development is challenging, time consuming and expensive. Computer-aided drug discovery (CADD) tools can act as a virtual shortcut, assisting in the expedition of this long process and potentially reducing the cost of research and development. Today CADD has become an effective and indispensable tool in therapeutic development. The human genome project has made available a substantial amount of sequence data that can be used in various drug discovery projects. Additionally, increasing knowledge of biological structures, as well as increasing computer power have made it possible to use computational methods effectively in various phases of the drug discovery and development pipeline. The importance of in silico tools is greater than ever before and has advanced pharmaceutical research. Here we present an overview of computational methods used in different facets of drug discovery and highlight some of the recent successes. In this review, both structure-based and ligand-based drug discovery methods are discussed. Advances in virtual high-throughput screening, protein structure prediction methods, protein–ligand docking, pharmacophore modeling and QSAR techniques are reviewed. PMID:28144341

  17. Computational methods in drug discovery.

    PubMed

    Leelananda, Sumudu P; Lindert, Steffen

    2016-01-01

    The process for drug discovery and development is challenging, time consuming and expensive. Computer-aided drug discovery (CADD) tools can act as a virtual shortcut, assisting in the expedition of this long process and potentially reducing the cost of research and development. Today CADD has become an effective and indispensable tool in therapeutic development. The human genome project has made available a substantial amount of sequence data that can be used in various drug discovery projects. Additionally, increasing knowledge of biological structures, as well as increasing computer power have made it possible to use computational methods effectively in various phases of the drug discovery and development pipeline. The importance of in silico tools is greater than ever before and has advanced pharmaceutical research. Here we present an overview of computational methods used in different facets of drug discovery and highlight some of the recent successes. In this review, both structure-based and ligand-based drug discovery methods are discussed. Advances in virtual high-throughput screening, protein structure prediction methods, protein-ligand docking, pharmacophore modeling and QSAR techniques are reviewed.

  18. Managing the technological edge: the UNESCO International Computation Centre and the limits to the transfer of computer technology, 1946-61.

    PubMed

    Nofre, David

    2014-07-01

    The spread of the modern computer is assumed to have been a smooth process of technology transfer. This view relies on an assessment of the open circulation of knowledge ensured by the US and British governments in the early post-war years. This article presents new historical evidence that question this view. At the centre of the article lies the ill-fated establishment of the UNESCO International Computation Centre. The project was initially conceived in 1946 to provide advanced computation capabilities to scientists of all nations. It soon became a prize sought by Western European countries like The Netherlands and Italy seeking to speed up their own national research programs. Nonetheless, as the article explains, the US government's limitations on the research function of the future centre resulted in the withdrawal of European support for the project. These limitations illustrate the extent to which US foreign science policy could operate as (stealth) industrial policy to secure a competitive technological advantage and the prospects of US manufacturers in a future European market.

  19. Louisiana: a model for advancing regional e-Research through cyberinfrastructure.

    PubMed

    Katz, Daniel S; Allen, Gabrielle; Cortez, Ricardo; Cruz-Neira, Carolina; Gottumukkala, Raju; Greenwood, Zeno D; Guice, Les; Jha, Shantenu; Kolluru, Ramesh; Kosar, Tevfik; Leger, Lonnie; Liu, Honggao; McMahon, Charlie; Nabrzyski, Jarek; Rodriguez-Milla, Bety; Seidel, Ed; Speyrer, Greg; Stubblefield, Michael; Voss, Brian; Whittenburg, Scott

    2009-06-28

    Louisiana researchers and universities are leading a concentrated, collaborative effort to advance statewide e-Research through a new cyberinfrastructure: computing systems, data storage systems, advanced instruments and data repositories, visualization environments and people, all linked together by software programs and high-performance networks. This effort has led to a set of interlinked projects that have started making a significant difference in the state, and has created an environment that encourages increased collaboration, leading to new e-Research. This paper describes the overall effort, the new projects and environment and the results to date.

  20. European Scientific Notes. Volume 35, Number 5,

    DTIC Science & Technology

    1981-05-31

    Mr. Y.S. Wu Information Systems ESN 35-5 (1981) COMPUTER Levrat himself is a fascinating Dan SCIENCE who took his doctorate at the Universitv of...fascinating Computer Science Department reports for project on computer graphics. Text nurposes of teaching and research di- processing by computer has...water batteries, of offshore winds and lighter support alkaline batterips, lead-acid systems , structures, will be carried out before metal/air batteries

  1. R&D for computational cognitive and social models : foundations for model evaluation through verification and validation (final LDRD report).

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slepoy, Alexander; Mitchell, Scott A.; Backus, George A.

    2008-09-01

    Sandia National Laboratories is investing in projects that aim to develop computational modeling and simulation applications that explore human cognitive and social phenomena. While some of these modeling and simulation projects are explicitly research oriented, others are intended to support or provide insight for people involved in high consequence decision-making. This raises the issue of how to evaluate computational modeling and simulation applications in both research and applied settings where human behavior is the focus of the model: when is a simulation 'good enough' for the goals its designers want to achieve? In this report, we discuss two years' worthmore » of review and assessment of the ASC program's approach to computational model verification and validation, uncertainty quantification, and decision making. We present a framework that extends the principles of the ASC approach into the area of computational social and cognitive modeling and simulation. In doing so, we argue that the potential for evaluation is a function of how the modeling and simulation software will be used in a particular setting. In making this argument, we move from strict, engineering and physics oriented approaches to V&V to a broader project of model evaluation, which asserts that the systematic, rigorous, and transparent accumulation of evidence about a model's performance under conditions of uncertainty is a reasonable and necessary goal for model evaluation, regardless of discipline. How to achieve the accumulation of evidence in areas outside physics and engineering is a significant research challenge, but one that requires addressing as modeling and simulation tools move out of research laboratories and into the hands of decision makers. This report provides an assessment of our thinking on ASC Verification and Validation, and argues for further extending V&V research in the physical and engineering sciences toward a broader program of model evaluation in situations of high consequence decision-making.« less

  2. HPC AND GRID COMPUTING FOR INTEGRATIVE BIOMEDICAL RESEARCH

    PubMed Central

    Kurc, Tahsin; Hastings, Shannon; Kumar, Vijay; Langella, Stephen; Sharma, Ashish; Pan, Tony; Oster, Scott; Ervin, David; Permar, Justin; Narayanan, Sivaramakrishnan; Gil, Yolanda; Deelman, Ewa; Hall, Mary; Saltz, Joel

    2010-01-01

    Integrative biomedical research projects query, analyze, and integrate many different data types and make use of datasets obtained from measurements or simulations of structure and function at multiple biological scales. With the increasing availability of high-throughput and high-resolution instruments, the integrative biomedical research imposes many challenging requirements on software middleware systems. In this paper, we look at some of these requirements using example research pattern templates. We then discuss how middleware systems, which incorporate Grid and high-performance computing, could be employed to address the requirements. PMID:20107625

  3. ENVIRONMENTAL BIOINFORMATICS AND COMPUTATIONAL TOXICOLOGY CENTER

    EPA Science Inventory

    The Center activities focused on integrating developmental efforts from the various research projects of the Center, and collaborative applications involving scientists from other institutions and EPA, to enhance research in critical areas. A representative sample of specif...

  4. Research and technology, 1984 report

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Research and technology projects in the following areas are described: cryogenic engineering, hypergolic engineering, hazardous warning instrumentation, structures and mechanics, sensors and controls, computer sciences, communications, material analysis, biomedicine, meteorology, engineering management, logistics, training and maintenance aids, and technology applications.

  5. A summary of the research program in the broad field of electronics

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Summary reports of research projects covering solid state materials, semiconductors and devices, quantum electronics, plasmas, applied electromagnetics, electrical engineering systems to include control communication, computer and power systems, biomedical engineering and mathematical biosciences.

  6. Earth System Grid II, Turning Climate Datasets into Community Resources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Middleton, Don

    2006-08-01

    The Earth System Grid (ESG) II project, funded by the Department of Energy’s Scientific Discovery through Advanced Computing program, has transformed climate data into community resources. ESG II has accomplished this goal by creating a virtual collaborative environment that links climate centers and users around the world to models and data via a computing Grid, which is based on the Department of Energy’s supercomputing resources and the Internet. Our project’s success stems from partnerships between climate researchers and computer scientists to advance basic and applied research in the terrestrial, atmospheric, and oceanic sciences. By interfacing with other climate science projects,more » we have learned that commonly used methods to manage and remotely distribute data among related groups lack infrastructure and under-utilize existing technologies. Knowledge and expertise gained from ESG II have helped the climate community plan strategies to manage a rapidly growing data environment more effectively. Moreover, approaches and technologies developed under the ESG project have impacted datasimulation integration in other disciplines, such as astrophysics, molecular biology and materials science.« less

  7. Security Approaches in Using Tablet Computers for Primary Data Collection in Clinical Research

    PubMed Central

    Wilcox, Adam B.; Gallagher, Kathleen; Bakken, Suzanne

    2013-01-01

    Next-generation tablets (iPads and Android tablets) may potentially improve the collection and management of clinical research data. The widespread adoption of tablets, coupled with decreased software and hardware costs, has led to increased consideration of tablets for primary research data collection. When using tablets for the Washington Heights/Inwood Infrastructure for Comparative Effectiveness Research (WICER) project, we found that the devices give rise to inherent security issues associated with the potential use of cloud-based data storage approaches. This paper identifies and describes major security considerations for primary data collection with tablets; proposes a set of architectural strategies for implementing data collection forms with tablet computers; and discusses the security, cost, and workflow of each strategy. The paper briefly reviews the strategies with respect to their implementation for three primary data collection activities for the WICER project. PMID:25848559

  8. Security approaches in using tablet computers for primary data collection in clinical research.

    PubMed

    Wilcox, Adam B; Gallagher, Kathleen; Bakken, Suzanne

    2013-01-01

    Next-generation tablets (iPads and Android tablets) may potentially improve the collection and management of clinical research data. The widespread adoption of tablets, coupled with decreased software and hardware costs, has led to increased consideration of tablets for primary research data collection. When using tablets for the Washington Heights/Inwood Infrastructure for Comparative Effectiveness Research (WICER) project, we found that the devices give rise to inherent security issues associated with the potential use of cloud-based data storage approaches. This paper identifies and describes major security considerations for primary data collection with tablets; proposes a set of architectural strategies for implementing data collection forms with tablet computers; and discusses the security, cost, and workflow of each strategy. The paper briefly reviews the strategies with respect to their implementation for three primary data collection activities for the WICER project.

  9. Stumping for the Project

    NASA Technical Reports Server (NTRS)

    Ginty, Carol

    2003-01-01

    Advocating research is a little trickier than selling other projects at NASA. You can point to a satellite. You can point to a rocket you can see the Shuttle and the International Space Station . But it's different on the research side . How do you display Computational Fluid Dynamics? How do you get someone to understand the value of composite materials or Nano-tubes that they can't even see without a microscope?

  10. Executable research compendia in geoscience research infrastructures

    NASA Astrophysics Data System (ADS)

    Nüst, Daniel

    2017-04-01

    From generation through analysis and collaboration to communication, scientific research requires the right tools. Scientists create their own software using third party libraries and platforms. Cloud computing, Open Science, public data infrastructures, and Open Source enable scientists with unprecedented opportunites, nowadays often in a field "Computational X" (e.g. computational seismology) or X-informatics (e.g. geoinformatics) [0]. This increases complexity and generates more innovation, e.g. Environmental Research Infrastructures (environmental RIs [1]). Researchers in Computational X write their software relying on both source code (e.g. from https://github.com) and binary libraries (e.g. from package managers such as APT, https://wiki.debian.org/Apt, or CRAN, https://cran.r-project.org/). They download data from domain specific (cf. https://re3data.org) or generic (e.g. https://zenodo.org) data repositories, and deploy computations remotely (e.g. European Open Science Cloud). The results themselves are archived, given persistent identifiers, connected to other works (e.g. using https://orcid.org/), and listed in metadata catalogues. A single researcher, intentionally or not, interacts with all sub-systems of RIs: data acquisition, data access, data processing, data curation, and community support [3]. To preserve computational research [3] proposes the Executable Research Compendium (ERC), a container format closing the gap of dependency preservation by encapsulating the runtime environment. ERCs and RIs can be integrated for different uses: (i) Coherence: ERC services validate completeness, integrity and results (ii) Metadata: ERCs connect the different parts of a piece of research and faciliate discovery (iii) Exchange and Preservation: ERC as usable building blocks are the shared and archived entity (iv) Self-consistency: ERCs remove dependence on ephemeral sources (v) Execution: ERC services create and execute a packaged analysis but integrate with existing platforms for display and control These integrations are vital for capturing workflows in RIs and connect key stakeholders (scientists, publishers, librarians). They are demonstrated using developments by the DFG-funded project Opening Reproducible Research (http://o2r.info). Semi-automatic creation of ERCs based on research workflows is a core goal of the project. References [0] Tony Hey, Stewart Tansley, Kristin Tolle (eds), 2009. The Fourth Paradigm: Data-Intensive Scientific Discovery. Microsoft Research. [1] P. Martin et al., Open Information Linking for Environmental Research Infrastructures, 2015 IEEE 11th International Conference on e-Science, Munich, 2015, pp. 513-520. doi: 10.1109/eScience.2015.66 [2] Y. Chen et al., Analysis of Common Requirements for Environmental Science Research Infrastructures, The International Symposium on Grids and Clouds (ISGC) 2013, Taipei, 2013, http://pos.sissa.it/archive/conferences/179/032/ISGC [3] Opening Reproducible Research, Geophysical Research Abstracts Vol. 18, EGU2016-7396, 2016, http://meetingorganizer.copernicus.org/EGU2016/EGU2016-7396.pdf

  11. Computer Architecture for Energy Efficient SFQ

    DTIC Science & Technology

    2014-08-27

    IBM Corporation (T.J. Watson Research Laboratory) 1101 Kitchawan Road Yorktown Heights, NY 10598 -0000 2 ABSTRACT Number of Papers published in peer...accomplished during this ARO-sponsored project at IBM Research to identify and model an energy efficient SFQ-based computer architecture. The... IBM Windsor Blue (WB), illustrated schematically in Figure 2. The basic building block of WB is a "tile" comprised of a 64-bit arithmetic logic unit

  12. Bringing Computational Thinking into the High School Science and Math Classroom

    NASA Astrophysics Data System (ADS)

    Trouille, Laura; Beheshti, E.; Horn, M.; Jona, K.; Kalogera, V.; Weintrop, D.; Wilensky, U.; University CT-STEM Project, Northwestern; University CenterTalent Development, Northwestern

    2013-01-01

    Computational thinking (for example, the thought processes involved in developing algorithmic solutions to problems that can then be automated for computation) has revolutionized the way we do science. The Next Generation Science Standards require that teachers support their students’ development of computational thinking and computational modeling skills. As a result, there is a very high demand among teachers for quality materials. Astronomy provides an abundance of opportunities to support student development of computational thinking skills. Our group has taken advantage of this to create a series of astronomy-based computational thinking lesson plans for use in typical physics, astronomy, and math high school classrooms. This project is funded by the NSF Computing Education for the 21st Century grant and is jointly led by Northwestern University’s Center for Interdisciplinary Exploration and Research in Astrophysics (CIERA), the Computer Science department, the Learning Sciences department, and the Office of STEM Education Partnerships (OSEP). I will also briefly present the online ‘Astro Adventures’ courses for middle and high school students I have developed through NU’s Center for Talent Development. The online courses take advantage of many of the amazing online astronomy enrichment materials available to the public, including a range of hands-on activities and the ability to take images with the Global Telescope Network. The course culminates with an independent computational research project.

  13. 1999 LDRD Laboratory Directed Research and Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rita Spencer; Kyle Wheeler

    This is the FY 1999 Progress Report for the Laboratory Directed Research and Development (LDRD) Program at Los Alamos National Laboratory. It gives an overview of the LDRD Program, summarizes work done on individual research projects, relates the projects to major Laboratory program sponsors, and provides an index to the principal investigators. Project summaries are grouped by their LDRD component: Competency Development, Program Development, and Individual Projects. Within each component, they are further grouped into nine technical categories: (1) materials science, (2) chemistry, (3) mathematics and computational science, (4) atomic, molecular, optical, and plasma physics, fluids, and particle beams, (5)more » engineering science, (6) instrumentation and diagnostics, (7) geoscience, space science, and astrophysics, (8) nuclear and particle physics, and (9) bioscience.« less

  14. Laboratory Directed Research and Development FY 1998 Progress Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    John Vigil; Kyle Wheeler

    This is the FY 1998 Progress Report for the Laboratory Directed Research and Development (LDRD) Program at Los Alamos National Laboratory. It gives an overview of the LDRD Program, summarizes work done on individual research projects, relates the projects to major Laboratory program sponsors, and provides an index to the principle investigators. Project summaries are grouped by their LDRD component: Competency Development, Program Development, and Individual Projects. Within each component, they are further grouped into nine technical categories: (1) materials science, (2) chemistry, (3) mathematics and computational science, (4) atomic, molecular, optical, and plasma physics, fluids, and particle beams, (5)more » engineering science, (6) instrumentation and diagnostics, (7) geoscience, space science, and astrophysics, (8) nuclear and particle physics, and (9) bioscience.« less

  15. Laboratory directed research and development: FY 1997 progress report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vigil, J.; Prono, J.

    1998-05-01

    This is the FY 1997 Progress Report for the Laboratory Directed Research and Development (LDRD) program at Los Alamos National Laboratory. It gives an overview of the LDRD program, summarizes work done on individual research projects, relates the projects to major Laboratory program sponsors, and provides an index to the principal investigators. Project summaries are grouped by their LDRD component: Competency Development, Program Development, and Individual Projects. Within each component, they are further grouped into nine technical categories: (1) materials science, (2) chemistry, (3) mathematics and computational science, (4) atomic and molecular physics and plasmas, fluids, and particle beams, (5)more » engineering science, (6) instrumentation and diagnostics, (7) geoscience, space science, and astrophysics, (8) nuclear and particle physics, and (9) bioscience.« less

  16. Coherent Lidar Activities at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Kavaya, Michael J.; Amzajerdian, Farzin; Koch, Grady J.; Singh, Upendra N.; Yu, Jirong

    2007-01-01

    NASA Langley Research Center has been developing and using coherent lidar systems for many years. The current projects at LaRC are the Global Wind Observing Sounder (GWOS) mission preparation, the Laser Risk Reduction Program (LRRP), the Instrument Incubator Program (IIP) compact, rugged Doppler wind lidar project, the Autonomous precision Landing and Hazard detection and Avoidance Technology (ALHAT) project for lunar landing, and the Skywalker project to find and use thermals to extend UAV flight time. These five projects encompass coherent lidar technology development; characterization, validation, and calibration facilities; compact, rugged packaging; computer simulation; trade studies; data acquisition, processing, and display development; system demonstration; and space mission design. This paper will further discuss these activities at LaRC.

  17. Final Technical Report for Years 1-4 of the Early Career Research Project "Viscosity and equation of state of hot and dense QCD matter" - ARRA portion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Molnar, Denes

    2014-04-14

    The Section below summarizes research activities and achievements during the first four years of the PI’s Early Career Research Project (ECRP). Two main areas have been advanced: i) radiative 3 ↔ 2 radiative transport, via development of a new computer code MPC/Grid that solves the Boltzmann transport equation in full 6+1D (3X+3V+time) on both single-CPU and parallel computers; ii) development of a self-consistent framework to convert viscous fluids to particles, and application of this framework to relativistic heavy-ion collisions, in particular, determination of the shear viscosity. Year 5 of the ECRP is under a separate award number, and therefore itmore » has its own report document ’Final Technical Report for Year 5 of the Early Career Research Project “Viscosity and equation of state of hot and dense QCDmatter”’ (award DE-SC0008028). The PI’s group was also part of the DOE JET Topical Collaboration, a multi-institution project that overlapped in time significantly with the ECRP. Purdue achievements as part of the JET Topical Collaboration are in a separate report “Final Technical Report summarizing Purdue research activities as part of the DOE JET Topical Collaboration” (award DE-SC0004077).« less

  18. Computer Assisted Virtual Environment - CAVE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erickson, Phillip; Podgorney, Robert; Weingartner,

    Research at the Center for Advanced Energy Studies is taking on another dimension with a 3-D device known as a Computer Assisted Virtual Environment. The CAVE uses projection to display high-end computer graphics on three walls and the floor. By wearing 3-D glasses to create depth perception and holding a wand to move and rotate images, users can delve into data.

  19. Computer Assisted Virtual Environment - CAVE

    ScienceCinema

    Erickson, Phillip; Podgorney, Robert; Weingartner,

    2018-05-30

    Research at the Center for Advanced Energy Studies is taking on another dimension with a 3-D device known as a Computer Assisted Virtual Environment. The CAVE uses projection to display high-end computer graphics on three walls and the floor. By wearing 3-D glasses to create depth perception and holding a wand to move and rotate images, users can delve into data.

  20. Introducing Computer Education into an Early Elementary Curriculum.

    ERIC Educational Resources Information Center

    Jaworski, Anne Porter; Brummel, Brenda

    In addition to reviewing the literature on the pros and cons of computer use in the schools, this document reports the results of a research project in which 13 pairs of first graders learned to use the LOGO computer language over a 10-week period as part of their classroom activities. The first two chapters discuss the overall question of…

  1. STARS: An Integrated, Multidisciplinary, Finite-Element, Structural, Fluids, Aeroelastic, and Aeroservoelastic Analysis Computer Program

    NASA Technical Reports Server (NTRS)

    Gupta, K. K.

    1997-01-01

    A multidisciplinary, finite element-based, highly graphics-oriented, linear and nonlinear analysis capability that includes such disciplines as structures, heat transfer, linear aerodynamics, computational fluid dynamics, and controls engineering has been achieved by integrating several new modules in the original STARS (STructural Analysis RoutineS) computer program. Each individual analysis module is general-purpose in nature and is effectively integrated to yield aeroelastic and aeroservoelastic solutions of complex engineering problems. Examples of advanced NASA Dryden Flight Research Center projects analyzed by the code in recent years include the X-29A, F-18 High Alpha Research Vehicle/Thrust Vectoring Control System, B-52/Pegasus Generic Hypersonics, National AeroSpace Plane (NASP), SR-71/Hypersonic Launch Vehicle, and High Speed Civil Transport (HSCT) projects. Extensive graphics capabilities exist for convenient model development and postprocessing of analysis results. The program is written in modular form in standard FORTRAN language to run on a variety of computers, such as the IBM RISC/6000, SGI, DEC, Cray, and personal computer; associated graphics codes use OpenGL and IBM/graPHIGS language for color depiction. This program is available from COSMIC, the NASA agency for distribution of computer programs.

  2. CAMCE: An Environment to Support Multimedia Courseware Projects.

    ERIC Educational Resources Information Center

    Barrese, R. M.; And Others

    1992-01-01

    Presents results of CAMCE (Computer-Aided Multimedia Courseware Engineering) project research concerned with definition of a methodology to describe a systematic approach for multimedia courseware development. Discussion covers the CAMCE methodology, requirements of an advanced authoring environment, use of an object-based model in the CAMCE…

  3. Bioinformatics by Example: From Sequence to Target

    NASA Astrophysics Data System (ADS)

    Kossida, Sophia; Tahri, Nadia; Daizadeh, Iraj

    2002-12-01

    With the completion of the human genome, and the imminent completion of other large-scale sequencing and structure-determination projects, computer-assisted bioscience is aimed to become the new paradigm for conducting basic and applied research. The presence of these additional bioinformatics tools stirs great anxiety for experimental researchers (as well as for pedagogues), since they are now faced with a wider and deeper knowledge of differing disciplines (biology, chemistry, physics, mathematics, and computer science). This review targets those individuals who are interested in using computational methods in their teaching or research. By analyzing a real-life, pharmaceutical, multicomponent, target-based example the reader will experience this fascinating new discipline.

  4. Particle-Based Simulations of Microscopic Thermal Properties of Confined Systems

    DTIC Science & Technology

    2014-11-01

    velocity versus electric field in gallium arsenide (GaAs) computed with the original CMC table structure (squares) at temperature T=150K, and the new...computer-aided design Cellular Monte Carlo Ensemble Monte Carlo gallium arsenide Heat Transport Equation DARPA Defense Advanced Research Projects

  5. Ballistic Deflection Transistors for THz Amplification

    DTIC Science & Technology

    2016-05-09

    Computer Engineering Rabi Sherstha 0.00 Electrical and Computer Engineering 0.07 7 NAME Total Number: Grahan Jensen Fei Song 2 ...... ...... Sub... Rabi Sherstha spent entire summers working on ARO-related projects under the supervision of Prof. Sobolewski. They were supported by the UR Undergraduate Research Discover Program.

  6. Cloud Infrastructure & Applications - CloudIA

    NASA Astrophysics Data System (ADS)

    Sulistio, Anthony; Reich, Christoph; Doelitzscher, Frank

    The idea behind Cloud Computing is to deliver Infrastructure-as-a-Services and Software-as-a-Service over the Internet on an easy pay-per-use business model. To harness the potentials of Cloud Computing for e-Learning and research purposes, and to small- and medium-sized enterprises, the Hochschule Furtwangen University establishes a new project, called Cloud Infrastructure & Applications (CloudIA). The CloudIA project is a market-oriented cloud infrastructure that leverages different virtualization technologies, by supporting Service-Level Agreements for various service offerings. This paper describes the CloudIA project in details and mentions our early experiences in building a private cloud using an existing infrastructure.

  7. Computational mechanics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raboin, P J

    1998-01-01

    The Computational Mechanics thrust area is a vital and growing facet of the Mechanical Engineering Department at Lawrence Livermore National Laboratory (LLNL). This work supports the development of computational analysis tools in the areas of structural mechanics and heat transfer. Over 75 analysts depend on thrust area-supported software running on a variety of computing platforms to meet the demands of LLNL programs. Interactions with the Department of Defense (DOD) High Performance Computing and Modernization Program and the Defense Special Weapons Agency are of special importance as they support our ParaDyn project in its development of new parallel capabilities for DYNA3D.more » Working with DOD customers has been invaluable to driving this technology in directions mutually beneficial to the Department of Energy. Other projects associated with the Computational Mechanics thrust area include work with the Partnership for a New Generation Vehicle (PNGV) for ''Springback Predictability'' and with the Federal Aviation Administration (FAA) for the ''Development of Methodologies for Evaluating Containment and Mitigation of Uncontained Engine Debris.'' In this report for FY-97, there are five articles detailing three code development activities and two projects that synthesized new code capabilities with new analytic research in damage/failure and biomechanics. The article this year are: (1) Energy- and Momentum-Conserving Rigid-Body Contact for NIKE3D and DYNA3D; (2) Computational Modeling of Prosthetics: A New Approach to Implant Design; (3) Characterization of Laser-Induced Mechanical Failure Damage of Optical Components; (4) Parallel Algorithm Research for Solid Mechanics Applications Using Finite Element Analysis; and (5) An Accurate One-Step Elasto-Plasticity Algorithm for Shell Elements in DYNA3D.« less

  8. Chemical research projects office: An overview and bibliography, 1975-1980

    NASA Technical Reports Server (NTRS)

    Kourtides, D. A.; Heimbuch, A. H.; Parker, J. A.

    1980-01-01

    The activities of the Chemical Research Projects Office at Ames Research Center, Moffett Field, California are reported. The office conducts basic and applied research in the fields of polymer chemistry, computational chemistry, polymer physics, and physical and organic chemistry. It works to identify the chemical research and technology required for solutions to problems of national urgency, synchronous with the aeronautic and space effort. It conducts interdisciplinary research on chemical problems, mainly in areas of macromolecular science and fire research. The office also acts as liaison with the engineering community and assures that relevant technology is made available to other NASA centers, agencies, and industry. Recent accomplishments are listed in this report. Activities of the three research groups, Polymer Research, Aircraft Operating and Safety, and Engineering Testing, are summarized. A complete bibliography which lists all Chemical Research Projects Office publications, contracts, grants, patents, and presentations from 1975 to 1980 is included.

  9. XPRESS: eXascale PRogramming Environment and System Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brightwell, Ron; Sterling, Thomas; Koniges, Alice

    The XPRESS Project is one of four major projects of the DOE Office of Science Advanced Scientific Computing Research X-stack Program initiated in September, 2012. The purpose of XPRESS is to devise an innovative system software stack to enable practical and useful exascale computing around the end of the decade with near-term contributions to efficient and scalable operation of trans-Petaflops performance systems in the next two to three years; both for DOE mission-critical applications. To this end, XPRESS directly addresses critical challenges in computing of efficiency, scalability, and programmability through introspective methods of dynamic adaptive resource management and task scheduling.

  10. Turbine Internal and Film Cooling Modeling For 3D Navier-Stokes Codes

    NASA Technical Reports Server (NTRS)

    DeWitt, Kenneth; Garg Vijay; Ameri, Ali

    2005-01-01

    The aim of this research project is to make use of NASA Glenn on-site computational facilities in order to develop, validate and apply aerodynamic, heat transfer, and turbine cooling models for use in advanced 3D Navier-Stokes Computational Fluid Dynamics (CFD) codes such as the Glenn-" code. Specific areas of effort include: Application of the Glenn-HT code to specific configurations made available under Turbine Based Combined Cycle (TBCC), and Ultra Efficient Engine Technology (UEET) projects. Validating the use of a multi-block code for the time accurate computation of the detailed flow and heat transfer of cooled turbine airfoils. The goal of the current research is to improve the predictive ability of the Glenn-HT code. This will enable one to design more efficient turbine components for both aviation and power generation. The models will be tested against specific configurations provided by NASA Glenn.

  11. Technologies for Large Data Management in Scientific Computing

    NASA Astrophysics Data System (ADS)

    Pace, Alberto

    2014-01-01

    In recent years, intense usage of computing has been the main strategy of investigations in several scientific research projects. The progress in computing technology has opened unprecedented opportunities for systematic collection of experimental data and the associated analysis that were considered impossible only few years ago. This paper focuses on the strategies in use: it reviews the various components that are necessary for an effective solution that ensures the storage, the long term preservation, and the worldwide distribution of large quantities of data that are necessary in a large scientific research project. The paper also mentions several examples of data management solutions used in High Energy Physics for the CERN Large Hadron Collider (LHC) experiments in Geneva, Switzerland which generate more than 30,000 terabytes of data every year that need to be preserved, analyzed, and made available to a community of several tenth of thousands scientists worldwide.

  12. Peculiarities of organization of project and research activity of students in computer science, physics and technology

    NASA Astrophysics Data System (ADS)

    Stolyarov, I. V.

    2017-01-01

    The author of this article manages a project and research activity of students in the areas of computer science, physics, engineering and biology, basing on the acquired experience in these fields. Pupils constantly become winners of competitions and conferences of different levels, for example, three of the finalists of Intel ISEF in 2013 in Phoenix (Arizona, USA) and in 2014 in Los Angeles (California, USA). In 2013 A. Makarychev received the "Small Nobel prize" in Computer Science section and special award sponsors - the company's CAST. Scientific themes and methods suggested by the author and developed in joint publications of students from Russia, Germany and Austria are the patents for invention and certificates for registration in the ROSPATENT. The article presents the results of the implementation of specific software and hardware systems in physics, engineering and medicine.

  13. Inventory of Federal energy-related environment and safety research for FY 1978. Volume II. Project listings and indexes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1979-12-01

    This volume contains summaries of FY-1978 government-sponsored environment and safety research related to energy. Project summaries were collected by Aerospace Corporation under contract with the Department of Energy, Office of Program Coordination, under the Assistant Secretary for Environment. Summaries are arranged by log number, which groups the projects by reporting agency. The log number is a unique number assigned to each project from a block of numbers set aside for each agency. Information about the projects is included in the summary listings. This includes the project title, principal investigators, research organization, project number, contract number, supporting organization, funding level ifmore » known, related energy sources with numbers indicating percentages of effort devoted to each, and R and D categories. A brief description of each project is given, and this is followed by subject index terms that were assigned for computer searching and for generating the printed subject index in Volume IV.« less

  14. Programs for attracting under-represented minority students to graduate school and research careers in computational science. Final report for period October 1, 1995 - September 30, 1997

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turner, James C. Jr.; Mason, Thomas; Guerrieri, Bruno

    1997-10-01

    Programs have been established at Florida A & M University to attract minority students to research careers in mathematics and computational science. The primary goal of the program was to increase the number of such students studying computational science via an interactive multimedia learning environment One mechanism used for meeting this goal was the development of educational modules. This academic year program established within the mathematics department at Florida A&M University, introduced students to computational science projects using high-performance computers. Additional activities were conducted during the summer, these included workshops, meetings, and lectures. Through the exposure provided by this programmore » to scientific ideas and research in computational science, it is likely that their successful applications of tools from this interdisciplinary field will be high.« less

  15. The Research and Evaluation of Serious Games: Toward a Comprehensive Methodology

    ERIC Educational Resources Information Center

    Mayer, Igor; Bekebrede, Geertje; Harteveld, Casper; Warmelink, Harald; Zhou, Qiqi; van Ruijven, Theo; Lo, Julia; Kortmann, Rens; Wenzler, Ivo

    2014-01-01

    The authors present the methodological background to and underlying research design of an ongoing research project on the scientific evaluation of serious games and/or computer-based simulation games (SGs) for advanced learning. The main research questions are: (1) what are the requirements and design principles for a comprehensive social…

  16. Final Report for "Implimentation and Evaluation of Multigrid Linear Solvers into Extended Magnetohydrodynamic Codes for Petascale Computing"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Srinath Vadlamani; Scott Kruger; Travis Austin

    Extended magnetohydrodynamic (MHD) codes are used to model the large, slow-growing instabilities that are projected to limit the performance of International Thermonuclear Experimental Reactor (ITER). The multiscale nature of the extended MHD equations requires an implicit approach. The current linear solvers needed for the implicit algorithm scale poorly because the resultant matrices are so ill-conditioned. A new solver is needed, especially one that scales to the petascale. The most successful scalable parallel processor solvers to date are multigrid solvers. Applying multigrid techniques to a set of equations whose fundamental modes are dispersive waves is a promising solution to CEMM problems.more » For the Phase 1, we implemented multigrid preconditioners from the HYPRE project of the Center for Applied Scientific Computing at LLNL via PETSc of the DOE SciDAC TOPS for the real matrix systems of the extended MHD code NIMROD which is a one of the primary modeling codes of the OFES-funded Center for Extended Magnetohydrodynamic Modeling (CEMM) SciDAC. We implemented the multigrid solvers on the fusion test problem that allows for real matrix systems with success, and in the process learned about the details of NIMROD data structures and the difficulties of inverting NIMROD operators. The further success of this project will allow for efficient usage of future petascale computers at the National Leadership Facilities: Oak Ridge National Laboratory, Argonne National Laboratory, and National Energy Research Scientific Computing Center. The project will be a collaborative effort between computational plasma physicists and applied mathematicians at Tech-X Corporation, applied mathematicians Front Range Scientific Computations, Inc. (who are collaborators on the HYPRE project), and other computational plasma physicists involved with the CEMM project.« less

  17. Computational modeling of Radioisotope Thermoelectric Generators (RTG) for interplanetary and deep space travel

    NASA Astrophysics Data System (ADS)

    Nejat, Cyrus; Nejat, Narsis; Nejat, Najmeh

    2014-06-01

    This research project is part of Narsis Nejat Master of Science thesis project that it is done at Shiraz University. The goals of this research are to make a computer model to evaluate the thermal power, electrical power, amount of emitted/absorbed dose, and amount of emitted/absorbed dose rate for static Radioisotope Thermoelectric Generators (RTG)s that is include a comprehensive study of the types of RTG systems and in particular RTG’s fuel resulting from both natural and artificial isotopes, calculation of the permissible dose radioisotope selected from the above, and conceptual design modeling and comparison between several NASA made RTGs with the project computer model pointing out the strong and weakness points for using this model in nuclear industries for simulation. The heat is being converted to electricity by two major methods in RTGs: static conversion and dynamic conversion. The model that is created for this project is for RTGs that heat is being converted to electricity statically. The model approximates good results as being compared with SNAP-3, SNAP-19, MHW, and GPHS RTGs in terms of electrical power, efficiency, specific power, and types of the mission and amount of fuel mass that is required to accomplish the mission.

  18. Geomechanical/Geochemical Modeling Studies Conducted within theInternational DECOVALEX Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Birkholzer, J.T.; Rutqvist, J.; Sonnenthal, E.L.

    2005-10-19

    The DECOVALEX project is an international cooperative project initiated by SKI, the Swedish Nuclear Power Inspectorate, with participation of about 10 international organizations. The general goal of this project is to encourage multidisciplinary interactive and cooperative research on modeling coupled thermo-hydro-mechanical-chemical (THMC) processes in geologic formations in support of the performance assessment for underground storage of radioactive waste. One of the research tasks, initiated in 2004 by the U.S. Department of Energy (DOE), addresses the long-term impact of geomechanical and geochemical processes on the flow conditions near waste emplacement tunnels. Within this task, four international research teams conduct predictive analysismore » of the coupled processes in two generic repositories, using multiple approaches and different computer codes. Below, we give an overview of the research task and report its current status.« less

  19. Geomechanical/ Geochemical Modeling Studies onducted Within the International DECOVALEX Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J.T. Birkholzer; J. Rutqvist; E.L. Sonnenthal

    2006-02-01

    The DECOVALEX project is an international cooperative project initiated by SKI, the Swedish Nuclear Power Inspectorate, with participation of about 10 international organizations. The general goal of this project is to encourage multidisciplinary interactive and cooperative research on modeling coupled thermo-hydro-mechanical-chemical (THMC) processes in geologic formations in support of the performance assessment for underground storage of radioactive waste. One of the research tasks, initiated in 2004 by the U.S. Department of Energy (DOE), addresses the long-term impact of geomechanical and geochemical processes on the flow conditions near waste emplacement tunnels. Within this task, four international research teams conduct predictive analysismore » of the coupled processes in two generic repositories, using multiple approaches and different computer codes. Below, we give an overview of the research task and report its current status.« less

  20. Improving student retention in computer engineering technology

    NASA Astrophysics Data System (ADS)

    Pierozinski, Russell Ivan

    The purpose of this research project was to improve student retention in the Computer Engineering Technology program at the Northern Alberta Institute of Technology by reducing the number of dropouts and increasing the graduation rate. This action research project utilized a mixed methods approach of a survey and face-to-face interviews. The participants were male and female, with a large majority ranging from 18 to 21 years of age. The research found that participants recognized their skills and capability, but their capacity to remain in the program was dependent on understanding and meeting the demanding pace and rigour of the program. The participants recognized that curriculum delivery along with instructor-student interaction had an impact on student retention. To be successful in the program, students required support in four domains: academic, learning management, career, and social.

  1. India's Computational Biology Growth and Challenges.

    PubMed

    Chakraborty, Chiranjib; Bandyopadhyay, Sanghamitra; Agoramoorthy, Govindasamy

    2016-09-01

    India's computational science is growing swiftly due to the outburst of internet and information technology services. The bioinformatics sector of India has been transforming rapidly by creating a competitive position in global bioinformatics market. Bioinformatics is widely used across India to address a wide range of biological issues. Recently, computational researchers and biologists are collaborating in projects such as database development, sequence analysis, genomic prospects and algorithm generations. In this paper, we have presented the Indian computational biology scenario highlighting bioinformatics-related educational activities, manpower development, internet boom, service industry, research activities, conferences and trainings undertaken by the corporate and government sectors. Nonetheless, this new field of science faces lots of challenges.

  2. Radio Synthesis Imaging - A High Performance Computing and Communications Project

    NASA Astrophysics Data System (ADS)

    Crutcher, Richard M.

    The National Science Foundation has funded a five-year High Performance Computing and Communications project at the National Center for Supercomputing Applications (NCSA) for the direct implementation of several of the computing recommendations of the Astronomy and Astrophysics Survey Committee (the "Bahcall report"). This paper is a summary of the project goals and a progress report. The project will implement a prototype of the next generation of astronomical telescope systems - remotely located telescopes connected by high-speed networks to very high performance, scalable architecture computers and on-line data archives, which are accessed by astronomers over Gbit/sec networks. Specifically, a data link has been installed between the BIMA millimeter-wave synthesis array at Hat Creek, California and NCSA at Urbana, Illinois for real-time transmission of data to NCSA. Data are automatically archived, and may be browsed and retrieved by astronomers using the NCSA Mosaic software. In addition, an on-line digital library of processed images will be established. BIMA data will be processed on a very high performance distributed computing system, with I/O, user interface, and most of the software system running on the NCSA Convex C3880 supercomputer or Silicon Graphics Onyx workstations connected by HiPPI to the high performance, massively parallel Thinking Machines Corporation CM-5. The very computationally intensive algorithms for calibration and imaging of radio synthesis array observations will be optimized for the CM-5 and new algorithms which utilize the massively parallel architecture will be developed. Code running simultaneously on the distributed computers will communicate using the Data Transport Mechanism developed by NCSA. The project will also use the BLANCA Gbit/s testbed network between Urbana and Madison, Wisconsin to connect an Onyx workstation in the University of Wisconsin Astronomy Department to the NCSA CM-5, for development of long-distance distributed computing. Finally, the project is developing 2D and 3D visualization software as part of the international AIPS++ project. This research and development project is being carried out by a team of experts in radio astronomy, algorithm development for massively parallel architectures, high-speed networking, database management, and Thinking Machines Corporation personnel. The development of this complete software, distributed computing, and data archive and library solution to the radio astronomy computing problem will advance our expertise in high performance computing and communications technology and the application of these techniques to astronomical data processing.

  3. Introducing Enabling Computational Tools to the Climate Sciences: Multi-Resolution Climate Modeling with Adaptive Cubed-Sphere Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jablonowski, Christiane

    The research investigates and advances strategies how to bridge the scale discrepancies between local, regional and global phenomena in climate models without the prohibitive computational costs of global cloud-resolving simulations. In particular, the research explores new frontiers in computational geoscience by introducing high-order Adaptive Mesh Refinement (AMR) techniques into climate research. AMR and statically-adapted variable-resolution approaches represent an emerging trend for atmospheric models and are likely to become the new norm in future-generation weather and climate models. The research advances the understanding of multi-scale interactions in the climate system and showcases a pathway how to model these interactions effectively withmore » advanced computational tools, like the Chombo AMR library developed at the Lawrence Berkeley National Laboratory. The research is interdisciplinary and combines applied mathematics, scientific computing and the atmospheric sciences. In this research project, a hierarchy of high-order atmospheric models on cubed-sphere computational grids have been developed that serve as an algorithmic prototype for the finite-volume solution-adaptive Chombo-AMR approach. The foci of the investigations have lied on the characteristics of both static mesh adaptations and dynamically-adaptive grids that can capture flow fields of interest like tropical cyclones. Six research themes have been chosen. These are (1) the introduction of adaptive mesh refinement techniques into the climate sciences, (2) advanced algorithms for nonhydrostatic atmospheric dynamical cores, (3) an assessment of the interplay between resolved-scale dynamical motions and subgrid-scale physical parameterizations, (4) evaluation techniques for atmospheric model hierarchies, (5) the comparison of AMR refinement strategies and (6) tropical cyclone studies with a focus on multi-scale interactions and variable-resolution modeling. The results of this research project demonstrate significant advances in all six research areas. The major conclusions are that statically-adaptive variable-resolution modeling is currently becoming mature in the climate sciences, and that AMR holds outstanding promise for future-generation weather and climate models on high-performance computing architectures.« less

  4. WE-AB-303-09: Rapid Projection Computations for On-Board Digital Tomosynthesis in Radiation Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iliopoulos, AS; Sun, X; Pitsianis, N

    2015-06-15

    Purpose: To facilitate fast and accurate iterative volumetric image reconstruction from limited-angle on-board projections. Methods: Intrafraction motion hinders the clinical applicability of modern radiotherapy techniques, such as lung stereotactic body radiation therapy (SBRT). The LIVE system may impact clinical practice by recovering volumetric information via Digital Tomosynthesis (DTS), thus entailing low time and radiation dose for image acquisition during treatment. The DTS is estimated as a deformation of prior CT via iterative registration with on-board images; this shifts the challenge to the computational domain, owing largely to repeated projection computations across iterations. We address this issue by composing efficient digitalmore » projection operators from their constituent parts. This allows us to separate the static (projection geometry) and dynamic (volume/image data) parts of projection operations by means of pre-computations, enabling fast on-board processing, while also relaxing constraints on underlying numerical models (e.g. regridding interpolation kernels). Further decoupling the projectors into simpler ones ensures the incurred memory overhead remains low, within the capacity of a single GPU. These operators depend only on the treatment plan and may be reused across iterations and patients. The dynamic processing load is kept to a minimum and maps well to the GPU computational model. Results: We have integrated efficient, pre-computable modules for volumetric ray-casting and FDK-based back-projection with the LIVE processing pipeline. Our results show a 60x acceleration of the DTS computations, compared to the previous version, using a single GPU; presently, reconstruction is attained within a couple of minutes. The present implementation allows for significant flexibility in terms of the numerical and operational projection model; we are investigating the benefit of further optimizations and accurate digital projection sub-kernels. Conclusion: Composable projection operators constitute a versatile research tool which can greatly accelerate iterative registration algorithms and may be conducive to the clinical applicability of LIVE. National Institutes of Health Grant No. R01-CA184173; GPU donation by NVIDIA Corporation.« less

  5. Navy Career Education Diffusion Project: State of Oregon. Final Report.

    ERIC Educational Resources Information Center

    McDermott, Michael M.

    The final report describes a project to research, develop, and field test Navy occupational information for inclusion into the Oregon Career Information System (CIS), a computer-assisted career education program. Five sections include: (1) introductory information; (2) a discussion of the preparation of Navy occupational information and reviewing…

  6. Aeroelasticity of wing and wing-body configurations on parallel computers

    NASA Technical Reports Server (NTRS)

    Byun, Chansup

    1995-01-01

    The objective of this research is to develop computationally efficient methods for solving aeroelasticity problems on parallel computers. Both uncoupled and coupled methods are studied in this research. For the uncoupled approach, the conventional U-g method is used to determine the flutter boundary. The generalized aerodynamic forces required are obtained by the pulse transfer-function analysis method. For the coupled approach, the fluid-structure interaction is obtained by directly coupling finite difference Euler/Navier-Stokes equations for fluids and finite element dynamics equations for structures. This capability will significantly impact many aerospace projects of national importance such as Advanced Subsonic Civil Transport (ASCT), where the structural stability margin becomes very critical at the transonic region. This research effort will have direct impact on the High Performance Computing and Communication (HPCC) Program of NASA in the area of parallel computing.

  7. Computational modeling in the optimization of corrosion control to reduce lead in drinking water

    EPA Science Inventory

    An international “proof-of-concept” research project (UK, US, CA) will present its findings during this presentation. An established computational modeling system developed in the UK is being calibrated and validated in U.S. and Canadian case studies. It predicts LCR survey resul...

  8. Computer-Assisted Learning in Language Arts

    ERIC Educational Resources Information Center

    Serwer, Blanche L.; Stolurow, Lawrence M.

    1970-01-01

    A description of computer program segments in the feasibility and development phase of Operationally Relevant Activities for Children's Language Experience (Project ORACLE); original form of this paper was prepared by Serwer for presentation to annual meeting of New England Research Association (1st, Boston College, June 5-6, 1969). (Authors/RD)

  9. Commentary: New Technologies on the Horizon for Teaching

    ERIC Educational Resources Information Center

    Parslow, Graham R.

    2013-01-01

    A well-researched report has listed the technologies that should increasingly feature in teaching. It is projected that in the coming year there will be increased use of cloud computing, mobile applications, social exchanges, and tablet computing. The New Media Consortium (NMC) that produced the report is an international association of…

  10. Computers in Knowledge-Based Fields.

    ERIC Educational Resources Information Center

    Myers, Charles A.

    Last in a series of research projects on the implications of technological change and automation, this study is concerned with the use of computers in formal education and educational administration; in library systems and subsystems; in legal, legislative, and related services; in medical and hospital services; and in national and centralized…

  11. Laptop Computers in the Elementary Classroom: Authentic Instruction with At-Risk Students

    ERIC Educational Resources Information Center

    Kemker, Kate; Barron, Ann E.; Harmes, J. Christine

    2007-01-01

    This case study investigated the integration of laptop computers into an elementary classroom in a low socioeconomic status (SES) school. Specifically, the research examined classroom management techniques and aspects of authentic learning relative to the student projects and activities. A mixed methods approach included classroom observations,…

  12. CYCLOPS-3 System Research.

    ERIC Educational Resources Information Center

    Marill, Thomas; And Others

    The aim of the CYCLOPS Project research is the development of techniques for allowing computers to perform visual scene analysis, pre-processing of visual imagery, and perceptual learning. Work on scene analysis and learning has previously been described. The present report deals with research on pre-processing and with further work on scene…

  13. A Framework for Teaching Software Development Methods

    ERIC Educational Resources Information Center

    Dubinsky, Yael; Hazzan, Orit

    2005-01-01

    This article presents a study that aims at constructing a teaching framework for software development methods in higher education. The research field is a capstone project-based course, offered by the Technion's Department of Computer Science, in which Extreme Programming is introduced. The research paradigm is an Action Research that involves…

  14. Research Symposium I

    NASA Technical Reports Server (NTRS)

    2004-01-01

    The proceedings of this symposium consist of abstracts of talks presented by interns at NASA Glenn Research Center (GRC). The interns assisted researchers at GRC in projects which primarily address the following topics: aircraft engines and propulsion, spacecraft propulsion, fuel cells, thin film photovoltaic cells, aerospace materials, computational fluid dynamics, aircraft icing, management, and computerized simulation.

  15. Why Things Are So Bad for the Computer-Naive User

    DTIC Science & Technology

    1975-03-01

    knowledge of human communication that we need. Many of the things that people do in communication, including the entire list indicated above, are not...making direct use of computers feasible and comfortable for broad classes of people, research in modeling human communication processes deserves a far...higher national priority than it currently h<is. There are a few active research projects that are building the right kinds of rodels of human

  16. Aircraft Engine Technology for Green Aviation to Reduce Fuel Burn

    NASA Technical Reports Server (NTRS)

    Hughes, Christopher E.; VanZante, Dale E.; Heidmann, James D.

    2013-01-01

    The NASA Fundamental Aeronautics Program Subsonic Fixed Wing Project and Integrated Systems Research Program Environmentally Responsible Aviation Project in the Aeronautics Research Mission Directorate are conducting research on advanced aircraft technology to address the environmental goals of reducing fuel burn, noise and NOx emissions for aircraft in 2020 and beyond. Both Projects, in collaborative partnerships with U.S. Industry, Academia, and other Government Agencies, have made significant progress toward reaching the N+2 (2020) and N+3 (beyond 2025) installed fuel burn goals by fundamental aircraft engine technology development, subscale component experimental investigations, full scale integrated systems validation testing, and development validation of state of the art computation design and analysis codes. Specific areas of propulsion technology research are discussed and progress to date.

  17. Embracing Statistical Challenges in the Information Technology Age

    DTIC Science & Technology

    2006-01-01

    computation and feature selection. Moreover, two research projects on network tomography and arctic cloud detection are used throughout the paper to bring...prominent Network Tomography problem, origin- destination (OD) traffic estimation. It demonstrates well how the two modes of data collection interact...software debugging (Biblit et al, 2005 [2]), and network tomography for computer network management. Computer sys- tem problems exist long before the IT

  18. Diamond High Assurance Security Program: Trusted Computing Exemplar

    DTIC Science & Technology

    2002-09-01

    computing component, the Embedded MicroKernel Prototype. A third-party evaluation of the component will be initiated during development (e.g., once...target technologies and larger projects is a topic for future research. Trusted Computing Reference Component – The Embedded MicroKernel Prototype We...Kernel The primary security function of the Embedded MicroKernel will be to enforce process and data-domain separation, while providing primitive

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Hsien-Hsin S

    The overall objective of this research project is to develop novel architectural techniques as well as system software to achieve a highly secure and intrusion-tolerant computing system. Such system will be autonomous, self-adapting, introspective, with self-healing capability under the circumstances of improper operations, abnormal workloads, and malicious attacks. The scope of this research includes: (1) System-wide, unified introspection techniques for autonomic systems, (2) Secure information-flow microarchitecture, (3) Memory-centric security architecture, (4) Authentication control and its implication to security, (5) Digital right management, (5) Microarchitectural denial-of-service attacks on shared resources. During the period of the project, we developed several architectural techniquesmore » and system software for achieving a robust, secure, and reliable computing system toward our goal.« less

  20. A decision-theoretic approach to the display of information for time-critical decisions: The Vista project

    NASA Technical Reports Server (NTRS)

    Horvitz, Eric; Ruokangas, Corinne; Srinivas, Sampath; Barry, Matthew

    1993-01-01

    We describe a collaborative research and development effort between the Palo Alto Laboratory of the Rockwell Science Center, Rockwell Space Operations Company, and the Propulsion Systems Section of NASA JSC to design computational tools that can manage the complexity of information displayed to human operators in high-stakes, time-critical decision contexts. We shall review an application from NASA Mission Control and describe how we integrated a probabilistic diagnostic model and a time-dependent utility model, with techniques for managing the complexity of computer displays. Then, we shall describe the behavior of VPROP, a system constructed to demonstrate promising display-management techniques. Finally, we shall describe our current research directions on the Vista 2 follow-on project.

  1. The EPOS ICT Architecture

    NASA Astrophysics Data System (ADS)

    Jeffery, Keith; Harrison, Matt; Bailo, Daniele

    2016-04-01

    The EPOS-PP Project 2010-2014 proposed an architecture and demonstrated feasibility with a prototype. Requirements based on use cases were collected and an inventory of assets (e.g. datasets, software, users, computing resources, equipment/detectors, laboratory services) (RIDE) was developed. The architecture evolved through three stages of refinement with much consultation both with the EPOS community representing EPOS users and participants in geoscience and with the overall ICT community especially those working on research such as the RDA (Research Data Alliance) community. The architecture consists of a central ICS (Integrated Core Services) consisting of a portal and catalog, the latter providing to end-users a 'map' of all EPOS resources (datasets, software, users, computing, equipment/detectors etc.). ICS is extended to ICS-d (distributed ICS) for certain services (such as visualisation software services or Cloud computing resources) and CES (Computational Earth Science) for specific simulation or analytical processing. ICS also communicates with TCS (Thematic Core Services) which represent European-wide portals to national and local assets, resources and services in the various specific domains (e.g. seismology, volcanology, geodesy) of EPOS. The EPOS-IP project 2015-2019 started October 2015. Two work-packages cover the ICT aspects; WP6 involves interaction with the TCS while WP7 concentrates on ICS including interoperation with ICS-d and CES offerings: in short the ICT architecture. Based on the experience and results of EPOS-PP the ICT team held a pre-meeting in July 2015 and set out a project plan. The first major activity involved requirements (re-)collection with use cases and also updating the inventory of assets held by the various TCS in EPOS. The RIDE database of assets is currently being converted to CERIF (Common European Research Information Format - an EU Recommendation to Member States) to provide the basis for the EPOS-IP ICS Catalog. In parallel the ICT team is tracking developments in ICT for relevance to EPOS-IP. In particular, the potential utilisation of e-Is (e-Infrastructures) such as GEANT(network), AARC (security), EGI (GRID computing), EUDAT (data curation), PRACE (High Performance Computing), HELIX-Nebula / Open Science Cloud (Cloud computing) are being assessed. Similarly relationships to other e-RIs (e-Research Infrastructures) such as ENVRI+, EXCELERATE and other ESFRI (European Strategic Forum for Research Infrastructures) projects are developed to share experience and technology and to promote interoperability. EPOS ICT team members are also involved in VRE4EIC, a project developing a reference architecture and component software services for a Virtual Research Environment to be superimposed on EPOS-ICS. The challenge which is being tackled now is therefore to keep consistency and interoperability among the different modules, initiatives and actors which participate to the process of running the EPOS platform. It implies both a continuous update about IT aspects of mentioned initiatives and a refinement of the e-architecture designed so far. One major aspect of EPOS-IP is the ICT support for legalistic, financial and governance aspects of the EPOS ERIC to be initiated during EPOS-IP. This implies a sophisticated AAAI (Authentication, authorization, accounting infrastructure) with consistency throughout the software, communications and data stack.

  2. Intelligent supercomputers: the Japanese computer sputnik

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walter, G.

    1983-11-01

    Japan's government-supported fifth-generation computer project has had a pronounced effect on the American computer and information systems industry. The US firms are intensifying their research on and production of intelligent supercomputers, a combination of computer architecture and artificial intelligence software programs. While the present generation of computers is built for the processing of numbers, the new supercomputers will be designed specifically for the solution of symbolic problems and the use of artificial intelligence software. This article discusses new and exciting developments that will increase computer capabilities in the 1990s. 4 references.

  3. Louisiana: a model for advancing regional e-Research through cyberinfrastructure

    PubMed Central

    Katz, Daniel S.; Allen, Gabrielle; Cortez, Ricardo; Cruz-Neira, Carolina; Gottumukkala, Raju; Greenwood, Zeno D.; Guice, Les; Jha, Shantenu; Kolluru, Ramesh; Kosar, Tevfik; Leger, Lonnie; Liu, Honggao; McMahon, Charlie; Nabrzyski, Jarek; Rodriguez-Milla, Bety; Seidel, Ed; Speyrer, Greg; Stubblefield, Michael; Voss, Brian; Whittenburg, Scott

    2009-01-01

    Louisiana researchers and universities are leading a concentrated, collaborative effort to advance statewide e-Research through a new cyberinfrastructure: computing systems, data storage systems, advanced instruments and data repositories, visualization environments and people, all linked together by software programs and high-performance networks. This effort has led to a set of interlinked projects that have started making a significant difference in the state, and has created an environment that encourages increased collaboration, leading to new e-Research. This paper describes the overall effort, the new projects and environment and the results to date. PMID:19451102

  4. The aeroacoustics of supersonic jets

    NASA Technical Reports Server (NTRS)

    Morris, Philip J.; McLaughlin, Dennis K.

    1995-01-01

    This research project was a joint experimental/computational study of noise in supersonic jets. The experiments were performed in a low to moderate Reynolds number anechoic supersonic jet facility. Computations have focused on the modeling of the effect of an external shroud on the generation and radiation of jet noise. This report summarizes the results of the research program in the form of the Masters and Doctoral theses of those students who obtained their degrees with the assistance of this research grant. In addition, the presentations and publications made by the principal investigators and the research students is appended.

  5. Embracing Diversity: The Exploration of User Motivations in Citizen Science Astronomy Projects

    NASA Astrophysics Data System (ADS)

    Lee, Lo

    2018-06-01

    Online citizen science projects ask members of the public to donate spare time on their personal computers to process large datasets. A critical challenge for these projects is volunteer recruitment and retention. Many of these projects use Berkeley Open Infrastructure for Network Computing (BOINC), a piece of middleware, to support their operations. This poster analyzes volunteer motivations in two large, BOINC-based astronomy projects, Einstein@Home and Milkyway@Home. Volunteer opinions are addressed to assess whether and how competitive elements, such as credit and ranking systems, motivate volunteers. Findings from a study of project volunteers, comprising surveys (n=2,031) and follow-up interviews (n=21), show that altruism is the main incentive for participation because volunteers consider scientific research to be critical for humans. Multiple interviewees also revealed a passion for extrinsic motivations, i.e. those that involve recognition from other people, such as opportunities to become co-authors of publications or to earn financial benefits. Credit and ranking systems motivate nearly half of interviewees. By analyzing user motivations in astronomical BOINC projects, this research provides scientists with deeper understandings about volunteer communities and various types of volunteers. Building on these findings, scientists can develop different strategies, for example, awarding volunteers badges, to recruit and retain diverse volunteers, and thus enhance long-term user participation in astronomical BOINC projects.

  6. Visualizing ultrasound through computational modeling

    NASA Technical Reports Server (NTRS)

    Guo, Theresa W.

    2004-01-01

    The Doppler Ultrasound Hematocrit Project (DHP) hopes to find non-invasive methods of determining a person s blood characteristics. Because of the limits of microgravity and the space travel environment, it is important to find non-invasive methods of evaluating the health of persons in space. Presently, there is no well developed method of determining blood composition non-invasively. This projects hopes to use ultrasound and Doppler signals to evaluate the characteristic of hematocrit, the percentage by volume of red blood cells within whole blood. These non-invasive techniques may also be developed to be used on earth for trauma patients where invasive measure might be detrimental. Computational modeling is a useful tool for collecting preliminary information and predictions for the laboratory research. We hope to find and develop a computer program that will be able to simulate the ultrasound signals the project will work with. Simulated models of test conditions will more easily show what might be expected from laboratory results thus help the research group make informed decisions before and during experimentation. There are several existing Matlab based computer programs available, designed to interpret and simulate ultrasound signals. These programs will be evaluated to find which is best suited for the project needs. The criteria of evaluation that will be used are 1) the program must be able to specify transducer properties and specify transmitting and receiving signals, 2) the program must be able to simulate ultrasound signals through different attenuating mediums, 3) the program must be able to process moving targets in order to simulate the Doppler effects that are associated with blood flow, 4) the program should be user friendly and adaptable to various models. After a computer program is chosen, two simulation models will be constructed. These models will simulate and interpret an RF data signal and a Doppler signal.

  7. Multicore: Fallout from a Computing Evolution

    ScienceCinema

    Yelick, Kathy [Director, NERSC

    2017-12-09

    July 22, 2008 Berkeley Lab lecture: Parallel computing used to be reserved for big science and engineering projects, but in two years that's all changed. Even laptops and hand-helds use parallel processors. Unfortunately, the software hasn't kept pace. Kathy Yelick, Director of the National Energy Research Scientific Computing Center at Berkeley Lab, describes the resulting chaos and the computing community's efforts to develop exciting applications that take advantage of tens or hundreds of processors on a single chip.

  8. New project to support scientific collaboration electronically

    NASA Astrophysics Data System (ADS)

    Clauer, C. R.; Rasmussen, C. E.; Niciejewski, R. J.; Killeen, T. L.; Kelly, J. D.; Zambre, Y.; Rosenberg, T. J.; Stauning, P.; Friis-Christensen, E.; Mende, S. B.; Weymouth, T. E.; Prakash, A.; McDaniel, S. E.; Olson, G. M.; Finholt, T. A.; Atkins, D. E.

    A new multidisciplinary effort is linking research in the upper atmospheric and space, computer, and behavioral sciences to develop a prototype electronic environment for conducting team science worldwide. A real-world electronic collaboration testbed has been established to support scientific work centered around the experimental operations being conducted with instruments from the Sondrestrom Upper Atmospheric Research Facility in Kangerlussuaq, Greenland. Such group computing environments will become an important component of the National Information Infrastructure initiative, which is envisioned as the high-performance communications infrastructure to support national scientific research.

  9. Computer tools for systems engineering at LaRC

    NASA Technical Reports Server (NTRS)

    Walters, J. Milam

    1994-01-01

    The Systems Engineering Office (SEO) has been established to provide life cycle systems engineering support to Langley research Center projects. over the last two years, the computing market has been reviewed for tools which could enhance the effectiveness and efficiency of activities directed towards this mission. A group of interrelated applications have been procured, or are under development including a requirements management tool, a system design and simulation tool, and project and engineering data base. This paper will review the current configuration of these tools and provide information on future milestones and directions.

  10. Introductory Geophysics at Colorado College: A Research-Driven Course

    NASA Astrophysics Data System (ADS)

    Bank, C.

    2003-12-01

    Doing research during an undergraduate course provides stimulus for students and instructor. Students learn to appreciate the scientific method and get hands-on experience, while the instructor remains thrilled about teaching her/his discipline. The introductory geophysics course taught at Colorado College is made up of four units (gravity, seismic, resistivity, and magnetic) using available geophysical equipment. Within each unit students learn the physical background of the method, and then tackle a small research project selected by the instructor. Students pose a research question (or formulate a hypothesis), collect near-surface data in the field, process it using personal computers, and analyse it by creating computer models and running simple inversions. Computer work is done using the programming language Matlab, with several pre-coded scripts to make the programming experience more comfortable. Students then interpret the data and answer the question posed at the beginning. The unit ends with students writing a summary report, creating a poster, or presenting their findings orally. First evaluations of the course show that students appreciate the emphasis on field work and applications to real problems, as well as developing and testing their own hypotheses. The main challenge for the instructor is to find feasible projects, given the time constraints of a course and availability of field sites with new questions to answer. My presentation will feature a few projects done by students during the course and will discuss the experience students and I have had with this approach.

  11. Research in Progress--Update April 1990. Occasional Paper InTER/14/90.

    ERIC Educational Resources Information Center

    Boots, Maureen, Comp.

    This document contains abstracts of 29 research projects in progress in Great Britain divided into six sections: (1) the current phase of Information Technology in Education Research (InTER) programs on groupwork with computers, tools for exploratory learning, conceptual change in science, and bubble dialogue as an ethnographic research tool; (2)…

  12. Patrick Davenport | NREL

    Science.gov Websites

    systems. In graduate school, Patrick completed his thesis research project with and was subsequently research position in the Computational Biomechanics Laboratory (FEM) at the University of Denver. Education M.S. Materials & Manufacturing Engineering, Technical University of Denmark (DTU) M.B.A

  13. CESDIS

    NASA Technical Reports Server (NTRS)

    1994-01-01

    CESDIS, the Center of Excellence in Space Data and Information Sciences was developed jointly by NASA, Universities Space Research Association (USRA), and the University of Maryland in 1988 to focus on the design of advanced computing techniques and data systems to support NASA Earth and space science research programs. CESDIS is operated by USRA under contract to NASA. The Director, Associate Director, Staff Scientists, and administrative staff are located on-site at NASA's Goddard Space Flight Center in Greenbelt, Maryland. The primary CESDIS mission is to increase the connection between computer science and engineering research programs at colleges and universities and NASA groups working with computer applications in Earth and space science. The 1993-94 CESDIS year included a broad range of computer science research applied to NASA problems. This report provides an overview of these research projects and programs as well as a summary of the various other activities of CESDIS in support of NASA and the university research community, We have had an exciting and challenging year.

  14. Computing, Information and Communications Technology (CICT) Website

    NASA Technical Reports Server (NTRS)

    Hardman, John; Tu, Eugene (Technical Monitor)

    2002-01-01

    The Computing, Information and Communications Technology Program (CICT) was established in 2001 to ensure NASA's Continuing leadership in emerging technologies. It is a coordinated, Agency-wide effort to develop and deploy key enabling technologies for a broad range of mission-critical tasks. The NASA CICT program is designed to address Agency-specific computing, information, and communications technology requirements beyond the projected capabilities of commercially available solutions. The areas of technical focus have been chosen for their impact on NASA's missions, their national importance, and the technical challenge they provide to the Program. In order to meet its objectives, the CICT Program is organized into the following four technology focused projects: 1) Computing, Networking and Information Systems (CNIS); 2) Intelligent Systems (IS); 3) Space Communications (SC); 4) Information Technology Strategic Research (ITSR).

  15. The CP-PACS project

    NASA Astrophysics Data System (ADS)

    Iwasaki, Y.; CP-PACS Collaboration

    1998-01-01

    The CP-PACS project is a five year plan, which formally started in April 1992 and has been completed in March 1997, to develop a massively parallel computer for carrying out research in computational physics with primary emphasis on lattice QCD. The initial version of the CP-PACS computer with a theoretical peak speed of 307 GFLOPS with 1024 processors was completed in March 1996. The final version with a peak speed of 614 GFLOPS with 2048 processors was completed in September 1996, and has been in full operation since October 1996. We describe the architecture, the final specification, the hardware implementation, and the software of the CP-PACS computer. The CP-PACS has been used for hadron spectroscopy production runs since July 1996. The performance for lattice QCD applications and the LINPACK benchmark are given.

  16. Final Project Report: Data Locality Enhancement of Dynamic Simulations for Exascale Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shen, Xipeng

    The goal of this project is to develop a set of techniques and software tools to enhance the matching between memory accesses in dynamic simulations and the prominent features of modern and future manycore systems, alleviating the memory performance issues for exascale computing. In the first three years, the PI and his group have achieves some significant progress towards the goal, producing a set of novel techniques for improving the memory performance and data locality in manycore systems, yielding 18 conference and workshop papers and 4 journal papers and graduating 6 Ph.Ds. This report summarizes the research results of thismore » project through that period.« less

  17. The gputools package enables GPU computing in R.

    PubMed

    Buckner, Joshua; Wilson, Justin; Seligman, Mark; Athey, Brian; Watson, Stanley; Meng, Fan

    2010-01-01

    By default, the R statistical environment does not make use of parallelism. Researchers may resort to expensive solutions such as cluster hardware for large analysis tasks. Graphics processing units (GPUs) provide an inexpensive and computationally powerful alternative. Using R and the CUDA toolkit from Nvidia, we have implemented several functions commonly used in microarray gene expression analysis for GPU-equipped computers. R users can take advantage of the better performance provided by an Nvidia GPU. The package is available from CRAN, the R project's repository of packages, at http://cran.r-project.org/web/packages/gputools More information about our gputools R package is available at http://brainarray.mbni.med.umich.edu/brainarray/Rgpgpu

  18. Cartwheel projections of segmented pulmonary vasculature for the detection of pulmonary embolism

    NASA Astrophysics Data System (ADS)

    Kiraly, Atilla P.; Naidich, David P.; Novak, Carol L.

    2005-04-01

    Pulmonary embolism (PE) detection via contrast-enhanced computed tomography (CT) images is an increasingly important topic of research. Accurate identification of PE is of critical importance in determining the need for further treatment. However, current multi-slice CT scanners provide datasets typically containing 600 or more images per patient, making it desirable to have a visualization method to help radiologists focus directly on potential candidates that might otherwise have been overlooked. This is especially important when assessing the ability of CT to identify smaller, sub-segmental emboli. We propose a cartwheel projection approach to PE visualization that computes slab projections of the original data aided by vessel segmentation. Previous research on slab visualization for PE has utilized the entire volumetric dataset, requiring thin slabs and necessitating the use of maximum intensity projection (MIP). Our use of segmentation within the projection computation allows the use of thicker slabs than previous methods, as well as the ability to employ visualization variations that are only possible with segmentation. Following automatic segmentation of the pulmonary vessels, slabs may be rotated around the X-, Y- or Z-axis. These slabs are rendered by preferentially using voxels within the lung vessels. This effectively eliminates distracting information not relevant to diagnosis, lessening both the chance of overlooking a subtle embolus and minimizing time on spent evaluating false positives. The ability to employ thicker slabs means fewer images need to be evaluated, yielding a more efficient workflow.

  19. Computerized Multi-Media Instructional Television. COMIT. Proceedings of a Symposium.

    ERIC Educational Resources Information Center

    Andrews, Gordon C., Ed.; Knapper, Christopher K., Ed.

    A joint research project in educational techniques, which was conducted by the University of Waterloo and the IBM Corporation, explored the use of color television with random-access videotape under computer control. At the end of the three-year project, papers were solicited from all COMIT (Computerized Multi-Media Instructional Television)…

  20. Promoting Pre-Service Teachers' Reflections through a Cross-Cultural Keypal Project

    ERIC Educational Resources Information Center

    Wach, Aleksandra

    2015-01-01

    This paper reports the results of an action research-based study that investigated participants' reflections on EFL learning and teaching in a computer-mediated communication (CMC)-based project. Forty pre-service teachers from two universities, in Poland and in Romania, exchanged emails on class-related topics; the email exchange was followed by…

  1. Funder Report on Decision Support Systems Project Dissemination Activities, Fiscal Year 1985.

    ERIC Educational Resources Information Center

    Tetlow, William L.

    Dissemination activities for the Decision Support Systems (DSS) for fiscal year (FY) 1985 are reported by the National Center for Higher Education Management Systems (NCHEMS). The main means for disseminating results of the DSS research and development project has been through computer-generated video presentations at meetings of higher education…

  2. Use of a database for managing qualitative research data.

    PubMed

    Ross, B A

    1994-01-01

    In this article, a process for handling text data in qualitative research projects by using existing word-processing and database programs is described. When qualitative data are managed using this method, the information is more readily available and the coding and organization of the data are enhanced. Furthermore, the narrative always remains intact regardless of how it is arranged or re-arranged, and there is a concomitant time savings and increased accuracy. The author hopes that this article will inspire some readers to explore additional methods and processes for computer-aided, nonstatistical data management. The study referred to in this article (Ross, 1991) was a qualitative research project which sought to find out how teaching faculty in nursing and education used computers in their professional work. Ajzen and Fishbein's (1980) Theory of Reasoned Action formed the theoretical basis for this work. This theory proposes that behavior, in this study the use of computers, is the result of intentions and that intentions are the result of attitudes and social norms. The study found that although computer use was sometimes the result of attitudes, more often it seemed to be the result of subjective (perceived) norms or intervening variables. Teaching faculty apparently did not initially make reasoned judgments about the computers or the programs they used, but chose to use whatever was required or available.

  3. Interactive and Multimedia Contents Associated with a System for Computer-Aided Assessment

    ERIC Educational Resources Information Center

    Paiva, Rui C.; Ferreira, Milton S.; Mendes, Ana G.; Eusébio, Augusto M. J.

    2015-01-01

    This article presents a research study addressing the development, implementation, evaluation, and use of Interactive Modules for Online Training (MITO) of mathematics in higher education. This work was carried out in the context of the MITO project, which combined several features of the learning and management system Moodle, the computer-aided…

  4. The Ideology of Computer Literacy in Schools.

    ERIC Educational Resources Information Center

    Mangan, J. Marshall

    This research project brings a critical perspective to the examination of computer literacy as an ideological form through a study of the reactions of high school teachers and students. On-site interviews with teachers and students found both acceptance of and resistance to the message of adjustment to an inevitable future of vocational and…

  5. Computer-Assisted Scheduling of Army Unit Training: An Application of Simulated Annealing.

    ERIC Educational Resources Information Center

    Hart, Roland J.; Goehring, Dwight J.

    This report of an ongoing research project intended to provide computer assistance to Army units for the scheduling of training focuses on the feasibility of simulated annealing, a heuristic approach for solving scheduling problems. Following an executive summary and brief introduction, the document is divided into three sections. First, the Army…

  6. Finding the Hook: Computer Science Education in Elementary Contexts

    ERIC Educational Resources Information Center

    Ozturk, Zehra; Dooley, Caitlin McMunn; Welch, Meghan

    2018-01-01

    The purpose of this study was to investigate how elementary teachers with little knowledge of computer science (CS) and project-based learning (PBL) experienced integrating CS through PBL as a part of a standards-based elementary curriculum in Grades 3-5. The researchers used qualitative constant comparison methods on field notes and reflections…

  7. Advanced Computer Image Generation Techniques Exploiting Perceptual Characteristics. Final Report.

    ERIC Educational Resources Information Center

    Stenger, Anthony J.; And Others

    This study suggests and identifies computer image generation (CIG) algorithms for visual simulation that improve the training effectiveness of CIG simulators and identifies areas of basic research in visual perception that are significant for improving CIG technology. The first phase of the project entailed observing three existing CIG simulators.…

  8. Apple Classrooms of Tomorrow: Philosophy and Structure [and] What's Happening Where.

    ERIC Educational Resources Information Center

    Apple Computer, Inc., Cupertino, CA.

    Apple Classrooms of Tomorrow (ACOT) is a long-term research project sponsored by Apple Computer, Inc., to explore how learning and teaching change when teachers and students have access to interactive computer technologies. ACOT adheres to a philosophy that instruction should be learner controlled; i.e., students take responsibility for their own…

  9. Study of one- and two-dimensional filtering and deconvolution algorithms for a streaming array computer

    NASA Technical Reports Server (NTRS)

    Ioup, G. E.

    1985-01-01

    Appendix 5 of the Study of One- and Two-Dimensional Filtering and Deconvolution Algorithms for a Streaming Array Computer includes a resume of the professional background of the Principal Investigator on the project, lists of this publications and research papers, graduate thesis supervised, and grants received.

  10. Critical Thinking about Literature through Computer Networking.

    ERIC Educational Resources Information Center

    Long, Thomas L.; Pedersen, Christine

    A computer-oriented, classroom-based research project was conducted at Thomas Nelson Community College in Hampton, Virginia, to explore the ways in which students in a composition and literature class might use a local area network (LAN) as a catalyst to critical thinking, to construct a decentralized classroom, and to use various forms of…

  11. Local and Long Distance Computer Networking for Science Classrooms. Technical Report No. 43.

    ERIC Educational Resources Information Center

    Newman, Denis

    This report describes Earth Lab, a project which is demonstrating new ways of using computers for upper-elementary and middle-school science instruction, and finding ways to integrate local-area and telecommunications networks. The discussion covers software, classroom activities, formative research on communications networks, and integration of…

  12. Displaying Special Characters and Symbols in Computer-Controlled Reaction Time Experiments.

    ERIC Educational Resources Information Center

    Friel, Brian M.; Kennison, Shelia M.

    A procedure for using MEL2 (Version 2.0 of Microcomputer Experimental Laboratory) and FontWINDOW to present special characters and symbols in computer-controlled reaction time experiments is described. The procedure permits more convenience and flexibility than in tachistocopic and projection techniques. FontWINDOW allows researchers to design…

  13. Digital Literacy Development of Students Involved in an ICT Educational Project

    NASA Astrophysics Data System (ADS)

    Quintana, Maria Graciela Badilla; Pujol, Meritxell Cortada

    The impact of the Information and Communication Technologies (ICT) has become the core of a change that involves most of the society fields, consequently the technological and informational literacy are essential requirements in education. The research is a quasi-experimental and ex-post-facto study in schools from Spain. The aim was to describe and analyze the involvement showed by 219 students who participated in a development of ICT's Project named Ponte dos Brozos. The research objective was to respond if the students who usually worked with ICT, had better knowledge and management with computing tools, and if they are better prepared in researching and selecting information. Results showed that students who have a higher contact with ICTs know about the technology and how to use it, also better knowledge and control of the computer and operative systems, a high information management level trough the Internet, although their literacy in information is devoid.

  14. Acoustic impulse response method as a source of undergraduate research projects and advanced laboratory experiments.

    PubMed

    Robertson, W M; Parker, J M

    2012-03-01

    A straightforward and inexpensive implementation of acoustic impulse response measurement is described utilizing the signal processing technique of coherent averaging. The technique is capable of high signal-to-noise measurements with personal computer data acquisition equipment, an amplifier/speaker, and a high quality microphone. When coupled with simple waveguide test systems fabricated from commercial PVC plumbing pipe, impulse response measurement has proven to be ideal for undergraduate research projects-often of publishable quality-or for advanced laboratory experiments. The technique provides important learning objectives for science or engineering students in areas such as interfacing and computer control of experiments; analog-to-digital conversion and sampling; time and frequency analysis using Fourier transforms; signal processing; and insight into a variety of current research areas such as acoustic bandgap materials, acoustic metamaterials, and fast and slow wave manipulation. © 2012 Acoustical Society of America

  15. Theory, Modeling, Software and Hardware Development for Analytical and Computational Materials Science

    NASA Technical Reports Server (NTRS)

    Young, Gerald W.; Clemons, Curtis B.

    2004-01-01

    The focus of this Cooperative Agreement between the Computational Materials Laboratory (CML) of the Processing Science and Technology Branch of the NASA Glenn Research Center (GRC) and the Department of Theoretical and Applied Mathematics at The University of Akron was in the areas of system development of the CML workstation environment, modeling of microgravity and earth-based material processing systems, and joint activities in laboratory projects. These efforts complement each other as the majority of the modeling work involves numerical computations to support laboratory investigations. Coordination and interaction between the modelers, system analysts, and laboratory personnel are essential toward providing the most effective simulations and communication of the simulation results. Toward these means, The University of Akron personnel involved in the agreement worked at the Applied Mathematics Research Laboratory (AMRL) in the Department of Theoretical and Applied Mathematics while maintaining a close relationship with the personnel of the Computational Materials Laboratory at GRC. Network communication between both sites has been established. A summary of the projects we undertook during the time period 9/1/03 - 6/30/04 is included.

  16. RAPPORT: running scientific high-performance computing applications on the cloud.

    PubMed

    Cohen, Jeremy; Filippis, Ioannis; Woodbridge, Mark; Bauer, Daniela; Hong, Neil Chue; Jackson, Mike; Butcher, Sarah; Colling, David; Darlington, John; Fuchs, Brian; Harvey, Matt

    2013-01-28

    Cloud computing infrastructure is now widely used in many domains, but one area where there has been more limited adoption is research computing, in particular for running scientific high-performance computing (HPC) software. The Robust Application Porting for HPC in the Cloud (RAPPORT) project took advantage of existing links between computing researchers and application scientists in the fields of bioinformatics, high-energy physics (HEP) and digital humanities, to investigate running a set of scientific HPC applications from these domains on cloud infrastructure. In this paper, we focus on the bioinformatics and HEP domains, describing the applications and target cloud platforms. We conclude that, while there are many factors that need consideration, there is no fundamental impediment to the use of cloud infrastructure for running many types of HPC applications and, in some cases, there is potential for researchers to benefit significantly from the flexibility offered by cloud platforms.

  17. ENFIN--A European network for integrative systems biology.

    PubMed

    Kahlem, Pascal; Clegg, Andrew; Reisinger, Florian; Xenarios, Ioannis; Hermjakob, Henning; Orengo, Christine; Birney, Ewan

    2009-11-01

    Integration of biological data of various types and the development of adapted bioinformatics tools represent critical objectives to enable research at the systems level. The European Network of Excellence ENFIN is engaged in developing an adapted infrastructure to connect databases, and platforms to enable both the generation of new bioinformatics tools and the experimental validation of computational predictions. With the aim of bridging the gap existing between standard wet laboratories and bioinformatics, the ENFIN Network runs integrative research projects to bring the latest computational techniques to bear directly on questions dedicated to systems biology in the wet laboratory environment. The Network maintains internally close collaboration between experimental and computational research, enabling a permanent cycling of experimental validation and improvement of computational prediction methods. The computational work includes the development of a database infrastructure (EnCORE), bioinformatics analysis methods and a novel platform for protein function analysis FuncNet.

  18. Management and Analysis of Biological and Clinical Data: How Computer Science May Support Biomedical and Clinical Research

    NASA Astrophysics Data System (ADS)

    Veltri, Pierangelo

    The use of computer based solutions for data management in biology and clinical science has contributed to improve life-quality and also to gather research results in shorter time. Indeed, new algorithms and high performance computation have been using in proteomics and genomics studies for curing chronic diseases (e.g., drug designing) as well as supporting clinicians both in diagnosis (e.g., images-based diagnosis) and patient curing (e.g., computer based information analysis on information gathered from patient). In this paper we survey on examples of computer based techniques applied in both biology and clinical contexts. The reported applications are also results of experiences in real case applications at University Medical School of Catanzaro and also part of experiences of the National project Staywell SH 2.0 involving many research centers and companies aiming to study and improve citizen wellness.

  19. The Galaxy platform for accessible, reproducible and collaborative biomedical analyses: 2016 update

    PubMed Central

    Afgan, Enis; Baker, Dannon; van den Beek, Marius; Blankenberg, Daniel; Bouvier, Dave; Čech, Martin; Chilton, John; Clements, Dave; Coraor, Nate; Eberhard, Carl; Grüning, Björn; Guerler, Aysam; Hillman-Jackson, Jennifer; Von Kuster, Greg; Rasche, Eric; Soranzo, Nicola; Turaga, Nitesh; Taylor, James; Nekrutenko, Anton; Goecks, Jeremy

    2016-01-01

    High-throughput data production technologies, particularly ‘next-generation’ DNA sequencing, have ushered in widespread and disruptive changes to biomedical research. Making sense of the large datasets produced by these technologies requires sophisticated statistical and computational methods, as well as substantial computational power. This has led to an acute crisis in life sciences, as researchers without informatics training attempt to perform computation-dependent analyses. Since 2005, the Galaxy project has worked to address this problem by providing a framework that makes advanced computational tools usable by non experts. Galaxy seeks to make data-intensive research more accessible, transparent and reproducible by providing a Web-based environment in which users can perform computational analyses and have all of the details automatically tracked for later inspection, publication, or reuse. In this report we highlight recently added features enabling biomedical analyses on a large scale. PMID:27137889

  20. "On the Job" Learning: A Bioinformatics Course Incorporating Undergraduates in Actual Research Projects and Manuscript Submissions

    ERIC Educational Resources Information Center

    Smith, Jason T.; Harris, Justine C.; Lopez, Oscar J.; Valverde, Laura; Borchert, Glen M.

    2015-01-01

    The sequencing of whole genomes and the analysis of genetic information continues to fundamentally change biological and medical research. Unfortunately, the people best suited to interpret this data (biologically trained researchers) are commonly discouraged by their own perceived computational limitations. To address this, we developed a course…

  1. Integrated Modeling and Analysis of Physical Oceanographic and Acoustic Processes

    DTIC Science & Technology

    2015-09-30

    goal is to improve ocean physical state and acoustic state predictive capabilities. The goal fitting the scope of this project is the creation of... Project -scale objectives are to complete targeted studies of oceanographic processes in a few regimes, accompanied by studies of acoustic propagation...by the basic research efforts of this project . An additional objective is to develop improved computational tools for acoustics and for the

  2. Magnesium Front End Research and Development: A Canada-China-USA Collaboration

    NASA Astrophysics Data System (ADS)

    Luo, Alan A.; Nyberg, Eric A.; Sadayappan, Kumar; Shi, Wenfang

    The Magnesium Front End Research & Development (MFERD) project is an effort jointly sponsored by the United States Department of Energy, the United States Automotive Materials Partnership (USAMP), the Chinese Ministry of Science and Technology and Natural Resources Canada (NRCan) to demonstrate the technical and economic feasibility of a magnesium-intensive automotive front end body structure which offers improved fuel economy and performance benefits in a multi-material automotive structure. The project examines novel magnesium automotive body applications and processes, beyond conventional die castings, including wrought components (sheet or extrusions) and high-integrity body castings. This paper outlines the scope of work and organization for the collaborative (tri-country) task teams. The project has the goals of developing key enabling technologies and knowledge base for increased magnesium automotive body applications. The MFERD project began in early 2007 by initiating R&D in the following areas: crashworthiness, NVH, fatigue and durability, corrosion and surface finishing, extrusion and forming, sheet and forming, high-integrity body casting, as well as joining and assembly. Additionally, the MFERD project is also linked to the Integrated Computational Materials Engineering (ICME) project that will investigate the processing/structure/properties relations for various magnesium alloys and manufacturing processes utilizing advanced computer-aided engineering and modeling tools.

  3. Distributed computing environments for future space control systems

    NASA Technical Reports Server (NTRS)

    Viallefont, Pierre

    1993-01-01

    The aim of this paper is to present the results of a CNES research project on distributed computing systems. The purpose of this research was to study the impact of the use of new computer technologies in the design and development of future space applications. The first part of this study was a state-of-the-art review of distributed computing systems. One of the interesting ideas arising from this review is the concept of a 'virtual computer' allowing the distributed hardware architecture to be hidden from a software application. The 'virtual computer' can improve system performance by adapting the best architecture (addition of computers) to the software application without having to modify its source code. This concept can also decrease the cost and obsolescence of the hardware architecture. In order to verify the feasibility of the 'virtual computer' concept, a prototype representative of a distributed space application is being developed independently of the hardware architecture.

  4. ICASE Workshop on Programming Computational Grids

    DTIC Science & Technology

    2001-09-01

    ICASE Workshop on Programming Computational Grids Thomas M. Eidson and Merrell L. Patrick ICASE, Hampton, Virginia ICASE NASA Langley Research Center...Computational Grids Contract Number Grant Number Program Element Number Author(s) Thomas M. Eidson and Merrell L. Patrick Project Number Task Number...clear that neither group fully understood the ideas and problems of the other. It was also clear that neither group is given the time and support to

  5. The Effect of a Computer Program Designed with Constructivist Principles for College Non-Science Majors on Understanding of Photosynthesis and Cellular Respiration

    ERIC Educational Resources Information Center

    Wielard, Valerie Michelle

    2013-01-01

    The primary objective of this project was to learn what effect a computer program would have on academic achievement and attitude toward science of college students enrolled in a biology class for non-science majors. It became apparent that the instructor also had an effect on attitudes toward science. The researcher designed a computer program,…

  6. THREED: A computer program for three dimensional transformation of coordinates. [in lunar photo triangulation mapping

    NASA Technical Reports Server (NTRS)

    Wong, K. W.

    1974-01-01

    Program THREED was developed for the purpose of a research study on the treatment of control data in lunar phototriangulation. THREED is the code name of a computer program for performing absolute orientation by the method of three-dimensional projective transformation. It has the capability of performing complete error analysis on the computed transformation parameters as well as the transformed coordinates.

  7. Parallel Algorithms for the Exascale Era

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robey, Robert W.

    New parallel algorithms are needed to reach the Exascale level of parallelism with millions of cores. We look at some of the research developed by students in projects at LANL. The research blends ideas from the early days of computing while weaving in the fresh approach brought by students new to the field of high performance computing. We look at reproducibility of global sums and why it is important to parallel computing. Next we look at how the concept of hashing has led to the development of more scalable algorithms suitable for next-generation parallel computers. Nearly all of this workmore » has been done by undergraduates and published in leading scientific journals.« less

  8. The European computer model for optronic system performance prediction (ECOMOS)

    NASA Astrophysics Data System (ADS)

    Repasi, Endre; Bijl, Piet; Labarre, Luc; Wittenstein, Wolfgang; Bürsing, Helge

    2017-05-01

    ECOMOS is a multinational effort within the framework of an EDA Project Arrangement. Its aim is to provide a generally accepted and harmonized European computer model for computing nominal Target Acquisition (TA) ranges of optronic imagers operating in the Visible or thermal Infrared (IR). The project involves close co-operation of defence and security industry and public research institutes from France, Germany, Italy, The Netherlands and Sweden. ECOMOS uses and combines well-accepted existing European tools to build up a strong competitive position. This includes two TA models: the analytical TRM4 model and the image-based TOD model. In addition, it uses the atmosphere model MATISSE. In this paper, the central idea of ECOMOS is exposed. The overall software structure and the underlying models are shown and elucidated. The status of the project development is given as well as a short outlook on validation tests and the future potential of simulation for sensor assessment.

  9. Using Volunteer Computing to Study Some Features of Diagonal Latin Squares

    NASA Astrophysics Data System (ADS)

    Vatutin, Eduard; Zaikin, Oleg; Kochemazov, Stepan; Valyaev, Sergey

    2017-12-01

    In this research, the study concerns around several features of diagonal Latin squares (DLSs) of small order. Authors of the study suggest an algorithm for computing minimal and maximal numbers of transversals of DLSs. According to this algorithm, all DLSs of a particular order are generated, and for each square all its transversals and diagonal transversals are constructed. The algorithm was implemented and applied to DLSs of order at most 7 on a personal computer. The experiment for order 8 was performed in the volunteer computing project Gerasim@home. In addition, the problem of finding pairs of orthogonal DLSs of order 10 was considered and reduced to Boolean satisfiability problem. The obtained problem turned out to be very hard, therefore it was decomposed into a family of subproblems. In order to solve the problem, the volunteer computing project SAT@home was used. As a result, several dozen pairs of described kind were found.

  10. LHC@Home: a BOINC-based volunteer computing infrastructure for physics studies at CERN

    NASA Astrophysics Data System (ADS)

    Barranco, Javier; Cai, Yunhai; Cameron, David; Crouch, Matthew; Maria, Riccardo De; Field, Laurence; Giovannozzi, Massimo; Hermes, Pascal; Høimyr, Nils; Kaltchev, Dobrin; Karastathis, Nikos; Luzzi, Cinzia; Maclean, Ewen; McIntosh, Eric; Mereghetti, Alessio; Molson, James; Nosochkov, Yuri; Pieloni, Tatiana; Reid, Ivan D.; Rivkin, Lenny; Segal, Ben; Sjobak, Kyrre; Skands, Peter; Tambasco, Claudia; Veken, Frederik Van der; Zacharov, Igor

    2017-12-01

    The LHC@Home BOINC project has provided computing capacity for numerical simulations to researchers at CERN since 2004, and has since 2011 been expanded with a wider range of applications. The traditional CERN accelerator physics simulation code SixTrack enjoys continuing volunteers support, and thanks to virtualisation a number of applications from the LHC experiment collaborations and particle theory groups have joined the consolidated LHC@Home BOINC project. This paper addresses the challenges related to traditional and virtualized applications in the BOINC environment, and how volunteer computing has been integrated into the overall computing strategy of the laboratory through the consolidated LHC@Home service. Thanks to the computing power provided by volunteers joining LHC@Home, numerous accelerator beam physics studies have been carried out, yielding an improved understanding of charged particle dynamics in the CERN Large Hadron Collider (LHC) and its future upgrades. The main results are highlighted in this paper.

  11. The Revolutionary Vertical Lift Technology (RVLT) Project

    NASA Technical Reports Server (NTRS)

    Yamauchi, Gloria K.

    2018-01-01

    The Revolutionary Vertical Lift Technology (RVLT) Project is one of six projects in the Advanced Air Vehicles Program (AAVP) of the NASA Aeronautics Research Mission Directorate. The overarching goal of the RVLT Project is to develop and validate tools, technologies, and concepts to overcome key barriers for vertical lift vehicles. The project vision is to enable the next generation of vertical lift vehicles with aggressive goals for efficiency, noise, and emissions, to expand current capabilities and develop new commercial markets. The RVLT Project invests in technologies that support conventional, non-conventional, and emerging vertical-lift aircraft in the very light to heavy vehicle classes. Research areas include acoustic, aeromechanics, drive systems, engines, icing, hybrid-electric systems, impact dynamics, experimental techniques, computational methods, and conceptual design. The project research is executed at NASA Ames, Glenn, and Langley Research Centers; the research extensively leverages partnerships with the US Army, the Federal Aviation Administration, industry, and academia. The primary facilities used by the project for testing of vertical-lift technologies include the 14- by 22-Ft Wind Tunnel, Icing Research Tunnel, National Full-Scale Aerodynamics Complex, 7- by 10-Ft Wind Tunnel, Rotor Test Cell, Landing and Impact Research facility, Compressor Test Facility, Drive System Test Facilities, Transonic Turbine Blade Cascade Facility, Vertical Motion Simulator, Mobile Acoustic Facility, Exterior Effects Synthesis and Simulation Lab, and the NASA Advanced Supercomputing Complex. To learn more about the RVLT Project, please stop by booth #1004 or visit their website at https://www.nasa.gov/aeroresearch/programs/aavp/rvlt.

  12. Research 1970/1971: Annual Progress Report.

    ERIC Educational Resources Information Center

    Georgia Inst. of Tech., Atlanta. Science Information Research Center.

    The report presents a summary of science information research activities of the School of Information and Computer Science, Georgia Institute of Technology. Included are project reports on interrelated studies in science information, information processing and systems design, automata and systems theories, and semiotics and linguistics. Also…

  13. NASA/ASEE Summer Faculty Fellowship Program: 1988 research reports

    NASA Technical Reports Server (NTRS)

    Anderson, Loren A. (Editor); Armstrong, Dennis W. (Editor)

    1988-01-01

    This contractor's report contains all sixteen final reports prepared by the participants in the 1988 Summer Faculty Fellowship Program. Reports describe research projects on a number of topics including controlled environments, robotics, cryogenic propellant storage, polymers, hydroponic culture, adaptive servocontrol, and computer aided design

  14. Solving optimization problems on computational grids.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wright, S. J.; Mathematics and Computer Science

    2001-05-01

    Multiprocessor computing platforms, which have become more and more widely available since the mid-1980s, are now heavily used by organizations that need to solve very demanding computational problems. Parallel computing is now central to the culture of many research communities. Novel parallel approaches were developed for global optimization, network optimization, and direct-search methods for nonlinear optimization. Activity was particularly widespread in parallel branch-and-bound approaches for various problems in combinatorial and network optimization. As the cost of personal computers and low-end workstations has continued to fall, while the speed and capacity of processors and networks have increased dramatically, 'cluster' platforms havemore » become popular in many settings. A somewhat different type of parallel computing platform know as a computational grid (alternatively, metacomputer) has arisen in comparatively recent times. Broadly speaking, this term refers not to a multiprocessor with identical processing nodes but rather to a heterogeneous collection of devices that are widely distributed, possibly around the globe. The advantage of such platforms is obvious: they have the potential to deliver enormous computing power. Just as obviously, however, the complexity of grids makes them very difficult to use. The Condor team, headed by Miron Livny at the University of Wisconsin, were among the pioneers in providing infrastructure for grid computations. More recently, the Globus project has developed technologies to support computations on geographically distributed platforms consisting of high-end computers, storage and visualization devices, and other scientific instruments. In 1997, we started the metaneos project as a collaborative effort between optimization specialists and the Condor and Globus groups. Our aim was to address complex, difficult optimization problems in several areas, designing and implementing the algorithms and the software infrastructure need to solve these problems on computational grids. This article describes some of the results we have obtained during the first three years of the metaneos project. Our efforts have led to development of the runtime support library MW for implementing algorithms with master-worker control structure on Condor platforms. This work is discussed here, along with work on algorithms and codes for integer linear programming, the quadratic assignment problem, and stochastic linear programmming. Our experiences in the metaneos project have shown that cheap, powerful computational grids can be used to tackle large optimization problems of various types. In an industrial or commercial setting, the results demonstrate that one may not have to buy powerful computational servers to solve many of the large problems arising in areas such as scheduling, portfolio optimization, or logistics; the idle time on employee workstations (or, at worst, an investment in a modest cluster of PCs) may do the job. For the optimization research community, our results motivate further work on parallel, grid-enabled algorithms for solving very large problems of other types. The fact that very large problems can be solved cheaply allows researchers to better understand issues of 'practical' complexity and of the role of heuristics.« less

  15. Final Report for Project FG02-05ER25685

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiaosong Ma

    2009-05-07

    In this report, the PI summarizes the results and achievements obtained in the sponsored project. Overall, the project has been very successful and produced both research results in massive data-intensive computing and data management for large scale supercomputers today, and in open-source software products. During the project period, 14 conference/journal publications, as well as two PhD students, have been produced due to exclusive or shared support from this award. In addition, the PI has recently been granted tenure from NC State University.

  16. Capitalizing on Community: the Small College Environment and the Development of Researchers

    NASA Astrophysics Data System (ADS)

    Stoneking, M. R.

    2014-03-01

    Liberal arts colleges constitute an important source of and training ground for future scientists. At Lawrence University, we take advantage of our small college environment to prepare physics students for research careers by complementing content acquisition with skill development and project experience distributed throughout the curriculum and with co-curricular elements that are tied to our close-knit supportive physics community. Small classes and frequent contact between physics majors and faculty members offer opportunities for regular and detailed feedback on the development of research relevant skills such as laboratory record-keeping, data analysis, electronic circuit design, computational programming, experimental design and modification, and scientific communication. Part of our approach is to balance collaborative group work on small projects (such as Arduino-based electronics projects and optical design challenges) with independent work (on, for example, advanced laboratory experimental extensions and senior capstone projects). Communal spaces and specialized facilities (experimental and computational) and active on-campus research programs attract eager students to the program, establish a community-based atmosphere, provide unique opportunities for the development of research aptitude, and offer opportunities for genuine contribution to a research program. Recently, we have also been encouraging innovativetendencies in physics majors through intentional efforts to develop personal characteristics, encouraging students to become more tolerant of ambiguity, risk-taking, initiative-seeking, and articulate. Indicators of the success of our approach include the roughly ten physics majors who graduate each year and our program's high ranking among institutions whose graduates go on to receive the Ph.D. in physics. Work supported in part by the National Science Foundation.

  17. Computing through Scientific Abstractions in SysBioPS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chin, George; Stephan, Eric G.; Gracio, Deborah K.

    2004-10-13

    Today, biologists and bioinformaticists have a tremendous amount of computational power at their disposal. With the availability of supercomputers, burgeoning scientific databases and digital libraries such as GenBank and PubMed, and pervasive computational environments such as the Grid, biologists have access to a wealth of computational capabilities and scientific data at hand. Yet, the rapid development of computational technologies has far exceeded the typical biologist’s ability to effectively apply the technology in their research. Computational sciences research and development efforts such as the Biology Workbench, BioSPICE (Biological Simulation Program for Intra-Cellular Evaluation), and BioCoRE (Biological Collaborative Research Environment) are importantmore » in connecting biologists and their scientific problems to computational infrastructures. On the Computational Cell Environment and Heuristic Entity-Relationship Building Environment projects at the Pacific Northwest National Laboratory, we are jointly developing a new breed of scientific problem solving environment called SysBioPSE that will allow biologists to access and apply computational resources in the scientific research context. In contrast to other computational science environments, SysBioPSE operates as an abstraction layer above a computational infrastructure. The goal of SysBioPSE is to allow biologists to apply computational resources in the context of the scientific problems they are addressing and the scientific perspectives from which they conduct their research. More specifically, SysBioPSE allows biologists to capture and represent scientific concepts and theories and experimental processes, and to link these views to scientific applications, data repositories, and computer systems.« less

  18. Building a virtual network in a community health research training program.

    PubMed

    Lau, F; Hayward, R

    2000-01-01

    To describe the experiences, lessons, and implications of building a virtual network as part of a two-year community health research training program in a Canadian province. An action research field study in which 25 health professionals from 17 health regions participated in a seven-week training course on health policy, management, economics, research methods, data analysis, and computer technology. The participants then returned to their regions to apply the knowledge in different community health research projects. Ongoing faculty consultations and support were provided as needed. Each participant was given a notebook computer with the necessary software, Internet access, and technical support for two years, to access information resources, engage in group problem solving, share ideas and knowledge, and collaborate on projects. Data collected over two years consisted of program documents, records of interviews with participants and staff, meeting notes, computer usage statistics, automated online surveys, computer conference postings, program Web site, and course feedback. The analysis consisted of detailed review and comparison of the data from different sources. NUD*IST was then used to validate earlier study findings. The ten key lessons are that role clarity, technology vision, implementation staging, protected time, just-in-time training, ongoing facilitation, work integration, participatory design, relationship building, and the demonstration of results are essential ingredients for building a successful network. This study provides a descriptive model of the processes involved in developing, in the community health setting, virtual networks that can be used as the basis for future research and as a practical guide for managers.

  19. Multicore: Fallout From a Computing Evolution (LBNL Summer Lecture Series)

    ScienceCinema

    Yelick, Kathy [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC)

    2018-05-07

    Summer Lecture Series 2008: Parallel computing used to be reserved for big science and engineering projects, but in two years that's all changed. Even laptops and hand-helds use parallel processors. Unfortunately, the software hasn't kept pace. Kathy Yelick, Director of the National Energy Research Scientific Computing Center at Berkeley Lab, describes the resulting chaos and the computing community's efforts to develop exciting applications that take advantage of tens or hundreds of processors on a single chip.

  20. Virtually going green: The role of quantum computational chemistry in reducing pollution and toxicity in chemistry

    NASA Astrophysics Data System (ADS)

    Stevens, Jonathan

    2017-07-01

    Continuing advances in computational chemistry has permitted quantum mechanical calculation to assist in research in green chemistry and to contribute to the greening of chemical practice. Presented here are recent examples illustrating the contribution of computational quantum chemistry to green chemistry, including the possibility of using computation as a green alternative to experiments, but also illustrating contributions to greener catalysis and the search for greener solvents. Examples of applications of computation to ambitious projects for green synthetic chemistry using carbon dioxide are also presented.

  1. Introduction to SIMRAND: Simulation of research and development project

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1982-01-01

    SIMRAND: SIMulation of Research ANd Development Projects is a methodology developed to aid the engineering and management decision process in the selection of the optimal set of systems or tasks to be funded on a research and development project. A project may have a set of systems or tasks under consideration for which the total cost exceeds the allocated budget. Other factors such as personnel and facilities may also enter as constraints. Thus the project's management must select, from among the complete set of systems or tasks under consideration, a partial set that satisfies all project constraints. The SIMRAND methodology uses analytical techniques and probability theory, decision analysis of management science, and computer simulation, in the selection of this optimal partial set. The SIMRAND methodology is truly a management tool. It initially specifies the information that must be generated by the engineers, thus providing information for the management direction of the engineers, and it ranks the alternatives according to the preferences of the decision makers.

  2. Long live the Data Scientist, but can he/she persist?

    NASA Astrophysics Data System (ADS)

    Wyborn, L. A.

    2011-12-01

    In recent years the fourth paradigm of data intensive science has slowly taken hold as the increased capacity of instruments and an increasing number of instruments (in particular sensor networks) have changed how fundamental research is undertaken. Most modern scientific research is about digital capture of data direct from instruments, processing it by computers, storing the results on computers and only publishing a small fraction of data in hard copy publications. At the same time, the rapid increase in capacity of supercomputers, particularly at petascale, means that far larger data sets can be analysed and to greater resolution than previously possible. The new cloud computing paradigm which allows distributed data, software and compute resources to be linked by seamless workflows, is creating new opportunities in processing of high volumes of data to an increasingly larger number of researchers. However, to take full advantage of these compute resources, data sets for analysis have to be aggregated from multiple sources to create high performance data sets. These new technology developments require that scientists must become more skilled in data management and/or have a higher degree of computer literacy. In almost every science discipline there is now an X-informatics branch and a computational X branch (eg, Geoinformatics and Computational Geoscience): both require a new breed of researcher that has skills in both the science fundamentals and also knowledge of some ICT aspects (computer programming, data base design and development, data curation, software engineering). People that can operate in both science and ICT are increasingly known as 'data scientists'. Data scientists are a critical element of many large scale earth and space science informatics projects, particularly those that are tackling current grand challenges at an international level on issues such as climate change, hazard prediction and sustainable development of our natural resources. These projects by their very nature require the integration of multiple digital data sets from multiple sources. Often the preparation of the data for computational analysis can take months and requires painstaking attention to detail to ensure that anomalies identified are real and are not just artefacts of the data preparation and/or the computational analysis. Although data scientists are increasingly vital to successful data intensive earth and space science projects, unless they are recognised for their capabilities in both the science and the computational domains they are likely to migrate to either a science role or an ICT role as their career advances. Most reward and recognition systems do not recognise those with skills in both, hence, getting trained data scientists to persist beyond one or two projects can be challenge. Those data scientists that persist in the profession are characteristically committed and enthusiastic people who have the support of their organisations to take on this role. They also tend to be people who share developments and are critical to the success of the open source software movement. However, the fact remains that survival of the data scientist as a species is being threatened unless something is done to recognise their invaluable contributions to the new fourth paradigm of science.

  3. RIACS

    NASA Technical Reports Server (NTRS)

    Moore, Robert C.

    1998-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities that serves as a bridge between NASA and the academic community. Under a five-year co-operative agreement with NASA, research at RIACS is focused on areas that are strategically enabling to the Ames Research Center's role as NASA's Center of Excellence for Information Technology. The primary mission of RIACS is charted to carry out research and development in computer science. This work is devoted in the main to tasks that are strategically enabling with respect to NASA's bold mission in space exploration and aeronautics. There are three foci for this work: (1) Automated Reasoning. (2) Human-Centered Computing. and (3) High Performance Computing and Networking. RIACS has the additional goal of broadening the base of researcher in these areas of importance to the nation's space and aeronautics enterprises. Through its visiting scientist program, RIACS facilitates the participation of university-based researchers, including both faculty and students, in the research activities of NASA and RIACS. RIACS researchers work in close collaboration with NASA computer scientists on projects such as the Remote Agent Experiment on Deep Space One mission, and Super-Resolution Surface Modeling.

  4. A cyber-linked undergraduate research experience in computational biomolecular structure prediction and design

    PubMed Central

    Alford, Rebecca F.; Dolan, Erin L.

    2017-01-01

    Computational biology is an interdisciplinary field, and many computational biology research projects involve distributed teams of scientists. To accomplish their work, these teams must overcome both disciplinary and geographic barriers. Introducing new training paradigms is one way to facilitate research progress in computational biology. Here, we describe a new undergraduate program in biomolecular structure prediction and design in which students conduct research at labs located at geographically-distributed institutions while remaining connected through an online community. This 10-week summer program begins with one week of training on computational biology methods development, transitions to eight weeks of research, and culminates in one week at the Rosetta annual conference. To date, two cohorts of students have participated, tackling research topics including vaccine design, enzyme design, protein-based materials, glycoprotein modeling, crowd-sourced science, RNA processing, hydrogen bond networks, and amyloid formation. Students in the program report outcomes comparable to students who participate in similar in-person programs. These outcomes include the development of a sense of community and increases in their scientific self-efficacy, scientific identity, and science values, all predictors of continuing in a science research career. Furthermore, the program attracted students from diverse backgrounds, which demonstrates the potential of this approach to broaden the participation of young scientists from backgrounds traditionally underrepresented in computational biology. PMID:29216185

  5. A cyber-linked undergraduate research experience in computational biomolecular structure prediction and design.

    PubMed

    Alford, Rebecca F; Leaver-Fay, Andrew; Gonzales, Lynda; Dolan, Erin L; Gray, Jeffrey J

    2017-12-01

    Computational biology is an interdisciplinary field, and many computational biology research projects involve distributed teams of scientists. To accomplish their work, these teams must overcome both disciplinary and geographic barriers. Introducing new training paradigms is one way to facilitate research progress in computational biology. Here, we describe a new undergraduate program in biomolecular structure prediction and design in which students conduct research at labs located at geographically-distributed institutions while remaining connected through an online community. This 10-week summer program begins with one week of training on computational biology methods development, transitions to eight weeks of research, and culminates in one week at the Rosetta annual conference. To date, two cohorts of students have participated, tackling research topics including vaccine design, enzyme design, protein-based materials, glycoprotein modeling, crowd-sourced science, RNA processing, hydrogen bond networks, and amyloid formation. Students in the program report outcomes comparable to students who participate in similar in-person programs. These outcomes include the development of a sense of community and increases in their scientific self-efficacy, scientific identity, and science values, all predictors of continuing in a science research career. Furthermore, the program attracted students from diverse backgrounds, which demonstrates the potential of this approach to broaden the participation of young scientists from backgrounds traditionally underrepresented in computational biology.

  6. Syringomyelia

    MedlinePlus

    ... is the most reliable way to diagnose syringomyelia. Computer-generated radio waves and a powerful magnetic field ... a searchable database of current and past research projects supported by NIH and other federal agencies. RePORTER ...

  7. NASA Earth Exchange (NEX) Supporting Analyses for National Climate Assessments

    NASA Astrophysics Data System (ADS)

    Nemani, R. R.; Thrasher, B. L.; Wang, W.; Lee, T. J.; Melton, F. S.; Dungan, J. L.; Michaelis, A.

    2015-12-01

    The NASA Earth Exchange (NEX) is a collaborative computing platform that has been developed with the objective of bringing scientists together with the software tools, massive global datasets, and supercomputing resources necessary to accelerate research in Earth systems science and global change. NEX supports several research projects that are closely related with the National Climate Assessment including the generation of high-resolution climate projections, identification of trends and extremes in climate variables and the evaluation of their impacts on regional carbon/water cycles and biodiversity, the development of land-use management and adaptation strategies for climate-change scenarios, and even the exploration of climate mitigation through geo-engineering. Scientists also use the large collection of satellite data on NEX to conduct research on quantifying spatial and temporal changes in land surface processes in response to climate and land-cover-land-use changes. Researchers, leveraging NEX's massive compute/storage resources, have used statistical techniques to downscale the coarse-resolution CMIP5 projections to fulfill the demands of the community for a wide range of climate change impact analyses. The DCP-30 (Downscaled Climate Projections at 30 arcsecond) for the conterminous US at monthly, ~1km resolution and the GDDP (Global Daily Downscaled Projections) for the entire world at daily, 25km resolution are now widely used in climate research and applications, as well as for communicating climate change. In order to serve a broader community, the NEX team in collaboration with Amazon, Inc, created the OpenNEX platform. OpenNEX provides ready access to NEX data holdings, including the NEX-DCP30 and GDDP datasets along with a number of pertinent analysis tools and workflows on the AWS infrastructure in the form of publicly available, self contained, fully functional Amazon Machine Images (AMI's) for anyone interested in global climate change.

  8. GROTTO visualization for decision support

    NASA Astrophysics Data System (ADS)

    Lanzagorta, Marco O.; Kuo, Eddy; Uhlmann, Jeffrey K.

    1998-08-01

    In this paper we describe the GROTTO visualization projects being carried out at the Naval Research Laboratory. GROTTO is a CAVE-like system, that is, a surround-screen, surround- sound, immersive virtual reality device. We have explored the GROTTO visualization in a variety of scientific areas including oceanography, meteorology, chemistry, biochemistry, computational fluid dynamics and space sciences. Research has emphasized the applications of GROTTO visualization for military, land and sea-based command and control. Examples include the visualization of ocean current models for the simulation and stud of mine drifting and, inside our computational steering project, the effects of electro-magnetic radiation on missile defense satellites. We discuss plans to apply this technology to decision support applications involving the deployment of autonomous vehicles into contaminated battlefield environments, fire fighter control and hostage rescue operations.

  9. Final Report: Correctness Tools for Petascale Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mellor-Crummey, John

    2014-10-27

    In the course of developing parallel programs for leadership computing systems, subtle programming errors often arise that are extremely difficult to diagnose without tools. To meet this challenge, University of Maryland, the University of Wisconsin—Madison, and Rice University worked to develop lightweight tools to help code developers pinpoint a variety of program correctness errors that plague parallel scientific codes. The aim of this project was to develop software tools that help diagnose program errors including memory leaks, memory access errors, round-off errors, and data races. Research at Rice University focused on developing algorithms and data structures to support efficient monitoringmore » of multithreaded programs for memory access errors and data races. This is a final report about research and development work at Rice University as part of this project.« less

  10. Hypersonic research engine/aerothermodynamic integration model, experimental results. Volume 1: Mach 6 component integration

    NASA Technical Reports Server (NTRS)

    Andrews, E. H., Jr.; Mackley, E. A.

    1976-01-01

    The NASA Hypersonic Research Engine (HRE) Project was initiated for the purpose of advancing the technology of airbreathing propulsion for hypersonic flight. A large component (inlet, combustor, and nozzle) and structures development program was encompassed by the project. The tests of a full-scale (18 in. diameter cowl and 87 in. long) HRE concept, designated the Aerothermodynamic Integration Model (AIM), at Mach numbers of 5, 6, and 7. Computer program results for Mach 6 component integration tests are presented.

  11. Strain engineering of electronic and magnetic properties of Ga2S2 nanoribbons

    NASA Astrophysics Data System (ADS)

    Wang, Bao-Ji; Li, Xiao-Hua; Zhang, Li-Wei; Wang, Guo-Dong; Ke, San-Huang

    2017-05-01

    Not Available Project supported by the National Natural Science Foundation of China (Grant Nos. 11174220 and 11374226), the Key Scientific Research Project of the Henan Institutions of Higher Learning, China (Grant No. 16A140009), the Program for Innovative Research Team of Henan Polytechnic University, China (Grant Nos. T2015-3 and T2016-2), the Doctoral Foundation of Henan Polytechnic University, China (Grant No. B2015-46), and the High-performance Grid Computing Platform of Henan Polytechnic University, China.

  12. 3D Acoustic Full Waveform Inversion for Engineering Purpose

    NASA Astrophysics Data System (ADS)

    Lim, Y.; Shin, S.; Kim, D.; Kim, S.; Chung, W.

    2017-12-01

    Seismic waveform inversion is the most researched data processing technique. In recent years, with an increase in marine development projects, seismic surveys are commonly conducted for engineering purposes; however, researches for application of waveform inversion are insufficient. The waveform inversion updates the subsurface physical property by minimizing the difference between modeled and observed data. Furthermore, it can be used to generate an accurate subsurface image; however, this technique consumes substantial computational resources. Its most compute-intensive step is the calculation of the gradient and hessian values. This aspect gains higher significance in 3D as compared to 2D. This paper introduces a new method for calculating gradient and hessian values, in an effort to reduce computational overburden. In the conventional waveform inversion, the calculation area covers all sources and receivers. In seismic surveys for engineering purposes, the number of receivers is limited. Therefore, it is inefficient to construct the hessian and gradient for the entire region (Figure 1). In order to tackle this problem, we calculate the gradient and the hessian for a single shot within the range of the relevant source and receiver. This is followed by summing up of these positions for the entire shot (Figure 2). In this paper, we demonstrate that reducing the area of calculation of the hessian and gradient for one shot reduces the overall amount of computation and therefore, the computation time. Furthermore, it is proved that the waveform inversion can be suitably applied for engineering purposes. In future research, we propose to ascertain an effective calculation range. This research was supported by the Basic Research Project(17-3314) of the Korea Institute of Geoscience and Mineral Resources(KIGAM) funded by the Ministry of Science, ICT and Future Planning of Korea.

  13. Barriers and facilitators to home computer and internet use among urban novice computer users of low socioeconomic position.

    PubMed

    Kontos, Emily Z; Bennett, Gary G; Viswanath, K

    2007-10-22

    Despite the increasing penetration of the Internet and amount of online health information, there are significant barriers that limit its widespread adoption as a source of health information. One is the "digital divide," with people of higher socioeconomic position (SEP) demonstrating greater access and usage compared to those from lower SEP groups. However, as the access gap narrows over time and more people use the Internet, a shift in research needs to occur to explore how one might improve Internet use as well as website design for a range of audiences. This is particularly important in the case of novice users who may not have the technical skills, experience, or social connections that could help them search for health information using the Internet. The focus of our research is to investigate the challenges in the implementation of a project to improve health information seeking among low SEP groups. The goal of the project is not to promote health information seeking as much as to understand the barriers and facilitators to computer and Internet use, beyond access, among members of lower SEP groups in an urban setting. The purpose was to qualitatively describe participants' self-identified barriers and facilitators to computer and Internet use during a 1-year pilot study as well as the challenges encountered by the research team in the delivery of the intervention. Between August and November 2005, 12 low-SEP urban individuals with no or limited computer and Internet experience were recruited through a snowball sampling. Each participant received a free computer system, broadband Internet access, monthly computer training courses, and technical support for 1 year as the intervention condition. Upon completion of the study, participants were offered the opportunity to complete an in-depth semistructured interview. Interviews were approximately 1 hour in length and were conducted by the project director. The interviews were held in the participants' homes and were tape recorded for accuracy. Nine of the 12 study participants completed the semistructured interviews. Members of the research team conducted a qualitative analysis based on the transcripts from the nine interviews using the crystallization/immersion method. Nine of the 12 participants completed the in-depth interview (75% overall response rate), with three men and six women agreeing to be interviewed. Major barriers to Internet use that were mentioned included time constraints and family conflict over computer usage. The monthly training classes and technical assistance components of the intervention surfaced as the most important facilitators to computer and Internet use. The concept of received social support from other study members, such as assistance with computer-related questions, also emerged as an important facilitator to overall computer usage. This pilot study offers important insights into the self-identified barriers and facilitators in computer and Internet use among urban low-SEP novice users as well as the challenges faced by the research team in implementing the intervention.

  14. FBIS report. Science and technology: Europe/International, March 29, 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1996-03-29

    ;Partial Contents: Advanced Materials (EU Project to Improve Production in Metal Matrix Compounds Noted, Germany: Extremely Hard Carbon Coating Development, Italy: Director of CNR Metallic Materials Institute Interviewed); Aerospace (ESA Considers Delays, Reductions as Result of Budget Cuts, Italy: Space Agency`s Director on Restructuring, Future Plans); Automotive, Transportation (EU: Clean Diesel Engine Technology Research Reviewed); Biotechnology (Germany`s Problems, Successes in Biotechnology Discussed); Computers (EU Europort Parallel Computing Project Concluded, Italy: PQE 2000 Project on Massively Parallel Systems Viewed); Defense R&D (France: Future Tasks of `Brevel` Military Intelligence Drone Noted); Energy, Environment (German Scientist Tests Elimination of Phosphates); Advanced Manufacturing (France:more » Advanced Rapid Prototyping System Presented); Lasers, Sensors, Optics (France: Strategy of Cilas Laser Company Detailed); Microelectronics (France: Simulation Company to Develop Microelectronic Manufacturing Application); Nuclear R&D (France: Megajoule Laser Plan, Cooperation with Livermore Lab Noted); S&T Policy (EU Efforts to Aid Small Companies` Research Viewed); Telecommunications (France Telecom`s Way to Internet).« less

  15. Exploratory Research and Development Fund, FY 1990

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1992-05-01

    The Lawrence Berkeley Laboratory Exploratory R D Fund FY 1990 report is compiled from annual reports submitted by principal investigators following the close of the fiscal year. This report describes the projects supported and summarizes their accomplishments. It constitutes a part of an Exploratory R D Fund (ERF) planning and documentation process that includes an annual planning cycle, projection selection, implementation, and review. The research areas covered in this report are: Accelerator and fusion research; applied science; cell and molecular biology; chemical biodynamics; chemical sciences; earth sciences; engineering; information and computing sciences; materials sciences; nuclear science; physics and research medicinemore » and radiation biophysics.« less

  16. Application of supercomputers to computational aerodynamics

    NASA Technical Reports Server (NTRS)

    Peterson, V. L.

    1984-01-01

    Computers are playing an increasingly important role in the field of aerodynamics such that they now serve as a major complement to wind tunnels in aerospace research and development. Factors pacing advances in computational aerodynamics are identified, including the amount of computational power required to take the next major step in the discipline. Example results obtained from the successively refined forms of the governing equations are discussed, both in the context of levels of computer power required and the degree to which they either further the frontiers of research or apply to problems of practical importance. Finally, the Numerical Aerodynamic Simulation (NAS) Program - with its 1988 target of achieving a sustained computational rate of 1 billion floating point operations per second and operating with a memory of 240 million words - is discussed in terms of its goals and its projected effect on the future of computational aerodynamics.

  17. Biological and Environmental Research Exascale Requirements Review. An Office of Science review sponsored jointly by Advanced Scientific Computing Research and Biological and Environmental Research, March 28-31, 2016, Rockville, Maryland

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arkin, Adam; Bader, David C.; Coffey, Richard

    Understanding the fundamentals of genomic systems or the processes governing impactful weather patterns are examples of the types of simulation and modeling performed on the most advanced computing resources in America. High-performance computing and computational science together provide a necessary platform for the mission science conducted by the Biological and Environmental Research (BER) office at the U.S. Department of Energy (DOE). This report reviews BER’s computing needs and their importance for solving some of the toughest problems in BER’s portfolio. BER’s impact on science has been transformative. Mapping the human genome, including the U.S.-supported international Human Genome Project that DOEmore » began in 1987, initiated the era of modern biotechnology and genomics-based systems biology. And since the 1950s, BER has been a core contributor to atmospheric, environmental, and climate science research, beginning with atmospheric circulation studies that were the forerunners of modern Earth system models (ESMs) and by pioneering the implementation of climate codes onto high-performance computers. See http://exascaleage.org/ber/ for more information.« less

  18. Computational aerodynamics development and outlook /Dryden Lecture in Research for 1979/

    NASA Technical Reports Server (NTRS)

    Chapman, D. R.

    1979-01-01

    Some past developments and current examples of computational aerodynamics are briefly reviewed. An assessment is made of the requirements on future computer memory and speed imposed by advanced numerical simulations, giving emphasis to the Reynolds averaged Navier-Stokes equations and to turbulent eddy simulations. Experimental scales of turbulence structure are used to determine the mesh spacings required to adequately resolve turbulent energy and shear. Assessment also is made of the changing market environment for developing future large computers, and of the projections of micro-electronics memory and logic technology that affect future computer capability. From the two assessments, estimates are formed of the future time scale in which various advanced types of aerodynamic flow simulations could become feasible. Areas of research judged especially relevant to future developments are noted.

  19. CGAT: a model for immersive personalized training in computational genomics

    PubMed Central

    Sims, David; Ponting, Chris P.

    2016-01-01

    How should the next generation of genomics scientists be trained while simultaneously pursuing high quality and diverse research? CGAT, the Computational Genomics Analysis and Training programme, was set up in 2010 by the UK Medical Research Council to complement its investment in next-generation sequencing capacity. CGAT was conceived around the twin goals of training future leaders in genome biology and medicine, and providing much needed capacity to UK science for analysing genome scale data sets. Here we outline the training programme employed by CGAT and describe how it dovetails with collaborative research projects to launch scientists on the road towards independent research careers in genomics. PMID:25981124

  20. NASA Space Engineering Research Center for VLSI systems design

    NASA Technical Reports Server (NTRS)

    1991-01-01

    This annual review reports the center's activities and findings on very large scale integration (VLSI) systems design for 1990, including project status, financial support, publications, the NASA Space Engineering Research Center (SERC) Symposium on VLSI Design, research results, and outreach programs. Processor chips completed or under development are listed. Research results summarized include a design technique to harden complementary metal oxide semiconductors (CMOS) memory circuits against single event upset (SEU); improved circuit design procedures; and advances in computer aided design (CAD), communications, computer architectures, and reliability design. Also described is a high school teacher program that exposes teachers to the fundamentals of digital logic design.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Solar-Lezama, Armando

    The goal of the project was to develop a programming model that would significantly improve productivity in the high-performance computing domain by bringing together three components: a) Automated equivalence checking, b) Sketch-based program synthesis, and c) Autotuning. The report provides an executive summary of the research accomplished through this project. At the end of the report is appended a paper that describes in more detail the key technical accomplishments from this project, and which was published in SC 2014.

  2. Loads calibrations of strain gage bridges on the DAST project Aeroelastic Research Wing (ARW-1)

    NASA Technical Reports Server (NTRS)

    Eckstrom, C. V.

    1980-01-01

    The details of and results from the procedure used to calibrate strain gage bridges for measurement of wing structural loads for the DAST project ARW-1 wing are presented. Results are in the form of loads equations and comparison of computed loads vs. actual loads for two simulated flight loading conditions.

  3. Teaching Science with Web-Based Inquiry Projects: An Exploratory Investigation

    ERIC Educational Resources Information Center

    Webb, Aubree M.; Knight, Stephanie L.; Wu, X. Ben; Schielack, Jane F.

    2014-01-01

    The purpose of this research is to explore a new computer-based interactive learning approach to assess the impact on student learning and attitudes toward science in a large university ecology classroom. A comparison was done with an established program to measure the relative impact of the new approach. The first inquiry project, BearCam, gives…

  4. Solutions Unlimited: Evaluation of Prototype Units 1 and 3. Research Report 89.

    ERIC Educational Resources Information Center

    Agency for Instructional Television, Bloomington, IN.

    Solutions Unlimited is a computer/video project developed by the Agency for Instructional Television (AIT) in collaboration with a consortium of state, provincial, and local education agencies. The goal of the project is to improve the problem-solving skills of students in grades 6 through 8 through the use of brief instructional television…

  5. Use of Tablet Computers to Improve Access to Education in a Remote Location

    ERIC Educational Resources Information Center

    Ally, Mohamed; Balaji, V.; Abdelbaki, Anwar; Cheng, Ricky

    2017-01-01

    A research project was carried out in using mobile learning to increase access to education. This project is contributing to the achievement of Goal 4 of the Sustainable Development Goals (SDGs), which is to "Ensure inclusive and equitable quality education and promote lifelong learning opportunities for all". The mobile learning project…

  6. Exploiting the Potential of CD-ROM Databases: Staff Induction at the University of East Anglia.

    ERIC Educational Resources Information Center

    Guillot, Marie-Noelle; Kenning, Marie-Madeleine

    1995-01-01

    Overviews a project exploring the possibility of using CD-ROM applications and the design of exploratory didactic materials to introduce academic staff to the field of computer-assisted instruction. The project heightened the staff's awareness of electronic resources and their potential as research, teaching, and learning aids, with particular…

  7. In Search of Social Movement Learning: The Growing Jobs for Living Project. NALL Working Paper.

    ERIC Educational Resources Information Center

    Clover, Darlene E.; Hall, Budd L.

    The New Approaches to Lifelong Learning (NALL) project is a Canada-wide 5-year research initiative during which more than 70 academic and community members are working collaboratively within a framework of informal learning to address the following issues: informal computer-based learning, recognition of prior learning, informal learning in a…

  8. Mind Games, Reasoning Skills, and the Primary School Curriculum

    ERIC Educational Resources Information Center

    Bottino, Rosa Maria; Ott, Michela

    2006-01-01

    This paper reports on a pilot research project aimed at helping to develop some strategic and reasoning abilities in primary school pupils by engaging them in educational itineraries based on the use of a number of computer mind games. The paper briefly describes the project's aims and organization, the kind of games used and the working…

  9. Top 10 Uses for ClarisWorks in the One-Computer Classroom.

    ERIC Educational Resources Information Center

    Robinette, Michelle

    1996-01-01

    Suggests ways to use ClarisWorks to motivate students when only one computer is accessible: (1) class database; (2) grade book; (3) classroom journal; (4) ongoing story center; (5) skill-and-draw review station; (6) monthly class magazine/newspaper; (7) research base/project planner; (8) lecture and presentation enhancement; (9) database of ideas…

  10. Evaluating the Acceptance of Cloud-Based Productivity Computer Solutions in Small and Medium Enterprises

    ERIC Educational Resources Information Center

    Dominguez, Alfredo

    2013-01-01

    Cloud computing has emerged as a new paradigm for on-demand delivery and consumption of shared IT resources over the Internet. Research has predicted that small and medium organizations (SMEs) would be among the earliest adopters of cloud solutions; however, this projection has not materialized. This study set out to investigate if behavior…

  11. Computer Based Training - A Report of a NATO Study Visit to America. A.P. Report 91.

    ERIC Educational Resources Information Center

    Patrick, J.

    This report describes some of the research projects encountered on a 1979 study visit which investigated the nature and availability of computer-based training (CBT) systems in the United States and Canada, particularly within industrial, occupational and military contexts. An overview of the trip itinerary includes the names of the organizations…

  12. Human Factors in the Design of a Computer-Assisted Instruction System. Technical Progress Report.

    ERIC Educational Resources Information Center

    Mudge, J. C.

    A research project built an author-controlled computer-assisted instruction (CAI) system to study ease-of-use factors in student-system, author-system, and programer-system interfaces. Interfaces were designed and observed in use and systematically revised. Development of course material by authors, use by students, and administrative tasks were…

  13. Near-term hybrid vehicle program, phase 1. Appendix B: Design trade-off studies report. Volume 3: Computer program listings

    NASA Technical Reports Server (NTRS)

    1979-01-01

    A description and listing is presented of two computer programs: Hybrid Vehicle Design Program (HYVELD) and Hybrid Vehicle Simulation Program (HYVEC). Both of the programs are modifications and extensions of similar programs developed as part of the Electric and Hybrid Vehicle System Research and Development Project.

  14. Mixed-Initiative Information System for Computer-Aided Training and Decision Making. Final Report.

    ERIC Educational Resources Information Center

    Grignetti, Mario C.; Warnock, Eleanor H.

    A description of the NET-SCHOLAR system, an on-line aid for naive users of the Advanced Research Projects Administration (ARPA) Computer Network, is provided. The discussion focuses upon the system's representation and handling of functional and procedural information and its ability to deal with action verbs, all within the context of the ARPA…

  15. Computer Mediated Communication in the Universal Design for Learning Framework for Preparation of Special Education Teachers

    ERIC Educational Resources Information Center

    Basham, James D.; Lowrey, K. Alisa; deNoyelles, Aimee

    2010-01-01

    This study investigated the Universal Design for Learning (UDL) framework as a basis for a bi-university computer mediated communication (CMC) collaborative project. Participants in the research included 78 students from two special education programs enrolled in teacher education courses. The focus of the investigation was on exploring the…

  16. Physical Computing for STEAM Education: Maker-Educators' Experiences in an Online Graduate Course

    ERIC Educational Resources Information Center

    Hsu, Yu-Chang; Ching, Yu-Hui; Baldwin, Sally

    2018-01-01

    This research explored how K-16 educators learned physical computing, and developed as maker-educators in an online graduate course. With peer support and instructor guidance, these educators designed maker projects using Scratch and Makey Makey, and developed educational maker proposals with plans of teaching the topics of their choice in STEAM…

  17. Comparing Students' Scratch Skills with Their Computational Thinking Skills in Terms of Different Variables

    ERIC Educational Resources Information Center

    Oluk, Ali; Korkmaz, Özgen

    2016-01-01

    This study aimed to compare 5th graders' scores obtained from Scratch projects developed in the framework of Information Technologies and Software classes via Dr Scratch web tool with the scores obtained from Computational Thinking Levels Scale and to examine this comparison in terms of different variables. Correlational research model was…

  18. The Coded Schoolhouse: One-to-One Tablet Computer Programs and Urban Education

    ERIC Educational Resources Information Center

    Crooks, Roderic N.

    2016-01-01

    Using a South Los Angeles charter school of approximately 650 students operated by a non-profit charter management organization (CMO) as the primary field site, this two-year, ethnographic research project examines the implementation of a one-to-one tablet computer program in a public high school. This dissertation examines the variety of ways…

  19. Accident Prevention through Driving Skills Assessment and Interventions for Older Drivers: A Programmatic Research Project.

    ERIC Educational Resources Information Center

    Yee, Darlene; Melichar, Joseph F.

    A study examined the effectiveness of a new computer-based multiphasic approach to identifying "at-risk" older drivers and remediating deficits in their driving and traffic safety attitudes, knowledge, and skills. Data collected from 250 older drivers in 3 states were used to develop specifications for computer-based driver improvement modules.…

  20. Computer Simulation Lends New Insights Into Cyanide-Caused Cardiac Toxicity

    DTIC Science & Technology

    2004-12-01

    COMPUTER SIMULATION LENDS NEW INSIGHTS INTO CYANIDE-CAUSED CARDIAC TOXICITY C.K. Zoltani* U.S. Army Research Laboratory Computational and...Into Cyanide-Caused Cardiac Toxicity 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK...disequilibrium in the membrane currents caused by the cyanide has grave implications for the cell’s electrophysiology. CN-caused cardiac toxicity shares

  1. Zooniverse - Real science online with more than a million people. (Invited)

    NASA Astrophysics Data System (ADS)

    Smith, A.; Lynn, S.; Lintott, C.; Whyte, L.; Borden, K. A.

    2013-12-01

    The Zooniverse (zooniverse.org) began in 2007 with the launch of Galaxy Zoo, a project in which more than 175,000 people provided shape analyses of more than 1 million galaxy images sourced from the Sloan Digital Sky Survey. These galaxy 'classifications', some 60 million in total, have since been used to produce more than 50 peer-reviewed publications based not only on the original research goals of the project but also because of serendipitous discoveries made by the volunteer community. Based upon the success of Galaxy Zoo the team have gone on to develop more than 25 web-based citizen science projects, all with a strong research focus in a range of subjects from astronomy to zoology where human-based analysis still exceeds that of machine intelligence. Over the past 6 years Zooniverse projects have collected more than 300 million data analyses from over 1 million volunteers providing fantastically rich datasets for not only the individuals working to produce research from their project but also the machine learning and computer vision research communities. This talk will focus on the core 'method' by which Zooniverse projects are developed and lessons learned by the Zooniverse team developing citizen science projects across a range of disciplines.

  2. JPL basic research review. [research and advanced development

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Current status, projected goals, and results of 49 research and advanced development programs at the Jet Propulsion Laboratory are reported in abstract form. Areas of investigation include: aerodynamics and fluid mechanics, applied mathematics and computer sciences, environment protection, materials science, propulsion, electric and solar power, guidance and navigation, communication and information sciences, general physics, and chemistry.

  3. The Mind Research Network - Mental Illness Neuroscience Discovery Grant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberts, J.; Calhoun, V.

    The scientific and technological programs of the Mind Research Network (MRN), reflect DOE missions in basic science and associated instrumentation, computational modeling, and experimental techniques. MRN's technical goals over the course of this project have been to develop and apply integrated, multi-modality functional imaging techniques derived from a decade of DOE-support research and technology development.

  4. An Object-Oriented Software Reuse Tool

    DTIC Science & Technology

    1989-04-01

    Square Cambridge, MA 02139 I. CONTROLLING OFFICE NAME ANO ADDRESS 12. REPORT DATIE Advanced Research Projects Agency April 1989 1400 Wilson Blvd. IS...Office of Naval Research UNCLASSIFIED Information Systems Arlington, VA 22217 1s,. DECLASSIFICATION/DOWNGRAOINGSCHEDUL.E 6. O:STRIILJTION STATEMENT (of...DISTRIBUTION: Defense Technical Information Center Computer Sciences Division ONR, Code 1133 Navy Center for Applied Research in Artificial

  5. 1985 Annual Technical Report: A Research Program in Computer Technology. July 1984--June 1985.

    ERIC Educational Resources Information Center

    University of Southern California, Marina del Rey. Information Sciences Inst.

    Summaries of research performed by the Information Sciences Institute at the University of Southern California for the U.S. Department of Defense Advanced Research Projects Agency in 17 areas are provided in this report: (1) Common LISP framework, an exportable version of the Formalized Software Development (FSD) testbed; (2) Explainable Expert…

  6. GREMEX- GODDARD RESEARCH AND ENGINEERING MANAGEMENT EXERCISE SIMULATION SYSTEM

    NASA Technical Reports Server (NTRS)

    Vaccaro, M. J.

    1994-01-01

    GREMEX is a man-machine management simulation game of a research and development project. It can be used to depict a project from just after the development of the project plan through the final construction phase. The GREMEX computer programs are basically a program evaluation and review technique (PERT) reporting system. In the usual PERT program, the operator inputs each month the amount of work performed on each activity and the computer does the bookkeeping to determine the expected completion date of the project. GREMEX automatically assumes that all activities due to be worked in the current month will be worked. GREMEX predicts new durations (and costs) each month based on management actions taken by the players and the contractor's abilities. Each activity is assigned the usual cost and duration estimates but must also be assigned three parameters that relate to the probability that the time estimate is correct, the probability that the cost estimate is correct, and the probability of technical success. Management actions usually can be expected to change these probabilities. For example, use of overtime or double shifts in research and development work will decrease duration and increase cost by known proportions and will also decrease the probability of technical success due to an increase in the likelihood of accidents or mistakes. These re-estimating future events and assigning probability factors provides life to the model. GREMEX is not a production job for project management. GREMEX is a game that can be used to train management personnel in the administration of research and development type projects. GREMEX poses no 'best way' to manage a project. The emphasis of GREMEX is to expose participants to many of the factors involved in decision making when managing a project in a government research and development environment. A management team can win the game by surpassing cost, schedule, and technical performance goals established when the simulation began. The serious management experimenter can use GREMEX to explore the results of management methods they could not risk in real life. GREMEX can operate with any research and development type project with up to 15 subcontractors and produces reports simulating monthly or quarterly updates of the project PERT network. Included with the program is a data deck for simulation of a fictitious spacecraft project. Instructions for substituting other projects are also included. GREMEX is written in FORTRAN IV for execution in the batch mode and has been implemented on an IBM 360 with a central memory requirement of approximately 350K (decimal) of 8 bit bytes. The GREMEX system was developed in 1973.

  7. Evaluative studies in nuclear medicine research: emission computed tomography assessment. Final report, January 1-December 31, 1981

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Potchen, E.J.; Harris, G.I.; Gift, D.A.

    The report provides information on an assessment of the potential short and long term benefits of emission computed tomography (ECT) in biomedical research and patient care. Work during the past year has been augmented by the development and use of an opinion survey instrument to reach a wider representation of knowledgeable investigators and users of this technology. This survey instrument is reproduced in an appendix. Information derived from analysis of the opinion survey, and used in conjunction with results of independent staff studies of available sources, provides the basis for the discussions given in following sections of PET applications inmore » the brain, of technical factors, and of economic implications. Projections of capital and operating costs on a per study basis were obtained from a computerized, pro forma accounting model and are compared with the survey cost estimates for both research and clinical modes of application. The results of a cash-flow model analysis of the relationship between projected economic benefit of PET research to disease management and the costs associated with such research are presented and discussed.« less

  8. M/A-COM linkabit eastern operations

    NASA Astrophysics Data System (ADS)

    Mills, D. L.; Avramovic, Z.

    1983-03-01

    This first Quarterly Project Report on LINKABIT's contribution to the Defense Advanced Research Projects Agency (DARPA) Internet Program covers the period from 22 December 1982 through 21 March 1983. LINKABIT's support of the Internet Program is concentrated in the areas of protocol design, implementation, testing, and evaluation. In addition, LINKABIT staff are providing integration and support services for certain computer systems to be installed at DARPA sites in Washington, D.C., and Stuttgart, West Germany. During the period covered by this report, LINKABIT organized the project activities and established staff responsibilities. Several computers and peripheral devices were made available from Government sources for use in protocol development and network testing. Considerable time was devoted to installing this equipment, integrating the software, and testing it with the Internet system.

  9. Impact of remote sensing upon the planning, management, and development of water resources

    NASA Technical Reports Server (NTRS)

    Castruccio, P. A.; Loats, H. L.; Fowler, T. R.; Frech, S. L.

    1975-01-01

    Principal water resources users were surveyed to determine the impact of remote data streams on hydrologic computer models. Analysis of responses demonstrated that: most water resources effort suitable to remote sensing inputs is conducted through federal agencies or through federally stimulated research; and, most hydrologic models suitable to remote sensing data are federally developed. Computer usage by major water resources users was analyzed to determine the trends of usage and costs for the principal hydrologic users/models. The laws and empirical relationships governing the growth of the data processing loads were described and applied to project the future data loads. Data loads for ERTS CCT image processing were computed and projected through the 1985 era.

  10. Generic Divide and Conquer Internet-Based Computing

    NASA Technical Reports Server (NTRS)

    Radenski, Atanas; Follen, Gregory J. (Technical Monitor)

    2001-01-01

    The rapid growth of internet-based applications and the proliferation of networking technologies have been transforming traditional commercial application areas as well as computer and computational sciences and engineering. This growth stimulates the exploration of new, internet-oriented software technologies that can open new research and application opportunities not only for the commercial world, but also for the scientific and high -performance computing applications community. The general goal of this research project is to contribute to better understanding of the transition to internet-based high -performance computing and to develop solutions for some of the difficulties of this transition. More specifically, our goal is to design an architecture for generic divide and conquer internet-based computing, to develop a portable implementation of this architecture, to create an example library of high-performance divide-and-conquer computing agents that run on top of this architecture, and to evaluate the performance of these agents. We have been designing an architecture that incorporates a master task-pool server and utilizes satellite computational servers that operate on the Internet in a dynamically changing large configuration of lower-end nodes provided by volunteer contributors. Our designed architecture is intended to be complementary to and accessible from computational grids such as Globus, Legion, and Condor. Grids provide remote access to existing high-end computing resources; in contrast, our goal is to utilize idle processor time of lower-end internet nodes. Our project is focused on a generic divide-and-conquer paradigm and its applications that operate on a loose and ever changing pool of lower-end internet nodes.

  11. eScience for molecular-scale simulations and the eMinerals project.

    PubMed

    Salje, E K H; Artacho, E; Austen, K F; Bruin, R P; Calleja, M; Chappell, H F; Chiang, G-T; Dove, M T; Frame, I; Goodwin, A L; Kleese van Dam, K; Marmier, A; Parker, S C; Pruneda, J M; Todorov, I T; Trachenko, K; Tyer, R P; Walker, A M; White, T O H

    2009-03-13

    We review the work carried out within the eMinerals project to develop eScience solutions that facilitate a new generation of molecular-scale simulation work. Technological developments include integration of compute and data systems, developing of collaborative frameworks and new researcher-friendly tools for grid job submission, XML data representation, information delivery, metadata harvesting and metadata management. A number of diverse science applications will illustrate how these tools are being used for large parameter-sweep studies, an emerging type of study for which the integration of computing, data and collaboration is essential.

  12. Meaning of Interior Tomography

    PubMed Central

    Wang, Ge; Yu, Hengyong

    2013-01-01

    The classic imaging geometry for computed tomography is for collection of un-truncated projections and reconstruction of a global image, with the Fourier transform as the theoretical foundation that is intrinsically non-local. Recently, interior tomography research has led to theoretically exact relationships between localities in the projection and image spaces and practically promising reconstruction algorithms. Initially, interior tomography was developed for x-ray computed tomography. Then, it has been elevated as a general imaging principle. Finally, a novel framework known as “omni-tomography” is being developed for grand fusion of multiple imaging modalities, allowing tomographic synchrony of diversified features. PMID:23912256

  13. GeoBrain Computational Cyber-laboratory for Earth Science Studies

    NASA Astrophysics Data System (ADS)

    Deng, M.; di, L.

    2009-12-01

    Computational approaches (e.g., computer-based data visualization, analysis and modeling) are critical for conducting increasingly data-intensive Earth science (ES) studies to understand functions and changes of the Earth system. However, currently Earth scientists, educators, and students have met two major barriers that prevent them from being effectively using computational approaches in their learning, research and application activities. The two barriers are: 1) difficulties in finding, obtaining, and using multi-source ES data; and 2) lack of analytic functions and computing resources (e.g., analysis software, computing models, and high performance computing systems) to analyze the data. Taking advantages of recent advances in cyberinfrastructure, Web service, and geospatial interoperability technologies, GeoBrain, a project funded by NASA, has developed a prototype computational cyber-laboratory to effectively remove the two barriers. The cyber-laboratory makes ES data and computational resources at large organizations in distributed locations available to and easily usable by the Earth science community through 1) enabling seamless discovery, access and retrieval of distributed data, 2) federating and enhancing data discovery with a catalogue federation service and a semantically-augmented catalogue service, 3) customizing data access and retrieval at user request with interoperable, personalized, and on-demand data access and services, 4) automating or semi-automating multi-source geospatial data integration, 5) developing a large number of analytic functions as value-added, interoperable, and dynamically chainable geospatial Web services and deploying them in high-performance computing facilities, 6) enabling the online geospatial process modeling and execution, and 7) building a user-friendly extensible web portal for users to access the cyber-laboratory resources. Users can interactively discover the needed data and perform on-demand data analysis and modeling through the web portal. The GeoBrain cyber-laboratory provides solutions to meet common needs of ES research and education, such as, distributed data access and analysis services, easy access to and use of ES data, and enhanced geoprocessing and geospatial modeling capability. It greatly facilitates ES research, education, and applications. The development of the cyber-laboratory provides insights, lessons-learned, and technology readiness to build more capable computing infrastructure for ES studies, which can meet wide-range needs of current and future generations of scientists, researchers, educators, and students for their formal or informal educational training, research projects, career development, and lifelong learning.

  14. The research and application of green computer room environmental monitoring system based on internet of things technology

    NASA Astrophysics Data System (ADS)

    Wei, Wang; Chongchao, Pan; Yikai, Liang; Gang, Li

    2017-11-01

    With the rapid development of information technology, the scale of data center increases quickly, and the energy consumption of computer room also increases rapidly, among which, energy consumption of air conditioning cooling makes up a large proportion. How to apply new technology to reduce the energy consumption of the computer room becomes an important topic of energy saving in the current research. This paper study internet of things technology, and design a kind of green computer room environmental monitoring system. In the system, we can get the real-time environment data from the application of wireless sensor network technology, which will be showed in a creative way of three-dimensional effect. In the environment monitor, we can get the computer room assets view, temperature cloud view, humidity cloud view, microenvironment view and so on. Thus according to the condition of the microenvironment, we can adjust the air volume, temperature and humidity parameters of the air conditioning for the individual equipment cabinet to realize the precise air conditioning refrigeration. And this can reduce the energy consumption of air conditioning, as a result, the overall energy consumption of the green computer room will reduce greatly. At the same time, we apply this project in the computer center of Weihai, and after a year of test and running, we find that it took a good energy saving effect, which fully verified the effectiveness of this project on the energy conservation of the computer room.

  15. Answering the big questions in neuroscience: DoD's experimental research wing takes on massive, high-risk projects.

    PubMed

    Mertz, Leslie

    2012-01-01

    When the Defense Advanced Research Projects Agency (DARPA) asks research questions, it goes big. This is, after all, the same agency that put together teams of scientists and engineers to find a way to connect the worlds computers and, in doing so, developed the precursor to the Internet. DARPA, the experimental research wing of the U.S. Department of Defense, funds the types of research queries that scientists and engineers dream of tackling. Unlike a traditional granting agency that conservatively metes out its funding and only to projects with a good chance of success, DARPA puts its money on massive, multi-institutional projects that have no guarantees, but have enormous potential. In the 1990s, DARPA began its biological and medical science research to improve the safety, health, and well being of military personnel, according to DARPA program manager and Army Colonel Geoffrey Ling, Ph.D., M.D. More recently, DARPA has entered the realm of neuroscience and neurotechnology. Its focus with these projects is on its prime customer, the U.S. Department of Defense, but Ling acknowledged that technologies developed in its programs "certainly have potential to cascade into civilian uses."

  16. 77 FR 74543 - Federal Advisory Committee Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-14

    .../411593 . Dial-in: After you've connected your computer, audio connection instructions will be presented... the status of current research projects. FOR FURTHER INFORMATION CONTACT: The meeting is open to the...

  17. Comptational Design Of Functional CA-S-H and Oxide Doped Alloy Systems

    NASA Astrophysics Data System (ADS)

    Yang, Shizhong; Chilla, Lokeshwar; Yang, Yan; Li, Kuo; Wicker, Scott; Zhao, Guang-Lin; Khosravi, Ebrahim; Bai, Shuju; Zhang, Boliang; Guo, Shengmin

    Computer aided functional materials design accelerates the discovery of novel materials. This presentation will cover our recent research advance on the Ca-S-H system properties prediction and oxide doped high entropy alloy property simulation and experiment validation. Several recent developed computational materials design methods were utilized to the two systems physical and chemical properties prediction. A comparison of simulation results to the corresponding experiment data will be introduced. This research is partially supported by NSF CIMM project (OIA-15410795 and the Louisiana BoR), NSF HBCU Supplement climate change and ecosystem sustainability subproject 3, and LONI high performance computing time allocation loni mat bio7.

  18. PubMed on Tap: discovering design principles for online information delivery to handheld computers.

    PubMed

    Hauser, Susan E; Demner-Fushman, Dina; Ford, Glenn; Thoma, George R

    2004-01-01

    Online access to biomedical information from handheld computers will be a valuable adjunct to other popular medical applications if information delivery systems are designed with handheld computers in mind. The goal of this project is to discover design principles to facilitate practitioners' access to online medical information at the point-of-care. A prototype system was developed to serve as a testbed for this research. Using the testbed, an initial evaluation has yielded several user interface design principles. Continued research is expected to discover additional user interface design principles as well as guidelines for results organization and system performance

  19. Machine vision for real time orbital operations

    NASA Technical Reports Server (NTRS)

    Vinz, Frank L.

    1988-01-01

    Machine vision for automation and robotic operation of Space Station era systems has the potential for increasing the efficiency of orbital servicing, repair, assembly and docking tasks. A machine vision research project is described in which a TV camera is used for inputing visual data to a computer so that image processing may be achieved for real time control of these orbital operations. A technique has resulted from this research which reduces computer memory requirements and greatly increases typical computational speed such that it has the potential for development into a real time orbital machine vision system. This technique is called AI BOSS (Analysis of Images by Box Scan and Syntax).

  20. A Framework for Debugging Geoscience Projects in a High Performance Computing Environment

    NASA Astrophysics Data System (ADS)

    Baxter, C.; Matott, L.

    2012-12-01

    High performance computing (HPC) infrastructure has become ubiquitous in today's world with the emergence of commercial cloud computing and academic supercomputing centers. Teams of geoscientists, hydrologists and engineers can take advantage of this infrastructure to undertake large research projects - for example, linking one or more site-specific environmental models with soft computing algorithms, such as heuristic global search procedures, to perform parameter estimation and predictive uncertainty analysis, and/or design least-cost remediation systems. However, the size, complexity and distributed nature of these projects can make identifying failures in the associated numerical experiments using conventional ad-hoc approaches both time- consuming and ineffective. To address these problems a multi-tiered debugging framework has been developed. The framework allows for quickly isolating and remedying a number of potential experimental failures, including: failures in the HPC scheduler; bugs in the soft computing code; bugs in the modeling code; and permissions and access control errors. The utility of the framework is demonstrated via application to a series of over 200,000 numerical experiments involving a suite of 5 heuristic global search algorithms and 15 mathematical test functions serving as cheap analogues for the simulation-based optimization of pump-and-treat subsurface remediation systems.

  1. Connecting Biology and Organic Chemistry Introductory Laboratory Courses through a Collaborative Research Project

    ERIC Educational Resources Information Center

    Boltax, Ariana L.; Armanious, Stephanie; Kosinski-Collins, Melissa S.; Pontrello, Jason K.

    2015-01-01

    Modern research often requires collaboration of experts in fields, such as math, chemistry, biology, physics, and computer science to develop unique solutions to common problems. Traditional introductory undergraduate laboratory curricula in the sciences often do not emphasize connections possible between the various disciplines. We designed an…

  2. Critical Field Experiments on Uses of Scientific and Technical Information.

    ERIC Educational Resources Information Center

    Rubenstein, Albert H.; And Others

    Research in the field of "information-seeking behavior of scientists and engineers" has been done on the behavior and preferences of researchers with respect to technical literature, computer-based information systems, and other scientific and technical information (STI) systems and services. The objectives of this project are: (1) to…

  3. Postdoctoral Fellowship Program in Educational Research. Final Technical Report.

    ERIC Educational Resources Information Center

    Morgan, William P.

    During his postdoctoral fellowship year, Dr. Morgan took formal course work in computer programing, advanced research design, projective techniques, the physiology of aging, and hypnosis. He also attended weekly seminars in the Institute of Environmental Stress and conducted an investigation entitled "The Alteration of Perceptual and Metabolic…

  4. Minesweeper and Hypothetical Thinking Action Research & Pilot Study

    ERIC Educational Resources Information Center

    Walker, Jacob J.

    2010-01-01

    This Action Research project and Pilot Study was designed and implemented to improve students' hypothetical thinking abilities by exploring the possibility that learning and playing the computer game Minesweeper may inherently help improve hypothetical thinking. One objective was to use educational tools to make it easier for students to learn the…

  5. The Evolution of Untethered Communications.

    ERIC Educational Resources Information Center

    National Academy of Sciences - National Research Council, Washington, DC.

    In response to a request from the Defense Advanced Research Projects Agency (DARPA), the Computer Science and Telecommunications Board (CSTB) of the National Research Council initiated a one-year study on untethered communications in July 1996. To carry out the study, the CSTB appointed a committee of 15 wireless-technology experts, including…

  6. Applied Information Systems Research Program Workshop

    NASA Technical Reports Server (NTRS)

    Bredekamp, Joe

    1991-01-01

    Viewgraphs on Applied Information Systems Research Program Workshop are presented. Topics covered include: the Earth Observing System Data and Information System; the planetary data system; Astrophysics Data System project review; OAET Computer Science and Data Systems Programs; the Center of Excellence in Space Data and Information Sciences; and CASIS background.

  7. Research on Mobile Learning Activities Applying Tablets

    ERIC Educational Resources Information Center

    Kurilovas, Eugenijus; Juskeviciene, Anita; Bireniene, Virginija

    2015-01-01

    The paper aims to present current research on mobile learning activities in Lithuania while implementing flagship EU-funded CCL project on application of tablet computers in education. In the paper, the quality of modern mobile learning activities based on learning personalisation, problem solving, collaboration, and flipped class methods is…

  8. Learn by Yourself: The Self-Learning Tools for Qualitative Analysis Software Packages

    ERIC Educational Resources Information Center

    Freitas, Fábio; Ribeiro, Jaime; Brandão, Catarina; Reis, Luís Paulo; de Souza, Francislê Neri; Costa, António Pedro

    2017-01-01

    Computer Assisted Qualitative Data Analysis Software (CAQDAS) are tools that help researchers to develop qualitative research projects. These software packages help the users with tasks such as transcription analysis, coding and text interpretation, writing and annotation, content search and analysis, recursive abstraction, grounded theory…

  9. The impact of CFD on development test facilities - A National Research Council projection. [computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Korkegi, R. H.

    1983-01-01

    The results of a National Research Council study on the effect that advances in computational fluid dynamics (CFD) will have on conventional aeronautical ground testing are reported. Current CFD capabilities include the depiction of linearized inviscid flows and a boundary layer, initial use of Euler coordinates using supercomputers to automatically generate a grid, research and development on Reynolds-averaged Navier-Stokes (N-S) equations, and preliminary research on solutions to the full N-S equations. Improvements in the range of CFD usage is dependent on the development of more powerful supercomputers, exceeding even the projected abilities of the NASA Numerical Aerodynamic Simulator (1 BFLOP/sec). Full representation of the Re-averaged N-S equations will require over one million grid points, a computing level predicted to be available in 15 yr. Present capabilities allow identification of data anomalies, confirmation of data accuracy, and adequateness of model design in wind tunnel trials. Account can be taken of the wall effects and the Re in any flight regime during simulation. CFD can actually be more accurate than instrumented tests, since all points in a flow can be modeled with CFD, while they cannot all be monitored with instrumentation in a wind tunnel.

  10. Computational methods and software systems for dynamics and control of large space structures

    NASA Technical Reports Server (NTRS)

    Park, K. C.; Felippa, C. A.; Farhat, C.; Pramono, E.

    1990-01-01

    This final report on computational methods and software systems for dynamics and control of large space structures covers progress to date, projected developments in the final months of the grant, and conclusions. Pertinent reports and papers that have not appeared in scientific journals (or have not yet appeared in final form) are enclosed. The grant has supported research in two key areas of crucial importance to the computer-based simulation of large space structure. The first area involves multibody dynamics (MBD) of flexible space structures, with applications directed to deployment, construction, and maneuvering. The second area deals with advanced software systems, with emphasis on parallel processing. The latest research thrust in the second area, as reported here, involves massively parallel computers.

  11. GSDC: A Unique Data Center in Korea for HEP research

    NASA Astrophysics Data System (ADS)

    Ahn, Sang-Un

    2017-04-01

    Global Science experimental Data hub Center (GSDC) at Korea Institute of Science and Technology Information (KISTI) is a unique data center in South Korea established for promoting the fundamental research fields by supporting them with the expertise on Information and Communication Technology (ICT) and the infrastructure for High Performance Computing (HPC), High Throughput Computing (HTC) and Networking. GSDC has supported various research fields in South Korea dealing with the large scale of data, e.g. RENO experiment for neutrino research, LIGO experiment for gravitational wave detection, Genome sequencing project for bio-medical, and HEP experiments such as CDF at FNAL, Belle at KEK, and STAR at BNL. In particular, GSDC has run a Tier-1 center for ALICE experiment using the LHC at CERN since 2013. In this talk, we present the overview on computing infrastructure that GSDC runs for the research fields and we discuss on the data center infrastructure management system deployed at GSDC.

  12. Building place-based collaborations to develop high school students' groundwater systems knowledge and decision-making capacity

    NASA Astrophysics Data System (ADS)

    Podrasky, A.; Covitt, B. A.; Woessner, W.

    2017-12-01

    The availability of clean water to support human uses and ecological integrity has become an urgent interest for many scientists, decision makers and citizens. Likewise, as computational capabilities increasingly revolutionize and become integral to the practice of science, technology, engineering and math (STEM) disciplines, the STEM+ Computing (STEM+C) Partnerships program seeks to integrate the use of computational approaches in K-12 STEM teaching and learning. The Comp Hydro project, funded by a STEM+C grant from the National Science Foundation, brings together a diverse team of scientists, educators, professionals and citizens at sites in Arizona, Colorado, Maryland and Montana to foster water literacy, as well as computational science literacy, by integrating authentic, place- and data- based learning using physical, mathematical, computational and conceptual models. This multi-state project is currently engaging four teams of six teachers who work during two academic years with educators and scientists at each site. Teams work to develop instructional units specific to their region that integrate hydrologic science and computational modeling. The units, currently being piloted in high school earth and environmental science classes, provide a classroom context to investigate student understanding of how computation is used in Earth systems science. To develop effective science instruction that is rich in place- and data- based learning, effective collaborations between researchers, educators, scientists, professionals and citizens are crucial. In this poster, we focus on project implementation in Montana, where an instructional unit has been developed and is being tested through collaboration among University scientists, researchers and educators, high school teachers and agency and industry scientists and engineers. In particular, we discuss three characteristics of effective collaborative science education design for developing and implementing place- and data- based science education to support students in developing socio-scientific and computational literacy sufficient for making decisions about real world issues such as groundwater contamination. These characteristics include that science education experiences are real, responsive/accessible and rigorous.

  13. Motivation, description, and summary status of geomechanical andgeochemical modeling studies in Task D of the InternationalDECOVALEX-THMC Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Birkholzer, J.T.; Barr, D.; Rutqvist, J.

    2005-11-15

    The DECOVALEX project is an international cooperativeproject initiated by SKI, the Swedish Nuclear Power Inspectorate, withparticipation of about 10 international organizations. The general goalof this project is to encourage multidisciplinary interactive andcooperative research on modelling coupledthermo-hydro-mechanical-chemical (THMC) processes in geologic formationsin support of the performance assessment for underground storage ofradioactive waste. One of the research tasks, initiated in 2004 by theU.S. Department of Energy (DOE), addresses the long-term impact ofgeomechanical and geochemical processes on the flow conditions near wasteemplacement tunnels. Within this task, four international research teamsconduct predictive analysis of the coupled processes in two genericrepositories, using multiple approaches andmore » different computer codes.Below, we give an overview of the research task and report its currentstatus.« less

  14. NASA/DoD Aerospace Knowledge Diffusion Research Project. XXXIII - Technical communications practices and the use of information technologies as reported by Dutch and U.S. aerospace engineers

    NASA Technical Reports Server (NTRS)

    Barclay, Rebecca O.; Pinelli, Thomas E.; Tan, Axel S. T.; Kennedy, John M.

    1993-01-01

    As part of Phase 4 of the NASA/DOD Aerospace Knowledge Diffusion Research Project, two studies were conducted that investigated the technical communications practices of Dutch and U.S. aerospace engineers and scientists. A self-administered questionnaire was distributed to aerospace engineers and scientists at the National Aerospace Laboratory (The Netherlands), and NASA Ames Research Center (U.S.), and the NASA Langley Research Center (U.S.). This paper presents responses of the Dutch and U.S. participants to selected questions about four of the seven project objectives: determining the importance of technical communications to aerospace engineering professionals, investigating the production of technical communications, examining the use and importance of computer and information technology, and exploring the use of electronic networks.

  15. Molecular robots with sensors and intelligence.

    PubMed

    Hagiya, Masami; Konagaya, Akihiko; Kobayashi, Satoshi; Saito, Hirohide; Murata, Satoshi

    2014-06-17

    CONSPECTUS: What we can call a molecular robot is a set of molecular devices such as sensors, logic gates, and actuators integrated into a consistent system. The molecular robot is supposed to react autonomously to its environment by receiving molecular signals and making decisions by molecular computation. Building such a system has long been a dream of scientists; however, despite extensive efforts, systems having all three functions (sensing, computation, and actuation) have not been realized yet. This Account introduces an ongoing research project that focuses on the development of molecular robotics funded by MEXT (Ministry of Education, Culture, Sports, Science and Technology, Japan). This 5 year project started in July 2012 and is titled "Development of Molecular Robots Equipped with Sensors and Intelligence". The major issues in the field of molecular robotics all correspond to a feedback (i.e., plan-do-see) cycle of a robotic system. More specifically, these issues are (1) developing molecular sensors capable of handling a wide array of signals, (2) developing amplification methods of signals to drive molecular computing devices, (3) accelerating molecular computing, (4) developing actuators that are controllable by molecular computers, and (5) providing bodies of molecular robots encapsulating the above molecular devices, which implement the conformational changes and locomotion of the robots. In this Account, the latest contributions to the project are reported. There are four research teams in the project that specialize on sensing, intelligence, amoeba-like actuation, and slime-like actuation, respectively. The molecular sensor team is focusing on the development of molecular sensors that can handle a variety of signals. This team is also investigating methods to amplify signals from the molecular sensors. The molecular intelligence team is developing molecular computers and is currently focusing on a new photochemical technology for accelerating DNA-based computations. They also introduce novel computational models behind various kinds of molecular computers necessary for designing such computers. The amoeba robot team aims at constructing amoeba-like robots. The team is trying to incorporate motor proteins, including kinesin and microtubules (MTs), for use as actuators implemented in a liposomal compartment as a robot body. They are also developing a methodology to link DNA-based computation and molecular motor control. The slime robot team focuses on the development of slime-like robots. The team is evaluating various gels, including DNA gel and BZ gel, for use as actuators, as well as the body material to disperse various molecular devices in it. They also try to control the gel actuators by DNA signals coming from molecular computers.

  16. Real-time dynamics and control strategies for space operations of flexible structures

    NASA Technical Reports Server (NTRS)

    Park, K. C.; Alvin, K. F.; Alexander, S.

    1993-01-01

    This project (NAG9-574) was meant to be a three-year research project. However, due to NASA's reorganizations during 1992, the project was funded only for one year. Accordingly, every effort was made to make the present final report as if the project was meant to be for one-year duration. Originally, during the first year we were planning to accomplish the following: we were to start with a three dimensional flexible manipulator beam with articulated joints and with a linear control-based controller applied at the joints; using this simple example, we were to design the software systems requirements for real-time processing, introduce the streamlining of various computational algorithms, perform the necessary reorganization of the partitioned simulation procedures, and assess the potential speed-up realization of the solution process by parallel computations. The three reports included as part of the final report address: the streamlining of various computational algorithms; the necessary reorganization of the partitioned simulation procedures, in particular the observer models; and an initial attempt of reconfiguring the flexible space structures.

  17. A Computational Study of the Flow Physics of Acoustic Liners

    NASA Technical Reports Server (NTRS)

    Tam, Christopher

    2006-01-01

    The present investigation is a continuation of a previous joint project between the Florida State University and the NASA Langley Research Center Liner Physics Team. In the previous project, a study of acoustic liners, in two dimensions, inside a normal incidence impedance tube was carried out. The study consisted of two parts. The NASA team was responsible for the experimental part of the project. This involved performing measurements in an impedance tube with a large aspect ratio slit resonator. The FSU team was responsible for the computation part of the project. This involved performing direct numerical simulation (DNS) of the NASA experiment in two dimensions using CAA methodology. It was agreed that upon completion of numerical simulation, the computed values of the liner impedance were to be sent to NASA for validation with experimental results. On following this procedure good agreements were found between numerical results and experimental measurements over a wide range of frequencies and sound-pressure-level. Broadband incident sound waves were also simulated numerically and measured experimentally. Overall, good agreements were also found.

  18. Researching the Internet in a Writing Class: A Writing Teacher's Role and a Computer Specialist's Role.

    ERIC Educational Resources Information Center

    Anstendig, Linda; Meyer, Jeanine

    An Internet research project was undertaken by a class of college honors students to see how effectively the Internet could be used for genuine research purposes. The class consisted of 16 students, a mix of freshmen, sophomores, and juniors, enrolled in an advanced writing course whose focus was different forms of research: I-Search, ethnography,…

  19. From research to self-reflection: learning about ourselves as academics through a support group's resistance to our intervention.

    PubMed

    Scherr, Courtney Lynam; Mattson, Marifran

    2012-01-01

    Purdue University's Center for Healthcare Engineering developed a computer-assisted technology hub (CATHUB) designed to aid individuals with disabilities. Upon realizing the lack of input from the very individuals they were trying to help, Marifran approached the developers of CATHUB and offered to engage a group of amputees to aid in the design and implementation of the hub. In this essay, Courtney and Marifran recount, each from their own perspective, their experiences working with Amputees in Action as participants in their research project. Ultimately the researchers discovered their research agenda was not compatible with the amputees' needs, resulting in enlightened self-reflection by the researchers and abandonment of the research project.

  20. Large-scale neuromorphic computing systems

    NASA Astrophysics Data System (ADS)

    Furber, Steve

    2016-10-01

    Neuromorphic computing covers a diverse range of approaches to information processing all of which demonstrate some degree of neurobiological inspiration that differentiates them from mainstream conventional computing systems. The philosophy behind neuromorphic computing has its origins in the seminal work carried out by Carver Mead at Caltech in the late 1980s. This early work influenced others to carry developments forward, and advances in VLSI technology supported steady growth in the scale and capability of neuromorphic devices. Recently, a number of large-scale neuromorphic projects have emerged, taking the approach to unprecedented scales and capabilities. These large-scale projects are associated with major new funding initiatives for brain-related research, creating a sense that the time and circumstances are right for progress in our understanding of information processing in the brain. In this review we present a brief history of neuromorphic engineering then focus on some of the principal current large-scale projects, their main features, how their approaches are complementary and distinct, their advantages and drawbacks, and highlight the sorts of capabilities that each can deliver to neural modellers.

Top