Sample records for computer science volume

  1. A DDC Bibliography on Computers in Information Sciences. Volume II. Information Sciences Series.

    ERIC Educational Resources Information Center

    Defense Documentation Center, Alexandria, VA.

    The unclassified and unlimited bibliography compiles references dealing specifically with the role of computers in information sciences. The volume contains 239 annotated references grouped under three major headings: Artificial and Programming Languages, Computer Processing of Analog Data, and Computer Processing of Digital Data. The references…

  2. A DDC Bibliography on Computers in Information Sciences. Volume I. Information Sciences Series.

    ERIC Educational Resources Information Center

    Defense Documentation Center, Alexandria, VA.

    The unclassified and unlimited bibliography compiles references dealing specifically with the role of computers in information sciences. The volume contains 249 annotated references grouped under two major headings: Time Shared, On-Line, and Real Time Systems, and Computer Components. The references are arranged in accesion number (AD-number)…

  3. Computers in Life Science Education. Volume 5, 1988.

    ERIC Educational Resources Information Center

    Computers in Life Science Education, 1988

    1988-01-01

    Designed to serve as a means of communication among life science educators who anticipate or are currently using microcomputers as an educational tool, this volume of newsletters provides background information and practical suggestions on computer use. Over 80 articles are included. Topic areas include: (1) using a personal computer in a plant…

  4. Software Reviews.

    ERIC Educational Resources Information Center

    Science and Children, 1990

    1990-01-01

    Reviewed are seven computer software packages for IBM and/or Apple Computers. Included are "Windows on Science: Volume 1--Physical Science"; "Science Probe--Physical Science"; "Wildlife Adventures--Grizzly Bears"; "Science Skills--Development Programs"; "The Clean Machine"; "Rock Doctor";…

  5. NASA Tech Briefs, August 1993. Volume 17, No. 8

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Topics include: Computer Graphics; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences; Books and Reports.

  6. NASA Tech Briefs, March 1993. Volume 17, No. 3

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Topics include: Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences;

  7. NASA Tech Briefs, August 1994. Volume 18, No. 8

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Topics covered include: Computer Hardware; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences; Books and Reports.

  8. NASA Tech Briefs, December 1993. Volume 17, No. 12

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Topics covered include: High-Performance Computing; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports.

  9. NASA Tech Briefs, March 1994. Volume 18, No. 3

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Topics include: Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports

  10. NASA Tech Briefs, March 2000. Volume 24, No. 3

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Topics include: Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports.

  11. NASA Tech Briefs, March 1997. Volume 21, No. 3

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Topics: Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports.

  12. Computers in Life Science Education. Volumes 1 through 4, 1984-1987.

    ERIC Educational Resources Information Center

    Modell, Harold, Ed.

    1987-01-01

    Designed to serve as a means of communication among life science educators who anticipate or are currently using microcomputers as an educational tool, these four volumes of newsletters provide background information and practical suggestions on computer use in over 80 articles. Topic areas include: (1) teaching physiology and other life sciences…

  13. NASA Tech Briefs, February 2000. Volume 24, No. 2

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Topics covered include: Test and Measurement; Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Bio-Medical; Mathematics and Information Sciences; Computers and Peripherals.

  14. Studies in Mathematics, Volume 22. Studies in Computer Science.

    ERIC Educational Resources Information Center

    Pollack, Seymour V., Ed.

    The nine articles in this collection were selected because they represent concerns central to computer science, emphasize topics of particular interest to mathematicians, and underscore the wide range of areas deeply and continually affected by computer science. The contents consist of: "Introduction" (S. V. Pollack), "The…

  15. NASA Tech Briefs, July 1994. Volume 18, No. 7

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Topics covered include: Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports

  16. NASA Tech Briefs, October 1994. Volume 18, No. 10

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Topics: Data Acquisition and Analysis; Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences; Books and Reports

  17. Final Technical Progress Report; Closeout Certifications; CSSV Newsletter Volume I; CSSV Newsletter Volume II; CSSV Activity Journal; CSSV Final Financial Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Houston, Johnny L; Geter, Kerry

    This Project?s third year of implementation in 2007-2008, the final year, as designated by Elizabeth City State University (ECSU), in cooperation with the National Association of Mathematicians (NAM) Inc., in an effort to promote research and research training programs in computational science ? scientific visualization (CSSV). A major goal of the Project was to attract the energetic and productive faculty, graduate and upper division undergraduate students of diverse ethnicities to a program that investigates science and computational science issues of long-term interest to the Department of Energy (DoE) and the nation. The breadth and depth of computational science?scientific visualization andmore » the magnitude of resources available are enormous for permitting a variety of research activities. ECSU?s Computational Science-Science Visualization Center will serve as a conduit for directing users to these enormous resources.« less

  18. NASA Tech Briefs, July 2000. Volume 24, No. 7

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Topics covered include: Data Acquisition; Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Test and Measurement; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports.

  19. NASA Tech Briefs, June 1996. Volume 20, No. 6

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Topics: New Computer Hardware; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences;Books and Reports.

  20. NASA Tech Briefs, September 1999. Volume 23, No. 9

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Topics discussed include: Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences;

  1. NASA Tech Briefs, June 1997. Volume 21, No. 6

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Topics include: Computer Hardware and Peripherals; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Books and Reports.

  2. NASA Tech Briefs, November 1999. Volume 23, No. 11

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Topics covered include: Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Materials; Computer Programs; Mechanics; Machinery/Automation; Physical Sciences; Mathematics and Information Sciences; Books and Reports.

  3. NASA Tech Briefs, January 2000. Volume 24, No. 1

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Topics include: Data Acquisition; Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Bio-Medical; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Information Sciences; Books and reports.

  4. Proceedings: Computer Science and Data Systems Technical Symposium, volume 1

    NASA Technical Reports Server (NTRS)

    Larsen, Ronald L.; Wallgren, Kenneth

    1985-01-01

    Progress reports and technical updates of programs being performed by NASA centers are covered. Presentations in viewgraph form are included for topics in three categories: computer science, data systems and space station applications.

  5. NASA Tech Briefs, November 2000. Volume 24, No. 11

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Topics covered include: Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Test and Measurement; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Data Acquisition.

  6. NASA Tech Briefs, July 1995. Volume 19, No. 7

    NASA Technical Reports Server (NTRS)

    1995-01-01

    Topics include: mechanical components, electronic components and circuits, electronic systems, physical sciences, materials, computer programs, mechanics, machinery, manufacturing/fabrication, mathematics and information sciences, book and reports, and a special section of Federal laboratory computing Tech Briefs.

  7. NASA Tech Briefs, August 2000. Volume 24, No. 8

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Topics include: Simulation/Virtual Reality; Test and Measurement; Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Medical Design.

  8. NASA Tech Briefs, August 1992. Volume 16, No. 8

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Topics include: Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences.

  9. NASA Tech Briefs, September 1992. Volume 16, No.9

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Topics include: Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences.

  10. NASA Tech Briefs, January 1993. Volume 17, No. 1

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Topics include: Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences;

  11. NASA Tech Briefs, November 1992. Volume 16, No. 11

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Topics include: Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences;

  12. NASA Tech Briefs, December 1992. Volume 16, No. 12

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Topics include: Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences;

  13. Advanced Placement Computer Science (with Pascal). Teacher's Guide. Volume 1. Second Edition.

    ERIC Educational Resources Information Center

    Farkouh, Alice; And Others

    The purpose of this guide is to give teachers and supervisors a working knowledge of various approaches to enhancing pupil learning about computer science, particularly through the use of Pascal. It contains instructional units dealing with: (1) computer components; (2) computer languages; (3) compilers; (4) essential features of a Pascal program;…

  14. NASA Tech Briefs, October 1989. Volume 13, No. 10

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Topics include: Electronic Components and Circuits. Electronic Systems, Physical Sciences, Materials, Computer Programs, Mechanics, Machinery, Fabrication Technology, Mathematics and Information Sciences, and Life Sciences

  15. NASA Tech Briefs, February 1990. Volume 14, No. 2

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Topics include: Electronic Components and Circuits. Electronic Systems, Physical Sciences, Materials, Computer Programs, Mechanics, Machinery, Fabrication Technology, Mathematics and Information Sciences, and Life Sciences

  16. NASA Tech Briefs, January 1990. Volume 14, No. 1

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Topics include: Electronic Components and Circuits. Electronic Systems, Physical Sciences, Materials, Computer Programs, Mechanics, Machinery, Fabrication Technology, Mathematics and Information Sciences, and Life Sciences

  17. NASA Tech Briefs, November 1989. Volume 13, No. 11

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Topics include: Electronic Components and Circuits. Electronic Systems, Physical Sciences, Materials, Computer Programs, Mechanics, Machinery, Fabrication Technology, Mathematics and Information Sciences, and Life Sciences

  18. NASA Tech Briefs, September 1989. Volume 13, No. 9

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Topics include: Electronic Components and Circuits. Electronic Systems, Physical Sciences, Materials, Computer Programs, Mechanics, Machinery, Fabrication Technology, Mathematics and Information Sciences, and Life Sciences.

  19. NASA Tech Briefs, October 1992. Volume 16, No. 10

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Topics covered include: Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication technology; Mathematics and Information Sciences; Life Sciences.

  20. NASA Tech Briefs, December 1989. Volume 13, No. 12

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Topics include: Electronic Components and Circuits. Electronic Systems, Physical Sciences, Materials, Computer Programs, Mechanics, Machinery, Fabrication Technology, Mathematics and Information Sciences, and Life Sciences.

  1. NASA Tech Briefs, April 1993. Volume 17, No. 4

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Topics include: Optoelectronics; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences;

  2. NASA Tech Briefs, March 1990. Volume 14, No. 3

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Topics include: Electronic Components and Circuits. Electronic Systems, Physical Sciences, Materials, Computer Programs, Mechanics, Machinery, Fabrication Technology, Mathematics and Information Sciences, and Life Sciences

  3. Science News of the Year.

    ERIC Educational Resources Information Center

    Science News, 1985

    1985-01-01

    Highlights important 1985 science stories appearing in "Science News" under these headings: anthropology and paleontology, astronomy, behavior, biology, biomedicine, chemistry, computers and mathematics, earth sciences, environment, physics, science and society, space sciences, and technology. Each entry includes the volume and page…

  4. Proceedings: Computer Science and Data Systems Technical Symposium, volume 2

    NASA Technical Reports Server (NTRS)

    Larsen, Ronald L.; Wallgren, Kenneth

    1985-01-01

    Progress reports and technical updates of programs being performed by NASA centers are covered. Presentations in viewgraph form, along with abstracts, are included for topics in three catagories: computer science, data systems, and space station applications.

  5. NASA Tech Briefs, January 1989. Volume 13, No. 1

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Topics include: Electronic Components & and Circuits. Electronic Systems, A Physical Sciences, Materials, Computer Programs, Mechanics, Machinery, Fabrication Technology, Mathematics and Information Sciences, and Life Sciences.

  6. NASA Tech Briefs, June 1993. Volume 17, No. 6

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Topics include: Imaging Technology: Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences.

  7. NASA Tech Briefs, November 1993. Volume 17, No. 11

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Topics covered: Advanced Manufacturing; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences.

  8. NASA Tech Briefs, February 1993. Volume 17, No. 2

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Topics include: Communication Technology; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences.

  9. NASA Tech Briefs, January 1992. Volume 16, No. 1

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Topics include: New Product Ideas; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Fabrication; Mathematics and Information Sciences; Life Sciences;

  10. NASA Tech Briefs, May 1992. Volume 16, No. 5

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Topics include: New Product Ideas; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences.

  11. NASA Tech Briefs, July 1992. Volume 16, No. 7

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Topics include: New Product Ideas; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences.

  12. NASA Tech Briefs, March 1992. Volume 16, No. 3

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Topics include: New Product Ideas; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences.

  13. NASA Tech Briefs, September 1994. Volume 18, No. 9

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Topics: Sensors; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences; Books and Reports.

  14. NASA Tech Briefs, June 2000. Volume 24, No. 6

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Topics include: Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Test and Measurement; Physical Sciences; Materials; Computer Programs; Computers and Peripherals;

  15. LASER Tech Briefs, Winter 1994. Volume 2, No. 1

    NASA Technical Reports Server (NTRS)

    Schnirring, Bill (Editor)

    1994-01-01

    Topics include: Electronic Components and Circuits. Electronic Systems, Physical Sciences, Materials, Computer Programs, Mechanics, Machinery, Fabrication Technology, Mathematics and Information Sciences, Life Sciences, and Books and reports

  16. NASA Tech Briefs, May 1993. Volume 17, No. 5

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Topics include: Advanced Composites and Plastics; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences.

  17. NASA Tech Briefs, February 1992. Volume 16, No. 2

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Topics covered include: New Product Development; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences.

  18. NASA Tech Briefs, July 1993. Volume 17, No. 7

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Topics include: Data Acquisition and Analysis: Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences.

  19. NASA Tech Briefs, June 1992. Volume 16, No. 6

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Topics covered include: New Product Ideas; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences.

  20. NASA Tech Briefs, December 1994. Volume 18, No. 12

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Topics: Test and Measurement; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports

  1. NASA Tech Briefs, January 1995. Volume 19, No. 1

    NASA Technical Reports Server (NTRS)

    1995-01-01

    Topics include: Sensors; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences; Books and Reports

  2. NASA Tech Briefs, May 1991. Volume 15, No. 5

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Topics: New Product Ideas; NASA TU Services; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences.

  3. NASA Tech Briefs, January 1991. Volume 15, No. 1

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Topics: New Product Ideas; NASA TU Services; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences;Life Sciences.

  4. NASA Tech Briefs, September 1991. Volume 15, No. 9

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Topics: New Product Ideas; NASA TU Services; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences.

  5. NASA Tech Briefs, June 1990. Volume 14, No. 6

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Topics: New Product Ideas; NASA TU Services; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences.

  6. NASA Tech Briefs, August 1991. Volume 15, No. 8

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Topics: New Product Ideas; NASA TU Services; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences.

  7. NASA Tech Briefs, February 1991. Volume 15, No. 2

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Topics: New Product Ideas; NASA TU Services; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences.

  8. NASA Tech Briefs, March 1991. Volume 15, No. 3

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Topics: New Product Ideas; NASA TU Services; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences.

  9. NASA Tech Briefs, December 1990. Volume 14, No. 12

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Topics: New Product Ideas; NASA TU Services; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences.

  10. NASA Tech Briefs, June 1991. Volume 15, No. 6

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Topics: New Product Ideas; NASA TU Services; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences.

  11. NASA Tech Briefs, September 1993. Volume 17, No. 9

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Topics include: Microelectronics; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports.

  12. NASA Tech Briefs, May 1990. Volume 14, No. 5

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Topics: New Product Ideas; NASA TU Services; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences.

  13. NASA Tech Briefs, January 1994. Volume 18, No. 1

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Topics include: Communications Technology; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences; Books and Reports.

  14. NASA Tech Briefs, November 1994. Volume 18, No. 11

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Topics: Advanced Manufacturing; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports.

  15. NASA Tech Briefs, April 1991. Volume 15, No. 4

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Topics: New Product Ideas; NASA TU Services; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences.

  16. NASA Tech Briefs, October 1990. Volume 14, No. 10

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Topics: New Product Ideas; NASA TU Services; Electronic Components and Circuits; Electronic Systems; Physical' Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences.

  17. NASA Tech Briefs, October 1991. Volume 15, No. 10

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Topics: New Product Ideas; NASA TU Services; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences.

  18. NASA Tech Briefs, September 1988. Volume 12, No. 8

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Topics include: New Product Ideas; NASA TU Services; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences.

  19. NASA Tech Briefs, July/August 1988. Volume 12, No. 7

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Topics: New Product Ideas; NASA TU Services; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences.

  20. LASER Tech Briefs, Fall 1994. Volume 2, No. 4

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Topics in this issue of LASER Tech briefs include: Electronic Components and Circuits. Electronic Systems, Physical Sciences, Materials, Computer Programs, Fabrication Technology, Mathematics and Information Sciences, and Life Sciences

  1. NASA Tech Briefs, October 1988. Volume 12, No. 9

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Topics include: New Product Ideas; NASA TU Services; Electronic Components and Circuits; Electronic Systems; Physical Sciences Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences.

  2. NASA Tech Briefs, July 1991. Volume 15, No. 7

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Topics include: New Product Ideas; NASA TU Services; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences.

  3. NASA Tech Briefs, March 1987. Volume 11, No. 3

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Topics include: NASA TU Services; New Product Ideas; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Fabrication Technology; Machinery; Mathematics and Information Sciences; Life Sciences.

  4. NASA Tech Briefs, May 1987. Volume 11, No. 5

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Topics include: NASA TU Services; New Product Ideas; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Fabrication Technology; Machinery; Mathematics and Information Sciences; Life Sciences.

  5. NASA Tech Briefs, October 1987. Volume 11, No. 9

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Topics include: NASA TU Services; New Product Ideas; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Fabrication Technology; Machinery; Mathematics and Information Sciences; Life Sciences.

  6. NASA Tech Briefs, June 1989. Volume 13, No. 6

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Topics include: New Product Ideas; NASA TU Services; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences.

  7. NASA Tech Briefs, February 1987. Volume 11, No. 2

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Topics include: NASA TU Services; New Product Ideas; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Fabrication Technology; Machinery; Mathematics and Information Sciences; Life Sciences.

  8. NASA Tech Briefs, January 1987. Volume 11, No. 2

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Topics include: NASA TU Services; New Product Ideas; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Fabrication Technology; Machinery; Mathematics and Information Sciences; Life Sciences.

  9. NASA Tech Briefs, July 1990. Volume 14, No. 7

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Topics include: New Product Ideas; NASA TU Services; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences.

  10. NASA Tech Briefs, August 1990. Volume 14, No. 8

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Topics covered: New Product Ideas; NASA TU Services; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences.

  11. NASA Tech Briefs, April 1987. Volume 11, No. 4

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Topics include: NASA TU Services; New Product Ideas; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Fabrication Technology; Machinery; Mathematics and Information Sciences; Life Sciences.

  12. NASA Tech Briefs, September 1987. Volume 11, No. 8

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Topics include: NASA TU Services; New Product Ideas; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Fabrication Technology; Machinery; Mathematics and Information Sciences; Life Sciences.

  13. NASA Tech Briefs, June 1994. Volume 18, No. 6

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Topics covered include: Microelectronics; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports

  14. NASA Tech Briefs, October 1996. Volume 20, No. 10

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Topics covered include: Sensors; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports.

  15. NASA Tech Briefs, June 1987. Volume 11, No. 6

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Topics include: NASA TU Services; New Product Ideas; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Fabrication Technology; Machinery; Mathematics and Information Sciences; Life Sciences.

  16. NASA Tech Briefs, August 1989. Volume 13, No. 8

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Topics covered: New Product Ideas; NASA TU Services; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences.

  17. Computers in Life Science Education, Volume 7, Numbers 1-12.

    ERIC Educational Resources Information Center

    Computers in Life Science Education, 1990

    1990-01-01

    The 12 digests of Computers in Life Science Education from 1990 are presented. The articles found in chronological sequence are as follows: "The Computer as a Teaching Tool--How Far Have We Come? Where Are We Going?" (Modell); "Where's the Software--Part 1"; "Keeping Abreast of the Literature" (which appears quarterly); "Where's the Software--Part…

  18. NASA Tech Briefs, February 1997. Volume 2, No. 2

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Topics include: Test and Measurement; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports

  19. NASA Tech Briefs, November 1988. Volume 12, No. 10

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Topics covered include: New Product Ideas; NASA TU Services; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences.

  20. NASA Tech Briefs, September/October 1986. Volume 10, No. 5

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Topics include: NASA TU Services; New Product Ideas; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Fabrication Technology; Machinery; Mathematics and Information Sciences; Life Sciences.

  1. NASA Tech Briefs, November 1996. Volume 20, No. 11

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Topics covered: Video and Imaging; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports

  2. NASA Tech Briefs, December 1996. Volume 20, No. 12

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Topics: Design and Analysis Software; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports

  3. NASA Tech Briefs, May 1996. Volume 20, No. 5

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Topics include: Video and Imaging;Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports

  4. NASA Tech Briefs, November/December 1986. Volume 10, No. 6

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Topics include: NASA TU Services; New Product Ideas; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Fabrication Technology; Machinery; Mathematics and Information Sciences; Life Sciences.

  5. NASA Tech Briefs, October 1993. Volume 17, No. 10

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Topics include: Sensors; esign and Engineering; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication technology; Mathematics and Information Sciences; Life Sciences; Books and Reports.

  6. NASA Tech Briefs, May 1994. Volume 18, No. 5

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Topics covered include: Robotics/Automation; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports.

  7. NASA Tech Briefs, May/June 1986. Volume 10, No. 3

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Topics discussed include: NASA TU Services; New Product Ideas; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences.

  8. NASA Tech Briefs, September 1990. Volume 14, No. 9

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Topics covered include: New Product Ideas; NASA TU Services; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences.

  9. NASA Tech Briefs, November/December 1987. Volume 11, No. 10

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Topics include: NASA TU Services; New Product Ideas; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Fabrication Technology; Machinery; Mathematics and Information Sciences; Life Sciences.

  10. NASA Tech Briefs, February 1994. Volume 18, No. 2

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Topics covered include: Test and Measurement; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences; Books and Reports

  11. NASA Tech Briefs, March 1988. Volume 12, No. 3

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Topics include: New Product Ideas; NASA TU Services; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; and Life Sciences.

  12. NASA Tech Briefs, July 1996. Volume 20, No. 7

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Topics covered include: Mechanical Components; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports

  13. NASA Tech Briefs, July/August 1987. Volume 11, No. 7

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Topics include: NASA TU Services; New Product Ideas; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Fabrication Technology; Machinery; Mathematics and Information Sciences; Life Sciences.

  14. NASA Tech Briefs, July 1997. Volume 21, No. 7

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Topics: Mechanical Components; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Software; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Life Sciences.

  15. NASA Tech Briefs, December 1991. Volume 15, No. 12

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Topics include: Electronic Components and Circuits. Electronic Systems, Physical Sciences, Materials, Computer Programs, Mechanics, Machinery, Fabrication Technology, Mathematics and Information Sciences,

  16. NASA Tech Briefs, April 1997. Volume 21, No. 4

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Topics covered include: Video and Imaging; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports.

  17. NASA Tech Briefs, March/April 1986. Volume 10, No. 2

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Topics covered include: NASA TU Services; New Product Ideas; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences.

  18. NASA Tech Briefs, October 1997. Volume 21, No. 10

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Topics covered include: Sensors/Imaging; Mechanical Components; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Software; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports.

  19. NASA Tech Briefs, January 1988. Volume 12, No. 1

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Topics covered include: New Product Ideas; NASA TU Services; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; and Life Sciences.

  20. NASA Tech Briefs, April 1994. Volume 18, No. 4

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Topics covered: Advanced Composites and Plastics; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports.

  1. NASA Tech Briefs, August 1996. Volume 20, No. 8

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Topics covered include: Graphics and Simulation; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports

  2. NASA Tech Briefs, November 1991. Volume 15, No. 11

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Topics include: Electronic Components & and Circuits. Electronic Systems, Physical Sciences, Materials, Computer Programs, Mechanics, Machinery, Fabrication Technology, and Mathematics and Information Sciences,

  3. Space and Earth Sciences, Computer Systems, and Scientific Data Analysis Support, Volume 1

    NASA Technical Reports Server (NTRS)

    Estes, Ronald H. (Editor)

    1993-01-01

    This Final Progress Report covers the specific technical activities of Hughes STX Corporation for the last contract triannual period of 1 June through 30 Sep. 1993, in support of assigned task activities at Goddard Space Flight Center (GSFC). It also provides a brief summary of work throughout the contract period of performance on each active task. Technical activity is presented in Volume 1, while financial and level-of-effort data is presented in Volume 2. Technical support was provided to all Division and Laboratories of Goddard's Space Sciences and Earth Sciences Directorates. Types of support include: scientific programming, systems programming, computer management, mission planning, scientific investigation, data analysis, data processing, data base creation and maintenance, instrumentation development, and management services. Mission and instruments supported include: ROSAT, Astro-D, BBXRT, XTE, AXAF, GRO, COBE, WIND, UIT, SMM, STIS, HEIDI, DE, URAP, CRRES, Voyagers, ISEE, San Marco, LAGEOS, TOPEX/Poseidon, Pioneer-Venus, Galileo, Cassini, Nimbus-7/TOMS, Meteor-3/TOMS, FIFE, BOREAS, TRMM, AVHRR, and Landsat. Accomplishments include: development of computing programs for mission science and data analysis, supercomputer applications support, computer network support, computational upgrades for data archival and analysis centers, end-to-end management for mission data flow, scientific modeling and results in the fields of space and Earth physics, planning and design of GSFC VO DAAC and VO IMS, fabrication, assembly, and testing of mission instrumentation, and design of mission operations center.

  4. European Scientific Notes. Volume 35, Number 5,

    DTIC Science & Technology

    1981-05-31

    Mr. Y.S. Wu Information Systems ESN 35-5 (1981) COMPUTER Levrat himself is a fascinating Dan SCIENCE who took his doctorate at the Universitv of...fascinating Computer Science Department reports for project on computer graphics. Text nurposes of teaching and research di- processing by computer has...water batteries, of offshore winds and lighter support alkaline batterips, lead-acid systems , structures, will be carried out before metal/air batteries

  5. NASA Tech Briefs, July/August 1986. Volume 10, No. 4

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Topic include: NASA TU Serv1ces; New Product Ideas; Electronic Components and Circuits; Electronic Systems; Materials; Computer Programs; Mechanics; Physical Sciences; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences. 3

  6. NASA Tech Briefs, May 1997. Volume 21, No. 5

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Topics covered include: Advanced Composites, Plastics and Metals; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports.

  7. NASA Tech Briefs, January 1998. Volume 22, No. 1

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Topics: Sensors/Data Acquisition; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Software; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Life Sciences; Books and Reports.

  8. NASA Tech Briefs, January 1997. Volume 21, No. 1

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Topics: Sensors; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Books and Reports.

  9. NASA Tech Briefs, April 1992. Volume 16, No. 4

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Topics covered include: New Product Ideas; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences.

  10. NASA Tech Briefs, June 1995. Volume 19, No. 6

    NASA Technical Reports Server (NTRS)

    1995-01-01

    Topics include: communications technology, electronic components and circuits, electronic systems, physical sciences, materials, computer programs, mechanics, machinery, manufacturing/fabrication, mathematics and information sciences, life sciences, books and reports, a special section of laser Tech Briefs.

  11. NASA Tech Briefs, December 1997. Volume 21, No. 12

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Topics: Design and Analysis Software; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Software; Mechanics; Manufacturing/Fabrication; Mathematics and Information Sciences; Books and Reports.

  12. NASA Tech Briefs, May 1988. Volume 12, No. 5

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Topics : New Product Ideas; NASA TU Services; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics ; Machinery; Fabrication Technology; Mathematics and Information Sciences.

  13. NASA Tech Briefs, November 1990. Volume 14, No. 11

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Topics: New Product Ideas; NASA TU Services; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences.

  14. NASA Tech Briefs, April 1990. Volume 14, No. 4

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Topics: New Product Ideas; NASA TU Services; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences.

  15. NASA Tech Briefs, September 1997. Volume 21, No. 9

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Topics include: Data Acquisition and Analysis; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Software; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences.

  16. Science News of the Year.

    ERIC Educational Resources Information Center

    Science News, 1987

    1987-01-01

    Provides a review of science news stories reported in "Science News" during 1987. References each item to the volume and page number in which the subject was addressed. Contains references on astronomy, behavior, biology, biomedicine, chemistry, earth sciences, environment, mathematics and computers, paleontology and anthropology, physics, science…

  17. Floaters and Sinkers: Solutions for Math and Science. Densities and Volumes. Book 5.

    ERIC Educational Resources Information Center

    Wiebe, Arthur, Ed.; And Others

    Developed to serve as a way to integrate mathematics skills and science processes, this booklet provides activities which demonstrate the concept of density for students of grades five through nine. Investigations are offered on the densities of water, salt, salt water, and woods. Opportunities are also provided in computing volumes of cylinders…

  18. ESnet: Large-Scale Science and Data Management ( (LBNL Summer Lecture Series)

    ScienceCinema

    Johnston, Bill

    2017-12-09

    Summer Lecture Series 2004: Bill Johnston of Berkeley Lab's Computing Sciences is a distinguished networking and computing researcher. He managed the Energy Sciences Network (ESnet), a leading-edge, high-bandwidth network funded by DOE's Office of Science. Used for everything from videoconferencing to climate modeling, and flexible enough to accommodate a wide variety of data-intensive applications and services, ESNet's traffic volume is doubling every year and currently surpasses 200 terabytes per month.

  19. NASA Tech Briefs, February 1989. Volume 13, No. 2

    NASA Technical Reports Server (NTRS)

    1989-01-01

    This issue contains a special feature on shaping the future with Ceramics. Other topics include: Electronic Components & and Circuits. Electronic Systems, Physical Sciences, Materials, Computer Programs, Mechanics, Machinery, Fabrication Technology, Mathematics and Information Sciences, and Life Sciences,

  20. NASA Tech Briefs, June 1988. Volume 12, No. 6

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Topics covered: New Product Ideas; NASA TU Services; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences.

  1. Advances in engineering science, volume 2

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Papers are presented dealing with structural dynamics; structural synthesis; and the nonlinear analysis of structures, structural members, and composite structures and materials. Applications of mathematics and computer science are included.

  2. NASA Tech Briefs, April 1988. Volume 12, No. 4

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Topics include: New Product Ideas; NASA TU Services; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences.

  3. NASA Tech Briefs, July 1989. Volume 13, No. 7

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Topics include New Product Ideas; NASA TU Services; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials;;Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences.

  4. Global Journal of Computer Science and Technology. Volume 1.2

    ERIC Educational Resources Information Center

    Dixit, R. K.

    2009-01-01

    Articles in this issue of "Global Journal of Computer Science and Technology" include: (1) Input Data Processing Techniques in Intrusion Detection Systems--Short Review (Suhair H. Amer and John A. Hamilton, Jr.); (2) Semantic Annotation of Stock Photography for CBIR Using MPEG-7 standards (R. Balasubramani and V. Kannan); (3) An Experimental Study…

  5. Global Journal of Computer Science and Technology. Volume 9, Issue 5 (Ver. 2.0)

    ERIC Educational Resources Information Center

    Dixit, R. K.

    2010-01-01

    This is a special issue published in version 1.0 of "Global Journal of Computer Science and Technology." Articles in this issue include: (1) [Theta] Scheme (Orthogonal Milstein Scheme), a Better Numerical Approximation for Multi-dimensional SDEs (Klaus Schmitz Abe); (2) Input Data Processing Techniques in Intrusion Detection…

  6. NASA Tech Briefs, September 1996. Volume 20, No. 9

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Topics: Data Acquisition and Analysis; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Books and Reports.

  7. United States Air Force Summer Faculty Research Program (1987). Program Technical Report. Volume 1.

    DTIC Science & Technology

    1987-12-01

    Mechanical Engineering Specialty: Engineering Science Rose-Hulman Institute Assigned: APL 5500 Wabash Avenue - Terre Haute, IN 47803 (812) 877-1511 Dr...Professor/Di rector 1973 Dept. of Humanities Specialty: Literature/Language Rose-Hulman Inst. of Technology Assigned: HRL/LR 5500 Wabash Avenue - Terre...1976 Assistant Professor Specialty: Computer Science Dept. of Computer Science Assigned: AL Rose-Hulman Inst. of Technology 5500 Wabash Ave. Terre Haute

  8. LASER Tech Briefs, September 1993. Volume 1, No. 1

    NASA Technical Reports Server (NTRS)

    Schnirring, Bill (Editor)

    1993-01-01

    This edition of LASER Tech briefs contains a feature on photonics. The other topics include: Electronic Components and Circuits. Electronic Systems, Physical Sciences, Materials, Computer Programs, Mechanics, Machinery, Fabrication Technology, Mathematics and Information Sciences, Life Sciences and books and reports.

  9. Telescience testbed pilot program, volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.

    1989-01-01

    Space Station Freedom and its associated labs, coupled with the availability of new computing and communications technologies, have the potential for significantly enhancing scientific research. A Telescience Testbed Pilot Program (TTPP), aimed at developing the experience base to deal with issues in the design of the future information system of the Space Station era. The testbeds represented four scientific disciplines (astronomy and astrophysics, earth sciences, life sciences, and microgravity sciences) and studied issues in payload design, operation, and data analysis. This volume, of a 3 volume set, which all contain the results of the TTPP, is the executive summary.

  10. NASA Tech Briefs, August 1997. Volume 21, No. 8

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Topics:Graphics and Simulation; Mechanical Components; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Software; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Books and Reports.

  11. NASA Tech Briefs, August 2002. Volume 26, No. 8

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Topics include: a technology focus on computers, electronic components and systems, software, materials, mechanics, machinery/automation, manufacturing, physical sciences, information sciences, book and reports, and Motion control Tech Briefs.

  12. NASA Tech Briefs, March 1996. Volume 20, No. 3

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Topics: Computer-Aided Design and Engineering; Electronic Components and Cicuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information; Books and Reports.

  13. Scientific Research in British Universities and Colleges 1969-70, Volume I, Physical Sciences.

    ERIC Educational Resources Information Center

    Department of Education and Science, London (England).

    This annual publication (1969-1970) contains brief statements about current research in the physical sciences being conducted at British universities and colleges. Areas included are chemistry, physics, engineering, biochemistry, biometry, biophysics, physical geography, mathematics, computing science, and history and philosophy of science. (CP)

  14. Journal of Undergraduate Research, Volume VIII, 2008

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stiner, K. S.; Graham, S.; Khan, M.

    Th e Journal of Undergraduate Research (JUR) provides undergraduate interns the opportunity to publish their scientific innovation and to share their passion for education and research with fellow students and scientists. Fields in which these students worked include: Biology; Chemistry; Computer Science; Engineering; Environmental Science; General Sciences; Materials Sciences; Medical and Health Sciences; Nuclear Sciences; Physics; Science Policy; and Waste Management.

  15. NASA Tech Briefs, August 1995. Volume 19, No. 8

    NASA Technical Reports Server (NTRS)

    1995-01-01

    There is a special focus on computer graphics and simulation in this issue. Topics covered include : Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer programs, Mechanics; Machinery; Fabrication Technology; and Mathematics and Information Sciences. There is a section on for Laser Technology, which includes a feature on Moving closer to the suns power.

  16. NASA Tech Briefs, April 1989. Volume 13, No. 4

    NASA Technical Reports Server (NTRS)

    1989-01-01

    A special feature of this issue is an article about the evolution of high technology in Texas. Topics include: Electronic Components & and Circuits. Electronic Systems, Physical Sciences, Materials, Computer Programs, Mechanics, Machinery, Fabrication Technology, Mathematics and Information Sciences, and Life Sciences.

  17. NASA Tech Briefs, February 1988. Volume 12, No. 2

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Topics covered include: New Product Ideas; NASA TU Services; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Systems; and Life Sciences.

  18. NASA Tech Briefs, April 2000. Volume 24, No. 4

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Topics covered include: Imaging/Video/Display Technology; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Bio-Medical; Test and Measurement; Mathematics and Information Sciences; Books and Reports.

  19. NASA Tech Briefs, March 1989. Volume 13, No. 3

    NASA Technical Reports Server (NTRS)

    1989-01-01

    This issue's special features cover the NASA inventor of the year, and the other nominees for the year. Other Topics include: Electronic Components & and Circuits. Electronic Systems, Physical Sciences, Materials, Computer Programs, Mechanics, Machinery, Fabrication Technology, Mathematics and Information Sciences, and Life Sciences

  20. NASA Tech Briefs, January 1996. Volume 20, No. 1

    NASA Technical Reports Server (NTRS)

    1996-01-01

    This issue has a special focus on sensors, and include articles on Electronic Components and Circuits, Electronic Systems, Physical Sciences, Materials, Computer Programs, Mechanics, Machinery/Automation, Manufacturing/Fabrication, and Mathematics and Information Sciences

  1. Journal of Undergraduate Research, Volume VI, 2006

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faletra, P.; Schuetz, A.; Cherkerzian, D.

    Students who conducted research at DOE National Laboratories during 2005 were invited to include their research abstracts, and for a select few, their completed research papers in this Journal. This Journal is direct evidence of students collaborating with their mentors. Fields in which these students worked include: Biology; Chemistry; Computer Science; Engineering; Environmental Science; General Sciences; Materials Sciences; Medical and Health Sciences; Nuclear Sciences; Physics; and Science Policy.

  2. Research and Development in the Computer and Information Sciences. Volume 1, Information Acquisition, Sensing, and Input: A Selective Literature Review.

    ERIC Educational Resources Information Center

    Stevens, Mary Elizabeth

    The series, of which this is the initial report, is intended to give a selective overview of research and development efforts and requirements in the computer and information sciences. The operations of information acquisition, sensing, and input to information processing systems are considered in generalized terms. Specific topics include but are…

  3. Software Assurance Curriculum Project Volume 2: Undergraduate Course Outlines

    DTIC Science & Technology

    2010-08-01

    Contents Acknowledgments iii Abstract v 1 An Undergraduate Curriculum Focus on Software Assurance 1 2 Computer Science I 7 3 Computer Science II...confidence that can be integrated into traditional software development and acquisition process models . Thus, in addition to a technology focus...testing throughout the software development life cycle ( SDLC ) AP Security and complexity—system development challenges: security failures

  4. NASA Tech Briefs, March 1998. Volume 22, No. 3

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Topics include: special coverage of computer aided design and engineering, electronic components and circuits, electronic systems, physical sciences, materials, computer software, special coverage on mechanical technology, machinery/automation, manufacturing/fabrication, mathematics and information sciences, book and reports, and a special section of Electronics Tech Briefs. Profiles of the exhibitors at the National Design Engineering show are also included in this issue.

  5. NASA Tech Briefs, April 1996. Volume 20, No. 4

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Topics covered include: Advanced Composites and Plastics; Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information; Books and Reports.

  6. Science Notes.

    ERIC Educational Resources Information Center

    School Science Review, 1986

    1986-01-01

    Describes 26 different activities, experiments, demonstrations, and computer simulations in various topics in science. Includes instructional activities dealing with mural ecology, surface area/volume ratios, energy transfer in ecosystems, electrochemical simulations, alternating and direct current, terminal velocity, measuring the size of the…

  7. BASIC Simulation Programs; Volumes I and II. Biology, Earth Science, Chemistry.

    ERIC Educational Resources Information Center

    Digital Equipment Corp., Maynard, MA.

    Computer programs which teach concepts and processes related to biology, earth science, and chemistry are presented. The seven biology problems deal with aspects of genetics, evolution and natural selection, gametogenesis, enzymes, photosynthesis, and the transport of material across a membrane. Four earth science problems concern climates, the…

  8. Computer science: Key to a space program renaissance. The 1981 NASA/ASEE summer study on the use of computer science and technology in NASA. Volume 2: Appendices

    NASA Technical Reports Server (NTRS)

    Freitas, R. A., Jr. (Editor); Carlson, P. A. (Editor)

    1983-01-01

    Adoption of an aggressive computer science research and technology program within NASA will: (1) enable new mission capabilities such as autonomous spacecraft, reliability and self-repair, and low-bandwidth intelligent Earth sensing; (2) lower manpower requirements, especially in the areas of Space Shuttle operations, by making fuller use of control center automation, technical support, and internal utilization of state-of-the-art computer techniques; (3) reduce project costs via improved software verification, software engineering, enhanced scientist/engineer productivity, and increased managerial effectiveness; and (4) significantly improve internal operations within NASA with electronic mail, managerial computer aids, an automated bureaucracy and uniform program operating plans.

  9. Supplementary Computer Generated Cueing to Enhance Air Traffic Controller Efficiency

    DTIC Science & Technology

    2013-03-01

    assess the complexity of air traffic control (Mogford, Guttman, Morrow, & Kopardekar, 1995; Laudeman, Shelden, Branstrom, & Brasil , 1998). Controllers...Behaviorial Sciences: Volume 1: Methodological Issues Volume 2: Statistical Issues, 1, 257. Laudeman, I. V., Shelden, S. G., Branstrom, R., & Brasil

  10. Selected papers in the applied computer sciences 1992

    USGS Publications Warehouse

    Wiltshire, Denise A.

    1992-01-01

    This compilation of short papers reports on technical advances in the applied computer sciences. The papers describe computer applications in support of earth science investigations and research. This is the third volume in the series "Selected Papers in the Applied Computer Sciences." Listed below are the topics addressed in the compilation:Integration of geographic information systems and expert systems for resource management,Visualization of topography using digital image processing,Development of a ground-water data base for the southeastern Uited States using a geographic information system,Integration and aggregation of stream-drainage data using a geographic information system,Procedures used in production of digital geologic coverage using compact disc read-only memory (CD-ROM) technology, andAutomated methods for producing a technical publication on estimated water use in the United States.

  11. NASA Tech Briefs, December 1995. Volume 19, No. 12

    NASA Technical Reports Server (NTRS)

    1995-01-01

    Topics include: a special focus section on Bio/Medical technology, electronic components and circuits, electronic systems, physical sciences, materials, computer programs, mechanics, machinery, manufacturing/fabrication, mathematics and information sciences, book and reports, and a special section on Laser Tech Briefs.

  12. PREFACE: International Conference on Applied Sciences 2015 (ICAS2015)

    NASA Astrophysics Data System (ADS)

    Lemle, Ludovic Dan; Jiang, Yiwen

    2016-02-01

    The International Conference on Applied Sciences ICAS2015 took place in Wuhan, China on June 3-5, 2015 at the Military Economics Academy of Wuhan. The conference is regularly organized, alternatively in Romania and in P.R. China, by Politehnica University of Timişoara, Romania, and Military Economics Academy of Wuhan, P.R. China, with the joint aims to serve as a platform for exchange of information between various areas of applied sciences, and to promote the communication between the scientists of different nations, countries and continents. The topics of the conference cover a comprehensive spectrum of issues from: >Economical Sciences and Defense: Management Sciences, Business Management, Financial Management, Logistics, Human Resources, Crisis Management, Risk Management, Quality Control, Analysis and Prediction, Government Expenditure, Computational Methods in Economics, Military Sciences, National Security, and others... >Fundamental Sciences and Engineering: Interdisciplinary applications of physics, Numerical approximation and analysis, Computational Methods in Engineering, Metallic Materials, Composite Materials, Metal Alloys, Metallurgy, Heat Transfer, Mechanical Engineering, Mechatronics, Reliability, Electrical Engineering, Circuits and Systems, Signal Processing, Software Engineering, Data Bases, Modeling and Simulation, and others... The conference gathered qualified researchers whose expertise can be used to develop new engineering knowledge that has applicability potential in Engineering, Economics, Defense, etc. The number of participants was 120 from 11 countries (China, Romania, Taiwan, Korea, Denmark, France, Italy, Spain, USA, Jamaica, and Bosnia and Herzegovina). During the three days of the conference four invited and 67 oral talks were delivered. Based on the work presented at the conference, 38 selected papers have been included in this volume of IOP Conference Series: Materials Science and Engineering. These papers present new research in the various fields of Materials Engineering, Mechanical Engineering, Computers Engineering, and Electrical Engineering. It's our great pleasure to present this volume of IOP Conference Series: Materials Science and Engineering to the scientific community to promote further research in these areas. We sincerely hope that the papers published in this volume will contribute to the advancement of knowledge in the respective fields.

  13. The ADL Registry and CORDRA. Volume 1: General Overview

    DTIC Science & Technology

    2008-08-01

    and problems encountered by others in related fields, such as library science , computer and network systems design, and publishing. As ADL...in and exist in isolated islands, limiting their visibility, access, and reuse. 4 Compared to publishing and library science , the learning

  14. NASA Tech Briefs, May 1989. Volume 13, No. 5

    NASA Technical Reports Server (NTRS)

    1989-01-01

    This issue contains a special feature on the flight station of the future, discussing future enhancements to Aircraft cockpits. Topics include: Electronic Components and Circuits. Electronic Systems, Physical Sciences, Materials, Computer Programs, Mechanics, Machinery, Fabrication Technology, and Mathematics and Information Sciences.

  15. The 159th national meeting of the American Association for the advancement of science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This volume is the program/abstracts for the 1993 national meeting of the American Association for the Advancement of Science. The meeting was held in Boston from 11-16 February 1993. Symposia dealt with works on the following topics; perspectives on human genetics; confronting AIDS; biology, cells bugs; medical research society; social psychology neuroscience; future chemistry, from carbon to silicon; measuring the matter energy of the universe; earth's ever-changing atmosphere; causing coping with environmental change; agricultural biotechnology, plant protection production; science corporate enterprise; examining reforming the economic system; science, ethics the law; communicating science to the public; information technology the changing facemore » of science; mathematics, concepts computations; international cooperation human survival; science for everyone; science religion, examining both; anthropology, dynamics of human history; international science issues; improving formal science education; and science education reform in America. Separate abstracts have been prepared for articles from this volume.« less

  16. Sixth New Zealand Computer Conference (Auckland 78). Volume I, Papers.

    ERIC Educational Resources Information Center

    New Zealand Computer Society, Auckland.

    This collection of conference presentations includes 23 papers on a variety of topics pertaining to the use of computer in New Zealand. Among the topics discussed are computer science techniques in a commercial data processing situation, data processing personnel and their careers, the communication aspects of an airline system, implementation of…

  17. Surface Modeling and Grid Generation of Orbital Sciences X34 Vehicle. Phase 1

    NASA Technical Reports Server (NTRS)

    Alter, Stephen J.

    1997-01-01

    The surface modeling and grid generation requirements, motivations, and methods used to develop Computational Fluid Dynamic volume grids for the X34-Phase 1 are presented. The requirements set forth by the Aerothermodynamics Branch at the NASA Langley Research Center serve as the basis for the final techniques used in the construction of all volume grids, including grids for parametric studies of the X34. The Integrated Computer Engineering and Manufacturing code for Computational Fluid Dynamics (ICEM/CFD), the Grid Generation code (GRIDGEN), the Three-Dimensional Multi-block Advanced Grid Generation System (3DMAGGS) code, and Volume Grid Manipulator (VGM) code are used to enable the necessary surface modeling, surface grid generation, volume grid generation, and grid alterations, respectively. All volume grids generated for the X34, as outlined in this paper, were used for CFD simulations within the Aerothermodynamics Branch.

  18. NASA Tech Briefs, May 1995. Volume 19, No. 5

    NASA Technical Reports Server (NTRS)

    1995-01-01

    This issue features an resource report on Jet Propulsion Laboratory and a special focus on advanced composites and plastics. It also contains articles on electronic components and circuits, electronic systems, physical sciences, computer programs, mechanics, machinery, manufacturing and fabrication, mathematics and information sciences, and life sciences. This issue also contains a supplement on federal laboratory test and measurements.

  19. Communications among data and science centers

    NASA Technical Reports Server (NTRS)

    Green, James L.

    1990-01-01

    The ability to electronically access and query the contents of remote computer archives is of singular importance in space and earth sciences; the present evaluation of such on-line information networks' development status foresees swift expansion of their data capabilities and complexity, in view of the volumes of data that will continue to be generated by NASA missions. The U.S.'s National Space Science Data Center (NSSDC) manages NASA's largest science computer network, the Space Physics Analysis Network; a comprehensive account is given of the structure of NSSDC international access through BITNET, and of connections to the NSSDC available in the Americas via the International X.25 network.

  20. Essential Autonomous Science Inference on Rovers (EASIR)

    NASA Technical Reports Server (NTRS)

    Roush, Ted L.; Shipman, Mark; Morris, Robert; Gazis, Paul; Pedersen, Liam

    2003-01-01

    Existing constraints on time, computational, and communication resources associated with Mars rover missions suggest on-board science evaluation of sensor data can contribute to decreasing human-directed operational planning, optimizing returned science data volumes, and recognition of unique or novel data. All of which act to increase the scientific return from a mission. Many different levels of science autonomy exist and each impacts the data collected and returned by, and activities of, rovers. Several computational algorithms, designed to recognize objects of interest to geologists and biologists, are discussed. The algorithms represent various functions that producing scientific opinions and several scenarios illustrate how the opinions can be used.

  1. NASA Tech Briefs, June 1998. Volume 22, No. 6

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Topics include: special coverage on computer hardware and peripherals, electronic components and circuits, electronic systems, software, materials, mechanics, machinery/automation, manufacturing, physical sciences, information sciences, book and reports, and a special section of Photonics Tech Briefs. and a second special section of Motion Control Tech Briefs

  2. Biomedical Science, Unit IV: The Nervous System in Health and Medicine. The Nervous System; Disorders of the Brain and Nervous System; Application of Computer Science to Diagnosis; Drugs and Pharmacology; The Human Senses; Electricity. Instructor's Manual. Revised Version, 1976.

    ERIC Educational Resources Information Center

    Biomedical Interdisciplinary Curriculum Project, Berkeley, CA.

    This volume contains the lesson plans and appropriate teacher background material for a 37-lesson sequence on the nervous system in health and medicine. Additional material is provided for supplementary lessons on concepts of electricity. Associated material, contained in separate volumes, include a student text and a student laboratory manual.…

  3. NASA Tech Briefs, November 1997. Volume 21, No. 11

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Topics covered include: Test and Measurement; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Software; Mechanics; Machinery/Automation; Books and Reports..

  4. Telescience testbed pilot program, volume 3: Experiment summaries

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.

    1989-01-01

    Space Station Freedom and its associated labs, coupled with the availability of new computing and communications technologies, have the potential for significantly enhancing scientific research. A Telescience Testbed Pilot Program (TTPP), aimed at developing the experience base to deal with issues in the design of the future information system of the Space Station era. The testbeds represented four scientific disciplines (astronomy and astrophysics, earth science, life sciences, and microgravity sciences) and studied issues in payload design, operation, and data analysis. This volume, of a 3 volume set, which all contain the results of the TTPP, presents summaries of the experiments. This experiment involves the evaluation of the current Internet for the use of file and image transfer between SIRTF instrument teams. The main issue addressed was current network response times.

  5. NASA Tech Briefs, July 1999. Volume 23, No. 7

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Topics: Test and Measurement; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Software; Mechanics; Machinery/Automation; Bio-Medical; Books and Reports; Semiconductors/ICs.

  6. Computer Sciences and Data Systems, volume 2

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Topics addressed include: data storage; information network architecture; VHSIC technology; fiber optics; laser applications; distributed processing; spaceborne optical disk controller; massively parallel processors; and advanced digital SAR processors.

  7. NASA Tech Briefs, September 2000. Volume 24, No. 9

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Topics include: Sensors; Test and Measurement; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Bio-Medical; semiconductors/ICs; Books and Reports.

  8. PREFACE: New trends in Computer Simulations in Physics and not only in physics

    NASA Astrophysics Data System (ADS)

    Shchur, Lev N.; Krashakov, Serge A.

    2016-02-01

    In this volume we have collected papers based on the presentations given at the International Conference on Computer Simulations in Physics and beyond (CSP2015), held in Moscow, September 6-10, 2015. We hope that this volume will be helpful and scientifically interesting for readers. The Conference was organized for the first time with the common efforts of the Moscow Institute for Electronics and Mathematics (MIEM) of the National Research University Higher School of Economics, the Landau Institute for Theoretical Physics, and the Science Center in Chernogolovka. The name of the Conference emphasizes the multidisciplinary nature of computational physics. Its methods are applied to the broad range of current research in science and society. The choice of venue was motivated by the multidisciplinary character of the MIEM. It is a former independent university, which has recently become the part of the National Research University Higher School of Economics. The Conference Computer Simulations in Physics and beyond (CSP) is planned to be organized biannually. This year's Conference featured 99 presentations, including 21 plenary and invited talks ranging from the analysis of Irish myths with recent methods of statistical physics, to computing with novel quantum computers D-Wave and D-Wave2. This volume covers various areas of computational physics and emerging subjects within the computational physics community. Each section was preceded by invited talks presenting the latest algorithms and methods in computational physics, as well as new scientific results. Both parallel and poster sessions paid special attention to numerical methods, applications and results. For all the abstracts presented at the conference please follow the link http://csp2015.ac.ru/files/book5x.pdf

  9. The Synthetic Aperture Radar Science Data Processing Foundry Concept for Earth Science

    NASA Astrophysics Data System (ADS)

    Rosen, P. A.; Hua, H.; Norton, C. D.; Little, M. M.

    2015-12-01

    Since 2008, NASA's Earth Science Technology Office and the Advanced Information Systems Technology Program have invested in two technology evolutions to meet the needs of the community of scientists exploiting the rapidly growing database of international synthetic aperture radar (SAR) data. JPL, working with the science community, has developed the InSAR Scientific Computing Environment (ISCE), a next-generation interferometric SAR processing system that is designed to be flexible and extensible. ISCE currently supports many international space borne data sets but has been primarily focused on geodetic science and applications. A second evolutionary path, the Advanced Rapid Imaging and Analysis (ARIA) science data system, uses ISCE as its core science data processing engine and produces automated science and response products, quality assessments and metadata. The success of this two-front effort has been demonstrated in NASA's ability to respond to recent events with useful disaster support. JPL has enabled high-volume and low latency data production by the re-use of the hybrid cloud computing science data system (HySDS) that runs ARIA, leveraging on-premise cloud computing assets that are able to burst onto the Amazon Web Services (AWS) services as needed. Beyond geodetic applications, needs have emerged to process large volumes of time-series SAR data collected for estimation of biomass and its change, in such campaigns as the upcoming AfriSAR field campaign. ESTO is funding JPL to extend the ISCE-ARIA model to a "SAR Science Data Processing Foundry" to on-ramp new data sources and to produce new science data products to meet the needs of science teams and, in general, science community members. An extension of the ISCE-ARIA model to support on-demand processing will permit PIs to leverage this Foundry to produce data products from accepted data sources when they need them. This paper will describe each of the elements of the SAR SDP Foundry and describe their integration into a new conceptual approach to enable more effective use of SAR instruments.

  10. Computer Sciences and Data Systems, volume 1

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Topics addressed include: software engineering; university grants; institutes; concurrent processing; sparse distributed memory; distributed operating systems; intelligent data management processes; expert system for image analysis; fault tolerant software; and architecture research.

  11. Optical Computing. Organization of the 1993 Photonics Science Topical Meetings Held in Palm Springs, California on March 16 - 19, 1993. Technical Digest Series, Volume 7

    DTIC Science & Technology

    1993-03-19

    network Implementation using 9:20 am asymmetric Fabry-Perot modulators, Andrew Jennings, Brian OWA3 Multiwavelength optical half adder, Pochi Yeh... multiwavelength optical half adder. (p. 68) nects. (p. 96) 9:40 am 2:50 pm OWA4 Wavelength multiplexed computer-generated volume OWC3 Content addramble...ATMOS and OSCAR are RACE projects, mentioned in the text shape this into new systems architectures, ("optical ether"). Broadly speaking, this has led to

  12. NASA Tech Briefs, April 1998. Volume 22, No. 4

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Topics include: special coverage on video and imaging, electronic components and circuits, electronic systems, physical sciences, materials, computer software, mechanics, machinery/automation, and a special section of Photonics Tech Briefs.

  13. High-Resiliency and Auto-Scaling of Large-Scale Cloud Computing for OCO-2 L2 Full Physics Processing

    NASA Astrophysics Data System (ADS)

    Hua, H.; Manipon, G.; Starch, M.; Dang, L. B.; Southam, P.; Wilson, B. D.; Avis, C.; Chang, A.; Cheng, C.; Smyth, M.; McDuffie, J. L.; Ramirez, P.

    2015-12-01

    Next generation science data systems are needed to address the incoming flood of data from new missions such as SWOT and NISAR where data volumes and data throughput rates are order of magnitude larger than present day missions. Additionally, traditional means of procuring hardware on-premise are already limited due to facilities capacity constraints for these new missions. Existing missions, such as OCO-2, may also require high turn-around time for processing different science scenarios where on-premise and even traditional HPC computing environments may not meet the high processing needs. We present our experiences on deploying a hybrid-cloud computing science data system (HySDS) for the OCO-2 Science Computing Facility to support large-scale processing of their Level-2 full physics data products. We will explore optimization approaches to getting best performance out of hybrid-cloud computing as well as common issues that will arise when dealing with large-scale computing. Novel approaches were utilized to do processing on Amazon's spot market, which can potentially offer ~10X costs savings but with an unpredictable computing environment based on market forces. We will present how we enabled high-tolerance computing in order to achieve large-scale computing as well as operational cost savings.

  14. Innovations in Science Teaching. The Forum for Liberal Education, Volume II, Number 4, February, 1980.

    ERIC Educational Resources Information Center

    Mohrman, Kathryn, Ed.

    Curricular development in undergraduate programs in the biological, physical, and mathematical sciences at a number of colleges and universities are described. One common theme is the continuing interest in computers in higher education. As the student bodies of many campuses become more heterogeneous with increasing enrollments of minorities and…

  15. Technology 2001: The Second National Technology Transfer Conference and Exposition, volume 1

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Papers from the technical sessions of the Technology 2001 Conference and Exposition are presented. The technical sessions featured discussions of advanced manufacturing, artificial intelligence, biotechnology, computer graphics and simulation, communications, data and information management, electronics, electro-optics, environmental technology, life sciences, materials science, medical advances, robotics, software engineering, and test and measurement.

  16. Earth Science Informatics Comes of Age

    NASA Technical Reports Server (NTRS)

    Jodha, Siri; Khalsa, S.; Ramachandran, Rahul

    2014-01-01

    The volume and complexity of Earth science data have steadily increased, placing ever-greater demands on researchers, software developers and data managers tasked with handling such data. Additional demands arise from requirements being levied by funding agencies and governments to better manage, preserve and provide open access to data. Fortunately, over the past 10-15 years significant advances in information technology, such as increased processing power, advanced programming languages, more sophisticated and practical standards, and near-ubiquitous internet access have made the jobs of those acquiring, processing, distributing and archiving data easier. These advances have also led to an increasing number of individuals entering the field of informatics as it applies to Geoscience and Remote Sensing. Informatics is the science and technology of applying computers and computational methods to the systematic analysis, management, interchange, and representation of data, information, and knowledge. Informatics also encompasses the use of computers and computational methods to support decisionmaking and other applications for societal benefits.

  17. Proceedings from an International Conference on Computers and Philosophy, i-C&P 2006 held 3-5 May 2006 in Laval, France

    DTIC Science & Technology

    2008-10-20

    embedded intelligence and cultural adaptations to the onslaught of robots in society. This volume constitutes a key contribution to the body of... Robotics , CNRS/Toulouse University, France Nathalie COLINEAU, Language & Multi-modality, CSIRO, Australia Roberto CORDESCHI, Computation & Communication...Intelligence, SONY CSL ­ Paris Nik KASABOV, Computer and Information Sciences, Auckland University, New Zealand Oussama KHATIB, Robotics & Artificial

  18. List of Publications of the U.S. Army Engineer Waterways Experiment Station. Volume 2

    DTIC Science & Technology

    1993-09-01

    Station List of Publications of the U.S. Army Engineer Waterways Experiment Station Volume II compiled by Research Library Information Management Division...Waterways Experiment Station for Other Agencies Air Base Survivability Systems Management Office Headquarters .............................. Z-1 Airport... manages , conducts, and coordinates research and development in the Information Management (IM) technology areas that include computer science

  19. Telescience testbed pilot program, volume 2: Program results

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.

    1989-01-01

    Space Station Freedom and its associated labs, coupled with the availability of new computing and communications technologies, have the potential for significantly enhancing scientific research. A Telescience Testbed Pilot Program (TTPP), aimed at developing the experience base to deal with issues in the design of the future information system of the Space Station era. The testbeds represented four scientific disciplines (astronomy and astrophysics, earth sciences, life sciences, and microgravity sciences) and studied issues in payload design, operation, and data analysis. This volume, of a 3 volume set, which all contain the results of the TTPP, contains the integrated results. Background is provided of the program and highlights of the program results. The various testbed experiments and the programmatic approach is summarized. The results are summarized on a discipline by discipline basis, highlighting the lessons learned for each discipline. Then the results are integrated across each discipline, summarizing the lessons learned overall.

  20. On the Large-Scaling Issues of Cloud-based Applications for Earth Science Dat

    NASA Astrophysics Data System (ADS)

    Hua, H.

    2016-12-01

    Next generation science data systems are needed to address the incoming flood of data from new missions such as NASA's SWOT and NISAR where its SAR data volumes and data throughput rates are order of magnitude larger than present day missions. Existing missions, such as OCO-2, may also require high turn-around time for processing different science scenarios where on-premise and even traditional HPC computing environments may not meet the high processing needs. Additionally, traditional means of procuring hardware on-premise are already limited due to facilities capacity constraints for these new missions. Experiences have shown that to embrace efficient cloud computing approaches for large-scale science data systems requires more than just moving existing code to cloud environments. At large cloud scales, we need to deal with scaling and cost issues. We present our experiences on deploying multiple instances of our hybrid-cloud computing science data system (HySDS) to support large-scale processing of Earth Science data products. We will explore optimization approaches to getting best performance out of hybrid-cloud computing as well as common issues that will arise when dealing with large-scale computing. Novel approaches were utilized to do processing on Amazon's spot market, which can potentially offer 75%-90% costs savings but with an unpredictable computing environment based on market forces.

  1. Illustrative Computer Programming for Libraries; Selected Examples for Information Specialists. Contributions in Librarianship and Information Science, No. 12.

    ERIC Educational Resources Information Center

    Davis, Charles H.

    Intended for teaching applications programing for libraries and information centers, this volume is a graded workbook or text supplement containing typical practice problems, suggested solutions, and brief analyses which emphasize programing efficiency. The computer language used is Programing Language/One (PL/1) because it adapts readily to…

  2. The application of cloud computing to scientific workflows: a study of cost and performance.

    PubMed

    Berriman, G Bruce; Deelman, Ewa; Juve, Gideon; Rynge, Mats; Vöckler, Jens-S

    2013-01-28

    The current model of transferring data from data centres to desktops for analysis will soon be rendered impractical by the accelerating growth in the volume of science datasets. Processing will instead often take place on high-performance servers co-located with data. Evaluations of how new technologies such as cloud computing would support such a new distributed computing model are urgently needed. Cloud computing is a new way of purchasing computing and storage resources on demand through virtualization technologies. We report here the results of investigations of the applicability of commercial cloud computing to scientific computing, with an emphasis on astronomy, including investigations of what types of applications can be run cheaply and efficiently on the cloud, and an example of an application well suited to the cloud: processing a large dataset to create a new science product.

  3. Middle School Science Notes.

    ERIC Educational Resources Information Center

    School Science Review, 1984

    1984-01-01

    Presents (1) suggestions on teaching volume and density in the elementary school; (2) ideas for teaching about floating and sinking; (3) a simple computer program on color addition; and (4) an illustration of Newton's second law of motion. (JN)

  4. NASA Tech Briefs, August 2001. Volume 25, No. 8

    NASA Technical Reports Server (NTRS)

    2001-01-01

    Topics include: special coverage section on computers and peripherals, and sections on electronic components systems, software, materials, mechanics, manufacturing/fabrication, physical sciences, book and reports, and a special section of Motion Control Tech Briefs.

  5. NASA Tech Briefs, March 2002. Volume 26, No. 3

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Topics include: a special section on data acquisition, software, electronic components and systems, materials, computer programs, mechanics, machinery/automation, manufacturing, biomedical, physical sciences, book and reports, and a special section of Photonics Tech Briefs.

  6. Integrating Intelligent Systems Domain Knowledge Into the Earth Science Curricula

    NASA Astrophysics Data System (ADS)

    Güereque, M.; Pennington, D. D.; Pierce, S. A.

    2017-12-01

    High-volume heterogeneous datasets are becoming ubiquitous, migrating to center stage over the last ten years and transcending the boundaries of computationally intensive disciplines into the mainstream, becoming a fundamental part of every science discipline. Despite the fact that large datasets are now pervasive across industries and academic disciplines, the array of skills is generally absent from earth science programs. This has left the bulk of the student population without access to curricula that systematically teach appropriate intelligent-systems skills, creating a void for skill sets that should be universal given their need and marketability. While some guidance regarding appropriate computational thinking and pedagogy is appearing, there exist few examples where these have been specifically designed and tested within the earth science domain. Furthermore, best practices from learning science have not yet been widely tested for developing intelligent systems-thinking skills. This research developed and tested evidence based computational skill modules that target this deficit with the intention of informing the earth science community as it continues to incorporate intelligent systems techniques and reasoning into its research and classrooms.

  7. Manifesto of computational social science

    NASA Astrophysics Data System (ADS)

    Conte, R.; Gilbert, N.; Bonelli, G.; Cioffi-Revilla, C.; Deffuant, G.; Kertesz, J.; Loreto, V.; Moat, S.; Nadal, J.-P.; Sanchez, A.; Nowak, A.; Flache, A.; San Miguel, M.; Helbing, D.

    2012-11-01

    The increasing integration of technology into our lives has created unprecedented volumes of data on society's everyday behaviour. Such data opens up exciting new opportunities to work towards a quantitative understanding of our complex social systems, within the realms of a new discipline known as Computational Social Science. Against a background of financial crises, riots and international epidemics, the urgent need for a greater comprehension of the complexity of our interconnected global society and an ability to apply such insights in policy decisions is clear. This manifesto outlines the objectives of this new scientific direction, considering the challenges involved in it, and the extensive impact on science, technology and society that the success of this endeavour is likely to bring about.

  8. Computational Accelerator Physics. Proceedings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bisognano, J.J.; Mondelli, A.A.

    1997-04-01

    The sixty two papers appearing in this volume were presented at CAP96, the Computational Accelerator Physics Conference held in Williamsburg, Virginia from September 24{minus}27,1996. Science Applications International Corporation (SAIC) and the Thomas Jefferson National Accelerator Facility (Jefferson lab) jointly hosted CAP96, with financial support from the U.S. department of Energy`s Office of Energy Research and the Office of Naval reasearch. Topics ranged from descriptions of specific codes to advanced computing techniques and numerical methods. Update talks were presented on nearly all of the accelerator community`s major electromagnetic and particle tracking codes. Among all papers, thirty of them are abstracted formore » the Energy Science and Technology database.(AIP)« less

  9. NASA Tech Briefs, May 1998. Volume 22, No. 5

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Topics include: special coverage on advanced composites, plastics and metals, electronic components and circuits, electronic systems, physical sciences, computer software, mechanics, machinery/automation, manufacturing/fabrication book and reports, and a special section of Electronics Tech Briefs.

  10. NASA Tech Briefs, February 2002. Volume 26, No. 2

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Topics include:a technology focus on computers, electronic components and systems, software, materials, mechanics,physical sciences machinery, manufacturing/fabrication, mathematics, book and reports, motion control tech briefs and a special section on Photonics Tech Briefs.

  11. Technology 2004, Vol. 2

    NASA Technical Reports Server (NTRS)

    1995-01-01

    Proceedings from symposia of the Technology 2004 Conference, November 8-10, 1994, Washington, DC. Volume 2 features papers on computers and software, virtual reality simulation, environmental technology, video and imaging, medical technology and life sciences, robotics and artificial intelligence, and electronics.

  12. Proceedings of the Conference of the International Group for the Psychology of Mathematics Education (21st, Lahti, Finland, July 14-19, 1997). Volume 2.

    ERIC Educational Resources Information Center

    Pehkonen, Erkki, Ed.

    The second volume of the proceedings of 21st annual meeting of the International Group for the Psychology of Mathematics Education contains the following papers: (1) "The Dilemma of Transparency: Seeing and Seeing through Talk in the Mathematics Classroom" (J. Adler); (2) "Abstraction is Hard in Computer-Science Too" (D.…

  13. Information Sciences Assessment for Asia and Australasia

    DTIC Science & Technology

    2009-10-16

    entertainment and home services - Machine Translation for international cooperation - NLU + Affective Computing for education - Intelligent Optimization for...into an emotion. ETTS, embedded Mandarin, music retrieval. Also, research in areas of computer graphics, digital media processing  Intelligent...many from outside China, 40% in phase 2 Sales volume in 2007 130 * 100 million RMB SAP (1st), CITI, AIG, EDS, Capgemini, ILOG, Infosys, HCL, Sony

  14. Interacting with Petabytes of Earth Science Data using Jupyter Notebooks, IPython Widgets and Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Erickson, T. A.; Granger, B.; Grout, J.; Corlay, S.

    2017-12-01

    The volume of Earth science data gathered from satellites, aircraft, drones, and field instruments continues to increase. For many scientific questions in the Earth sciences, managing this large volume of data is a barrier to progress, as it is difficult to explore and analyze large volumes of data using the traditional paradigm of downloading datasets to a local computer for analysis. Furthermore, methods for communicating Earth science algorithms that operate on large datasets in an easily understandable and reproducible way are needed. Here we describe a system for developing, interacting, and sharing well-documented Earth Science algorithms that combines existing software components: Jupyter Notebook: An open-source, web-based environment that supports documents that combine code and computational results with text narrative, mathematics, images, and other media. These notebooks provide an environment for interactive exploration of data and development of well documented algorithms. Jupyter Widgets / ipyleaflet: An architecture for creating interactive user interface controls (such as sliders, text boxes, etc.) in Jupyter Notebooks that communicate with Python code. This architecture includes a default set of UI controls (sliders, dropboxes, etc.) as well as APIs for building custom UI controls. The ipyleaflet project is one example that offers a custom interactive map control that allows a user to display and manipulate geographic data within the Jupyter Notebook. Google Earth Engine: A cloud-based geospatial analysis platform that provides access to petabytes of Earth science data via a Python API. The combination of Jupyter Notebooks, Jupyter Widgets, ipyleaflet, and Google Earth Engine makes it possible to explore and analyze massive Earth science datasets via a web browser, in an environment suitable for interactive exploration, teaching, and sharing. Using these environments can make Earth science analyses easier to understand and reproducible, which may increase the rate of scientific discoveries and the transition of discoveries into real-world impacts.

  15. Prediction of resource volumes at untested locations using simple local prediction models

    USGS Publications Warehouse

    Attanasi, E.D.; Coburn, T.C.; Freeman, P.A.

    2006-01-01

    This paper shows how local spatial nonparametric prediction models can be applied to estimate volumes of recoverable gas resources at individual undrilled sites, at multiple sites on a regional scale, and to compute confidence bounds for regional volumes based on the distribution of those estimates. An approach that combines cross-validation, the jackknife, and bootstrap procedures is used to accomplish this task. Simulation experiments show that cross-validation can be applied beneficially to select an appropriate prediction model. The cross-validation procedure worked well for a wide range of different states of nature and levels of information. Jackknife procedures are used to compute individual prediction estimation errors at undrilled locations. The jackknife replicates also are used with a bootstrap resampling procedure to compute confidence bounds for the total volume. The method was applied to data (partitioned into a training set and target set) from the Devonian Antrim Shale continuous-type gas play in the Michigan Basin in Otsego County, Michigan. The analysis showed that the model estimate of total recoverable volumes at prediction sites is within 4 percent of the total observed volume. The model predictions also provide frequency distributions of the cell volumes at the production unit scale. Such distributions are the basis for subsequent economic analyses. ?? Springer Science+Business Media, LLC 2007.

  16. European Science Notes, Volume 41, Number 1.

    DTIC Science & Technology

    1987-01-01

    extract which also *body, HNKI, stains dorsal root ganglion exhibited a trophic effect could be re- (DRG) cells and is selective for neural placed by... effect on central as well as peripheral to migrate just after the neural tube neurons. closes and that these cells migrate Neuronal Development...viscous effects which are ex- tions used pseudounsteady, cell -centered cluded from the computation-. In some finite volume methods. Quite different

  17. PREFACE: 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT2013)

    NASA Astrophysics Data System (ADS)

    Wang, Jianxiong

    2014-06-01

    This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF

  18. Software Assurance Curriculum Project Volume 4: Community College Education

    DTIC Science & Technology

    2011-09-01

    no previous programming or computer science experience expected) • Precalculus -ready (that is, proficiency sufficient to enter college-level... precalculus course) • English Composition I-ready (that is, proficiency sufficient to enter college-level English I course) Co-Requisite Discrete

  19. European Science Notes. Volume 41, Number 10,

    DTIC Science & Technology

    1987-10-01

    the following topics: laminar/turbulent transition in boundary layers; coherent structures in the modeling of turbulent boundary layers, wakes, and jets...of the labeling of a model protein, human immu- indicator. The amount of oxygen produced noglobulin (hIgG), with acridinium ester, can easily be...has concerned cations, and Computer Science. Research model reduction of large-scale systems in the controls area is conducted in the and state and

  20. PREFACE: International Conference on Applied Sciences (ICAS2014)

    NASA Astrophysics Data System (ADS)

    Lemle, Ludovic Dan; Jiang, Yiwen

    2015-06-01

    The International Conference on Applied Sciences (ICAS2014) took place in Hunedoara, Romania from 2-4 October 2014 at the Engineering Faculty of Hunedoara. The conference takes place alternately in Romania and in P.R. China and is organized by "Politehnica" University of Timisoara, Romania, and Military Economics Academy of Wuhan, P.R. China, with the aim to serve as a platform for exchange of information between various areas of applied sciences and to promote the communication between scientists of different nations, countries and continents. The topics of the conference covered a comprehensive spectrum of issues: 1. Economical Sciences 2. Engineering Sciences 3. Fundamental Sciences 4. Medical Sciences The conference gathered qualified researchers whose expertise can be used to develop new engineering knowledge that has the potential for application in economics, defense, medicine, etc. There were nearly 100 registered participants from six countries, and four invited and 56 oral talks were delivered during the two days of the conference. Based on the work presented at the conference, selected papers are included in this volume of IOP Conference Series: Materials Science and Engineering. These papers present new research in the various fields of Materials Engineering, Mechanical Engineering, Computer Engineering, and Mathematical Engineering. It is our great pleasure to present this volume of IOP Conference Series: Materials Science and Engineering to the scientific community to promote further research in these areas. We sincerely hope that the papers published in this volume will contribute to the advancement of knowledge in their respective fields.

  1. Solar System Number-Crunching.

    ERIC Educational Resources Information Center

    Albrecht, Bob; Firedrake, George

    1997-01-01

    Defines terrestrial and Jovian planets and provides directions to obtain planetary data from the National Space Science Data Center Web sites. Provides "number-crunching" activities for the terrestrial planets using Texas Instruments TI-83 graphing calculators: computing volumetric mean radius and volume, density, ellipticity, speed,…

  2. Next Generation Cloud-based Science Data Systems and Their Implications on Data and Software Stewardship, Preservation, and Provenance

    NASA Astrophysics Data System (ADS)

    Hua, H.; Manipon, G.; Starch, M.

    2017-12-01

    NASA's upcoming missions are expected to be generating data volumes at least an order of magnitude larger than current missions. A significant increase in data processing, data rates, data volumes, and long-term data archive capabilities are needed. Consequently, new challenges are emerging that impact traditional data and software management approaches. At large-scales, next generation science data systems are exploring the move onto cloud computing paradigms to support these increased needs. New implications such as costs, data movement, collocation of data systems & archives, and moving processing closer to the data, may result in changes to the stewardship, preservation, and provenance of science data and software. With more science data systems being on-boarding onto cloud computing facilities, we can expect more Earth science data records to be both generated and kept in the cloud. But at large scales, the cost of processing and storing global data may impact architectural and system designs. Data systems will trade the cost of keeping data in the cloud with the data life-cycle approaches of moving "colder" data back to traditional on-premise facilities. How will this impact data citation and processing software stewardship? What are the impacts of cloud-based on-demand processing and its affect on reproducibility and provenance. Similarly, with more science processing software being moved onto cloud, virtual machines, and container based approaches, more opportunities arise for improved stewardship and preservation. But will the science community trust data reprocessed years or decades later? We will also explore emerging questions of the stewardship of the science data system software that is generating the science data records both during and after the life of mission.

  3. ONRASIA Scientific Information Bulletin. Volume 8, Number 3, July- September 1993

    DTIC Science & Technology

    1993-09-01

    the Ninth Symposium on Preconditioned Conjugate Dr. Steven F. Ashby Gradient Methods , which he organized. Computing Sciences Department Computing...ditioned Conjugate Gradient Methods , held at Keio chines and is currently a topic of considerable University (Yokohama). During this meeting, I interest...in the United States. In Japan, on the other discussed iterative methods for linear systems with hand, this technique does not appear to be too well

  4. A Computation Infrastructure for Knowledge-Based Development of Reliable Software Systems

    DTIC Science & Technology

    2006-11-10

    Grant number: F045-023-0029 * Principal Investigator: David Gaspari, ATC-NY * Duration: May 2007 (assuming a successful review in 2005) * Source of... David Guaspari, Verifying Chain Replication in Event Logic Cornell University Technical Report, to be published 2006 "* Eli Barzilay, Implementing...and Reasoning, volume 2452 of Lecture Notes in Computer Science, pages 449-465, 2005. "* Mark Bickford and David Guaspari, A Programming Logic for

  5. PREFACE: ELC International Meeting on Inference, Computation, and Spin Glasses (ICSG2013)

    NASA Astrophysics Data System (ADS)

    Kabashima, Yoshiyuki; Hukushima, Koji; Inoue, Jun-ichi; Tanaka, Toshiyuki; Watanabe, Osamu

    2013-12-01

    The close relationship between probability-based inference and statistical mechanics of disordered systems has been noted for some time. This relationship has provided researchers with a theoretical foundation in various fields of information processing for analytical performance evaluation and construction of efficient algorithms based on message-passing or Monte Carlo sampling schemes. The ELC International Meeting on 'Inference, Computation, and Spin Glasses (ICSG2013)', was held in Sapporo 28-30 July 2013. The meeting was organized as a satellite meeting of STATPHYS25 in order to offer a forum where concerned researchers can assemble and exchange information on the latest results and newly established methodologies, and discuss future directions of the interdisciplinary studies between statistical mechanics and information sciences. Financial support from Grant-in-Aid for Scientific Research on Innovative Areas, MEXT, Japan 'Exploring the Limits of Computation (ELC)' is gratefully acknowledged. We are pleased to publish 23 papers contributed by invited speakers of ICSG2013 in this volume of Journal of Physics: Conference Series. We hope that this volume will promote further development of this highly vigorous interdisciplinary field between statistical mechanics and information/computer science. Editors and ICSG2013 Organizing Committee: Koji Hukushima Jun-ichi Inoue (Local Chair of ICSG2013) Yoshiyuki Kabashima (Editor-in-Chief) Toshiyuki Tanaka Osamu Watanabe (General Chair of ICSG2013)

  6. Dawn: A Simulation Model for Evaluating Costs and Tradeoffs of Big Data Science Architectures

    NASA Astrophysics Data System (ADS)

    Cinquini, L.; Crichton, D. J.; Braverman, A. J.; Kyo, L.; Fuchs, T.; Turmon, M.

    2014-12-01

    In many scientific disciplines, scientists and data managers are bracing for an upcoming deluge of big data volumes, which will increase the size of current data archives by a factor of 10-100 times. For example, the next Climate Model Inter-comparison Project (CMIP6) will generate a global archive of model output of approximately 10-20 Peta-bytes, while the upcoming next generation of NASA decadal Earth Observing instruments are expected to collect tens of Giga-bytes/day. In radio-astronomy, the Square Kilometre Array (SKA) will collect data in the Exa-bytes/day range, of which (after reduction and processing) around 1.5 Exa-bytes/year will be stored. The effective and timely processing of these enormous data streams will require the design of new data reduction and processing algorithms, new system architectures, and new techniques for evaluating computation uncertainty. Yet at present no general software tool or framework exists that will allow system architects to model their expected data processing workflow, and determine the network, computational and storage resources needed to prepare their data for scientific analysis. In order to fill this gap, at NASA/JPL we have been developing a preliminary model named DAWN (Distributed Analytics, Workflows and Numerics) for simulating arbitrary complex workflows composed of any number of data processing and movement tasks. The model can be configured with a representation of the problem at hand (the data volumes, the processing algorithms, the available computing and network resources), and is able to evaluate tradeoffs between different possible workflows based on several estimators: overall elapsed time, separate computation and transfer times, resulting uncertainty, and others. So far, we have been applying DAWN to analyze architectural solutions for 4 different use cases from distinct science disciplines: climate science, astronomy, hydrology and a generic cloud computing use case. This talk will present preliminary results and discuss how DAWN can be evolved into a powerful tool for designing system architectures for data intensive science.

  7. On the Computer Generation of Adaptive Numerical Libraries

    DTIC Science & Technology

    2010-05-01

    D.; Borowski, P.; Clark, T.; Clerc, D.; Dachsel, H.; Deegan , M.; Dyall, K.; Elwood, D.; Bibliography 123 Glendening, E.; Gutowski, M.; Hess, A...Science, pages 72–83. Springer, 2007. 84 Curry, Haskell B.; Feys, Robert; Craig , William. Combinatory Logic, volume 1. North-Holland Publishing

  8. The effectiveness of using computer simulated experiments on junior high students' understanding of the volume displacement concept

    NASA Astrophysics Data System (ADS)

    Choi, Byung-Soon; Gennaro, Eugene

    Several researchers have suggested that the computer holds much promise as a tool for science teachers for use in their classrooms (Bork, 1979, Lunetta & Hofstein, 1981). It also has been said that there needs to be more research in determining the effectiveness of computer software (Tinker, 1983).This study compared the effectiveness of microcomputer simulated experiences with that of parallel instruction involving hands-on laboratory experiences for teaching the concept of volume displacement to junior high school students. This study also assessed the differential effect on students' understanding of the volume displacement concept using sex of the students as another independent variable. In addition, it compared the degree of retention, after 45 days, of both treatment groups.It was found that computer simulated experiences were as effective as hands-on laboratory experiences, and that males, having had hands-on laboratory experiences, performed better on the posttest than females having had the hands-on laboratory experiences. There were no significant differences in performance when comparing males with females using the computer simulation in the learning of the displacement concept. This study also showed that there were no significant differences in the retention levels when the retention scores of the computer simulation groups were compared to those that had the hands-on laboratory experiences. However, an ANOVA of the retention test scores revealed that males in both treatment conditions retained knowledge of volume displacement better than females.

  9. Journal of Undergraduate Research, Volume IX, 2009

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stiner, K. S.; Graham, S.; Khan, M.

    Each year more than 600 undergraduate students are awarded paid internships at the Department of Energy’s (DOE) National Laboratories. Th ese interns are paired with research scientists who serve as mentors in authentic research projects. All participants write a research abstract and present at a poster session and/or complete a fulllength research paper. Abstracts and selected papers from our 2007–2008 interns that represent the breadth and depth of undergraduate research performed each year at our National Laboratories are published here in the Journal of Undergraduate Research. The fields in which these students worked included: Biology; Chemistry; Computer Science; Engineering; Environmentalmore » Science; General Science; Materials Science; Medical and Health Sciences; Nuclear Science; Physics; Science Policy; and Waste Management.« less

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dang, Liem X.; Vo, Quynh N.; Nilsson, Mikael

    We report one of the first simulations using a classical rate theory approach to predict the mechanism of the exchange process between water and aqueous uranyl ions. Using our water and ion-water polarizable force fields and molecular dynamics techniques, we computed the potentials of mean force for the uranyl ion-water pair as the function of pressures at ambient temperature. Subsequently, these simulated potentials of mean force were used to calculate rate constants using the transition rate theory; the time dependent transmission coefficients were also examined using the reactive flux method and Grote-Hynes treatments of the dynamic response of the solvent.more » The computed activation volumes using transition rate theory and the corrected rate constants are positive, thus the mechanism of this particular water-exchange is a dissociative process. We discuss our rate theory results and compare them with previously studies in which non-polarizable force fields were used. This work was supported by the US Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences, and Biosciences. The calculations were carried out using computer resources provided by the Office of Basic Energy Sciences.« less

  11. Specification/Verification of Temporal Properties for Distributed Systems: Issues and Approaches. Volume 1

    DTIC Science & Technology

    1990-02-01

    copies Pl ,...,P. of a multiple module fp resolve nondeterminism (local or global) in an identical manner. 5. The copies PI,...,P, axe physically...recovery block. A recovery block consists of a conventional block (like in ALGOL or PL /I) which is provided with a means of error detection, called an...improved failures model for communicating processes. In Proceeding. NSF- SERC Seminar on Concurrency, volume 197 of Lecture Notes in Computer Science

  12. A Visit to the Computer Science Department,

    DTIC Science & Technology

    1983-01-11

    very small, its capabilities are not. Let’s take the F.8 micro-computer and compare it to the world’s first computer, " Eniac ", for a minute. Eniac was...than 30 tons, and filled up completely a room of 170 square meters. The F8 micro-computer on the other hand, has a volume 1/30,000th of Eniac , weighs...less than half a *kilo, has a power uptake of only 2.5 watts, but is 20 times as fast as Eniac and more than 10,000 times as reliable. "From this we can

  13. Australian DefenceScience. Volume 16, Number 1, Autumn

    DTIC Science & Technology

    2008-01-01

    are carried via VOIP technology, and multicast IP traffic for audio -visual communications is also supported. The SSATIN system overall is seen to...Artificial Intelligence and Soft Computing Palma de Mallorca, Spain http://iasted.com/conferences/home-628.html 1 - 3 Sep 2008 Visualisation , Imaging and

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sewell, Christopher Meyer

    This is a set of slides from a guest lecture for a class at the University of Texas, El Paso on visualization and data analysis for high-performance computing. The topics covered are the following: trends in high-performance computing; scientific visualization, such as OpenGL, ray tracing and volume rendering, VTK, and ParaView; data science at scale, such as in-situ visualization, image databases, distributed memory parallelism, shared memory parallelism, VTK-m, "big data", and then an analysis example.

  15. Gigaflop (billion floating point operations per second) performance for computational electromagnetics

    NASA Technical Reports Server (NTRS)

    Shankar, V.; Rowell, C.; Hall, W. F.; Mohammadian, A. H.; Schuh, M.; Taylor, K.

    1992-01-01

    Accurate and rapid evaluation of radar signature for alternative aircraft/store configurations would be of substantial benefit in the evolution of integrated designs that meet radar cross-section (RCS) requirements across the threat spectrum. Finite-volume time domain methods offer the possibility of modeling the whole aircraft, including penetrable regions and stores, at longer wavelengths on today's gigaflop supercomputers and at typical airborne radar wavelengths on the teraflop computers of tomorrow. A structured-grid finite-volume time domain computational fluid dynamics (CFD)-based RCS code has been developed at the Rockwell Science Center, and this code incorporates modeling techniques for general radar absorbing materials and structures. Using this work as a base, the goal of the CFD-based CEM effort is to define, implement and evaluate various code development issues suitable for rapid prototype signature prediction.

  16. Visualization Techniques in Space and Atmospheric Sciences

    NASA Technical Reports Server (NTRS)

    Szuszczewicz, E. P. (Editor); Bredekamp, Joseph H. (Editor)

    1995-01-01

    Unprecedented volumes of data will be generated by research programs that investigate the Earth as a system and the origin of the universe, which will in turn require analysis and interpretation that will lead to meaningful scientific insight. Providing a widely distributed research community with the ability to access, manipulate, analyze, and visualize these complex, multidimensional data sets depends on a wide range of computer science and technology topics. Data storage and compression, data base management, computational methods and algorithms, artificial intelligence, telecommunications, and high-resolution display are just a few of the topics addressed. A unifying theme throughout the papers with regards to advanced data handling and visualization is the need for interactivity, speed, user-friendliness, and extensibility.

  17. The Aggregate Exposure Pathway (AEP): A conceptual framework for advancing exposure science research and transforming risk assessment

    EPA Science Inventory

    Recent advances in analytical methods, biomarker discovery, cell-based assay development, computational tools, sensor/monitor, and omics technology have enabled new streams of exposure and toxicity data to be generated at higher volumes and speed. These new data offer the opport...

  18. Celebrating 50 years of the laser (Scientific session of the general meeting of the Physical Sciences Division of the Russian Academy of Sciences, 13 December 2010)

    NASA Astrophysics Data System (ADS)

    2011-08-01

    A scientific session of the general meeting of the Physical Sciences Division of the Russian Academy of Sciences (RAS) dedicated to the 50th anniversary of the creation of lasers was held in the Conference Hall of the Lebedev Physical Institute, RAS, on 13 December 2010. The agenda of the session announced on the website www.gpad.ac.ru of the RAS Physical Sciences Division listed the following reports: (1) Matveev V A, Bagaev S N Opening speech; (2) Bratman V L, Litvak A G, Suvorov E V (Institute of Applied Physics, RAS, Nizhny Novgorod) "Mastering the terahertz domain: sources and applications"; (3) Balykin V I (Institute of Spectroscopy, RAS, Troitsk, Moscow region) "Ultracold atoms and atom optics"; (4) Ledentsov N N (Ioffe Physical Technical Institute, RAS, St. Petersburg) "New-generation surface-emitting lasers as the key element of the computer communication era"; (5) Krasil'nik Z F (Institute for the Physics of Microstructures, RAS, Nizhny Novgorod) "Lasers for silicon optoelectronics"; (6) Shalagin A M (Institute of Automation and Electrometry, Siberian Branch, RAS, Novosibirsk) "High-power diode-pumped alkali metal vapor lasers"; (7) Kul'chin Yu N (Institute for Automation and Control Processes, Far Eastern Branch, RAS, Vladivostok) "Photonics of self-organizing biomineral nanostructures"; (8) Kolachevsky N N (Lebedev Physical Institute, RAS, Moscow) "Laser cooling of rare-earth atoms and precision measurements". The papers written on the basis of reports 2-4, 7, and 8 are published below.Because the paper based on report 6 was received by the Editors late, it will be published in the October issue of Physics-Uspekhi together with the material related to the Scientific Session of the Physical Sciences Division, RAS, of 22 December 2010. • Mastering the terahertz domain: sources and applications, V L Bratman, A G Litvak, E V Suvorov Physics-Uspekhi, 2011, Volume 54, Number 8, Pages 837-844 • Ultracold atoms and atomic optics, V I Balykin Physics-Uspekhi, 2011, Volume 54, Number 8, Pages 844-852 • New-generation vertically emitting lasers as a key factor in the computer communication era, N N Ledentsov, J A Lott Physics-Uspekhi, 2011, Volume 54, Number 8, Pages 853-858 • The photonics of self-organizing biomineral nanostructures, Yu N Kulchin Physics-Uspekhi, 2011, Volume 54, Number 8, Pages 858-863 • Laser cooling of rare-earth atoms and precision measurements, N N Kolachevsky Physics-Uspekhi, 2011, Volume 54, Number 8, Pages 863-870

  19. Jade: using on-demand cloud analysis to give scientists back their flow

    NASA Astrophysics Data System (ADS)

    Robinson, N.; Tomlinson, J.; Hilson, A. J.; Arribas, A.; Powell, T.

    2017-12-01

    The UK's Met Office generates 400 TB weather and climate data every day by running physical models on its Top 20 supercomputer. As data volumes explode, there is a danger that analysis workflows become dominated by watching progress bars, and not thinking about science. We have been researching how we can use distributed computing to allow analysts to process these large volumes of high velocity data in a way that's easy, effective and cheap.Our prototype analysis stack, Jade, tries to encapsulate this. Functionality includes: An under-the-hood Dask engine which parallelises and distributes computations, without the need to retrain analysts Hybrid compute clusters (AWS, Alibaba, and local compute) comprising many thousands of cores Clusters which autoscale up/down in response to calculation load using Kubernetes, and balances the cluster across providers based on the current price of compute Lazy data access from cloud storage via containerised OpenDAP This technology stack allows us to perform calculations many orders of magnitude faster than is possible on local workstations. It is also possible to outperform dedicated local compute clusters, as cloud compute can, in principle, scale to much larger scales. The use of ephemeral compute resources also makes this implementation cost efficient.

  20. NASA Tech Briefs, April 1995. Volume 19, No. 4

    NASA Technical Reports Server (NTRS)

    1995-01-01

    This issue of the NASA Tech Briefs has a special focus section on video and imaging, a feature on the NASA invention of the year, and a resource report on the Dryden Flight Research Center. The issue also contains articles on electronic components and circuits, electronic systems, physical sciences, materials, computer programs, mechanics, machinery, manufacturing/fabrication, mathematics and information sciences and life sciences. In addition to the standard articles in the NASA Tech brief, this contains a supplement entitled "Laser Tech Briefs" which features an article on the National Ignition Facility, and other articles on the use of Lasers.

  1. Technology 2000, volume 2

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Technology 2000 was the first major industrial conference and exposition spotlighting NASA technology and technology transfer. It's purpose was, and continues to be, to increase awareness of existing NASA-developed technologies that are available for immediate use in the development of new products and processes, and to lay the groundwork for the effective utilization of emerging technologies. Included are sessions on: computer technology and software engineering; human factors engineering and life sciences; materials science; sensors and measurement technology; artificial intelligence; environmental technology; optics and communications; and superconductivity.

  2. Technology 2000, volume 1

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The purpose of the conference was to increase awareness of existing NASA developed technologies that are available for immediate use in the development of new products and processes, and to lay the groundwork for the effective utilization of emerging technologies. There were sessions on the following: Computer technology and software engineering; Human factors engineering and life sciences; Information and data management; Material sciences; Manufacturing and fabrication technology; Power, energy, and control systems; Robotics; Sensors and measurement technology; Artificial intelligence; Environmental technology; Optics and communications; and Superconductivity.

  3. The CCTC Quick-Reacting General War Gaming System (QUICK) Program Maintenance Manual. Volume I. Data Management Subsystem. Change 3.

    DTIC Science & Technology

    1980-05-22

    cross -referenced with the number of the data transaction listed in the data module quality con- trol list NVB Integer variable used to...Organization of the Joint Chiefs of Staff. Technical support was provided by System Sciences, Incorporated under Contract Number DCA100-75-C-0019. Change set... Contract Number DCA 100-75-C-0019. Change set two was prepared u nder Contract Number DCA 100-78-C-0035. Computer Sciences Corporation prepared change

  4. NCI's High Performance Computing (HPC) and High Performance Data (HPD) Computing Platform for Environmental and Earth System Data Science

    NASA Astrophysics Data System (ADS)

    Evans, Ben; Allen, Chris; Antony, Joseph; Bastrakova, Irina; Gohar, Kashif; Porter, David; Pugh, Tim; Santana, Fabiana; Smillie, Jon; Trenham, Claire; Wang, Jingbo; Wyborn, Lesley

    2015-04-01

    The National Computational Infrastructure (NCI) has established a powerful and flexible in-situ petascale computational environment to enable both high performance computing and Data-intensive Science across a wide spectrum of national environmental and earth science data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress so far to harmonise the underlying data collections for future interdisciplinary research across these large volume data collections. NCI has established 10+ PBytes of major national and international data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the major Australian national-scale scientific collections), leading research communities, and collaborating overseas organisations. New infrastructures created at NCI mean the data collections are now accessible within an integrated High Performance Computing and Data (HPC-HPD) environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large-scale high-bandwidth Lustre filesystems. The hardware was designed at inception to ensure that it would allow the layered software environment to flexibly accommodate the advancement of future data science. New approaches to software technology and data models have also had to be developed to enable access to these large and exponentially increasing data volumes at NCI. Traditional HPC and data environments are still made available in a way that flexibly provides the tools, services and supporting software systems on these new petascale infrastructures. But to enable the research to take place at this scale, the data, metadata and software now need to evolve together - creating a new integrated high performance infrastructure. The new infrastructure at NCI currently supports a catalogue of integrated, reusable software and workflows from earth system and ecosystem modelling, weather research, satellite and other observed data processing and analysis. One of the challenges for NCI has been to support existing techniques and methods, while carefully preparing the underlying infrastructure for the transition needed for the next class of Data-intensive Science. In doing so, a flexible range of techniques and software can be made available for application across the corpus of data collections available, and to provide a new infrastructure for future interdisciplinary research.

  5. Unsupervised Learning Through Randomized Algorithms for High-Volume High-Velocity Data (ULTRA-HV).

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pinar, Ali; Kolda, Tamara G.; Carlberg, Kevin Thomas

    Through long-term investments in computing, algorithms, facilities, and instrumentation, DOE is an established leader in massive-scale, high-fidelity simulations, as well as science-leading experimentation. In both cases, DOE is generating more data than it can analyze and the problem is intensifying quickly. The need for advanced algorithms that can automatically convert the abundance of data into a wealth of useful information by discovering hidden structures is well recognized. Such efforts however, are hindered by the massive volume of the data and its high velocity. Here, the challenge is developing unsupervised learning methods to discover hidden structure in high-volume, high-velocity data.

  6. Composable Distributed Access Control and Integrity Policies for Query-Based Wireless Sensor Networks

    DTIC Science & Technology

    2008-03-01

    unaltered during transmission and verified with data authentication. Data Freshness describes the ordering and currency of data. Strong freshness is a total...Advances in Cryptology — Crypto ’97, volume 1294 of Lecture Notes in Computer Science, pages 180–197. Springer-Verlag, Berlin, 1997. GS04. Saurabh

  7. Calibration Experiments for a Computer Vision Oyster Volume Estimation System

    ERIC Educational Resources Information Center

    Chang, G. Andy; Kerns, G. Jay; Lee, D. J.; Stanek, Gary L.

    2009-01-01

    Calibration is a technique that is commonly used in science and engineering research that requires calibrating measurement tools for obtaining more accurate measurements. It is an important technique in various industries. In many situations, calibration is an application of linear regression, and is a good topic to be included when explaining and…

  8. A Software Hub for High Assurance Model-Driven Development and Analysis

    DTIC Science & Technology

    2007-01-23

    verification of UML models in TLPVS. In Thomas Baar, Alfred Strohmeier, Ana Moreira, and Stephen J. Mellor, editors, UML 2004 - The Unified Modeling...volume 3785 of Lecture Notes in Computer Science, pages 52–65, Manchester, UK, Nov 2005. Springer. [GH04] Günter Graw and Peter Herrmann. Transformation

  9. European Scientific Notes. Volume 39, Number 1.

    DTIC Science & Technology

    1985-01-01

    the ~JAN 29 198E Office of Naval Research Branch Office, S Lonidon~ This docuiment is issued primarily for the information of U.S. Government ... German Researcher ....................................... Thomas C. Rozzell 6 A West German researcher has developed a new interactive model for...Compatibility ............ Thomas C. Rozzell 30 New Computer Journals ........................................ C.J. Holland 30 Science Newsbriefs

  10. Computer science, artificial intelligence, and cybernetics: Applied artificial intelligence in Japan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rubinger, B.

    1988-01-01

    This sourcebook provides information on the developments in artificial intelligence originating in Japan. Spanning such innovations as software productivity, natural language processing, CAD, and parallel inference machines, this volume lists leading organizations conducting research or implementing AI systems, describes AI applications being pursued, illustrates current results achieved, and highlights sources reporting progress.

  11. Reducing the Volume of NASA Earth-Science Data

    NASA Technical Reports Server (NTRS)

    Lee, Seungwon; Braverman, Amy J.; Guillaume, Alexandre

    2010-01-01

    A computer program reduces data generated by NASA Earth-science missions into representative clusters characterized by centroids and membership information, thereby reducing the large volume of data to a level more amenable to analysis. The program effects an autonomous data-reduction/clustering process to produce a representative distribution and joint relationships of the data, without assuming a specific type of distribution and relationship and without resorting to domain-specific knowledge about the data. The program implements a combination of a data-reduction algorithm known as the entropy-constrained vector quantization (ECVQ) and an optimization algorithm known as the differential evolution (DE). The combination of algorithms generates the Pareto front of clustering solutions that presents the compromise between the quality of the reduced data and the degree of reduction. Similar prior data-reduction computer programs utilize only a clustering algorithm, the parameters of which are tuned manually by users. In the present program, autonomous optimization of the parameters by means of the DE supplants the manual tuning of the parameters. Thus, the program determines the best set of clustering solutions without human intervention.

  12. User's Guide for ERB 7 Matrix. Volume 1: Experiment Description and Quality Control Report for Year 1

    NASA Technical Reports Server (NTRS)

    Tighe, R. J.; Shen, M. Y. H.

    1984-01-01

    The Nimbus 7 ERB MATRIX Tape is a computer program in which radiances and irradiances are converted into fluxes which are used to compute the basic scientific output parameters, emitted flux, albedo, and net radiation. They are spatially averaged and presented as time averages over one-day, six-day, and monthly periods. MATRIX data for the period November 16, 1978 through October 31, 1979 are presented. Described are the Earth Radiation Budget experiment, the Science Quality Control Report, Items checked by the MATRIX Science Quality Control Program, and Science Quality Control Data Analysis Report. Additional material from the detailed scientific quality control of the tapes which may be very useful to a user of the MATRIX tapes is included. Known errors and data problems and some suggestions on how to use the data for further climatologic and atmospheric physics studies are also discussed.

  13. Virtual Observatory and Distributed Data Mining

    NASA Astrophysics Data System (ADS)

    Borne, Kirk D.

    2012-03-01

    New modes of discovery are enabled by the growth of data and computational resources (i.e., cyberinfrastructure) in the sciences. This cyberinfrastructure includes structured databases, virtual observatories (distributed data, as described in Section 20.2.1 of this chapter), high-performance computing (petascale machines), distributed computing (e.g., the Grid, the Cloud, and peer-to-peer networks), intelligent search and discovery tools, and innovative visualization environments. Data streams from experiments, sensors, and simulations are increasingly complex and growing in volume. This is true in most sciences, including astronomy, climate simulations, Earth observing systems, remote sensing data collections, and sensor networks. At the same time, we see an emerging confluence of new technologies and approaches to science, most clearly visible in the growing synergism of the four modes of scientific discovery: sensors-modeling-computing-data (Eastman et al. 2005). This has been driven by numerous developments, including the information explosion, development of large-array sensors, acceleration in high-performance computing (HPC) power, advances in algorithms, and efficient modeling techniques. Among these, the most extreme is the growth in new data. Specifically, the acquisition of data in all scientific disciplines is rapidly accelerating and causing a data glut (Bell et al. 2007). It has been estimated that data volumes double every year—for example, the NCSA (National Center for Supercomputing Applications) reported that their users cumulatively generated one petabyte of data over the first 19 years of NCSA operation, but they then generated their next one petabyte in the next year alone, and the data production has been growing by almost 100% each year after that (Butler 2008). The NCSA example is just one of many demonstrations of the exponential (annual data-doubling) growth in scientific data collections. In general, this putative data-doubling is an inevitable result of several compounding factors: the proliferation of data-generating devices, sensors, projects, and enterprises; the 18-month doubling of the digital capacity of these microprocessor-based sensors and devices (commonly referred to as "Moore’s law"); the move to digital for nearly all forms of information; the increase in human-generated data (both unstructured information on the web and structured data from experiments, models, and simulation); and the ever-expanding capability of higher density media to hold greater volumes of data (i.e., data production expands to fill the available storage space). These factors are consequently producing an exponential data growth rate, which will soon (if not already) become an insurmountable technical challenge even with the great advances in computation and algorithms. This technical challenge is compounded by the ever-increasing geographic dispersion of important data sources—the data collections are not stored uniformly at a single location, or with a single data model, or in uniform formats and modalities (e.g., images, databases, structured and unstructured files, and XML data sets)—the data are in fact large, distributed, heterogeneous, and complex. The greatest scientific research challenge with these massive distributed data collections is consequently extracting all of the rich information and knowledge content contained therein, thus requiring new approaches to scientific research. This emerging data-intensive and data-oriented approach to scientific research is sometimes called discovery informatics or X-informatics (where X can be any science, such as bio, geo, astro, chem, eco, or anything; Agresti 2003; Gray 2003; Borne 2010). This data-oriented approach to science is now recognized by some (e.g., Mahootian and Eastman 2009; Hey et al. 2009) as the fourth paradigm of research, following (historically) experiment/observation, modeling/analysis, and computational science.

  14. Hybrid Systems: Computation and Control.

    DTIC Science & Technology

    1999-02-17

    computer science ; Vol. 1386) ISBN 3 -540-64358- 3 CR Subject Classification (1991): C.l.m, C. 3 , D.2.1,F.3.1, F.1.2, J.2 ISSN 0302-9743 ISBN 3 -540...64358- 3 Springer-Verlag Berlin Heidelberg New York This work is subject to copyright. All rights are reserved, whether the whole or part of the material...10632061 06/3142 - 5 4 3 2 1 0 Printed on acid-free paper Preface This volume contains the proceedings of the First International Workshop on Hybrid Systems

  15. 2005 Science and Technology for Chem-Bio Information Systems (S and T CBIS) volume 3 Thursday

    DTIC Science & Technology

    2005-10-28

    radar, lidar, or sodar with computer on-board. Temperature and moisture MW radiometer with computer on- board. Portable meteorological sensors ... Wireless on the go is a way of life now – my cell phone , my PDA, my IPOD (look, I’m “Podcasting”!) and dock it when I’m at home – Same components...Team.. Other specifications will follow… Standardization of the interfaces across all CBRN sensors / devices ! JPEO-CBD 20 Joint Program Executive Office

  16. A Research Agenda and Vision for Data Science

    NASA Astrophysics Data System (ADS)

    Mattmann, C. A.

    2014-12-01

    Big Data has emerged as a first-class citizen in the research community spanning disciplines in the domain sciences - Astronomy is pushing velocity with new ground-based instruments such as the Square Kilometre Array (SKA) and its unprecedented data rates (700 TB/sec!); Earth-science is pushing the boundaries of volume with increasing experiments in the international Intergovernmental Panel on Climate Change (IPCC) and climate modeling and remote sensing communities increasing the size of the total archives into the Exabytes scale; airborne missions from NASA such as the JPL Airborne Snow Observatory (ASO) is increasing both its velocity and decreasing the overall turnaround time required to receive products and to make them available to water managers and decision makers. Proteomics and the computational biology community are sequencing genomes and providing near real time answers to clinicians, researchers, and ultimately to patients, helping to process and understand and create diagnoses. Data complexity is on the rise, and the norm is no longer 100s of metadata attributes, but thousands to hundreds of thousands, including complex interrelationships between data and metadata and knowledge. I published a vision for data science in Nature 2013 that encapsulates four thrust areas and foci that I believe the computer science, Big Data, and data science communities need to attack over the next decade to make fundamental progress in the data volume, velocity and complexity challenges arising from the domain sciences such as those described above. These areas include: (1) rapid and unobtrusive algorithm integration; (2) intelligent and automatic data movement; (3) automated and rapid extraction text, metadata and language from heterogeneous file formats; and (4) participation and people power via open source communities. In this talk I will revisit these four areas and describe current progress; future work and challenges ahead as we move forward in this exciting age of Data Science.

  17. Cumulative index to NASA Tech Briefs, 1986-1990, volumes 10-14

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Tech Briefs are short announcements of new technology derived from the R&D activities of the National Aeronautics and Space Administration. These briefs emphasize information considered likely to be transferrable across industrial, regional, or disciplinary lines and are issued to encourage commercial application. This cumulative index of Tech Briefs contains abstracts and four indexes (subject, personal author, originating center, and Tech Brief number) and covers the period 1986 to 1990. The abstract section is organized by the following subject categories: electronic components and circuits, electronic systems, physical sciences, materials, computer programs, life sciences, mechanics, machinery, fabrication technology, and mathematics and information sciences.

  18. 3D medical volume reconstruction using web services.

    PubMed

    Kooper, Rob; Shirk, Andrew; Lee, Sang-Chul; Lin, Amy; Folberg, Robert; Bajcsy, Peter

    2008-04-01

    We address the problem of 3D medical volume reconstruction using web services. The use of proposed web services is motivated by the fact that the problem of 3D medical volume reconstruction requires significant computer resources and human expertise in medical and computer science areas. Web services are implemented as an additional layer to a dataflow framework called data to knowledge. In the collaboration between UIC and NCSA, pre-processed input images at NCSA are made accessible to medical collaborators for registration. Every time UIC medical collaborators inspected images and selected corresponding features for registration, the web service at NCSA is contacted and the registration processing query is executed using the image to knowledge library of registration methods. Co-registered frames are returned for verification by medical collaborators in a new window. In this paper, we present 3D volume reconstruction problem requirements and the architecture of the developed prototype system at http://isda.ncsa.uiuc.edu/MedVolume. We also explain the tradeoffs of our system design and provide experimental data to support our system implementation. The prototype system has been used for multiple 3D volume reconstructions of blood vessels and vasculogenic mimicry patterns in histological sections of uveal melanoma studied by fluorescent confocal laser scanning microscope.

  19. Global Weather Prediction and High-End Computing at NASA

    NASA Technical Reports Server (NTRS)

    Lin, Shian-Jiann; Atlas, Robert; Yeh, Kao-San

    2003-01-01

    We demonstrate current capabilities of the NASA finite-volume General Circulation Model an high-resolution global weather prediction, and discuss its development path in the foreseeable future. This model can be regarded as a prototype of a future NASA Earth modeling system intended to unify development activities cutting across various disciplines within the NASA Earth Science Enterprise.

  20. Human-Computer Interaction: A Journal of Theoretical, Empirical and Methodological Issues of User Science and of System Design. Volume 7, Number 1

    DTIC Science & Technology

    1992-01-01

    Norman .................................... University of California, San Diego, CA Dan R . Olsen, Jr ........................................ Brigham...Peter G. Poison .............................................. University of Colorado, Boulder, CO James R . Rhyne ................. IBM T J Watson...and artificial intelligence, among which are: * reasoning about concurrent systems, including program verification ( Barringer , 1985), operating

  1. Control of Chaos: New Perspectives in Experimental and Theoretical Science. International Journal of Bifurcation and Chaos in Applied Sciences and Engineering. Theme Issue. Part 2, Volume 8, Number 9, September 1998.

    DTIC Science & Technology

    1998-09-01

    discharges in the Onchidium pacemaker neu- "Episodic multiregional cortical coherence at multiple ron," J. Theor. Biol. 156, 269-291. frequencies during...with delay: A model of synchronization of Sepulchre, J. A. & Babloyantz, A. [1993] "Controlling cortical tissue," Neural Comput. 6, 1141-1154...generating circuit of different 363, 411 417. networks," Nature 351, 60-63. Singer, W. [1993] "Synchronization of cortical activity Mpitsos, G. J., Burton, R

  2. Why the Petascale era will drive improvements in the management of the full lifecycle of earth science data.

    NASA Astrophysics Data System (ADS)

    Wyborn, L.

    2012-04-01

    The advent of the petascale era, in both storage and compute facilities, will offer new opportunities for earth scientists to transform the way they do their science and to undertake cross-disciplinary science at a global scale. No longer will data have to be averaged and subsampled: it can be analysed to its fullest resolution at national or even global scales. Much larger data volumes can be analysed in single passes and at higher resolution: large scale cross domain science is now feasible. However, in general, earth sciences have been slow to capitalise on the potential of these new petascale compute facilities: many struggle to even use terascale facilities. Our chances of using these new facilities will require a vast improvement in the management of the full life cycle of data: in reality it will need to be transformed. Many of our current issues with earth science data are historic and stem from the limitations of early data storage systems. As storage was so expensive, metadata was usually stored separate from the data and attached as a readme file. Likewise, attributes that defined uncertainty, reliability and traceability were recoded in lab note books and rarely stored with the data. Data were routinely transferred as files. The new opportunities require that the traditional discover, display and locally download and process paradigm is too limited. For data access and assimilation to be improved, data will need to be self describing. For heterogeneous data to be rapidly integrated attributes such as reliability, uncertainty and traceability will need to be systematically recorded with each observation. The petascale era also requires that individual data files be transformed and aggregated into calibrated data arrays or data cubes. Standards become critical and are the enablers of integration. These changes are common to almost every science discipline. What makes earth sciences unique is that many domains record time series data, particularly in the environmental geosciences areas (weathering, soil changes, climate change). The data life cycle will be measured in decades and centuries, not years. Preservation over such time spans is quite a challenge to the earth sciences as data will have to be managed over many evolutions of software and hardware. The focus has to be on managing the data and not the media. Currently storage is not an issue, but it is predicted that data volumes will soon exceed the effective storage media than can be physically manufactured. This means that organisations will have to think about disposal and destruction of data. For earth sciences, this will be a particularly sensitive issue. Petascale computing offers many new opportunities to the earth sciences and by 2020 exascale computers will be a reality. To fully realise these opportunities the earth sciences needs to actively and systematically rethink what the ramifications of these new systems will have on current practices for data storage, discovery, access and assimilation.

  3. EOS Laser Atmosphere Wind Sounder (LAWS) investigation

    NASA Technical Reports Server (NTRS)

    Emmitt, George D.

    1991-01-01

    The related activities of the contract are outlined for the first year. These include: (1) attend team member meetings; (2) support EOS Project with science related activities; (3) prepare and Execution Phase plan; and (4) support LAWS and EOSDIS related work. Attached to the report is an appendix, 'LAWS Algorithm Development and Evaluation Laboratory (LADEL)'. Also attached is a copy of a proposal to the NASA EOS for 'LAWS Sampling Strategies and Wind Computation Algorithms -- Storm-Top Divergence Studies. Volume I: Investigation and Technical Plan, Data Plan, Computer Facilities Plan, Management Plan.'

  4. Big Computing in Astronomy: Perspectives and Challenges

    NASA Astrophysics Data System (ADS)

    Pankratius, Victor

    2014-06-01

    Hardware progress in recent years has led to astronomical instruments gathering large volumes of data. In radio astronomy for instance, the current generation of antenna arrays produces data at Tbits per second, and forthcoming instruments will expand these rates much further. As instruments are increasingly becoming software-based, astronomers will get more exposed to computer science. This talk therefore outlines key challenges that arise at the intersection of computer science and astronomy and presents perspectives on how both communities can collaborate to overcome these challenges.Major problems are emerging due to increases in data rates that are much larger than in storage and transmission capacity, as well as humans being cognitively overwhelmed when attempting to opportunistically scan through Big Data. As a consequence, the generation of scientific insight will become more dependent on automation and algorithmic instrument control. Intelligent data reduction will have to be considered across the entire acquisition pipeline. In this context, the presentation will outline the enabling role of machine learning and parallel computing.BioVictor Pankratius is a computer scientist who joined MIT Haystack Observatory following his passion for astronomy. He is currently leading efforts to advance astronomy through cutting-edge computer science and parallel computing. Victor is also involved in projects such as ALMA Phasing to enhance the ALMA Observatory with Very-Long Baseline Interferometry capabilities, the Event Horizon Telescope, as well as in the Radio Array of Portable Interferometric Detectors (RAPID) to create an analysis environment using parallel computing in the cloud. He has an extensive track record of research in parallel multicore systems and software engineering, with contributions to auto-tuning, debugging, and empirical experiments studying programmers. Victor has worked with major industry partners such as Intel, Sun Labs, and Oracle. He holds a distinguished doctorate and a Habilitation degree in Computer Science from the University of Karlsruhe. Contact him at pankrat@mit.edu, victorpankratius.com, or Twitter @vpankratius.

  5. 500 Contractors Receiving the Largest Dollar Volume of Prime Contract Awards for RDT&E Fiscal Year 1989

    DTIC Science & Technology

    1989-01-01

    COMPUTER SCIENCES CORPORATION 84 B AVCO RESEARCH LABORATORY INC 109 B CONTEL FEDERAL SYSTEMS INC 439 B B D SYSTEMS INC 427 B CONTRAVES GOERZ CORPORATION...2,121 Costa Mesa California 2,026 Santa Ana California 52 450 COMPRHENSIVE TECHNOLOGIES INTL S 2,109 * Chant illy Virginia 2,109 427 CONTRAVES GOERZ

  6. Discrete Mathematics in the Schools. DIMACS Series in Discrete Mathematics and Theoretical Computer Science, Volume 36.

    ERIC Educational Resources Information Center

    Rosenstein, Joseph G., Ed.; Franzblau, Deborah S., Ed.; Roberts, Fred S., Ed.

    This book is a collection of articles by experienced educators and explains why and how discrete mathematics should be taught in K-12 classrooms. It includes evidence for "why" and practical guidance for "how" and also discusses how discrete mathematics can be used as a vehicle for achieving the broader goals of the major…

  7. ONR (Office of Naval Research) Far East Scientific Bulletin. Volume 9, Number 1, January to March 1984.

    DTIC Science & Technology

    1984-03-01

    science are Professor Toyomi Ohta, who holds the chair in Electronic Computation, and Professor Yoshiaki Koga, who holds the chair in Communication...and N. Imanaka, G. Adachi, and J. Shiokawa . reported on sodium sulfate doped with NaVO3 and/or Ln2 (SO4 )3 (Ln = rare earths). In aninteresting

  8. Python in the NERSC Exascale Science Applications Program for Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ronaghi, Zahra; Thomas, Rollin; Deslippe, Jack

    We describe a new effort at the National Energy Re- search Scientific Computing Center (NERSC) in performance analysis and optimization of scientific Python applications targeting the Intel Xeon Phi (Knights Landing, KNL) many- core architecture. The Python-centered work outlined here is part of a larger effort called the NERSC Exascale Science Applications Program (NESAP) for Data. NESAP for Data focuses on applications that process and analyze high-volume, high-velocity data sets from experimental/observational science (EOS) facilities supported by the US Department of Energy Office of Science. We present three case study applications from NESAP for Data that use Python. These codesmore » vary in terms of “Python purity” from applications developed in pure Python to ones that use Python mainly as a convenience layer for scientists without expertise in lower level programming lan- guages like C, C++ or Fortran. The science case, requirements, constraints, algorithms, and initial performance optimizations for each code are discussed. Our goal with this paper is to contribute to the larger conversation around the role of Python in high-performance computing today and tomorrow, highlighting areas for future work and emerging best practices« less

  9. Econophysics and evolutionary economics (Scientific session of the Physical Sciences Division of the Russian Academy of Sciences, 2 November 2010)

    NASA Astrophysics Data System (ADS)

    2011-07-01

    The scientific session "Econophysics and evolutionary economics" of the Division of Physical Sciences of the Russian Academy of Sciences (RAS) took place on 2 November 2010 in the conference hall of the Lebedev Physical Institute, Russian Academy of Sciences. The session agenda announced on the website www.gpad.ac.ru of the RAS Physical Sciences Division listed the following reports: (1) Maevsky V I (Institute of Economics, RAS, Moscow) "The transition from simple reproduction to economic growth"; (2) Yudanov A Yu (Financial University of the Government of the Russian Federation, Moscow) "Experimental data on the development of fast-growing innovative companies in Russia"; (3) Pospelov I G (Dorodnitsyn Computation Center, RAS, Moscow) "Why is it sometimes possible to successfully model an economy? (4) Chernyavskii D S (Lebedev Physical Institute, RAS, Moscow) "Theoretical economics"; (5) Romanovskii M Yu (Prokhorov Institute of General Physics, RAS, Moscow) "Nonclassical random walks and the phenomenology of fluctuations of the yield of securities in the securities market"; (6) Dubovikov M M, Starchenko N V (INTRAST Management Company, Moscow Engineering Physics Institute, Moscow) "Fractal analysis of financial time series and the prediction problem"; Papers written on the basis of these reports are published below. • The transition from simple reproduction to economic growth, V I Maevsky, S Yu Malkov Physics-Uspekhi, 2011, Volume 54, Number 7, Pages 729-733 • High-growth firms in Russia: experimental data and prospects for the econophysical simulation of economic modernization, A Yu Yudanov Physics-Uspekhi, 2011, Volume 54, Number 7, Pages 733-737 • Equilibrium models of economics in the period of a global financial crisis, I G Pospelov Physics-Uspekhi, 2011, Volume 54, Number 7, Pages 738-742 • On econophysics and its place in modern theoretical economics, D S Chernavskii, N I Starkov, S Yu Malkov, Yu V Kosse, A V Shcherbakov Physics-Uspekhi, 2011, Volume 54, Number 7, Pages 742-749 • Nonclassical random walks and the phenomenology of fluctuations of securities returns in the stock market, P V Vidov, M Yu Romanovsky Physics-Uspekhi, 2011, Volume 54, Number 7, Pages 749-753 • Econophysics and the fractal analysis of financial time series, M M Dubovikov, N V Starchenko Physics-Uspekhi, 2011, Volume 54, Number 7, Pages 754-761

  10. Non-invasive imaging methods applied to neo- and paleo-ontological cephalopod research

    NASA Astrophysics Data System (ADS)

    Hoffmann, R.; Schultz, J. A.; Schellhorn, R.; Rybacki, E.; Keupp, H.; Gerden, S. R.; Lemanis, R.; Zachow, S.

    2014-05-01

    Several non-invasive methods are common practice in natural sciences today. Here we present how they can be applied and contribute to current topics in cephalopod (paleo-) biology. Different methods will be compared in terms of time necessary to acquire the data, amount of data, accuracy/resolution, minimum/maximum size of objects that can be studied, the degree of post-processing needed and availability. The main application of the methods is seen in morphometry and volumetry of cephalopod shells. In particular we present a method for precise buoyancy calculation. Therefore, cephalopod shells were scanned together with different reference bodies, an approach developed in medical sciences. It is necessary to know the volume of the reference bodies, which should have similar absorption properties like the object of interest. Exact volumes can be obtained from surface scanning. Depending on the dimensions of the study object different computed tomography techniques were applied.

  11. Toward an in-situ analytics and diagnostics framework for earth system models

    NASA Astrophysics Data System (ADS)

    Anantharaj, Valentine; Wolf, Matthew; Rasch, Philip; Klasky, Scott; Williams, Dean; Jacob, Rob; Ma, Po-Lun; Kuo, Kwo-Sen

    2017-04-01

    The development roadmaps for many earth system models (ESM) aim for a globally cloud-resolving model targeting the pre-exascale and exascale systems of the future. The ESMs will also incorporate more complex physics, chemistry and biology - thereby vastly increasing the fidelity of the information content simulated by the model. We will then be faced with an unprecedented volume of simulation output that would need to be processed and analyzed concurrently in order to derive the valuable scientific results. We are already at this threshold with our current generation of ESMs at higher resolution simulations. Currently, the nominal I/O throughput in the Community Earth System Model (CESM) via Parallel IO (PIO) library is around 100 MB/s. If we look at the high frequency I/O requirements, it would require an additional 1 GB / simulated hour, translating to roughly 4 mins wallclock / simulated-day => 24.33 wallclock hours / simulated-model-year => 1,752,000 core-hours of charge per simulated-model-year on the Titan supercomputer at the Oak Ridge Leadership Computing Facility. There is also a pending need for 3X more volume of simulation output . Meanwhile, many ESMs use instrument simulators to run forward models to compare model simulations against satellite and ground-based instruments, such as radars and radiometers. The CFMIP Observation Simulator Package (COSP) is used in CESM as well as the Accelerated Climate Model for Energy (ACME), one of the ESMs specifically targeting current and emerging leadership-class computing platforms These simulators can be computationally expensive, accounting for as much as 30% of the computational cost. Hence the data are often written to output files that are then used for offline calculations. Again, the I/O bottleneck becomes a limitation. Detection and attribution studies also use large volume of data for pattern recognition and feature extraction to analyze weather and climate phenomenon such as tropical cyclones, atmospheric rivers, blizzards, etc. It is evident that ESMs need an in-situ framework to decouple the diagnostics and analytics from the prognostics and physics computations of the models so that the diagnostic computations could be performed concurrently without limiting model throughput. We are designing a science-driven online analytics framework for earth system models. Our approach is to adopt several data workflow technologies, such as the Adaptable IO System (ADIOS), being developed under the U.S. Exascale Computing Project (ECP) and integrate these to allow for extreme performance IO, in situ workflow integration, science-driven analytics and visualization all in a easy to use computational framework. This will allow science teams to write data 100-1000 times faster and seamlessly move from post processing the output for validation and verification purposes to performing these calculations in situ. We can easily and knowledgeably envision a near-term future where earth system models like ACME and CESM will have to address not only the challenges of the volume of data but also need to consider the velocity of the data. The earth system model of the future in the exascale era, as they incorporate more complex physics at higher resolutions, will be able to analyze more simulation content without having to compromise targeted model throughput.

  12. Large Scale Computing and Storage Requirements for High Energy Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerber, Richard A.; Wasserman, Harvey

    2010-11-24

    The National Energy Research Scientific Computing Center (NERSC) is the leading scientific computing facility for the Department of Energy's Office of Science, providing high-performance computing (HPC) resources to more than 3,000 researchers working on about 400 projects. NERSC provides large-scale computing resources and, crucially, the support and expertise needed for scientists to make effective use of them. In November 2009, NERSC, DOE's Office of Advanced Scientific Computing Research (ASCR), and DOE's Office of High Energy Physics (HEP) held a workshop to characterize the HPC resources needed at NERSC to support HEP research through the next three to five years. Themore » effort is part of NERSC's legacy of anticipating users needs and deploying resources to meet those demands. The workshop revealed several key points, in addition to achieving its goal of collecting and characterizing computing requirements. The chief findings: (1) Science teams need access to a significant increase in computational resources to meet their research goals; (2) Research teams need to be able to read, write, transfer, store online, archive, analyze, and share huge volumes of data; (3) Science teams need guidance and support to implement their codes on future architectures; and (4) Projects need predictable, rapid turnaround of their computational jobs to meet mission-critical time constraints. This report expands upon these key points and includes others. It also presents a number of case studies as representative of the research conducted within HEP. Workshop participants were asked to codify their requirements in this case study format, summarizing their science goals, methods of solution, current and three-to-five year computing requirements, and software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, multi-core environment that is expected to dominate HPC architectures over the next few years. The report includes a section that describes efforts already underway or planned at NERSC that address requirements collected at the workshop. NERSC has many initiatives in progress that address key workshop findings and are aligned with NERSC's strategic plans.« less

  13. High Energy Physics Exascale Requirements Review. An Office of Science review sponsored jointly by Advanced Scientific Computing Research and High Energy Physics, June 10-12, 2015, Bethesda, Maryland

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Habib, Salman; Roser, Robert; Gerber, Richard

    The U.S. Department of Energy (DOE) Office of Science (SC) Offices of High Energy Physics (HEP) and Advanced Scientific Computing Research (ASCR) convened a programmatic Exascale Requirements Review on June 10–12, 2015, in Bethesda, Maryland. This report summarizes the findings, results, and recommendations derived from that meeting. The high-level findings and observations are as follows. Larger, more capable computing and data facilities are needed to support HEP science goals in all three frontiers: Energy, Intensity, and Cosmic. The expected scale of the demand at the 2025 timescale is at least two orders of magnitude — and in some cases greatermore » — than that available currently. The growth rate of data produced by simulations is overwhelming the current ability of both facilities and researchers to store and analyze it. Additional resources and new techniques for data analysis are urgently needed. Data rates and volumes from experimental facilities are also straining the current HEP infrastructure in its ability to store and analyze large and complex data volumes. Appropriately configured leadership-class facilities can play a transformational role in enabling scientific discovery from these datasets. A close integration of high-performance computing (HPC) simulation and data analysis will greatly aid in interpreting the results of HEP experiments. Such an integration will minimize data movement and facilitate interdependent workflows. Long-range planning between HEP and ASCR will be required to meet HEP’s research needs. To best use ASCR HPC resources, the experimental HEP program needs (1) an established, long-term plan for access to ASCR computational and data resources, (2) the ability to map workflows to HPC resources, (3) the ability for ASCR facilities to accommodate workflows run by collaborations potentially comprising thousands of individual members, (4) to transition codes to the next-generation HPC platforms that will be available at ASCR facilities, (5) to build up and train a workforce capable of developing and using simulations and analysis to support HEP scientific research on next-generation systems.« less

  14. Hadron electric polarizability from lattice QCD

    NASA Astrophysics Data System (ADS)

    Alexandru, Andrei

    2017-09-01

    Electromagnetic polarizabilities are important parameters for hadron structure, describing the response of the charge and current distributions inside the hadron to an external electromagnetic field. For most hadrons these quantities are poorly constrained experimentally since they can only be measured indirectly. Lattice QCD can be used to compute these quantities directly in terms of quark and gluons degrees of freedom, using the background field method. We present results for the neutron electric polarizability for two different quark masses, light enough to connect to chiral perturbation theory. These are currently the lightest quark masses used in polarizability studies. For each pion mass we compute the polarizability at four different volumes and perform an infinite volume extrapolation. We also discuss the effect of turning on the coupling between the background field and the sea quarks. A.A. is supported in part by the National Science Foundation CAREER Grant PHY-1151648 and by U.S. DOE Grant No. DE-FG02-95ER40907.

  15. Shock Interaction with Random Spherical Particle Beds

    NASA Astrophysics Data System (ADS)

    Neal, Chris; Mehta, Yash; Salari, Kambiz; Jackson, Thomas L.; Balachandar, S. "Bala"; Thakur, Siddharth

    2016-11-01

    In this talk we present results on fully resolved simulations of shock interaction with randomly distributed bed of particles. Multiple simulations were carried out by varying the number of particles to isolate the effect of volume fraction. Major focus of these simulations was to understand 1) the effect of the shockwave and volume fraction on the forces experienced by the particles, 2) the effect of particles on the shock wave, and 3) fluid mediated particle-particle interactions. Peak drag force for particles at different volume fractions show a downward trend as the depth of the bed increased. This can be attributed to dissipation of energy as the shockwave travels through the bed of particles. One of the fascinating observations from these simulations was the fluctuations in different quantities due to presence of multiple particles and their random distribution. These are large simulations with hundreds of particles resulting in large amount of data. We present statistical analysis of the data and make relevant observations. Average pressure in the computational domain is computed to characterize the strengths of the reflected and transmitted waves. We also present flow field contour plots to support our observations. U.S. Department of Energy, National Nuclear Security Administration, Advanced Simulation and Computing Program, as a Cooperative Agreement under the Predictive Science Academic Alliance Program, under Contract No. DE-NA0002378.

  16. Microgravity Science Glovebox

    NASA Technical Reports Server (NTRS)

    2001-01-01

    Computer-generated drawing shows the relative scale and working space for the Microgravity Science Glovebox (MSG) being developed by NASA and the European Space Agency for science experiments aboard the International Space Station (ISS). The person at the glovebox repesents a 95th percentile American male. The MSG will be deployed first to the Destiny laboratory module and later will be moved to ESA's Columbus Attached Payload Module. Each module will be filled with International Standard Payload Racks (green) attached to standoff fittings (yellow) that hold the racks in position. Destiny is six racks in length. The MSG is being developed by the European Space Agency and NASA to provide a large working volume for hands-on experiments aboard the International Space Station. Scientists will use the MSG to carry out multidisciplinary studies in combustion science, fluid physics and materials science. The MSG is managed by NASA's Marshall Space Flight Center. (Credit: NASA/Marshall)

  17. Aspects of Unstructured Grids and Finite-Volume Solvers for the Euler and Navier-Stokes Equations

    NASA Technical Reports Server (NTRS)

    Barth, Timothy J.

    1992-01-01

    One of the major achievements in engineering science has been the development of computer algorithms for solving nonlinear differential equations such as the Navier-Stokes equations. In the past, limited computer resources have motivated the development of efficient numerical schemes in computational fluid dynamics (CFD) utilizing structured meshes. The use of structured meshes greatly simplifies the implementation of CFD algorithms on conventional computers. Unstructured grids on the other hand offer an alternative to modeling complex geometries. Unstructured meshes have irregular connectivity and usually contain combinations of triangles, quadrilaterals, tetrahedra, and hexahedra. The generation and use of unstructured grids poses new challenges in CFD. The purpose of this note is to present recent developments in the unstructured grid generation and flow solution technology.

  18. [Introduction].

    PubMed

    Gerard, Adrienne; van den Bogaard, Alberts

    2008-01-01

    Along with the international trends in history of computing, Dutch contributions over the past twenty years moved away from a focus on machinery to the broader scope of use of computers, appropriation of computing technologies in various traditions, labour relations and professionalisation issues, and, lately, software. It is only natural that an emerging field like computer science sets out to write its genealogy and canonise the important steps in its intellectual endeavour. It is fair to say that a historiography diverging from such "home" interest, started in 1987 with the work of Eda Kranakis--then active in The Netherlands--commissioned by the national bureau for technology assessment, and Gerard Alberts, turning a commemorative volume of the Mathematical Center into a history of the same institute. History of computing in The Netherlands made a major leap in the spring of 1994 when Dirk de Wit, Jan van den Ende and Ellen van Oost defended their dissertations, on the roads towards adoption of computing technology in banking, in science and engineering, and on the gender aspect in computing. Here, history of computing had already moved from machines to the use of computers. The three authors joined Gerard Alberts and Onno de Wit in preparing a volume on the rise of IT in The Netherlands, the sequel of which in now in preparation in a team lead by Adrienne van den Bogaard. Dutch research reflected the international attention for professionalisation issues (Ensmenger, Haigh) very early on in the dissertation by Ruud van Dael, Something to do with computers (2001) revealing how occupations dealing with computers typically escape the pattern of closure by professionalisation as expected by the, thus outdated, sociology of professions. History of computing not only takes use and users into consideration, but finally, as one may say, confronts the technological side of putting the machine to use, software, head on. The groundbreaking works of the 2000 Paderborn meeting and by Martin Campbell-Kelly resonate in work done in The Netherlands and recently in a major research project sponsored by the European Science Foundation: Software for Europe. The four contributions to this issue offer a true cross-section of ongoing history of computing in The Netherlands. Gerard Alberts and Huub de Beer return to the earliest computers at the Mathematical Center. As they do so under the perspective of using the machines, the result is, let us say, remarkable. Adrienne van den Bogaard compares the styles of software as practiced by Van der Poel and Dijkstra: so much had these two pioneers in common, so different the consequences they took. Frank Veraart treats us with an excerpt from his recent dissertation on the domestication of the micro computer technology: appropriation of computing technology is shown by the role of intermediate actors. Onno de Wit, finally, gives an account of the development, prior to internet, of a national data communication network among large scale users and its remarkable persistence under competition with new network technologies.

  19. Long live the Data Scientist, but can he/she persist?

    NASA Astrophysics Data System (ADS)

    Wyborn, L. A.

    2011-12-01

    In recent years the fourth paradigm of data intensive science has slowly taken hold as the increased capacity of instruments and an increasing number of instruments (in particular sensor networks) have changed how fundamental research is undertaken. Most modern scientific research is about digital capture of data direct from instruments, processing it by computers, storing the results on computers and only publishing a small fraction of data in hard copy publications. At the same time, the rapid increase in capacity of supercomputers, particularly at petascale, means that far larger data sets can be analysed and to greater resolution than previously possible. The new cloud computing paradigm which allows distributed data, software and compute resources to be linked by seamless workflows, is creating new opportunities in processing of high volumes of data to an increasingly larger number of researchers. However, to take full advantage of these compute resources, data sets for analysis have to be aggregated from multiple sources to create high performance data sets. These new technology developments require that scientists must become more skilled in data management and/or have a higher degree of computer literacy. In almost every science discipline there is now an X-informatics branch and a computational X branch (eg, Geoinformatics and Computational Geoscience): both require a new breed of researcher that has skills in both the science fundamentals and also knowledge of some ICT aspects (computer programming, data base design and development, data curation, software engineering). People that can operate in both science and ICT are increasingly known as 'data scientists'. Data scientists are a critical element of many large scale earth and space science informatics projects, particularly those that are tackling current grand challenges at an international level on issues such as climate change, hazard prediction and sustainable development of our natural resources. These projects by their very nature require the integration of multiple digital data sets from multiple sources. Often the preparation of the data for computational analysis can take months and requires painstaking attention to detail to ensure that anomalies identified are real and are not just artefacts of the data preparation and/or the computational analysis. Although data scientists are increasingly vital to successful data intensive earth and space science projects, unless they are recognised for their capabilities in both the science and the computational domains they are likely to migrate to either a science role or an ICT role as their career advances. Most reward and recognition systems do not recognise those with skills in both, hence, getting trained data scientists to persist beyond one or two projects can be challenge. Those data scientists that persist in the profession are characteristically committed and enthusiastic people who have the support of their organisations to take on this role. They also tend to be people who share developments and are critical to the success of the open source software movement. However, the fact remains that survival of the data scientist as a species is being threatened unless something is done to recognise their invaluable contributions to the new fourth paradigm of science.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dang, Liem X.; Schenter, Gregory K.

    To enhance our understanding of the solvent exchange mechanism in liquid methanol, we report a systematic study of this process using molecular dynamics simulations. We use transition state theory, the Impey-Madden-McDonald method, the reactive flux method, and Grote-Hynes theory to compute the rate constants for this process. Solvent coupling was found to dominate, resulting in a significantly small transmission coefficient. We predict a positive activation volume for the methanol exchange process. The essential features of the dynamics of the system as well as the pressure dependence are recovered from a Generalized Langevin Equation description of the dynamics. We find thatmore » the dynamics and response to anharmonicity can be decomposed into two time regimes, one corresponding to short time response (< 0.1 ps) and long time response (> 5 ps). An effective characterization of the process results from launching dynamics from the planar hypersurface corresponding to Grote-Hynes theory. This results in improved numerical convergence of correlation functions. This work was supported by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences, and Biosciences. The calculations were carried out using computer resources provided by the Office of Basic Energy Sciences.« less

  1. Integrating Requirements Engineering, Modeling, and Verification Technologies into Software and Systems Engineering

    DTIC Science & Technology

    2007-10-28

    Software Engineering, FASE󈧉, volume 3442 of Lecture Notes in Computer Science, pages 175--189. Springer, 2005. Andreas Bauer, Martin Leucker, and Jonathan ...of Personnel receiving masters degrees NAME Markus Strohmeier Gerrit Hanselmann Jonathan Streit Ernst Sassen 4Total Number: Names of personnel...developed and documented mainly within the master thesis by Jonathan Streit [Str06]: • Jonathan Streit. Development of a programming language like tem

  2. dfnWorks: A discrete fracture network framework for modeling subsurface flow and transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hyman, Jeffrey D.; Karra, Satish; Makedonska, Nataliia

    DFNWORKS is a parallelized computational suite to generate three-dimensional discrete fracture networks (DFN) and simulate flow and transport. Developed at Los Alamos National Laboratory over the past five years, it has been used to study flow and transport in fractured media at scales ranging from millimeters to kilometers. The networks are created and meshed using DFNGEN, which combines FRAM (the feature rejection algorithm for meshing) methodology to stochastically generate three-dimensional DFNs with the LaGriT meshing toolbox to create a high-quality computational mesh representation. The representation produces a conforming Delaunay triangulation suitable for high performance computing finite volume solvers in anmore » intrinsically parallel fashion. Flow through the network is simulated in dfnFlow, which utilizes the massively parallel subsurface flow and reactive transport finite volume code PFLOTRAN. A Lagrangian approach to simulating transport through the DFN is adopted within DFNTRANS to determine pathlines and solute transport through the DFN. Example applications of this suite in the areas of nuclear waste repository science, hydraulic fracturing and CO 2 sequestration are also included.« less

  3. dfnWorks: A discrete fracture network framework for modeling subsurface flow and transport

    DOE PAGES

    Hyman, Jeffrey D.; Karra, Satish; Makedonska, Nataliia; ...

    2015-11-01

    DFNWORKS is a parallelized computational suite to generate three-dimensional discrete fracture networks (DFN) and simulate flow and transport. Developed at Los Alamos National Laboratory over the past five years, it has been used to study flow and transport in fractured media at scales ranging from millimeters to kilometers. The networks are created and meshed using DFNGEN, which combines FRAM (the feature rejection algorithm for meshing) methodology to stochastically generate three-dimensional DFNs with the LaGriT meshing toolbox to create a high-quality computational mesh representation. The representation produces a conforming Delaunay triangulation suitable for high performance computing finite volume solvers in anmore » intrinsically parallel fashion. Flow through the network is simulated in dfnFlow, which utilizes the massively parallel subsurface flow and reactive transport finite volume code PFLOTRAN. A Lagrangian approach to simulating transport through the DFN is adopted within DFNTRANS to determine pathlines and solute transport through the DFN. Example applications of this suite in the areas of nuclear waste repository science, hydraulic fracturing and CO 2 sequestration are also included.« less

  4. The Space Technology 5 Avionics System

    NASA Technical Reports Server (NTRS)

    Speer, Dave; Jackson, George; Stewart, Karen; Hernandez-Pellerano, Amri

    2004-01-01

    The Space Technology 5 (ST5) mission is a NASA New Millennium Program project that will validate new technologies for future space science missions and demonstrate the feasibility of building launching and operating multiple, miniature spacecraft that can collect research-quality in-situ science measurements. The three satellites in the ST5 constellation will be launched into a sun-synchronous Earth orbit in early 2006. ST5 fits into the 25-kilogram and 24-watt class of very small but fully capable spacecraft. The new technologies and design concepts for a compact power and command and data handling (C&DH) avionics system are presented. The 2-card ST5 avionics design incorporates new technology components while being tightly constrained in mass, power and volume. In order to hold down the mass and volume, and quali& new technologies for fUture use in space, high efficiency triple-junction solar cells and a lithium-ion battery were baselined into the power system design. The flight computer is co-located with the power system electronics in an integral spacecraft structural enclosure called the card cage assembly. The flight computer has a full set of uplink, downlink and solid-state recording capabilities, and it implements a new CMOS Ultra-Low Power Radiation Tolerant logic technology. There were a number of challenges imposed by the ST5 mission. Specifically, designing a micro-sat class spacecraft demanded that minimizing mass, volume and power dissipation would drive the overall design. The result is a very streamlined approach, while striving to maintain a high level of capability, The mission's radiation requirements, along with the low voltage DC power distribution, limited the selection of analog parts that can operate within these constraints. The challenge of qualifying new technology components for the space environment within a short development schedule was another hurdle. The mission requirements also demanded magnetic cleanliness in order to reduce the effect of stray (spacecraft-generated) magnetic fields on the science-grade magnetometer.

  5. Network-based approaches to climate knowledge discovery

    NASA Astrophysics Data System (ADS)

    Budich, Reinhard; Nyberg, Per; Weigel, Tobias

    2011-11-01

    Climate Knowledge Discovery Workshop; Hamburg, Germany, 30 March to 1 April 2011 Do complex networks combined with semantic Web technologies offer the next generation of solutions in climate science? To address this question, a first Climate Knowledge Discovery (CKD) Workshop, hosted by the German Climate Computing Center (Deutsches Klimarechenzentrum (DKRZ)), brought together climate and computer scientists from major American and European laboratories, data centers, and universities, as well as representatives from industry, the broader academic community, and the semantic Web communities. The participants, representing six countries, were concerned with large-scale Earth system modeling and computational data analysis. The motivation for the meeting was the growing problem that climate scientists generate data faster than it can be interpreted and the need to prepare for further exponential data increases. Current analysis approaches are focused primarily on traditional methods, which are best suited for large-scale phenomena and coarse-resolution data sets. The workshop focused on the open discussion of ideas and technologies to provide the next generation of solutions to cope with the increasing data volumes in climate science.

  6. Big Data Ecosystems Enable Scientific Discovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Critchlow, Terence J.; Kleese van Dam, Kerstin

    Over the past 5 years, advances in experimental, sensor and computational technologies have driven the exponential growth in the volumes, acquisition rates, variety and complexity of scientific data. As noted by Hey et al in their 2009 e-book The Fourth Paradigm, this availability of large-quantities of scientifically meaningful data has given rise to a new scientific methodology - data intensive science. Data intensive science is the ability to formulate and evaluate hypotheses using data and analysis to extend, complement and, at times, replace experimentation, theory, or simulation. This new approach to science no longer requires scientists to interact directly withmore » the objects of their research; instead they can utilize digitally captured, reduced, calibrated, analyzed, synthesized and visualized results - allowing them carry out 'experiments' in data.« less

  7. Georgetown Institute for Cognitive and Computational Sciences

    DTIC Science & Technology

    2004-04-01

    lumbar DRG after formalin injection into the hindpaw. Dilute formalin (1.8%) was injected into the rat hindpaw and DRG were harvested 30 minutes later...staining (Figure 140, arrows) on the ipsilateral side to nerve crush. In the lumbar spinal cord, the site of sciatic innervation, there was a dramatic...Proteases in traumatic brain injury. Proieases in Biology and Disease, Volume 3.: Proteases in the Brain, Edited by Nigel Hooper and Uwe Lendeckel, in

  8. Designing an Advanced Instructional Design Advisor: Principles of Instructional Design. Volume 2

    DTIC Science & Technology

    1991-05-01

    ones contained in this paper would comprise a substantial part of the knowledge base for the AIDA . 14. SUBJECT TERMS IS.NUMBER OF PAGES ucigoirlive...the classroom (e.g., computer simulations models can be used to enhance CBI). The Advanced Instructional Design Advisor is a project aimed at providing... model shares with its variations. Tennyson then identifies research- based prescriptions from the cognitive sciences which should become part of ISD in

  9. United States Air Force Research Initiation Program for 1988. Volume 2

    DTIC Science & Technology

    1990-04-01

    Specialty: Modeling and Simulation ENGINEERING AND SERVICES CENTER (Tyndall Air Force Base) Dr. Wayne A. Charlie Dr. Peter Jeffers (1987) Colorado State...Michael Sydor University of New Hampshire University of Minnesota Specialty: Systems Modeling & Controls Specialty: Optics, Material Science Dr. John...9MG-025 4 Modeling and Simulation on Micro- Dr. Joseph J. Feeley (1987) computers, 1989 760-7MG-070 5 Two Dimensional MHD Simulation of Dr. Manuel A

  10. Proceedings of Selected Research Paper Presentations at the Convention of the Association for Educational Communications and Technology and Sponsored by the Research and Theory Division (11th, Dallas, Texas, February 1-5, 1989).

    ERIC Educational Resources Information Center

    Simonson, Michael R., Ed.; Frey, Diane, Ed.

    1989-01-01

    The 46 papers is this volume represent some of the most current thinking in educational communications and technology. Individual papers address the following topics: gender differences in the selection of elective computer science courses and in the selection of non-traditional careers; instruction for individuals with different cognitive styles;…

  11. Computing Science and Statistics. Volume 24. Graphics and Visualization

    DTIC Science & Technology

    1993-03-01

    the dough , turbulent fluid flow, the time between drips of behavior changes radically when the population growth water from a faucet, Brownian motion... cookie which clearly is the discrete parameter analogue of continuous param- appropriate as after dinner fun. eter time series analysis". I strongly...methods. Your fortune cookie of the night reads: One problem that statisticians traditionally seem to "uYou have good friends who will come to your aid in

  12. Northeast Artificial Intelligence Consortium (NAIC). Volume 12. Computer Architecture for Very Large Knowledge Bases

    DTIC Science & Technology

    1990-12-01

    data rate to the electronics would be much lower on the average and the data much "richer" in information. Intelligent use of...system bottleneck, a high data rate should be provided by I/O systems. 2. machines with intelligent storage management specially designed for logic...management information processing, surveillance sensors, intelligence data collection and handling, solid state sciences, electromagnetics, and propagation, and electronic reliability/maintainability and compatibility.

  13. Report on Technology Horizons: A Vision for Air Force Science and Technology During 2010-2030. Volume 1

    DTIC Science & Technology

    2010-05-15

    flow and decision processes across the air and space domains. It thus comprises traditional wired and fiber-optic computer networks based on...dual flow path design allow high volumetric efficiency, and high cruise speed provides significantly increased survivability. Vertical takeoff...emerging “third-stream engine architectures” can enable for constant mass flow engines that can provide further reductions in fuel consumption. A wide

  14. Journal of Naval Science. Volume 2, Number 2. April 1976

    DTIC Science & Technology

    1976-04-01

    with bold lines to permit reduction in block making. A recent photograph and biographical note of the Author(s) will also be welcomed. Views and...Research Laboratory and of the Naval Under- water Systems Center aboard. The U.S. National Aeronautics and Space Administration ( NASA ) provided...F. Garcia. Fault Isolation Computer Methods. NASA Contractor Report CR-1758. February 1971. "•> P. A. Payne. D. R. Towill and K. J. Baker

  15. Integrating Remote Sensing Data, Hybrid-Cloud Computing, and Event Notifications for Advanced Rapid Imaging & Analysis (Invited)

    NASA Astrophysics Data System (ADS)

    Hua, H.; Owen, S. E.; Yun, S.; Lundgren, P.; Fielding, E. J.; Agram, P.; Manipon, G.; Stough, T. M.; Simons, M.; Rosen, P. A.; Wilson, B. D.; Poland, M. P.; Cervelli, P. F.; Cruz, J.

    2013-12-01

    Space-based geodetic measurement techniques such as Interferometric Synthetic Aperture Radar (InSAR) and Continuous Global Positioning System (CGPS) are now important elements in our toolset for monitoring earthquake-generating faults, volcanic eruptions, hurricane damage, landslides, reservoir subsidence, and other natural and man-made hazards. Geodetic imaging's unique ability to capture surface deformation with high spatial and temporal resolution has revolutionized both earthquake science and volcanology. Continuous monitoring of surface deformation and surface change before, during, and after natural hazards improves decision-making from better forecasts, increased situational awareness, and more informed recovery. However, analyses of InSAR and GPS data sets are currently handcrafted following events and are not generated rapidly and reliably enough for use in operational response to natural disasters. Additionally, the sheer data volumes needed to handle a continuous stream of InSAR data sets also presents a bottleneck. It has been estimated that continuous processing of InSAR coverage of California alone over 3-years would reach PB-scale data volumes. Our Advanced Rapid Imaging and Analysis for Monitoring Hazards (ARIA-MH) science data system enables both science and decision-making communities to monitor areas of interest with derived geodetic data products via seamless data preparation, processing, discovery, and access. We will present our findings on the use of hybrid-cloud computing to improve the timely processing and delivery of geodetic data products, integrating event notifications from USGS to improve the timely processing for response, as well as providing browse results for quick looks with other tools for integrative analysis.

  16. It's All About the Data: Workflow Systems and Weather

    NASA Astrophysics Data System (ADS)

    Plale, B.

    2009-05-01

    Digital data is fueling new advances in the computational sciences, particularly geospatial research as environmental sensing grows more practical through reduced technology costs, broader network coverage, and better instruments. e-Science research (i.e., cyberinfrastructure research) has responded to data intensive computing with tools, systems, and frameworks that support computationally oriented activities such as modeling, analysis, and data mining. Workflow systems support execution of sequences of tasks on behalf of a scientist. These systems, such as Taverna, Apache ODE, and Kepler, when built as part of a larger cyberinfrastructure framework, give the scientist tools to construct task graphs of execution sequences, often through a visual interface for connecting task boxes together with arcs representing control flow or data flow. Unlike business processing workflows, scientific workflows expose a high degree of detail and control during configuration and execution. Data-driven science imposes unique needs on workflow frameworks. Our research is focused on two issues. The first is the support for workflow-driven analysis over all kinds of data sets, including real time streaming data and locally owned and hosted data. The second is the essential role metadata/provenance collection plays in data driven science, for discovery, determining quality, for science reproducibility, and for long-term preservation. The research has been conducted over the last 6 years in the context of cyberinfrastructure for mesoscale weather research carried out as part of the Linked Environments for Atmospheric Discovery (LEAD) project. LEAD has pioneered new approaches for integrating complex weather data, assimilation, modeling, mining, and cyberinfrastructure systems. Workflow systems have the potential to generate huge volumes of data. Without some form of automated metadata capture, either metadata description becomes largely a manual task that is difficult if not impossible under high-volume conditions, or the searchability and manageability of the resulting data products is disappointingly low. The provenance of a data product is a record of its lineage, or trace of the execution history that resulted in the product. The provenance of a forecast model result, e.g., captures information about the executable version of the model, configuration parameters, input data products, execution environment, and owner. Provenance enables data to be properly attributed and captures critical parameters about the model run so the quality of the result can be ascertained. Proper provenance is essential to providing reproducible scientific computing results. Workflow languages used in science discovery are complete programming languages, and in theory can support any logic expressible by a programming language. The execution environments supporting the workflow engines, on the other hand, are subject to constraints on physical resources, and hence in practice the workflow task graphs used in science utilize relatively few of the cataloged workflow patterns. It is important to note that these workflows are executed on demand, and are executed once. Into this context is introduced the need for science discovery that is responsive to real time information. If we can use simple programming models and abstractions to make scientific discovery involving real-time data accessible to specialists who share and utilize data across scientific domains, we bring science one step closer to solving the largest of human problems.

  17. Climatic response variability and machine learning: development of a modular technology framework for predicting bio-climatic change in pacific northwest ecosystems"

    NASA Astrophysics Data System (ADS)

    Seamon, E.; Gessler, P. E.; Flathers, E.

    2015-12-01

    The creation and use of large amounts of data in scientific investigations has become common practice. Data collection and analysis for large scientific computing efforts are not only increasing in volume as well as number, the methods and analysis procedures are evolving toward greater complexity (Bell, 2009, Clarke, 2009, Maimon, 2010). In addition, the growth of diverse data-intensive scientific computing efforts (Soni, 2011, Turner, 2014, Wu, 2008) has demonstrated the value of supporting scientific data integration. Efforts to bridge this gap between the above perspectives have been attempted, in varying degrees, with modular scientific computing analysis regimes implemented with a modest amount of success (Perez, 2009). This constellation of effects - 1) an increasing growth in the volume and amount of data, 2) a growing data-intensive science base that has challenging needs, and 3) disparate data organization and integration efforts - has created a critical gap. Namely, systems of scientific data organization and management typically do not effectively enable integrated data collaboration or data-intensive science-based communications. Our research efforts attempt to address this gap by developing a modular technology framework for data science integration efforts - with climate variation as the focus. The intention is that this model, if successful, could be generalized to other application areas. Our research aim focused on the design and implementation of a modular, deployable technology architecture for data integration. Developed using aspects of R, interactive python, SciDB, THREDDS, Javascript, and varied data mining and machine learning techniques, the Modular Data Response Framework (MDRF) was implemented to explore case scenarios for bio-climatic variation as they relate to pacific northwest ecosystem regions. Our preliminary results, using historical NETCDF climate data for calibration purposes across the inland pacific northwest region (Abatzoglou, Brown, 2011), show clear ecosystems shifting over a ten-year period (2001-2011), based on multiple supervised classifier methods for bioclimatic indicators.

  18. Evaluating Cloud Computing in the Proposed NASA DESDynI Ground Data System

    NASA Technical Reports Server (NTRS)

    Tran, John J.; Cinquini, Luca; Mattmann, Chris A.; Zimdars, Paul A.; Cuddy, David T.; Leung, Kon S.; Kwoun, Oh-Ig; Crichton, Dan; Freeborn, Dana

    2011-01-01

    The proposed NASA Deformation, Ecosystem Structure and Dynamics of Ice (DESDynI) mission would be a first-of-breed endeavor that would fundamentally change the paradigm by which Earth Science data systems at NASA are built. DESDynI is evaluating a distributed architecture where expert science nodes around the country all engage in some form of mission processing and data archiving. This is compared to the traditional NASA Earth Science missions where the science processing is typically centralized. What's more, DESDynI is poised to profoundly increase the amount of data collection and processing well into the 5 terabyte/day and tens of thousands of job range, both of which comprise a tremendous challenge to DESDynI's proposed distributed data system architecture. In this paper, we report on a set of architectural trade studies and benchmarks meant to inform the DESDynI mission and the broader community of the impacts of these unprecedented requirements. In particular, we evaluate the benefits of cloud computing and its integration with our existing NASA ground data system software called Apache Object Oriented Data Technology (OODT). The preliminary conclusions of our study suggest that the use of the cloud and OODT together synergistically form an effective, efficient and extensible combination that could meet the challenges of NASA science missions requiring DESDynI-like data collection and processing volumes at reduced costs.

  19. Know Your Discipline: Teaching the Philosophy of Computer Science

    ERIC Educational Resources Information Center

    Tedre, Matti

    2007-01-01

    The diversity and interdisciplinarity of computer science and the multiplicity of its uses in other sciences make it hard to define computer science and to prescribe how computer science should be carried out. The diversity of computer science also causes friction between computer scientists from different branches. Computer science curricula, as…

  20. NASA information sciences and human factors program

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The Data Systems Program consists of research and technology devoted to controlling, processing, storing, manipulating, and analyzing space-derived data. The objectives of the program are to provide the technology advancements needed to enable affordable utilization of space-derived data, to increase substantially the capability for future missions of on-board processing and recording and to provide high-speed, high-volume computational systems that are anticipated for missions such as the evolutionary Space Station and Earth Observing System.

  1. Instrumentation for Airwake Measurements on the Flight Deck of a FFG-7

    DTIC Science & Technology

    1991-11-01

    volatile RAM to the computer hard disk with a unique file name based on time and date. At an opportune time the data file(s) are manually transferred...1967 6 Royal Air Force Manual (Volume D) AP3456D Al-i APPENDIX 1 GENERAL SPECIFICATION FOR VADAR VADAR was developed by the Instrumentation and Trials...TTCP HTP -6) N. Matheson N. Pollock DJ. Sherman Materials Research Laboratory Director/Library Defence Science & Technology Organisation Salisbury

  2. Tight-Binding Approach to Computational Materials Science, Symposium Held December 1-3, 1997, Boston, Massachusetts, USA. Volume 491

    DTIC Science & Technology

    1998-01-01

    to their large unit size and to experimental difficulties in determining geometries of carbon-based complex materials because of the weak X - ray ...qualitative relationship between the calculated local density of states and the experimental X - ray pho- toelectron spectra (XPS) and the Bremsstrahlung...from interaction schemes and allows complete data sets from different sources (neutron or X - ray diffraction, chemical constraints) to be fitted. In

  3. United States Air Force Summer Research Program -- 1993. Volume 1. Program Management Report

    DTIC Science & Technology

    1993-12-01

    IEEE Spectrum and Physics Today. High school applicants can participate only in laboratories located no more than 20 miles from their residence. Tailored...faculty and $37/day for graduate students whose homes were more than 50 miles from the laboratory. Transportation to the laboratory at the beginning of...TX 78212- 7200 Branting, Luther Field: Dept of Computer Science Assistant Professor, PhD Laboratory: AL/HR PC Box 3682 University of Wyoming Vol-Page

  4. Introducing Enabling Computational Tools to the Climate Sciences: Multi-Resolution Climate Modeling with Adaptive Cubed-Sphere Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jablonowski, Christiane

    The research investigates and advances strategies how to bridge the scale discrepancies between local, regional and global phenomena in climate models without the prohibitive computational costs of global cloud-resolving simulations. In particular, the research explores new frontiers in computational geoscience by introducing high-order Adaptive Mesh Refinement (AMR) techniques into climate research. AMR and statically-adapted variable-resolution approaches represent an emerging trend for atmospheric models and are likely to become the new norm in future-generation weather and climate models. The research advances the understanding of multi-scale interactions in the climate system and showcases a pathway how to model these interactions effectively withmore » advanced computational tools, like the Chombo AMR library developed at the Lawrence Berkeley National Laboratory. The research is interdisciplinary and combines applied mathematics, scientific computing and the atmospheric sciences. In this research project, a hierarchy of high-order atmospheric models on cubed-sphere computational grids have been developed that serve as an algorithmic prototype for the finite-volume solution-adaptive Chombo-AMR approach. The foci of the investigations have lied on the characteristics of both static mesh adaptations and dynamically-adaptive grids that can capture flow fields of interest like tropical cyclones. Six research themes have been chosen. These are (1) the introduction of adaptive mesh refinement techniques into the climate sciences, (2) advanced algorithms for nonhydrostatic atmospheric dynamical cores, (3) an assessment of the interplay between resolved-scale dynamical motions and subgrid-scale physical parameterizations, (4) evaluation techniques for atmospheric model hierarchies, (5) the comparison of AMR refinement strategies and (6) tropical cyclone studies with a focus on multi-scale interactions and variable-resolution modeling. The results of this research project demonstrate significant advances in all six research areas. The major conclusions are that statically-adaptive variable-resolution modeling is currently becoming mature in the climate sciences, and that AMR holds outstanding promise for future-generation weather and climate models on high-performance computing architectures.« less

  5. Solid earth science in the 1990s. Volume 2: Panel reports

    NASA Technical Reports Server (NTRS)

    1991-01-01

    This is the second volume of a three-volume report. Volume 2, Panel Reports, outlines a plan for solid Earth science research for the next decade. The science panels addressed the following fields: plate motion and deformation, lithospheric structure and evolution, volcanology, Earth structure and dynamics, Earth rotation and reference frames, and geopotential fields.

  6. Gridded Hourly Text Products: A TRMM Data Reduction Approach

    NASA Technical Reports Server (NTRS)

    Stocker, Erich; Kwiatkowski, John; Kelley, Owen; Wharton, Stephen W. (Technical Monitor)

    2001-01-01

    The quantity of precipitation data from satellite-based observations is a blessing and a curse. The sheer volume of the data makes it difficult for many researchers to use in targeted applications. This volume increases further as algorithm improvements lead to the reprocessing of mission data. In addition to the overall volume of data, the size and format complexity of orbital granules contribute to the difficulty in using all the available data. Finally, the number of different instruments available to measure rainfall and related parameters further contributes to the volume concerns. In summary, we have an embarrassment of riches. The science team of the Tropical Rainfall Measuring Mission (TRMM) recognized this dilemma and has developed a strategy to address it. The TRMM Science Data and Information System (TSDIS) produces, at the direction of the Joint TRMM Science Team, a number of instantaneous rainfall products. The TRMM Microwave Imager (TMI), the Precipitation Radar and a Combined TMI/PR are the key "instruments" used in this production. Each of these products contains an entire orbit of data. The algorithm code computes not just rain rates but a large number of other physical parameters as well as information needed for monitoring algorithm performance. That makes these products very large. For example, a single orbit of TMI rain rate product is 99 MB, a single orbit of the combined product yields a granule that is 158 MB, while the 80 vertical levels of rain information from the PR yields an orbital product of 253 MB. These are large products that are often difficult for science users to electronically transfer to their sites especially if they want a large period of time. Level 3 gridded products are much smaller, but their 5 or 30 day temporal resolution is insufficient for many researchers. In addition, TRMM standard products are produced in the HDF format. While a large number of user-friendly tools are available to hide the details of the format (including a toolkit developed at TSDIS for the TRMM science team), many potential users shy away

  7. Volunteered Cloud Computing for Disaster Management

    NASA Astrophysics Data System (ADS)

    Evans, J. D.; Hao, W.; Chettri, S. R.

    2014-12-01

    Disaster management relies increasingly on interpreting earth observations and running numerical models; which require significant computing capacity - usually on short notice and at irregular intervals. Peak computing demand during event detection, hazard assessment, or incident response may exceed agency budgets; however some of it can be met through volunteered computing, which distributes subtasks to participating computers via the Internet. This approach has enabled large projects in mathematics, basic science, and climate research to harness the slack computing capacity of thousands of desktop computers. This capacity is likely to diminish as desktops give way to battery-powered mobile devices (laptops, smartphones, tablets) in the consumer market; but as cloud computing becomes commonplace, it may offer significant slack capacity -- if its users are given an easy, trustworthy mechanism for participating. Such a "volunteered cloud computing" mechanism would also offer several advantages over traditional volunteered computing: tasks distributed within a cloud have fewer bandwidth limitations; granular billing mechanisms allow small slices of "interstitial" computing at no marginal cost; and virtual storage volumes allow in-depth, reversible machine reconfiguration. Volunteered cloud computing is especially suitable for "embarrassingly parallel" tasks, including ones requiring large data volumes: examples in disaster management include near-real-time image interpretation, pattern / trend detection, or large model ensembles. In the context of a major disaster, we estimate that cloud users (if suitably informed) might volunteer hundreds to thousands of CPU cores across a large provider such as Amazon Web Services. To explore this potential, we are building a volunteered cloud computing platform and targeting it to a disaster management context. Using a lightweight, fault-tolerant network protocol, this platform helps cloud users join parallel computing projects; automates reconfiguration of their virtual machines; ensures accountability for donated computing; and optimizes the use of "interstitial" computing. Initial applications include fire detection from multispectral satellite imagery and flood risk mapping through hydrological simulations.

  8. Data management and its role in delivering science at DOE BES user facilities - Past, Present, and Future

    NASA Astrophysics Data System (ADS)

    Miller, Stephen D.; Herwig, Kenneth W.; Ren, Shelly; Vazhkudai, Sudharshan S.; Jemian, Pete R.; Luitz, Steffen; Salnikov, Andrei A.; Gaponenko, Igor; Proffen, Thomas; Lewis, Paul; Green, Mark L.

    2009-07-01

    The primary mission of user facilities operated by Basic Energy Sciences under the Department of Energy is to produce data for users in support of open science and basic research [1]. We trace back almost 30 years of history across selected user facilities illustrating the evolution of facility data management practices and how these practices have related to performing scientific research. The facilities cover multiple techniques such as X-ray and neutron scattering, imaging and tomography sciences. Over time, detector and data acquisition technologies have dramatically increased the ability to produce prolific volumes of data challenging the traditional paradigm of users taking data home upon completion of their experiments to process and publish their results. During this time, computing capacity has also increased dramatically, though the size of the data has grown significantly faster than the capacity of one's laptop to manage and process this new facility produced data. Trends indicate that this will continue to be the case for yet some time. Thus users face a quandary for how to manage today's data complexity and size as these may exceed the computing resources users have available to themselves. This same quandary can also stifle collaboration and sharing. Realizing this, some facilities are already providing web portal access to data and computing thereby providing users access to resources they need [2]. Portal based computing is now driving researchers to think about how to use the data collected at multiple facilities in an integrated way to perform their research, and also how to collaborate and share data. In the future, inter-facility data management systems will enable next tier cross-instrument-cross facility scientific research fuelled by smart applications residing upon user computer resources. We can learn from the medical imaging community that has been working since the early 1990's to integrate data from across multiple modalities to achieve better diagnoses [3] - similarly, data fusion across BES facilities will lead to new scientific discoveries.

  9. Infrastructure Systems for Advanced Computing in E-science applications

    NASA Astrophysics Data System (ADS)

    Terzo, Olivier

    2013-04-01

    In the e-science field are growing needs for having computing infrastructure more dynamic and customizable with a model of use "on demand" that follow the exact request in term of resources and storage capacities. The integration of grid and cloud infrastructure solutions allows us to offer services that can adapt the availability in terms of up scaling and downscaling resources. The main challenges for e-sciences domains will on implement infrastructure solutions for scientific computing that allow to adapt dynamically the demands of computing resources with a strong emphasis on optimizing the use of computing resources for reducing costs of investments. Instrumentation, data volumes, algorithms, analysis contribute to increase the complexity for applications who require high processing power and storage for a limited time and often exceeds the computational resources that equip the majority of laboratories, research Unit in an organization. Very often it is necessary to adapt or even tweak rethink tools, algorithms, and consolidate existing applications through a phase of reverse engineering in order to adapt them to a deployment on Cloud infrastructure. For example, in areas such as rainfall monitoring, meteorological analysis, Hydrometeorology, Climatology Bioinformatics Next Generation Sequencing, Computational Electromagnetic, Radio occultation, the complexity of the analysis raises several issues such as the processing time, the scheduling of tasks of processing, storage of results, a multi users environment. For these reasons, it is necessary to rethink the writing model of E-Science applications in order to be already adapted to exploit the potentiality of cloud computing services through the uses of IaaS, PaaS and SaaS layer. An other important focus is on create/use hybrid infrastructure typically a federation between Private and public cloud, in fact in this way when all resources owned by the organization are all used it will be easy with a federate cloud infrastructure to add some additional resources form the Public cloud for following the needs in term of computational and storage resources and release them where process are finished. Following the hybrid model, the scheduling approach is important for managing both cloud models. Thanks to this model infrastructure every time resources are available for additional request in term of IT capacities that can used "on demand" for a limited time without having to proceed to purchase additional servers.

  10. Terascale direct numerical simulations of turbulent combustion using S3D

    NASA Astrophysics Data System (ADS)

    Chen, J. H.; Choudhary, A.; de Supinski, B.; DeVries, M.; Hawkes, E. R.; Klasky, S.; Liao, W. K.; Ma, K. L.; Mellor-Crummey, J.; Podhorszki, N.; Sankaran, R.; Shende, S.; Yoo, C. S.

    2009-01-01

    Computational science is paramount to the understanding of underlying processes in internal combustion engines of the future that will utilize non-petroleum-based alternative fuels, including carbon-neutral biofuels, and burn in new combustion regimes that will attain high efficiency while minimizing emissions of particulates and nitrogen oxides. Next-generation engines will likely operate at higher pressures, with greater amounts of dilution and utilize alternative fuels that exhibit a wide range of chemical and physical properties. Therefore, there is a significant role for high-fidelity simulations, direct numerical simulations (DNS), specifically designed to capture key turbulence-chemistry interactions in these relatively uncharted combustion regimes, and in particular, that can discriminate the effects of differences in fuel properties. In DNS, all of the relevant turbulence and flame scales are resolved numerically using high-order accurate numerical algorithms. As a consequence terascale DNS are computationally intensive, require massive amounts of computing power and generate tens of terabytes of data. Recent results from terascale DNS of turbulent flames are presented here, illustrating its role in elucidating flame stabilization mechanisms in a lifted turbulent hydrogen/air jet flame in a hot air coflow, and the flame structure of a fuel-lean turbulent premixed jet flame. Computing at this scale requires close collaborations between computer and combustion scientists to provide optimized scaleable algorithms and software for terascale simulations, efficient collective parallel I/O, tools for volume visualization of multiscale, multivariate data and automating the combustion workflow. The enabling computer science, applied to combustion science, is also required in many other terascale physics and engineering simulations. In particular, performance monitoring is used to identify the performance of key kernels in the DNS code, S3D and especially memory intensive loops in the code. Through the careful application of loop transformations, data reuse in cache is exploited thereby reducing memory bandwidth needs, and hence, improving S3D's nodal performance. To enhance collective parallel I/O in S3D, an MPI-I/O caching design is used to construct a two-stage write-behind method for improving the performance of write-only operations. The simulations generate tens of terabytes of data requiring analysis. Interactive exploration of the simulation data is enabled by multivariate time-varying volume visualization. The visualization highlights spatial and temporal correlations between multiple reactive scalar fields using an intuitive user interface based on parallel coordinates and time histogram. Finally, an automated combustion workflow is designed using Kepler to manage large-scale data movement, data morphing, and archival and to provide a graphical display of run-time diagnostics.

  11. Fundamentals handbook of electrical and computer engineering. Volume 1 Circuits fields and electronics

    NASA Astrophysics Data System (ADS)

    Chang, S. S. L.

    State of the art technology in circuits, fields, and electronics is discussed. The principles and applications of these technologies to industry, digital processing, microwave semiconductors, and computer-aided design are explained. Important concepts and methodologies in mathematics and physics are reviewed, and basic engineering sciences and associated design methods are dealt with, including: circuit theory and the design of magnetic circuits and active filter synthesis; digital signal processing, including FIR and IIR digital filter design; transmission lines, electromagnetic wave propagation and surface acoustic wave devices. Also considered are: electronics technologies, including power electronics, microwave semiconductors, GaAs devices, and magnetic bubble memories; digital circuits and logic design.

  12. Ultrascale Visualization of Climate Data

    NASA Technical Reports Server (NTRS)

    Williams, Dean N.; Bremer, Timo; Doutriaux, Charles; Patchett, John; Williams, Sean; Shipman, Galen; Miller, Ross; Pugmire, David R.; Smith, Brian; Steed, Chad; hide

    2013-01-01

    Fueled by exponential increases in the computational and storage capabilities of high-performance computing platforms, climate simulations are evolving toward higher numerical fidelity, complexity, volume, and dimensionality. These technological breakthroughs are coming at a time of exponential growth in climate data, with estimates of hundreds of exabytes by 2020. To meet the challenges and exploit the opportunities that such explosive growth affords, a consortium of four national laboratories, two universities, a government agency, and two private companies formed to explore the next wave in climate science. Working in close collaboration with domain experts, the Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) project aims to provide high-level solutions to a variety of climate data analysis and visualization problems.

  13. An Innovative Infrastructure with a Universal Geo-spatiotemporal Data Representation Supporting Cost-effective Integration of Diverse Earth Science Data

    NASA Astrophysics Data System (ADS)

    Kuo, K. S.; Rilee, M. L.

    2017-12-01

    Existing pathways for bringing together massive, diverse Earth Science datasets for integrated analyses burden end users with data packaging and management details irrelevant to their domain goals. The major data repositories focus on archival, discovery, and dissemination of products (files) in a standardized manner. End-users must download and then adapt these files using local resources and custom methods before analysis can proceed. This reduces scientific or other domain productivity, as scarce resources and expertise must be diverted to data processing. The Spatio-Temporal Adaptive Resolution Encoding (STARE) is a unifying scheme encoding geospatial and temporal information for organizing data on scalable computing/storage resources, minimizing expensive data transfers. STARE provides a compact representation that turns set-logic functions, e.g. conditional subsetting, into integer operations, that takes into account representative spatiotemporal resolutions of the data in the datasets, which is needed for data placement alignment of geo-spatiotemporally diverse data on massive parallel resources. Automating important scientific functions (e.g. regridding) and computational functions (e.g. data placement) allows scientists to focus on domain specific questions instead of expending their expertise on data processing. While STARE is not tied to any particular computing technology, we have used STARE for visualization and the SciDB array database to analyze Earth Science data on a 28-node compute cluster. STARE's automatic data placement and coupling of geometric and array indexing allows complicated data comparisons to be realized as straightforward database operations like "join." With STARE-enabled automation, SciDB+STARE provides a database interface, reducing costly data preparation, increasing the volume and variety of integrable data, and easing result sharing. Using SciDB+STARE as part of an integrated analysis infrastructure, we demonstrate the dramatic ease of combining diametrically different datasets, i.e. gridded (NMQ radar) vs. spacecraft swath (TRMM). SciDB+STARE is an important step towards a computational infrastructure for integrating and sharing diverse, complex Earth Science data and science products derived from them.

  14. Spacelab Science Results Study. Volume 1; External Observations

    NASA Technical Reports Server (NTRS)

    Naumann, Robert J. (Compiler)

    1999-01-01

    Some of the 36 Spacelab missions were more or less dedicated to specific scientific disciplines, while other carried a eclectic mixture of experiments ranging from astrophysics to life sciences. However, the experiments can be logically classified into two general categories; those that make use of the Shuttle as an observing platform for external phenomena (including those which use the Shuttle in an interactive mode) and those which use the Shuttle as a microgravity laboratory. This first volume of this Spacelab Science Results study will be devoted to experiments of the first category. The disciplines included are Astrophysics, Solar Physics, Space Plasma Physics, Atmospheric Sciences, and Earth Sciences. Because of the large number of microgravity investigations, Volume 2 will be devoted to Microgravity Sciences, which includes Fluid Physics, Combustion Science, Materials Science, and Biotechnology, and Volume 3 will be devoted to Space Life Sciences, which studies the response and adaptability of living organisms to the microgravity environment.

  15. Web search queries can predict stock market volumes.

    PubMed

    Bordino, Ilaria; Battiston, Stefano; Caldarelli, Guido; Cristelli, Matthieu; Ukkonen, Antti; Weber, Ingmar

    2012-01-01

    We live in a computerized and networked society where many of our actions leave a digital trace and affect other people's actions. This has lead to the emergence of a new data-driven research field: mathematical methods of computer science, statistical physics and sociometry provide insights on a wide range of disciplines ranging from social science to human mobility. A recent important discovery is that search engine traffic (i.e., the number of requests submitted by users to search engines on the www) can be used to track and, in some cases, to anticipate the dynamics of social phenomena. Successful examples include unemployment levels, car and home sales, and epidemics spreading. Few recent works applied this approach to stock prices and market sentiment. However, it remains unclear if trends in financial markets can be anticipated by the collective wisdom of on-line users on the web. Here we show that daily trading volumes of stocks traded in NASDAQ-100 are correlated with daily volumes of queries related to the same stocks. In particular, query volumes anticipate in many cases peaks of trading by one day or more. Our analysis is carried out on a unique dataset of queries, submitted to an important web search engine, which enable us to investigate also the user behavior. We show that the query volume dynamics emerges from the collective but seemingly uncoordinated activity of many users. These findings contribute to the debate on the identification of early warnings of financial systemic risk, based on the activity of users of the www.

  16. Web Search Queries Can Predict Stock Market Volumes

    PubMed Central

    Bordino, Ilaria; Battiston, Stefano; Caldarelli, Guido; Cristelli, Matthieu; Ukkonen, Antti; Weber, Ingmar

    2012-01-01

    We live in a computerized and networked society where many of our actions leave a digital trace and affect other people’s actions. This has lead to the emergence of a new data-driven research field: mathematical methods of computer science, statistical physics and sociometry provide insights on a wide range of disciplines ranging from social science to human mobility. A recent important discovery is that search engine traffic (i.e., the number of requests submitted by users to search engines on the www) can be used to track and, in some cases, to anticipate the dynamics of social phenomena. Successful examples include unemployment levels, car and home sales, and epidemics spreading. Few recent works applied this approach to stock prices and market sentiment. However, it remains unclear if trends in financial markets can be anticipated by the collective wisdom of on-line users on the web. Here we show that daily trading volumes of stocks traded in NASDAQ-100 are correlated with daily volumes of queries related to the same stocks. In particular, query volumes anticipate in many cases peaks of trading by one day or more. Our analysis is carried out on a unique dataset of queries, submitted to an important web search engine, which enable us to investigate also the user behavior. We show that the query volume dynamics emerges from the collective but seemingly uncoordinated activity of many users. These findings contribute to the debate on the identification of early warnings of financial systemic risk, based on the activity of users of the www. PMID:22829871

  17. SMV⊥: Simplex of maximal volume based upon the Gram-Schmidt process

    NASA Astrophysics Data System (ADS)

    Salazar-Vazquez, Jairo; Mendez-Vazquez, Andres

    2015-10-01

    In recent years, different algorithms for Hyperspectral Image (HI) analysis have been introduced. The high spectral resolution of these images allows to develop different algorithms for target detection, material mapping, and material identification for applications in Agriculture, Security and Defense, Industry, etc. Therefore, from the computer science's point of view, there is fertile field of research for improving and developing algorithms in HI analysis. In some applications, the spectral pixels of a HI can be classified using laboratory spectral signatures. Nevertheless, for many others, there is no enough available prior information or spectral signatures, making any analysis a difficult task. One of the most popular algorithms for the HI analysis is the N-FINDR because it is easy to understand and provides a way to unmix the original HI in the respective material compositions. The N-FINDR is computationally expensive and its performance depends on a random initialization process. This paper proposes a novel idea to reduce the complexity of the N-FINDR by implementing a bottom-up approach based in an observation from linear algebra and the use of the Gram-Schmidt process. Therefore, the Simplex of Maximal Volume Perpendicular (SMV⊥) algorithm is proposed for fast endmember extraction in hyperspectral imagery. This novel algorithm has complexity O(n) with respect to the number of pixels. In addition, the evidence shows that SMV⊥ calculates a bigger volume, and has lower computational time complexity than other poular algorithms on synthetic and real scenarios.

  18. Nanotechnology at NASA Ames

    NASA Technical Reports Server (NTRS)

    Srivastava, Deepak; Meyyappan, Meyya; Yan, Jerry (Technical Monitor)

    2000-01-01

    Advanced miniaturization, a key thrust area to enable new science and exploration missions, provides ultrasmall sensors, power sources, communication, navigation, and propulsion systems with very low mass, volume, and power consumption. Revolutions in electronics and computing will allow reconfigurable, autonomous, 'thinking' spacecraft. Nanotechnology presents a whole new spectrum of opportunities to build device components and systems for entirely new space architectures: (1) networks of ultrasmall probes on planetary surfaces; (2) micro-rovers that drive, hop, fly, and burrow; and (3) collections of microspacecraft making a variety of measurements.

  19. Live-cell mass profiling: an emerging approach in quantitative biophysics.

    PubMed

    Zangle, Thomas A; Teitell, Michael A

    2014-12-01

    Cell mass, volume and growth rate are tightly controlled biophysical parameters in cellular development and homeostasis, and pathological cell growth defines cancer in metazoans. The first measurements of cell mass were made in the 1950s, but only recently have advances in computer science and microfabrication spurred the rapid development of precision mass-quantifying approaches. Here we discuss available techniques for quantifying the mass of single live cells with an emphasis on relative features, capabilities and drawbacks for different applications.

  20. NASA's Role in Aeronautics: A Workshop. Volume 6: Aeronautical research

    NASA Technical Reports Server (NTRS)

    1981-01-01

    While each aspect of its aeronautical technology program is important to the current preeminence of the United States in aeronautics, the most essential contributions of NASA derive from its research. Successes and challenges in NASA's efforts to improve civil and military aviation are discussed for the following areas: turbulence, noise, supercritical aerodynamics, computational aerodynamics, fuels, high temperature materials, composite materials, single crystal components, powder metallurgy, and flight controls. Spin offs to engineering and other sciences explored include NASTRAN, lubricants, and composites.

  1. Site Environmental Report for 2010, Volumes 1 & 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baskin, David; Bauters, Tim; Borglin, Ned

    2011-09-01

    LBNL is a multiprogram scientific facility operated by the UC for the DOE. LBNL’s research is directed toward the physical, biological, environmental, and computational sciences, in order to deliver scientific knowledge and discoveries pertinent to DOE’s missions. This annual Site Environmental Report covers activities conducted in CY 2010. The format and content of this report satisfy the requirements of DOE Order 231.1A, Environment, Safety, and Health Reporting,1 and the operating contract between UC and DOE

  2. Proceedings of the Annual Acquisition Research Symposium (7th), Acquisition Research: Creating Synergy for Informed Change 12-13 May 2010. Volume 2

    DTIC Science & Technology

    2010-04-30

    delivered enhances both the teaching and learning processes. • The number of students engaged in focused acquisition research for their MBA projects...Meyers, US Navy—Lieutenant Nicholas Meyers is an MBA student in the Graduate School of Business & Public Policy at the Naval Postgraduate School . LT...Theoretic Computer Science Mathematics and Operations Research Werner Heisenberg-Weg 39 85577 Neubiberg, Germany Phone +49 89 6004 2400 Abstract

  3. Computing Science and Statistics: Volume 24. Graphics and Visualization

    DTIC Science & Technology

    1993-03-20

    r, is set to 3.569, the population examples include: kneading ingredients into a bread eventually oscillates about 16 fixed values. However the dough ...34fun statistics". My goal is to offer leagues I said in jest "After all, regression analysis is you the equivalent of a fortune cookie which clearly is... cookie of the night reads: One problem that statisticians traditionally seem to "You have good friends who will come to your aid in have is that they

  4. An Ada Based Expert System for the Ada Version of SAtool II. Volume 1 and 2

    DTIC Science & Technology

    1991-06-06

    Integrated Computer-Aided Manufacturing (ICAM) (20). In fact, IDEF 0 stands for ICAM Definition Method Zero . IDEF0 defines a subset of SA that omits...reasoning that has been programmed). An expert’s knowledge is specific to one problem domain as opposed to knowledge about general problem-solving...techniques. General problem domains are medicine, finance, science or engineering and so forth in which an expert can solve specific problems very well

  5. Enabling Extreme Scale Earth Science Applications at the Oak Ridge Leadership Computing Facility

    NASA Astrophysics Data System (ADS)

    Anantharaj, V. G.; Mozdzynski, G.; Hamrud, M.; Deconinck, W.; Smith, L.; Hack, J.

    2014-12-01

    The Oak Ridge Leadership Facility (OLCF), established at the Oak Ridge National Laboratory (ORNL) under the auspices of the U.S. Department of Energy (DOE), welcomes investigators from universities, government agencies, national laboratories and industry who are prepared to perform breakthrough research across a broad domain of scientific disciplines, including earth and space sciences. Titan, the OLCF flagship system, is currently listed as #2 in the Top500 list of supercomputers in the world, and the largest available for open science. The computational resources are allocated primarily via the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, sponsored by the U.S. DOE Office of Science. In 2014, over 2.25 billion core hours on Titan were awarded via INCITE projects., including 14% of the allocation toward earth sciences. The INCITE competition is also open to research scientists based outside the USA. In fact, international research projects account for 12% of the INCITE awards in 2014. The INCITE scientific review panel also includes 20% participation from international experts. Recent accomplishments in earth sciences at OLCF include the world's first continuous simulation of 21,000 years of earth's climate history (2009); and an unprecedented simulation of a magnitude 8 earthquake over 125 sq. miles. One of the ongoing international projects involves scaling the ECMWF Integrated Forecasting System (IFS) model to over 200K cores of Titan. ECMWF is a partner in the EU funded Collaborative Research into Exascale Systemware, Tools and Applications (CRESTA) project. The significance of the research carried out within this project is the demonstration of techniques required to scale current generation Petascale capable simulation codes towards the performance levels required for running on future Exascale systems. One of the techniques pursued by ECMWF is to use Fortran2008 coarrays to overlap computations and communications and to reduce the total volume of data communicated. Use of Titan has enabled ECMWF to plan future scalability developments and resource requirements. We will also discuss the best practices developed over the years in navigating logistical, legal and regulatory hurdles involved in supporting the facility's diverse user community.

  6. Foundations of the Unity of Science, Toward an International Encyclopedia of Unified Science, Volume 2, Numbers 1-9.

    ERIC Educational Resources Information Center

    Neurath, Otto; And Others

    The monographs published from 1938 through 1970 under the general title of the International Encyclopedia of Unified Science are now published in two volumes (see also SE 012 543). The monographs included in this volume, and the philosophers who wrote them, are listed below. Foundations of the Social Sciences (Neurath); The Structure of Scientific…

  7. Probing the Natural World, Volume 3A, Environmental Science, Crusty Problems, and Why You're You.

    ERIC Educational Resources Information Center

    Florida State Univ., Tallahassee. Dept. of Science Education.

    This volume is the first of a three volume, one year program for use in junior high school, and consists of these three units: Environmental Science, Crusty Problems (earth science), and Why You're You (heredity). The environmental science unit is composed of chapters relating to these subjects: the black death (plague); energy, food chain, and…

  8. The Terra Data Fusion Project: An Update

    NASA Astrophysics Data System (ADS)

    Di Girolamo, L.; Bansal, S.; Butler, M.; Fu, D.; Gao, Y.; Lee, H. J.; Liu, Y.; Lo, Y. L.; Raila, D.; Turner, K.; Towns, J.; Wang, S. W.; Yang, K.; Zhao, G.

    2017-12-01

    Terra is the flagship of NASA's Earth Observing System. Launched in 1999, Terra's five instruments continue to gather data that enable scientists to address fundamental Earth science questions. By design, the strength of the Terra mission has always been rooted in its five instruments and the ability to fuse the instrument data together for obtaining greater quality of information for Earth Science compared to individual instruments alone. As the data volume grows and the central Earth Science questions move towards problems requiring decadal-scale data records, the need for data fusion and the ability for scientists to perform large-scale analytics with long records have never been greater. The challenge is particularly acute for Terra, given its growing volume of data (> 1 petabyte), the storage of different instrument data at different archive centers, the different file formats and projection systems employed for different instrument data, and the inadequate cyberinfrastructure for scientists to access and process whole-mission fusion data (including Level 1 data). Sharing newly derived Terra products with the rest of the world also poses challenges. As such, the Terra Data Fusion Project aims to resolve two long-standing problems: 1) How do we efficiently generate and deliver Terra data fusion products? 2) How do we facilitate the use of Terra data fusion products by the community in generating new products and knowledge through national computing facilities, and disseminate these new products and knowledge through national data sharing services? Here, we will provide an update on significant progress made in addressing these problems by working with NASA and leveraging national facilities managed by the National Center for Supercomputing Applications (NCSA). The problems that we faced in deriving and delivering Terra L1B2 basic, reprojected and cloud-element fusion products, such as data transfer, data fusion, processing on different computer architectures, science, and sharing, will be presented with quantitative specifics. Results from several science-specific drivers for Terra fusion products will also be presented. We demonstrate that the Terra Data Fusion Project itself provides an excellent use-case for the community addressing Big Data and cyberinfrastructure problems.

  9. Factors influencing exemplary science teachers' levels of computer use

    NASA Astrophysics Data System (ADS)

    Hakverdi, Meral

    This study examines exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their students' use of computer applications/tools in or for their science class. After a relevant review of the literature certain variables were selected for analysis. These variables included personal self-efficacy in teaching with computers, outcome expectancy, pupil-control ideology, level of computer use, age, gender, teaching experience, personal computer use, professional computer use and science teachers' level of knowledge/skills in using specific computer applications for science instruction. The sample for this study includes middle and high school science teachers who received the Presidential Award for Excellence in Science Teaching Award (sponsored by the White House and the National Science Foundation) between the years 1997 and 2003 from all 50 states and U.S. territories. Award-winning science teachers were contacted about the survey via e-mail or letter with an enclosed return envelope. Of the 334 award-winning science teachers, usable responses were received from 92 science teachers, which made a response rate of 27.5%. Analysis of the survey responses indicated that exemplary science teachers have a variety of knowledge/skills in using computer related applications/tools. The most commonly used computer applications/tools are information retrieval via the Internet, presentation tools, online communication, digital cameras, and data collection probes. Results of the study revealed that students' use of technology in their science classroom is highly correlated with the frequency of their science teachers' use of computer applications/tools. The results of the multiple regression analysis revealed that personal self-efficacy related to the exemplary science teachers' level of computer use suggesting that computer use is dependent on perceived abilities at using computers. The teachers' use of computer-related applications/tools during class, and their personal self-efficacy, age, and gender are highly related with their level of knowledge/skills in using specific computer applications for science instruction. The teachers' level of knowledge/skills in using specific computer applications for science instruction and gender related to their use of computer-related applications/tools during class and the students' use of computer-related applications/tools in or for their science class. In conclusion, exemplary science teachers need assistance in learning and using computer-related applications/tool in their science class.

  10. Climate simulations and services on HPC, Cloud and Grid infrastructures

    NASA Astrophysics Data System (ADS)

    Cofino, Antonio S.; Blanco, Carlos; Minondo Tshuma, Antonio

    2017-04-01

    Cloud, Grid and High Performance Computing have changed the accessibility and availability of computing resources for Earth Science research communities, specially for Climate community. These paradigms are modifying the way how climate applications are being executed. By using these technologies the number, variety and complexity of experiments and resources are increasing substantially. But, although computational capacity is increasing, traditional applications and tools used by the community are not good enough to manage this large volume and variety of experiments and computing resources. In this contribution, we evaluate the challenges to run climate simulations and services on Grid, Cloud and HPC infrestructures and how to tackle them. The Grid and Cloud infrastructures provided by EGI's VOs ( esr , earth.vo.ibergrid and fedcloud.egi.eu) will be evaluated, as well as HPC resources from PRACE infrastructure and institutional clusters. To solve those challenges, solutions using DRM4G framework will be shown. DRM4G provides a good framework to manage big volume and variety of computing resources for climate experiments. This work has been supported by the Spanish National R&D Plan under projects WRF4G (CGL2011-28864), INSIGNIA (CGL2016-79210-R) and MULTI-SDM (CGL2015-66583-R) ; the IS-ENES2 project from the 7FP of the European Commission (grant agreement no. 312979); the European Regional Development Fund—ERDF and the Programa de Personal Investigador en Formación Predoctoral from Universidad de Cantabria and Government of Cantabria.

  11. High-Performance Compute Infrastructure in Astronomy: 2020 Is Only Months Away

    NASA Astrophysics Data System (ADS)

    Berriman, B.; Deelman, E.; Juve, G.; Rynge, M.; Vöckler, J. S.

    2012-09-01

    By 2020, astronomy will be awash with as much as 60 PB of public data. Full scientific exploitation of such massive volumes of data will require high-performance computing on server farms co-located with the data. Development of this computing model will be a community-wide enterprise that has profound cultural and technical implications. Astronomers must be prepared to develop environment-agnostic applications that support parallel processing. The community must investigate the applicability and cost-benefit of emerging technologies such as cloud computing to astronomy, and must engage the Computer Science community to develop science-driven cyberinfrastructure such as workflow schedulers and optimizers. We report here the results of collaborations between a science center, IPAC, and a Computer Science research institute, ISI. These collaborations may be considered pathfinders in developing a high-performance compute infrastructure in astronomy. These collaborations investigated two exemplar large-scale science-driver workflow applications: 1) Calculation of an infrared atlas of the Galactic Plane at 18 different wavelengths by placing data from multiple surveys on a common plate scale and co-registering all the pixels; 2) Calculation of an atlas of periodicities present in the public Kepler data sets, which currently contain 380,000 light curves. These products have been generated with two workflow applications, written in C for performance and designed to support parallel processing on multiple environments and platforms, but with different compute resource needs: the Montage image mosaic engine is I/O-bound, and the NASA Star and Exoplanet Database periodogram code is CPU-bound. Our presentation will report cost and performance metrics and lessons-learned for continuing development. Applicability of Cloud Computing: Commercial Cloud providers generally charge for all operations, including processing, transfer of input and output data, and for storage of data, and so the costs of running applications vary widely according to how they use resources. The cloud is well suited to processing CPU-bound (and memory bound) workflows such as the periodogram code, given the relatively low cost of processing in comparison with I/O operations. I/O-bound applications such as Montage perform best on high-performance clusters with fast networks and parallel file-systems. Science-driven Cyberinfrastructure: Montage has been widely used as a driver application to develop workflow management services, such as task scheduling in distributed environments, designing fault tolerance techniques for job schedulers, and developing workflow orchestration techniques. Running Parallel Applications Across Distributed Cloud Environments: Data processing will eventually take place in parallel distributed across cyber infrastructure environments having different architectures. We have used the Pegasus Work Management System (WMS) to successfully run applications across three very different environments: TeraGrid, OSG (Open Science Grid), and FutureGrid. Provisioning resources across different grids and clouds (also referred to as Sky Computing), involves establishing a distributed environment, where issues of, e.g, remote job submission, data management, and security need to be addressed. This environment also requires building virtual machine images that can run in different environments. Usually, each cloud provides basic images that can be customized with additional software and services. In most of our work, we provisioned compute resources using a custom application, called Wrangler. Pegasus WMS abstracts the architectures of the compute environments away from the end-user, and can be considered a first-generation tool suitable for scientists to run their applications on disparate environments.

  12. Preface: SciDAC 2005

    NASA Astrophysics Data System (ADS)

    Mezzacappa, Anthony

    2005-01-01

    On 26-30 June 2005 at the Grand Hyatt on Union Square in San Francisco several hundred computational scientists from around the world came together for what can certainly be described as a celebration of computational science. Scientists from the SciDAC Program and scientists from other agencies and nations were joined by applied mathematicians and computer scientists to highlight the many successes in the past year where computation has led to scientific discovery in a variety of fields: lattice quantum chromodynamics, accelerator modeling, chemistry, biology, materials science, Earth and climate science, astrophysics, and combustion and fusion energy science. Also highlighted were the advances in numerical methods and computer science, and the multidisciplinary collaboration cutting across science, mathematics, and computer science that enabled these discoveries. The SciDAC Program was conceived and funded by the US Department of Energy Office of Science. It is the Office of Science's premier computational science program founded on what is arguably the perfect formula: the priority and focus is science and scientific discovery, with the understanding that the full arsenal of `enabling technologies' in applied mathematics and computer science must be brought to bear if we are to have any hope of attacking and ultimately solving today's computational Grand Challenge problems. The SciDAC Program has been in existence for four years, and many of the computational scientists funded by this program will tell you that the program has given them the hope of addressing their scientific problems in full realism for the very first time. Many of these scientists will also tell you that SciDAC has also fundamentally changed the way they do computational science. We begin this volume with one of DOE's great traditions, and core missions: energy research. As we will see, computation has been seminal to the critical advances that have been made in this arena. Of course, to understand our world, whether it is to understand its very nature or to understand it so as to control it for practical application, will require explorations on all of its scales. Computational science has been no less an important tool in this arena than it has been in the arena of energy research. From explorations of quantum chromodynamics, the fundamental theory that describes how quarks make up the protons and neutrons of which we are composed, to explorations of the complex biomolecules that are the building blocks of life, to explorations of some of the most violent phenomena in our universe and of the Universe itself, computation has provided not only significant insight, but often the only means by which we have been able to explore these complex, multicomponent systems and by which we have been able to achieve scientific discovery and understanding. While our ultimate target remains scientific discovery, it certainly can be said that at a fundamental level the world is mathematical. Equations ultimately govern the evolution of the systems of interest to us, be they physical, chemical, or biological systems. The development and choice of discretizations of these underlying equations is often a critical deciding factor in whether or not one is able to model such systems stably, faithfully, and practically, and in turn, the algorithms to solve the resultant discrete equations are the complementary, critical ingredient in the recipe to model the natural world. The use of parallel computing platforms, especially at the TeraScale, and the trend toward even larger numbers of processors, continue to present significant challenges in the development and implementation of these algorithms. Computational scientists often speak of their `workflows'. A workflow, as the name suggests, is the sum total of all complex and interlocking tasks, from simulation set up, execution, and I/O, to visualization and scientific discovery, through which the advancement in our understanding of the natural world is realized. For the computational scientist, enabling such workflows presents myriad, signiflcant challenges, and it is computer scientists that are called upon at such times to address these challenges. Simulations are currently generating data at the staggering rate of tens of TeraBytes per simulation, over the course of days. In the next few years, these data generation rates are expected to climb exponentially to hundreds of TeraBytes per simulation, performed over the course of months. The output, management, movement, analysis, and visualization of these data will be our key to unlocking the scientific discoveries buried within the data. And there is no hope of generating such data to begin with, or of scientific discovery, without stable computing platforms and a sufficiently high and sustained performance of scientific applications codes on them. Thus, scientific discovery in the realm of computational science at the TeraScale and beyond will occur at the intersection of science, applied mathematics, and computer science. The SciDAC Program was constructed to mirror this reality, and the pages that follow are a testament to the efficacy of such an approach. We would like to acknowledge the individuals on whose talents and efforts the success of SciDAC 2005 was based. Special thanks go to Betsy Riley for her work on the SciDAC 2005 Web site and meeting agenda, for lining up our corporate sponsors, for coordinating all media communications, and for her efforts in processing the proceedings contributions, to Sherry Hempfling for coordinating the overall SciDAC 2005 meeting planning, for handling a significant share of its associated communications, and for coordinating with the ORNL Conference Center and Grand Hyatt, to Angela Harris for producing many of the documents and records on which our meeting planning was based and for her efforts in coordinating with ORNL Graphics Services, to Angie Beach of the ORNL Conference Center for her efforts in procurement and setting up and executing the contracts with the hotel, and to John Bui and John Smith for their superb wireless networking and A/V set up and support. We are grateful for the relentless efforts of all of these individuals, their remarkable talents, and for the joy of working with them during this past year. They were the cornerstones of SciDAC 2005. Thanks also go to Kymba A'Hearn and Patty Boyd for on-site registration, Brittany Hagen for administrative support, Bruce Johnston for netcast support, Tim Jones for help with the proceedings and Web site, Sherry Lamb for housing and registration, Cindy Lathum for Web site design, Carolyn Peters for on-site registration, and Dami Rich for graphic design. And we would like to express our appreciation to the Oak Ridge National Laboratory, especially Jeff Nichols, the Argonne National Laboratory, the Lawrence Berkeley National Laboratory, and to our corporate sponsors, Cray, IBM, Intel, and SGI, for their support. We would like to extend special thanks also to our plenary speakers, technical speakers, poster presenters, and panelists for all of their efforts on behalf of SciDAC 2005 and for their remarkable achievements and contributions. We would like to express our deep appreciation to Lali Chatterjee, Graham Douglas and Margaret Smith of Institute of Physics Publishing, who worked tirelessly in order to provide us with this finished volume within two months, which is nothing short of miraculous. Finally, we wish to express our heartfelt thanks to Michael Strayer, SciDAC Director, whose vision it was to focus SciDAC 2005 on scientific discovery, around which all of the excitement we experienced revolved, and to our DOE SciDAC program managers, especially Fred Johnson, for their support, input, and help throughout.

  13. Link Analysis of High Throughput Spacecraft Communication Systems for Future Science Missions

    NASA Technical Reports Server (NTRS)

    Simons, Rainee N.

    2015-01-01

    NASA's plan to launch several spacecrafts into low Earth Orbit (LEO) to support science missions in the next ten years and beyond requires down link throughput on the order of several terabits per day. The ability to handle such a large volume of data far exceeds the capabilities of current systems. This paper proposes two solutions, first, a high data rate link between the LEO spacecraft and ground via relay satellites in geostationary orbit (GEO). Second, a high data rate direct to ground link from LEO. Next, the paper presents results from computer simulations carried out for both types of links taking into consideration spacecraft transmitter frequency, EIRP, and waveform; elevation angle dependent path loss through Earths atmosphere, and ground station receiver GT.

  14. National Synchrotron Light Source annual report 1991. Volume 1, October 1, 1990--September 30, 1991

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hulbert, S.L.; Lazarz, N.M.

    1992-04-01

    This report discusses the following research conducted at NSLS: atomic and molecular science; energy dispersive diffraction; lithography, microscopy and tomography; nuclear physics; UV photoemission and surface science; x-ray absorption spectroscopy; x-ray scattering and crystallography; x-ray topography; workshop on surface structure; workshop on electronic and chemical phenomena at surfaces; workshop on imaging; UV FEL machine reviews; VUV machine operations; VUV beamline operations; VUV storage ring parameters; x-ray machine operations; x-ray beamline operations; x-ray storage ring parameters; superconducting x-ray lithography source; SXLS storage ring parameters; the accelerator test facility; proposed UV-FEL user facility at the NSLS; global orbit feedback systems; and NSLSmore » computer system.« less

  15. Link Analysis of High Throughput Spacecraft Communication Systems for Future Science Missions

    NASA Technical Reports Server (NTRS)

    Simons, Rainee N.

    2015-01-01

    NASA's plan to launch several spacecraft into low Earth Orbit (LEO) to support science missions in the next ten years and beyond requires down link throughput on the order of several terabits per day. The ability to handle such a large volume of data far exceeds the capabilities of current systems. This paper proposes two solutions, first, a high data rate link between the LEO spacecraft and ground via relay satellites in geostationary orbit (GEO). Second, a high data rate direct to ground link from LEO. Next, the paper presents results from computer simulations carried out for both types of links taking into consideration spacecraft transmitter frequency, EIRP, and waveform; elevation angle dependent path loss through Earths atmosphere, and ground station receiver GT.

  16. The Path from Large Earth Science Datasets to Information

    NASA Astrophysics Data System (ADS)

    Vicente, G. A.

    2013-12-01

    The NASA Goddard Earth Sciences Data (GES) and Information Services Center (DISC) is one of the major Science Mission Directorate (SMD) for archiving and distribution of Earth Science remote sensing data, products and services. This virtual portal provides convenient access to Atmospheric Composition and Dynamics, Hydrology, Precipitation, Ozone, and model derived datasets (generated by GSFC's Global Modeling and Assimilation Office), the North American Land Data Assimilation System (NLDAS) and the Global Land Data Assimilation System (GLDAS) data products (both generated by GSFC's Hydrological Sciences Branch). This presentation demonstrates various tools and computational technologies developed in the GES DISC to manage the huge volume of data and products acquired from various missions and programs over the years. It explores approaches to archive, document, distribute, access and analyze Earth Science data and information as well as addresses the technical and scientific issues, governance and user support problem faced by scientists in need of multi-disciplinary datasets. It also discusses data and product metrics, user distribution profiles and lessons learned through interactions with the science communities around the world. Finally it demonstrates some of the most used data and product visualization and analyses tools developed and maintained by the GES DISC.

  17. Addendum report to atmospheric science facility pallet-only mode space transportation system payload feasibility study, volume 3, revision A

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The feasibility of accomplishing selected atmospheric science mission using a pallet-only mode was studied. Certain unresolved issues were identified. The first issue was that of assuring that the on-board computer facility was adequate to process scientific data, control subsystems such as instrument pointing, provide mission operational program capability, and accomplish display and control. The second issue evolved from an investigation of the availability of existing substitute instruments that could be used instead of the prime instrumentation where the development tests and schedules are incompatible with the realistic budgets and shuttle vehicle schedules. Some effort was expended on identifying candidate substitute instruments, and the performance, cost, and development schedule trade-offs found during that effort were significant enough to warrant a follow-on investigation. This addendum documents the results of that follow-on effort, as it applies to the Atmospheric Sciences Facility.

  18. Matter, Motion, and Man, Volume III.

    ERIC Educational Resources Information Center

    Montag, Betty Jo

    Volume Three of the three-volume experimental program in general science attempts to provide preparation for the new approaches in biology, chemistry, and physics and to give those who will not continue in science a realistic way of understanding themselves, the world, and the role of science in society. Chapters on embryology, the body systems,…

  19. Higher-order accurate space-time schemes for computational astrophysics—Part I: finite volume methods

    NASA Astrophysics Data System (ADS)

    Balsara, Dinshaw S.

    2017-12-01

    As computational astrophysics comes under pressure to become a precision science, there is an increasing need to move to high accuracy schemes for computational astrophysics. The algorithmic needs of computational astrophysics are indeed very special. The methods need to be robust and preserve the positivity of density and pressure. Relativistic flows should remain sub-luminal. These requirements place additional pressures on a computational astrophysics code, which are usually not felt by a traditional fluid dynamics code. Hence the need for a specialized review. The focus here is on weighted essentially non-oscillatory (WENO) schemes, discontinuous Galerkin (DG) schemes and PNPM schemes. WENO schemes are higher order extensions of traditional second order finite volume schemes. At third order, they are most similar to piecewise parabolic method schemes, which are also included. DG schemes evolve all the moments of the solution, with the result that they are more accurate than WENO schemes. PNPM schemes occupy a compromise position between WENO and DG schemes. They evolve an Nth order spatial polynomial, while reconstructing higher order terms up to Mth order. As a result, the timestep can be larger. Time-dependent astrophysical codes need to be accurate in space and time with the result that the spatial and temporal accuracies must be matched. This is realized with the help of strong stability preserving Runge-Kutta schemes and ADER (Arbitrary DERivative in space and time) schemes, both of which are also described. The emphasis of this review is on computer-implementable ideas, not necessarily on the underlying theory.

  20. A Novel Method to Compute Breathing Volumes via Motion Capture Systems: Design and Experimental Trials.

    PubMed

    Massaroni, Carlo; Cassetta, Eugenio; Silvestri, Sergio

    2017-10-01

    Respiratory assessment can be carried out by using motion capture systems. A geometrical model is mandatory in order to compute the breathing volume as a function of time from the markers' trajectories. This study describes a novel model to compute volume changes and calculate respiratory parameters by using a motion capture system. The novel method, ie, prism-based method, computes the volume enclosed within the chest by defining 82 prisms from the 89 markers attached to the subject chest. Volumes computed with this method are compared to spirometry volumes and to volumes computed by a conventional method based on the tetrahedron's decomposition of the chest wall and integrated in a commercial motion capture system. Eight healthy volunteers were enrolled and 30 seconds of quiet breathing data collected from each of them. Results show a better agreement between volumes computed by the prism-based method and the spirometry (discrepancy of 2.23%, R 2  = .94) compared to the agreement between volumes computed by the conventional method and the spirometry (discrepancy of 3.56%, R 2  = .92). The proposed method also showed better performances in the calculation of respiratory parameters. Our findings open up prospects for the further use of the new method in the breathing assessment via motion capture systems.

  1. Semantic Web technologies for the big data in life sciences.

    PubMed

    Wu, Hongyan; Yamaguchi, Atsuko

    2014-08-01

    The life sciences field is entering an era of big data with the breakthroughs of science and technology. More and more big data-related projects and activities are being performed in the world. Life sciences data generated by new technologies are continuing to grow in not only size but also variety and complexity, with great speed. To ensure that big data has a major influence in the life sciences, comprehensive data analysis across multiple data sources and even across disciplines is indispensable. The increasing volume of data and the heterogeneous, complex varieties of data are two principal issues mainly discussed in life science informatics. The ever-evolving next-generation Web, characterized as the Semantic Web, is an extension of the current Web, aiming to provide information for not only humans but also computers to semantically process large-scale data. The paper presents a survey of big data in life sciences, big data related projects and Semantic Web technologies. The paper introduces the main Semantic Web technologies and their current situation, and provides a detailed analysis of how Semantic Web technologies address the heterogeneous variety of life sciences big data. The paper helps to understand the role of Semantic Web technologies in the big data era and how they provide a promising solution for the big data in life sciences.

  2. Federated data storage system prototype for LHC experiments and data intensive science

    NASA Astrophysics Data System (ADS)

    Kiryanov, A.; Klimentov, A.; Krasnopevtsev, D.; Ryabinkin, E.; Zarochentsev, A.

    2017-10-01

    Rapid increase of data volume from the experiments running at the Large Hadron Collider (LHC) prompted physics computing community to evaluate new data handling and processing solutions. Russian grid sites and universities’ clusters scattered over a large area aim at the task of uniting their resources for future productive work, at the same time giving an opportunity to support large physics collaborations. In our project we address the fundamental problem of designing a computing architecture to integrate distributed storage resources for LHC experiments and other data-intensive science applications and to provide access to data from heterogeneous computing facilities. Studies include development and implementation of federated data storage prototype for Worldwide LHC Computing Grid (WLCG) centres of different levels and University clusters within one National Cloud. The prototype is based on computing resources located in Moscow, Dubna, Saint Petersburg, Gatchina and Geneva. This project intends to implement a federated distributed storage for all kind of operations such as read/write/transfer and access via WAN from Grid centres, university clusters, supercomputers, academic and commercial clouds. The efficiency and performance of the system are demonstrated using synthetic and experiment-specific tests including real data processing and analysis workflows from ATLAS and ALICE experiments, as well as compute-intensive bioinformatics applications (PALEOMIX) running on supercomputers. We present topology and architecture of the designed system, report performance and statistics for different access patterns and show how federated data storage can be used efficiently by physicists and biologists. We also describe how sharing data on a widely distributed storage system can lead to a new computing model and reformations of computing style, for instance how bioinformatics program running on supercomputers can read/write data from the federated storage.

  3. Metacognition: computation, biology and function

    PubMed Central

    Fleming, Stephen M.; Dolan, Raymond J.; Frith, Christopher D.

    2012-01-01

    Many complex systems maintain a self-referential check and balance. In animals, such reflective monitoring and control processes have been grouped under the rubric of metacognition. In this introductory article to a Theme Issue on metacognition, we review recent and rapidly progressing developments from neuroscience, cognitive psychology, computer science and philosophy of mind. While each of these areas is represented in detail by individual contributions to the volume, we take this opportunity to draw links between disciplines, and highlight areas where further integration is needed. Specifically, we cover the definition, measurement, neurobiology and possible functions of metacognition, and assess the relationship between metacognition and consciousness. We propose a framework in which level of representation, order of behaviour and access consciousness are orthogonal dimensions of the conceptual landscape. PMID:22492746

  4. Study for Teaching Behavioral Sciences in Schools of Medicine, Volume III: Behavioral Science Perspectives in Medical Education.

    ERIC Educational Resources Information Center

    American Sociological Association, Washington, DC. Medical Sociology Council.

    Volume III of a study of teaching behavioral sciences in medical school presents perspectives on medical behavioral science from the viewpoints of the several behavioral disciplines (anthropology, psychology, sociology, political science, economics, behavioral biology and medical education). In addition, there is a discussion of translating…

  5. IBM Watson: How Cognitive Computing Can Be Applied to Big Data Challenges in Life Sciences Research.

    PubMed

    Chen, Ying; Elenee Argentinis, J D; Weber, Griff

    2016-04-01

    Life sciences researchers are under pressure to innovate faster than ever. Big data offer the promise of unlocking novel insights and accelerating breakthroughs. Ironically, although more data are available than ever, only a fraction is being integrated, understood, and analyzed. The challenge lies in harnessing volumes of data, integrating the data from hundreds of sources, and understanding their various formats. New technologies such as cognitive computing offer promise for addressing this challenge because cognitive solutions are specifically designed to integrate and analyze big datasets. Cognitive solutions can understand different types of data such as lab values in a structured database or the text of a scientific publication. Cognitive solutions are trained to understand technical, industry-specific content and use advanced reasoning, predictive modeling, and machine learning techniques to advance research faster. Watson, a cognitive computing technology, has been configured to support life sciences research. This version of Watson includes medical literature, patents, genomics, and chemical and pharmacological data that researchers would typically use in their work. Watson has also been developed with specific comprehension of scientific terminology so it can make novel connections in millions of pages of text. Watson has been applied to a few pilot studies in the areas of drug target identification and drug repurposing. The pilot results suggest that Watson can accelerate identification of novel drug candidates and novel drug targets by harnessing the potential of big data. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  6. Science in History, Volume 4: The Social Sciences, Conclusion.

    ERIC Educational Resources Information Center

    Bernal, J. D.

    This volume, the last of four, includes parts seven and eight of the eight parts in the series. Part Seven deals with the sciences of society which are described as the latest and most imperfect of the sciences. It is doubtful if, in their present form, they can be called sciences at all. The historical development of the social sciences is traced…

  7. Gender differences in the use of computers, programming, and peer interactions in computer science classrooms

    NASA Astrophysics Data System (ADS)

    Stoilescu, Dorian; Egodawatte, Gunawardena

    2010-12-01

    Research shows that female and male students in undergraduate computer science programs view computer culture differently. Female students are interested more in the use of computers than in doing programming, whereas male students see computer science mainly as a programming activity. The overall purpose of our research was not to find new definitions for computer science culture but to see how male and female students see themselves involved in computer science practices, how they see computer science as a successful career, and what they like and dislike about current computer science practices. The study took place in a mid-sized university in Ontario. Sixteen students and two instructors were interviewed to get their views. We found that male and female views are different on computer use, programming, and the pattern of student interactions. Female and male students did not have any major issues in using computers. In computing programming, female students were not so involved in computing activities whereas male students were heavily involved. As for the opinions about successful computer science professionals, both female and male students emphasized hard working, detailed oriented approaches, and enjoying playing with computers. The myth of the geek as a typical profile of successful computer science students was not found to be true.

  8. Nuclear science abstracts (NSA) database 1948--1974 (on the Internet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    Nuclear Science Abstracts (NSA) is a comprehensive abstract and index collection of the International Nuclear Science and Technology literature for the period 1948 through 1976. Included are scientific and technical reports of the US Atomic Energy Commission, US Energy Research and Development Administration and its contractors, other agencies, universities, and industrial and research organizations. Coverage of the literature since 1976 is provided by Energy Science and Technology Database. Approximately 25% of the records in the file contain abstracts. These are from the following volumes of the print Nuclear Science Abstracts: Volumes 12--18, Volume 29, and Volume 33. The database containsmore » over 900,000 bibliographic records. All aspects of nuclear science and technology are covered, including: Biomedical Sciences; Metals, Ceramics, and Other Materials; Chemistry; Nuclear Materials and Waste Management; Environmental and Earth Sciences; Particle Accelerators; Engineering; Physics; Fusion Energy; Radiation Effects; Instrumentation; Reactor Technology; Isotope and Radiation Source Technology. The database includes all records contained in Volume 1 (1948) through Volume 33 (1976) of the printed version of Nuclear Science Abstracts (NSA). This worldwide coverage includes books, conference proceedings, papers, patents, dissertations, engineering drawings, and journal literature. This database is now available for searching through the GOV. Research Center (GRC) service. GRC is a single online web-based search service to well known Government databases. Featuring powerful search and retrieval software, GRC is an important research tool. The GRC web site is at http://grc.ntis.gov.« less

  9. Interactive information processing for NASA's mesoscale analysis and space sensor program

    NASA Technical Reports Server (NTRS)

    Parker, K. G.; Maclean, L.; Reavis, N.; Wilson, G.; Hickey, J. S.; Dickerson, M.; Karitani, S.; Keller, D.

    1985-01-01

    The Atmospheric Sciences Division (ASD) of the Systems Dynamics Laboratory at NASA's Marshall Space Flight Center (MSFC) is currently involved in interactive information processing for the Mesoscale Analysis and Space Sensor (MASS) program. Specifically, the ASD is engaged in the development and implementation of new space-borne remote sensing technology to observe and measure mesoscale atmospheric processes. These space measurements and conventional observational data are being processed together to gain an improved understanding of the mesoscale structure and the dynamical evolution of the atmosphere relative to cloud development and precipitation processes. To satisfy its vast data processing requirements, the ASD has developed a Researcher Computer System consiting of three primary computer systems which provides over 20 scientists with a wide range of capabilities for processing and displaying a large volumes of remote sensing data. Each of the computers performs a specific function according to its unique capabilities.

  10. Railroad Classification Yard Technology Manual: Volume II : Yard Computer Systems

    DOT National Transportation Integrated Search

    1981-08-01

    This volume (Volume II) of the Railroad Classification Yard Technology Manual documents the railroad classification yard computer systems methodology. The subjects covered are: functional description of process control and inventory computer systems,...

  11. Science Learning: Processes and Applications.

    ERIC Educational Resources Information Center

    Santa, Carol Minnick, Ed.; Alvermann, Donna E., Ed.

    Reflecting a collaboration in terms of content areas, levels, and audience, this volume represents the efforts of science teachers and reading teachers to understand and help one another fine tune their craft. Chapters in the volume include: (1) "Metacognition, Reading and Science Education" (Linda Baker); (2) "Science and Reading:…

  12. Digest of Key Science and Engineering Indicators, 2008. NSB-08-2

    ERIC Educational Resources Information Center

    National Science Foundation, 2008

    2008-01-01

    This digest of key science and engineering indicators draws primarily from the National Science Board's two-volume "Science and Engineering Indicators, 2008" report. The digest serves two purposes: (1) to draw attention to important trends and data points from across the chapters and volumes of "Science and Engineering Indicators, 2008," and (2)…

  13. Exploring the Relationships between Self-Efficacy and Preference for Teacher Authority among Computer Science Majors

    ERIC Educational Resources Information Center

    Lin, Che-Li; Liang, Jyh-Chong; Su, Yi-Ching; Tsai, Chin-Chung

    2013-01-01

    Teacher-centered instruction has been widely adopted in college computer science classrooms and has some benefits in training computer science undergraduates. Meanwhile, student-centered contexts have been advocated to promote computer science education. How computer science learners respond to or prefer the two types of teacher authority,…

  14. Data stewardship - a fundamental part of the scientific method (Invited)

    NASA Astrophysics Data System (ADS)

    Foster, C.; Ross, J.; Wyborn, L. A.

    2013-12-01

    This paper emphasises the importance of data stewardship as a fundamental part of the scientific method, and the need to effect cultural change to ensure engagement by earth scientists. It is differentiated from the science of data stewardship per se. Earth System science generates vast quantities of data, and in the past, data analysis has been constrained by compute power, such that sub-sampling of data often provided the only way to reach an outcome. This is analogous to Kahneman's System 1 heuristic, with its simplistic and often erroneous outcomes. The development of HPC has liberated earth sciences such that the complexity and heterogeneity of natural systems can be utilised in modelling at any scale, global, or regional, or local; for example, movement of crustal fluids. Paradoxically, now that compute power is available, it is the stewardship of the data that is presenting the main challenges. There is a wide spectrum of issues: from effectively handling and accessing acquired data volumes [e.g. satellite feeds per day/hour]; through agreed taxonomy to effect machine to machine analyses; to idiosyncratic approaches by individual scientists. Except for the latter, most agree that data stewardship is essential. Indeed it is an essential part of the science workflow. As science struggles to engage and inform on issues of community importance, such as shale gas and fraccing, all parties must have equal access to data used for decision making; without that, there will be no social licence to operate or indeed access to additional science funding (Heidorn, 2008). The stewardship of scientific data is an essential part of the science process; but often it is regarded, wrongly, as entirely in the domain of data custodians or stewards. Geoscience Australia has developed a set of six principles that apply to all science activities within the agency: Relevance to Government Collaborative science Quality science Transparent science Communicated science Sustained science capability Every principle includes data stewardship: this is to effect cultural change at both collective and individual levels to ensure that our science outcomes and technical advice are effective for the Government and community.

  15. Large-scale deep learning for robotically gathered imagery for science

    NASA Astrophysics Data System (ADS)

    Skinner, K.; Johnson-Roberson, M.; Li, J.; Iscar, E.

    2016-12-01

    With the explosion of computing power, the intelligence and capability of mobile robotics has dramatically increased over the last two decades. Today, we can deploy autonomous robots to achieve observations in a variety of environments ripe for scientific exploration. These platforms are capable of gathering a volume of data previously unimaginable. Additionally, optical cameras, driven by mobile phones and consumer photography, have rapidly improved in size, power consumption, and quality making their deployment cheaper and easier. Finally, in parallel we have seen the rise of large-scale machine learning approaches, particularly deep neural networks (DNNs), increasing the quality of the semantic understanding that can be automatically extracted from optical imagery. In concert this enables new science using a combination of machine learning and robotics. This work will discuss the application of new low-cost high-performance computing approaches and the associated software frameworks to enable scientists to rapidly extract useful science data from millions of robotically gathered images. The automated analysis of imagery on this scale opens up new avenues of inquiry unavailable using more traditional manual or semi-automated approaches. We will use a large archive of millions of benthic images gathered with an autonomous underwater vehicle to demonstrate how these tools enable new scientific questions to be posed.

  16. Atmospheric and Space Sciences: Ionospheres and Plasma Environments

    NASA Astrophysics Data System (ADS)

    Yiǧit, Erdal

    2018-01-01

    The SpringerBriefs on Atmospheric and Space Sciences in two volumes presents a concise and interdisciplinary introduction to the basic theory, observation & modeling of atmospheric and ionospheric coupling processes on Earth. The goal is to contribute toward bridging the gap between meteorology, aeronomy, and planetary science. In addition recent progress in several related research topics, such atmospheric wave coupling and variability, is discussed. Volume 1 will focus on the atmosphere, while Volume 2 will present the ionospheres and the plasma environments. Volume 2 is aimed primarily at (research) students and young researchers that would like to gain quick insight into the basics of space sciences and current research. In combination with the first volume, it also is a useful tool for professors who would like to develop a course in atmospheric and space physics.

  17. U.S., Soviets Face Common Science Problems.

    ERIC Educational Resources Information Center

    Lepkowski, Wil

    1981-01-01

    Summarizes recent findings reported in a two-volume publication, "Science Policy: USA/USSR," issued by the National Science Foundation. Volumes I and II review U.S. and Soviet science policy in research and development, respectively. Comparisons are made concerning common problems around energy, environment, and the meaning of security.…

  18. Large-Scale Sentinel-1 Processing for Solid Earth Science and Urgent Response using Cloud Computing and Machine Learning

    NASA Astrophysics Data System (ADS)

    Hua, H.; Owen, S. E.; Yun, S. H.; Agram, P. S.; Manipon, G.; Starch, M.; Sacco, G. F.; Bue, B. D.; Dang, L. B.; Linick, J. P.; Malarout, N.; Rosen, P. A.; Fielding, E. J.; Lundgren, P.; Moore, A. W.; Liu, Z.; Farr, T.; Webb, F.; Simons, M.; Gurrola, E. M.

    2017-12-01

    With the increased availability of open SAR data (e.g. Sentinel-1 A/B), new challenges are being faced with processing and analyzing the voluminous SAR datasets to make geodetic measurements. Upcoming SAR missions such as NISAR are expected to generate close to 100TB per day. The Advanced Rapid Imaging and Analysis (ARIA) project can now generate geocoded unwrapped phase and coherence products from Sentinel-1 TOPS mode data in an automated fashion, using the ISCE software. This capability is currently being exercised on various study sites across the United States and around the globe, including Hawaii, Central California, Iceland and South America. The automated and large-scale SAR data processing and analysis capabilities use cloud computing techniques to speed the computations and provide scalable processing power and storage. Aspects such as how to processing these voluminous SLCs and interferograms at global scales, keeping up with the large daily SAR data volumes, and how to handle the voluminous data rates are being explored. Scene-partitioning approaches in the processing pipeline help in handling global-scale processing up to unwrapped interferograms with stitching done at a late stage. We have built an advanced science data system with rapid search functions to enable access to the derived data products. Rapid image processing of Sentinel-1 data to interferograms and time series is already being applied to natural hazards including earthquakes, floods, volcanic eruptions, and land subsidence due to fluid withdrawal. We will present the status of the ARIA science data system for generating science-ready data products and challenges that arise from being able to process SAR datasets to derived time series data products at large scales. For example, how do we perform large-scale data quality screening on interferograms? What approaches can be used to minimize compute, storage, and data movement costs for time series analysis in the cloud? We will also present some of our findings from applying machine learning and data analytics on the processed SAR data streams. We will also present lessons learned on how to ease the SAR community onto interfacing with these cloud-based SAR science data systems.

  19. Academic computer science and gender: A naturalistic study investigating the causes of attrition

    NASA Astrophysics Data System (ADS)

    Declue, Timothy Hall

    Far fewer women than men take computer science classes in high school, enroll in computer science programs in college, or complete advanced degrees in computer science. The computer science pipeline begins to shrink for women even before entering college, but it is at the college level that the "brain drain" is the most evident numerically, especially in the first class taken by most computer science majors called "Computer Science 1" or CS-I. The result, for both academia and industry, is a pronounced technological gender disparity in academic and industrial computer science. The study revealed the existence of several factors influencing success in CS-I. First, and most clearly, the effect of attribution processes seemed to be quite strong. These processes tend to work against success for females and in favor of success for males. Likewise, evidence was discovered which strengthens theories related to prior experience and the perception that computer science has a culture which is hostile to females. Two unanticipated themes related to the motivation and persistence of successful computer science majors. The findings did not support the belief that females have greater logistical problems in computer science than males, or that females tend to have a different programming style than males which adversely affects the females' ability to succeed in CS-I.

  20. Salary-Trend Studies of Faculty of the Years 1988-89 and 1991-92 in the Following Academic Disciplines/Major Fields: Accounting; Agribusiness and Agriproduction; Anthropology; Area and Ethnic Studies; Business Administration and Management; Business and Management; Business Economics; Chemistry; Communication Technologies; Communications; Computer and Information Sciences; Dramatic Arts; Drawing; Education; and Engineering.

    ERIC Educational Resources Information Center

    Howe, Richard D.; And Others

    This volume provides comparative data for faculty salaries in public and private colleges, based on an annual survey of over 600 colleges and universities. Data cover the following disciplines: Accounting, Agribusiness and Agriproduction, Anthropology, Area and Ethnic Studies, Business Administration and Management, Business and Management,…

  1. Modeling of scattering from ice surfaces

    NASA Astrophysics Data System (ADS)

    Dahlberg, Michael Ross

    Theoretical research is proposed to study electromagnetic wave scattering from ice surfaces. A mathematical formulation that is more representative of the electromagnetic scattering from ice, with volume mechanisms included, and capable of handling multiple scattering effects is developed. This research is essential to advancing the field of environmental science and engineering by enabling more accurate inversion of remote sensing data. The results of this research contributed towards a more accurate representation of the scattering from ice surfaces, that is computationally more efficient and that can be applied to many remote-sensing applications.

  2. System of Programmed Modules for Measuring Photographs with a Gamma-Telescope

    NASA Technical Reports Server (NTRS)

    Averin, S. A.; Veselova, G. V.; Navasardyan, G. V.

    1978-01-01

    Physical experiments using tracking cameras resulted in hundreds of thousands of stereo photographs of events being received. To process such a large volume of information, automatic and semiautomatic measuring systems are required. At the Institute of Space Research of the Academy of Science of the USSR, a system for processing film information from the spark gamma-telescope was developed. The system is based on a BPS-75 projector in line with the minicomputer Elektronika 1001. The report describes this system. The various computer programs available to the operators are discussed.

  3. Generation of intra-oral-like images from cone beam computed tomography volumes for dental forensic image comparison.

    PubMed

    Trochesset, Denise A; Serchuk, Richard B; Colosi, Dan C

    2014-03-01

    Identification of unknown individuals using dental comparison is well established in the forensic setting. The identification technique can be time and resource consuming if many individuals need to be identified at once. Medical CT (MDCT) for dental profiling has had limited success, mostly due to artifact from metal-containing dental restorations and implants. The authors describe a CBCT reformatting technique that creates images, which closely approximate conventional dental images. Using a i-CAT Platinum CBCT unit and standard issue i-CAT Vision software, a protocol is developed to reproducibly and reliably reformat CBCT volumes. The reformatted images are presented with conventional digital images from the same anatomic area for comparison. The authors conclude that images derived from CBCT volumes following this protocol are similar enough to conventional dental radiographs to allow for dental forensic comparison/identification and that CBCT offers a superior option over MDCT for this purpose. © 2013 American Academy of Forensic Sciences.

  4. Final report: Prototyping a combustion corridor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rutland, Christopher J.; Leach, Joshua

    2001-12-15

    The Combustion Corridor is a concept in which researchers in combustion and thermal sciences have unimpeded access to large volumes of remote computational results. This will enable remote, collaborative analysis and visualization of state-of-the-art combustion science results. The Engine Research Center (ERC) at the University of Wisconsin - Madison partnered with Lawrence Berkeley National Laboratory, Argonne National Laboratory, Sandia National Laboratory, and several other universities to build and test the first stages of a combustion corridor. The ERC served two important functions in this partnership. First, we work extensively with combustion simulations so we were able to provide real worldmore » research data sets for testing the Corridor concepts. Second, the ERC was part of an extension of the high bandwidth based DOE National Laboratory connections to universities.« less

  5. Analytical techniques for retrieval of atmospheric composition with the quadrupole mass spectrometer of the Sample Analysis at Mars instrument suite on Mars Science Laboratory

    NASA Astrophysics Data System (ADS)

    B. Franz, Heather; G. Trainer, Melissa; H. Wong, Michael; L. K. Manning, Heidi; C. Stern, Jennifer; R. Mahaffy, Paul; K. Atreya, Sushil; Benna, Mehdi; G. Conrad, Pamela; N. Harpold, Dan; A. Leshin, Laurie; A. Malespin, Charles; P. McKay, Christopher; Thomas Nolan, J.; Raaen, Eric

    2014-06-01

    The Sample Analysis at Mars (SAM) instrument suite is the largest scientific payload on the Mars Science Laboratory (MSL) Curiosity rover, which landed in Mars' Gale Crater in August 2012. As a miniature geochemical laboratory, SAM is well-equipped to address multiple aspects of MSL's primary science goal, characterizing the potential past or present habitability of Gale Crater. Atmospheric measurements support this goal through compositional investigations relevant to martian climate evolution. SAM instruments include a quadrupole mass spectrometer, a tunable laser spectrometer, and a gas chromatograph that are used to analyze martian atmospheric gases as well as volatiles released by pyrolysis of solid surface materials (Mahaffy et al., 2012). This report presents analytical methods for retrieving the chemical and isotopic composition of Mars' atmosphere from measurements obtained with SAM's quadrupole mass spectrometer. It provides empirical calibration constants for computing volume mixing ratios of the most abundant atmospheric species and analytical functions to correct for instrument artifacts and to characterize measurement uncertainties. Finally, we discuss differences in volume mixing ratios of the martian atmosphere as determined by SAM (Mahaffy et al., 2013) and Viking (Owen et al., 1977; Oyama and Berdahl, 1977) from an analytical perspective. Although the focus of this paper is atmospheric observations, much of the material concerning corrections for instrumental effects also applies to reduction of data acquired with SAM from analysis of solid samples. The Sample Analysis at Mars (SAM) instrument measures the composition of the martian atmosphere. Rigorous calibration of SAM's mass spectrometer was performed with relevant gas mixtures. Calibration included derivation of a new model to correct for electron multiplier effects. Volume mixing ratios for Ar and N2 obtained with SAM differ from those obtained with Viking. Differences between SAM and Viking volume mixing ratios are under investigation.

  6. Computer-Game Construction: A Gender-Neutral Attractor to Computing Science

    ERIC Educational Resources Information Center

    Carbonaro, Mike; Szafron, Duane; Cutumisu, Maria; Schaeffer, Jonathan

    2010-01-01

    Enrollment in Computing Science university programs is at a dangerously low level. A major reason for this is the general lack of interest in Computing Science by females. In this paper, we discuss our experience with using a computer game construction environment as a vehicle to encourage female participation in Computing Science. Experiments…

  7. Pre-College Science Curriculum Activities of the National Science Foundation. Report of Science Curriculum Review Team, Volume II-Appendix.

    ERIC Educational Resources Information Center

    National Science Foundation, Washington, DC.

    Presented is a detailed study of National Science Foundation (NSF) programs in pre-college science education. The development of policies and operational procedures are traced over the past quarter century and their impact on management practice analyzed. The report is presented in two parts: Volume 1, the findings and recommendations, and Volume…

  8. seismo-live: Training in Computational Seismology using Jupyter Notebooks

    NASA Astrophysics Data System (ADS)

    Igel, H.; Krischer, L.; van Driel, M.; Tape, C.

    2016-12-01

    Practical training in computational methodologies is still underrepresented in Earth science curriculae despite the increasing use of sometimes highly sophisticated simulation technologies in research projects. At the same time well-engineered community codes make it easy to return simulation-based results yet with the danger that the inherent traps of numerical solutions are not well understood. It is our belief that training with highly simplified numerical solutions (here to the equations describing elastic wave propagation) with carefully chosen elementary ingredients of simulation technologies (e.g., finite-differencing, function interpolation, spectral derivatives, numerical integration) could substantially improve this situation. For this purpose we have initiated a community platform (www.seismo-live.org) where Python-based Jupyter notebooks can be accessed and run without and necessary downloads or local software installations. The increasingly popular Jupyter notebooks allow combining markup language, graphics, equations with interactive, executable python codes. We demonstrate the potential with training notebooks for the finite-difference method, pseudospectral methods, finite/spectral element methods, the finite-volume and the discontinuous Galerkin method. The platform already includes general Python training, introduction to the ObsPy library for seismology as well as seismic data processing and noise analysis. Submission of Jupyter notebooks for general seismology are encouraged. The platform can be used for complementary teaching in Earth Science courses on compute-intensive research areas.

  9. 75 FR 63724 - Raisins Produced From Grapes Grown in California; Use of Estimated Trade Demand To Compute Volume...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-18

    ... figure to compute volume regulation percentages for 2010- 11 crop Natural (sun-dried) Seedless (NS... compute volume regulation percentages for 2010-11 crop Natural (sun-dried) Seedless (NS) raisins covered...

  10. Light scattering by planetary-regolith analog samples: computational results

    NASA Astrophysics Data System (ADS)

    Väisänen, Timo; Markkanen, Johannes; Hadamcik, Edith; Levasseur-Regourd, Anny-Chantal; Lasue, Jeremie; Blum, Jürgen; Penttilä, Antti; Muinonen, Karri

    2017-04-01

    We compute light scattering by a planetary-regolith analog surface. The corresponding experimental work is from Hadamcik et al. [1] with the PROGRA2-surf [2] device measuring the polarization of dust particles. The analog samples are low density (volume fraction 0.15 ± 0.03) agglomerates produced by random ballistic deposition of almost equisized silica spheres (refractive index n=1.5 and diameter 1.45 ± 0.06 µm). Computations are carried out with the recently developed codes entitled Radiative Transfer with Reciprocal Transactions (R2T2) and Radiative Transfer Coherent Backscattering with incoherent interactions (RT-CB-ic). Both codes incorporate the so-called incoherent treatment which enhances the applicability of the radiative transfer as shown by Muinonen et al. [3]. As a preliminary result, we have computed scattering from a large spherical medium with the RT-CB-ic using equal-sized particles with diameters of 1.45 microns. The preliminary results have shown that the qualitative characteristics are similar for the computed and measured intensity and polarization curves but that there are still deviations between the characteristics. We plan to remove the deviations by incorporating a size distribution of particles (1.45 ± 0.02 microns) and detailed information about the volume density profile within the analog surface. Acknowledgments: We acknowledge the ERC Advanced Grant no. 320773 entitled Scattering and Absorption of Electromagnetic Waves in Particulate Media (SAEMPL). Computational resources were provided by CSC - IT Centre for Science Ltd, Finland. References: [1] Hadamcik E. et al. (2007), JQSRT, 106, 74-89 [2] Levasseur-Regourd A.C. et al. (2015), Polarimetry of stars and planetary systems, CUP, 61-80 [3] Muinonen K. et al. (2016), extended abstract for EMTS.

  11. A Big Data Approach to Analyzing Market Volatility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Kesheng; Bethel, E. Wes; Gu, Ming

    2013-06-05

    Understanding the microstructure of the financial market requires the processing of a vast amount of data related to individual trades, and sometimes even multiple levels of quotes. Analyzing such a large volume of data requires tremendous computing power that is not easily available to financial academics and regulators. Fortunately, public funded High Performance Computing (HPC) power is widely available at the National Laboratories in the US. In this paper we demonstrate that the HPC resource and the techniques for data-intensive sciences can be used to greatly accelerate the computation of an early warning indicator called Volume-synchronized Probability of Informed tradingmore » (VPIN). The test data used in this study contains five and a half year's worth of trading data for about 100 most liquid futures contracts, includes about 3 billion trades, and takes 140GB as text files. By using (1) a more efficient file format for storing the trading records, (2) more effective data structures and algorithms, and (3) parallelizing the computations, we are able to explore 16,000 different ways of computing VPIN in less than 20 hours on a 32-core IBM DataPlex machine. Our test demonstrates that a modest computer is sufficient to monitor a vast number of trading activities in real-time – an ability that could be valuable to regulators. Our test results also confirm that VPIN is a strong predictor of liquidity-induced volatility. With appropriate parameter choices, the false positive rates are about 7% averaged over all the futures contracts in the test data set. More specifically, when VPIN values rise above a threshold (CDF > 0.99), the volatility in the subsequent time windows is higher than the average in 93% of the cases.« less

  12. Central Computer Science Concepts to Research-Based Teacher Training in Computer Science: An Experimental Study

    ERIC Educational Resources Information Center

    Zendler, Andreas; Klaudt, Dieter

    2012-01-01

    The significance of computer science for economics and society is undisputed. In particular, computer science is acknowledged to play a key role in schools (e.g., by opening multiple career paths). The provision of effective computer science education in schools is dependent on teachers who are able to properly represent the discipline and whose…

  13. Computer Program (HEVSIM) for Heavy Duty Vehicle Fuel Economy and Performance Simulation. Volume III.

    DOT National Transportation Integrated Search

    1981-09-01

    Volume III is the third and last volume of a three volume document describing the computer program HEVSIM. This volume includes appendices which list the HEVSIM program, sample part data, some typical outputs and updated nomenclature.

  14. PREFACE: 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011)

    NASA Astrophysics Data System (ADS)

    Teodorescu, Liliana; Britton, David; Glover, Nigel; Heinrich, Gudrun; Lauret, Jérôme; Naumann, Axel; Speer, Thomas; Teixeira-Dias, Pedro

    2012-06-01

    ACAT2011 This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011) which took place on 5-7 September 2011 at Brunel University, UK. The workshop series, which began in 1990 in Lyon, France, brings together computer science researchers and practitioners, and researchers from particle physics and related fields in order to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. It is a forum for the exchange of ideas among the fields, exploring and promoting cutting-edge computing, data analysis and theoretical calculation techniques in fundamental physics research. This year's edition of the workshop brought together over 100 participants from all over the world. 14 invited speakers presented key topics on computing ecosystems, cloud computing, multivariate data analysis, symbolic and automatic theoretical calculations as well as computing and data analysis challenges in astrophysics, bioinformatics and musicology. Over 80 other talks and posters presented state-of-the art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. Panel and round table discussions on data management and multivariate data analysis uncovered new ideas and collaboration opportunities in the respective areas. This edition of ACAT was generously sponsored by the Science and Technology Facility Council (STFC), the Institute for Particle Physics Phenomenology (IPPP) at Durham University, Brookhaven National Laboratory in the USA and Dell. We would like to thank all the participants of the workshop for the high level of their scientific contributions and for the enthusiastic participation in all its activities which were, ultimately, the key factors in the success of the workshop. Further information on ACAT 2011 can be found at http://acat2011.cern.ch Dr Liliana Teodorescu Brunel University ACATgroup The PDF also contains details of the workshop's committees and sponsors.

  15. Using spatial principles to optimize distributed computing for enabling the physical science discoveries

    PubMed Central

    Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing

    2011-01-01

    Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century. PMID:21444779

  16. Using spatial principles to optimize distributed computing for enabling the physical science discoveries.

    PubMed

    Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing

    2011-04-05

    Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century.

  17. ExoMars Trace Gas Orbiter Instrument Modelling Approach to Streamline Science Operations

    NASA Astrophysics Data System (ADS)

    Munoz Fernandez, Michela; Frew, David; Ashman, Michael; Cardesin Moinelo, Alejandro; Garcia Beteta, Juan Jose; Geiger, Bernhard; Metcalfe, Leo; Nespoli, Federico; Muniz Solaz, Carlos

    2018-05-01

    ExoMars Trace Gas Orbiter (TGO) science operations activities are centralised at ESAC's Science Operations Centre (SOC). The SOC receives the inputs from the principal investigators (PIs) in order to implement and deliver the spacecraft pointing requests and instrument timelines to the Mission Operations Centre (MOC). The high number of orbits per planning cycle has made it necessary to abstract the planning interactions between the SOC and the PI teams at the observation level. This paper describes the modelling approach we have conducted for TGOís instruments to streamline science operations. We have created dynamic observation types that scale to adapt to the conditions specified by the PI teams including observation timing, and pointing block parameters calculated from observation geometry. This approach is considered and improvement with respect to previous missions where the generation of the observation pointing and commanding requests was performed manually by the instrument teams. Automation software assists us to effectively handle the high density of planned orbits with increasing volume of scientific data and to successfully meet opportunistic scientific goals and objectives. Our planning tool combines the instrument observation definition files provided by the PIs together with the flight dynamics products to generate the Pointing Requests and the instrument timeline (ITL). The ITL contains all the validated commands at the TC sequence level and computes the resource envelopes (data rate, power, data volume) within the constraints. At the SOC, our main goal is to maximise the science output while minimising the number of iterations among the teams, ensuring that the timeline does not violate the state transitions allowed in the Mission Operations Rules and Constraints Document.

  18. 29 CFR 794.123 - Method of computing annual volume of sales.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 3 2010-07-01 2010-07-01 false Method of computing annual volume of sales. 794.123 Section... of Sales § 794.123 Method of computing annual volume of sales. (a) Where the enterprise, during the... gross volume of sales in excess of the amount specified in the statute, it is plain that its annual...

  19. A Financial Technology Entrepreneurship Program for Computer Science Students

    ERIC Educational Resources Information Center

    Lawler, James P.; Joseph, Anthony

    2011-01-01

    Education in entrepreneurship is becoming a critical area of curricula for computer science students. Few schools of computer science have a concentration in entrepreneurship in the computing curricula. The paper presents Technology Entrepreneurship in the curricula at a leading school of computer science and information systems, in which students…

  20. Will our Current Data Rescue, Curation and Preservation Practices bring us out of the Digital Dark Ages and into the Renaissance of Multi-Source Science? (Invited)

    NASA Astrophysics Data System (ADS)

    Wyborn, L. A.

    2013-12-01

    The emergence of the fourth paradigm of data intensive science in 2007 showed great promise: it offered a new fundamental methodology in scientific exploration in which researchers would be able to harness the huge increase in data volumes coming from new and more powerful instruments that were collecting data at unprecedented rates and at ever increasing resolutions. Given the potential this new methodology offered, decadal challenges were issued to the Earth and Space Science community to come together and work on problems such as impacts of climate change; sustainably exploiting scarce water, mineral and petroleum resources; and protecting our communities through better prediction of the behaviour of natural hazards. Such challenges require the capability to integrate heterogeneous data sets, from multiple sources, across multiple domains and at low transactional cost. To help realise these visions significant investments were made globally in cyberinfrastructures (computer centres, research clouds, data stores, high speed networks, etc.). Combined, these infrastructures are now capable of analysing petabyte size chunks of data, and the climate community is close to operating at exascale. But have we actually realised the vision of data intensive science? The simple reality is that data intensive science requires the capability to find and analyse large volumes of data in real time via machine to machine interactions. It is not necessarily just about ';Big Data' sets collected from remote instruments such as satellites or sensor networks. ';Long Tail' data sets, traditionally the output of small science campaigns, are vital to calibrating large data sets and need to be stored so that they can be reused and repurposed in ways beyond what the original collector of the data intended they be used for. Particularly for meaningful time series analysis in environmental sciences, there is the additional challenge to store and manage data through decades of multiple evolutions of both hardware and software. The move to data intensive science has driven the realisation that we need to put more effort and resources into rescuing, curating and preserving data and properly preserved data sets are now being use to resolve the real world issues of today. However, as the capacity of computational systems increases relentlessly we need to question if our current efforts in data curation and preservation will scale to these ever growing systems. For Earth and Space Sciences to come out of the digital dark ages and into the renaissance of multi-source science, it is time to take stock and question our current data rescue, curation and preservation initiatives. Will the data store I am using be around in 50 years' time? What measures is this data store taking to avoid bit-rot and/or deal with software and hardware obsolescence. Is my data self-describing? Have I paid enough attention to cross domain data standards so my data can be reused and repurposed for the current decadal challenges? More importantly, as the capacity of computational systems scale beyond exascale to zettascale and yottascale, will my data sets that I have rescued, curated and preserved in my lifetime, no matter whether they are small or large, be able to contribute to addressing the decadal challenges that are as yet undefined.

  1. Computer Science Teacher Professional Development in the United States: A Review of Studies Published between 2004 and 2014

    ERIC Educational Resources Information Center

    Menekse, Muhsin

    2015-01-01

    While there has been a remarkable interest to make computer science a core K-12 academic subject in the United States, there is a shortage of K-12 computer science teachers to successfully implement computer sciences courses in schools. In order to enhance computer science teacher capacity, training programs have been offered through teacher…

  2. Pulmonary lobar volumetry using novel volumetric computer-aided diagnosis and computed tomography

    PubMed Central

    Iwano, Shingo; Kitano, Mariko; Matsuo, Keiji; Kawakami, Kenichi; Koike, Wataru; Kishimoto, Mariko; Inoue, Tsutomu; Li, Yuanzhong; Naganawa, Shinji

    2013-01-01

    OBJECTIVES To compare the accuracy of pulmonary lobar volumetry using the conventional number of segments method and novel volumetric computer-aided diagnosis using 3D computed tomography images. METHODS We acquired 50 consecutive preoperative 3D computed tomography examinations for lung tumours reconstructed at 1-mm slice thicknesses. We calculated the lobar volume and the emphysematous lobar volume < −950 HU of each lobe using (i) the slice-by-slice method (reference standard), (ii) number of segments method, and (iii) semi-automatic and (iv) automatic computer-aided diagnosis. We determined Pearson correlation coefficients between the reference standard and the three other methods for lobar volumes and emphysematous lobar volumes. We also compared the relative errors among the three measurement methods. RESULTS Both semi-automatic and automatic computer-aided diagnosis results were more strongly correlated with the reference standard than the number of segments method. The correlation coefficients for automatic computer-aided diagnosis were slightly lower than those for semi-automatic computer-aided diagnosis because there was one outlier among 50 cases (2%) in the right upper lobe and two outliers among 50 cases (4%) in the other lobes. The number of segments method relative error was significantly greater than those for semi-automatic and automatic computer-aided diagnosis (P < 0.001). The computational time for automatic computer-aided diagnosis was 1/2 to 2/3 than that of semi-automatic computer-aided diagnosis. CONCLUSIONS A novel lobar volumetry computer-aided diagnosis system could more precisely measure lobar volumes than the conventional number of segments method. Because semi-automatic computer-aided diagnosis and automatic computer-aided diagnosis were complementary, in clinical use, it would be more practical to first measure volumes by automatic computer-aided diagnosis, and then use semi-automatic measurements if automatic computer-aided diagnosis failed. PMID:23526418

  3. Three dimensional adaptive mesh refinement on a spherical shell for atmospheric models with lagrangian coordinates

    NASA Astrophysics Data System (ADS)

    Penner, Joyce E.; Andronova, Natalia; Oehmke, Robert C.; Brown, Jonathan; Stout, Quentin F.; Jablonowski, Christiane; van Leer, Bram; Powell, Kenneth G.; Herzog, Michael

    2007-07-01

    One of the most important advances needed in global climate models is the development of atmospheric General Circulation Models (GCMs) that can reliably treat convection. Such GCMs require high resolution in local convectively active regions, both in the horizontal and vertical directions. During previous research we have developed an Adaptive Mesh Refinement (AMR) dynamical core that can adapt its grid resolution horizontally. Our approach utilizes a finite volume numerical representation of the partial differential equations with floating Lagrangian vertical coordinates and requires resolving dynamical processes on small spatial scales. For the latter it uses a newly developed general-purpose library, which facilitates 3D block-structured AMR on spherical grids. The library manages neighbor information as the blocks adapt, and handles the parallel communication and load balancing, freeing the user to concentrate on the scientific modeling aspects of their code. In particular, this library defines and manages adaptive blocks on the sphere, provides user interfaces for interpolation routines and supports the communication and load-balancing aspects for parallel applications. We have successfully tested the library in a 2-D (longitude-latitude) implementation. During the past year, we have extended the library to treat adaptive mesh refinement in the vertical direction. Preliminary results are discussed. This research project is characterized by an interdisciplinary approach involving atmospheric science, computer science and mathematical/numerical aspects. The work is done in close collaboration between the Atmospheric Science, Computer Science and Aerospace Engineering Departments at the University of Michigan and NOAA GFDL.

  4. [Measurement of intracranial hematoma volume by personal computer].

    PubMed

    DU, Wanping; Tan, Lihua; Zhai, Ning; Zhou, Shunke; Wang, Rui; Xue, Gongshi; Xiao, An

    2011-01-01

    To explore the method for intracranial hematoma volume measurement by the personal computer. Forty cases of various intracranial hematomas were measured by the computer tomography with quantitative software and personal computer with Photoshop CS3 software, respectively. the data from the 2 methods were analyzed and compared. There was no difference between the data from the computer tomography and the personal computer (P>0.05). The personal computer with Photoshop CS3 software can measure the volume of various intracranial hematomas precisely, rapidly and simply. It should be recommended in the clinical medicolegal identification.

  5. Physics through the 1990s: Scientific interfaces and technological applications

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The volume examines the scientific interfaces and technological applications of physics. Twelve areas are dealt with: biological physics-biophysics, the brain, and theoretical biology; the physics-chemistry interface-instrumentation, surfaces, neutron and synchrotron radiation, polymers, organic electronic materials; materials science; geophysics-tectonics, the atmosphere and oceans, planets, drilling and seismic exploration, and remote sensing; computational physics-complex systems and applications in basic research; mathematics-field theory and chaos; microelectronics-integrated circuits, miniaturization, future trends; optical information technologies-fiber optics and photonics; instrumentation; physics applications to energy needs and the environment; national security-devices, weapons, and arms control; medical physics-radiology, ultrasonics, MNR, and photonics. An executive summary and many chapters contain recommendations regarding funding, education, industry participation, small-group university research and large facility programs, government agency programs, and computer database needs.

  6. Light and dark matter in the universe

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    This simulation follows the growth of density perturbations in both gas and dark matter components in a volume 1 billion light years on a side beginning shortly after the Big Bang and evolved to half the present age of the universe. It calculates the gravitational clumping of intergalactic gas and dark matter modeled using a computational grid of 64 billion cells and 64 billion dark matter particles. The simulation uses a computational grid of 4096^3 cells and took over 4,000,000 CPU hours to complete. Read more: http://www.anl.gov/Media_Center/News/2010/news100104.html. Credits: Science: Michael L. Norman, Robert Harkness, Pascal Paschos and Rick Wagner Visualization:more » Mark Herald, Joseph A. Insley, Eric C. Olson and Michael E. Papka« less

  7. Frequency Domain Computer Programs for Prediction and Analysis of Rail Vehicle Dynamics : Volume 1. Technical Report

    DOT National Transportation Integrated Search

    1975-12-01

    Frequency domain computer programs developed or acquired by TSC for the analysis of rail vehicle dynamics are described in two volumes. Volume I defines the general analytical capabilities required for computer programs applicable to single rail vehi...

  8. Computer Program (HEVSIM) for Heavy Duty Vehicle Fuel Economy and Performance Simulation. Volume III: Appendices A through F

    DOT National Transportation Integrated Search

    1981-09-01

    Volume III is the third and last volume of a three volume document describing the computer program HEVSIM. This volume includes appendices which list the HEVSIM program, sample part data, some typical outputs and updated nomenclature.

  9. Computer Science | Classification | College of Engineering & Applied

    Science.gov Websites

    EMS 1011 profile photo Adrian Dumitrescu, Ph.D.ProfessorComputer Science(414) 229-4265Eng & Math @uwm.eduEng & Math Sciences 919 profile photo Hossein Hosseini, Ph.D.ProfessorComputer Science(414) 229 -5184hosseini@uwm.eduEng & Math Sciences 1091 profile photo Amol Mali, Ph.D.Associate ProfessorComputer

  10. Computers in Science Education: Can They Go Far Enough? Have We Gone Too Far?

    ERIC Educational Resources Information Center

    Schrock, John Richard

    1984-01-01

    Indicates that although computers may churn out creative research, science is still dependent on science education, and that science education consists of increasing human experience. Also considers uses and misuses of computers in the science classroom, examining Edgar Dale's "cone of experience" related to laboratory computer and "extended…

  11. Macmillan Encyclopedia of Chemistry (edited by Joseph J. Lagowski)

    NASA Astrophysics Data System (ADS)

    Kauffman, George B.

    1998-11-01

    Macmillan: New York, 1997. Four volumes. Figs., tables. lxxi + 1696 pp. 22.0 x 28.5 cm. $400. ISBN 0-02-897225-2. This latest addition to Macmillan's series of comprehensive core science encyclopedias (previous sets dealt with physics and earth sciences) will be of particular interest to readers of this Journal, for it is edited by longtime Journal of Chemical Education editor Joe Lagowski, assisted by a board of five distinguished associate editors. The attractively priced set offers clear explanations of the phenomena and concepts of chemistry and its materials, whether found in industry, the laboratory, or the natural world. It is intended for a broad spectrum of readers-professionals whose work draws on chemical concepts and knowledge (e.g., material scientists, engineers, health workers, biotechnologists, mathematicians, and computer programmers), science teachers at all levels from kindergarten to high school, high school and college students interested in medicine or the sciences, college and university professors, and laypersons desiring information on practical aspects of chemistry (e.g., household cleaning products, food and food additives, manufactured materials, herbicides, the human body, sweeteners, and animal communication).

  12. Challenges in Managing Trustworthy Large-scale Digital Science

    NASA Astrophysics Data System (ADS)

    Evans, B. J. K.

    2017-12-01

    The increased use of large-scale international digital science has opened a number of challenges for managing, handling, using and preserving scientific information. The large volumes of information are driven by three main categories - model outputs including coupled models and ensembles, data products that have been processing to a level of usability, and increasingly heuristically driven data analysis. These data products are increasingly the ones that are usable by the broad communities, and far in excess of the raw instruments data outputs. The data, software and workflows are then shared and replicated to allow broad use at an international scale, which places further demands of infrastructure to support how the information is managed reliably across distributed resources. Users necessarily rely on these underlying "black boxes" so that they are productive to produce new scientific outcomes. The software for these systems depend on computational infrastructure, software interconnected systems, and information capture systems. This ranges from the fundamentals of the reliability of the compute hardware, system software stacks and libraries, and the model software. Due to these complexities and capacity of the infrastructure, there is an increased emphasis of transparency of the approach and robustness of the methods over the full reproducibility. Furthermore, with large volume data management, it is increasingly difficult to store the historical versions of all model and derived data. Instead, the emphasis is on the ability to access the updated products and the reliability by which both previous outcomes are still relevant and can be updated for the new information. We will discuss these challenges and some of the approaches underway that are being used to address these issues.

  13. Unraveling the Complexities of Life Sciences Data.

    PubMed

    Higdon, Roger; Haynes, Winston; Stanberry, Larissa; Stewart, Elizabeth; Yandl, Gregory; Howard, Chris; Broomall, William; Kolker, Natali; Kolker, Eugene

    2013-03-01

    The life sciences have entered into the realm of big data and data-enabled science, where data can either empower or overwhelm. These data bring the challenges of the 5 Vs of big data: volume, veracity, velocity, variety, and value. Both independently and through our involvement with DELSA Global (Data-Enabled Life Sciences Alliance, DELSAglobal.org), the Kolker Lab ( kolkerlab.org ) is creating partnerships that identify data challenges and solve community needs. We specialize in solutions to complex biological data challenges, as exemplified by the community resource of MOPED (Model Organism Protein Expression Database, MOPED.proteinspire.org ) and the analysis pipeline of SPIRE (Systematic Protein Investigative Research Environment, PROTEINSPIRE.org ). Our collaborative work extends into the computationally intensive tasks of analysis and visualization of millions of protein sequences through innovative implementations of sequence alignment algorithms and creation of the Protein Sequence Universe tool (PSU). Pushing into the future together with our collaborators, our lab is pursuing integration of multi-omics data and exploration of biological pathways, as well as assigning function to proteins and porting solutions to the cloud. Big data have come to the life sciences; discovering the knowledge in the data will bring breakthroughs and benefits.

  14. Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April, 1986 through September 30, 1986 is summarized.

  15. 78 FR 10180 - Annual Computational Science Symposium; Conference

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-13

    ...] Annual Computational Science Symposium; Conference AGENCY: Food and Drug Administration, HHS. ACTION... Computational Science Symposium.'' The purpose of the conference is to help the broader community align and share experiences to advance computational science. At the conference, which will bring together FDA...

  16. Computational Cellular Dynamics Based on the Chemical Master Equation: A Challenge for Understanding Complexity

    PubMed Central

    Liang, Jie; Qian, Hong

    2010-01-01

    Modern molecular biology has always been a great source of inspiration for computational science. Half a century ago, the challenge from understanding macromolecular dynamics has led the way for computations to be part of the tool set to study molecular biology. Twenty-five years ago, the demand from genome science has inspired an entire generation of computer scientists with an interest in discrete mathematics to join the field that is now called bioinformatics. In this paper, we shall lay out a new mathematical theory for dynamics of biochemical reaction systems in a small volume (i.e., mesoscopic) in terms of a stochastic, discrete-state continuous-time formulation, called the chemical master equation (CME). Similar to the wavefunction in quantum mechanics, the dynamically changing probability landscape associated with the state space provides a fundamental characterization of the biochemical reaction system. The stochastic trajectories of the dynamics are best known through the simulations using the Gillespie algorithm. In contrast to the Metropolis algorithm, this Monte Carlo sampling technique does not follow a process with detailed balance. We shall show several examples how CMEs are used to model cellular biochemical systems. We shall also illustrate the computational challenges involved: multiscale phenomena, the interplay between stochasticity and nonlinearity, and how macroscopic determinism arises from mesoscopic dynamics. We point out recent advances in computing solutions to the CME, including exact solution of the steady state landscape and stochastic differential equations that offer alternatives to the Gilespie algorithm. We argue that the CME is an ideal system from which one can learn to understand “complex behavior” and complexity theory, and from which important biological insight can be gained. PMID:24999297

  17. Computational Cellular Dynamics Based on the Chemical Master Equation: A Challenge for Understanding Complexity.

    PubMed

    Liang, Jie; Qian, Hong

    2010-01-01

    Modern molecular biology has always been a great source of inspiration for computational science. Half a century ago, the challenge from understanding macromolecular dynamics has led the way for computations to be part of the tool set to study molecular biology. Twenty-five years ago, the demand from genome science has inspired an entire generation of computer scientists with an interest in discrete mathematics to join the field that is now called bioinformatics. In this paper, we shall lay out a new mathematical theory for dynamics of biochemical reaction systems in a small volume (i.e., mesoscopic) in terms of a stochastic, discrete-state continuous-time formulation, called the chemical master equation (CME). Similar to the wavefunction in quantum mechanics, the dynamically changing probability landscape associated with the state space provides a fundamental characterization of the biochemical reaction system. The stochastic trajectories of the dynamics are best known through the simulations using the Gillespie algorithm. In contrast to the Metropolis algorithm, this Monte Carlo sampling technique does not follow a process with detailed balance. We shall show several examples how CMEs are used to model cellular biochemical systems. We shall also illustrate the computational challenges involved: multiscale phenomena, the interplay between stochasticity and nonlinearity, and how macroscopic determinism arises from mesoscopic dynamics. We point out recent advances in computing solutions to the CME, including exact solution of the steady state landscape and stochastic differential equations that offer alternatives to the Gilespie algorithm. We argue that the CME is an ideal system from which one can learn to understand "complex behavior" and complexity theory, and from which important biological insight can be gained.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hules, John

    This 1998 annual report from the National Scientific Energy Research Computing Center (NERSC) presents the year in review of the following categories: Computational Science; Computer Science and Applied Mathematics; and Systems and Services. Also presented are science highlights in the following categories: Basic Energy Sciences; Biological and Environmental Research; Fusion Energy Sciences; High Energy and Nuclear Physics; and Advanced Scientific Computing Research and Other Projects.

  19. Enduring Influence of Stereotypical Computer Science Role Models on Women's Academic Aspirations

    ERIC Educational Resources Information Center

    Cheryan, Sapna; Drury, Benjamin J.; Vichayapai, Marissa

    2013-01-01

    The current work examines whether a brief exposure to a computer science role model who fits stereotypes of computer scientists has a lasting influence on women's interest in the field. One-hundred undergraduate women who were not computer science majors met a female or male peer role model who embodied computer science stereotypes in appearance…

  20. A Web of Resources for Introductory Computer Science.

    ERIC Educational Resources Information Center

    Rebelsky, Samuel A.

    As the field of Computer Science has grown, the syllabus of the introductory Computer Science course has changed significantly. No longer is it a simple introduction to programming or a tutorial on computer concepts and applications. Rather, it has become a survey of the field of Computer Science, touching on a wide variety of topics from digital…

  1. Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April l, 1988 through September 30, 1988.

  2. Summary of research in applied mathematics, numerical analysis and computer science at the Institute for Computer Applications in Science and Engineering

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science during the period October 1, 1983 through March 31, 1984 is summarized.

  3. Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1986 through March 31, 1987 is summarized.

  4. High school computer science education paves the way for higher education: the Israeli case

    NASA Astrophysics Data System (ADS)

    Armoni, Michal; Gal-Ezer, Judith

    2014-07-01

    The gap between enrollments in higher education computing programs and the high-tech industry's demands is widely reported, and is especially prominent for women. Increasing the availability of computer science education in high school is one of the strategies suggested in order to address this gap. We look at the connection between exposure to computer science in high school and pursuing computing in higher education. We also examine the gender gap, in the context of high school computer science education. We show that in Israel, students who took the high-level computer science matriculation exam were more likely to pursue computing in higher education. Regarding the issue of gender, we will show that, in general, in Israel the difference between males and females who take computer science in high school is relatively small, and a larger, though still not very large difference exists only for the highest exam level. In addition, exposing females to high-level computer science in high school has more relative impact on pursuing higher education in computing.

  5. Defining Computational Thinking for Mathematics and Science Classrooms

    NASA Astrophysics Data System (ADS)

    Weintrop, David; Beheshti, Elham; Horn, Michael; Orton, Kai; Jona, Kemi; Trouille, Laura; Wilensky, Uri

    2016-02-01

    Science and mathematics are becoming computational endeavors. This fact is reflected in the recently released Next Generation Science Standards and the decision to include "computational thinking" as a core scientific practice. With this addition, and the increased presence of computation in mathematics and scientific contexts, a new urgency has come to the challenge of defining computational thinking and providing a theoretical grounding for what form it should take in school science and mathematics classrooms. This paper presents a response to this challenge by proposing a definition of computational thinking for mathematics and science in the form of a taxonomy consisting of four main categories: data practices, modeling and simulation practices, computational problem solving practices, and systems thinking practices. In formulating this taxonomy, we draw on the existing computational thinking literature, interviews with mathematicians and scientists, and exemplary computational thinking instructional materials. This work was undertaken as part of a larger effort to infuse computational thinking into high school science and mathematics curricular materials. In this paper, we argue for the approach of embedding computational thinking in mathematics and science contexts, present the taxonomy, and discuss how we envision the taxonomy being used to bring current educational efforts in line with the increasingly computational nature of modern science and mathematics.

  6. Biological Visualization, Imaging and Simulation(Bio-VIS) at NASA Ames Research Center: Developing New Software and Technology for Astronaut Training and Biology Research in Space

    NASA Technical Reports Server (NTRS)

    Smith, Jeffrey

    2003-01-01

    The Bio- Visualization, Imaging and Simulation (BioVIS) Technology Center at NASA's Ames Research Center is dedicated to developing and applying advanced visualization, computation and simulation technologies to support NASA Space Life Sciences research and the objectives of the Fundamental Biology Program. Research ranges from high resolution 3D cell imaging and structure analysis, virtual environment simulation of fine sensory-motor tasks, computational neuroscience and biophysics to biomedical/clinical applications. Computer simulation research focuses on the development of advanced computational tools for astronaut training and education. Virtual Reality (VR) and Virtual Environment (VE) simulation systems have become important training tools in many fields from flight simulation to, more recently, surgical simulation. The type and quality of training provided by these computer-based tools ranges widely, but the value of real-time VE computer simulation as a method of preparing individuals for real-world tasks is well established. Astronauts routinely use VE systems for various training tasks, including Space Shuttle landings, robot arm manipulations and extravehicular activities (space walks). Currently, there are no VE systems to train astronauts for basic and applied research experiments which are an important part of many missions. The Virtual Glovebox (VGX) is a prototype VE system for real-time physically-based simulation of the Life Sciences Glovebox where astronauts will perform many complex tasks supporting research experiments aboard the International Space Station. The VGX consists of a physical display system utilizing duel LCD projectors and circular polarization to produce a desktop-sized 3D virtual workspace. Physically-based modeling tools (Arachi Inc.) provide real-time collision detection, rigid body dynamics, physical properties and force-based controls for objects. The human-computer interface consists of two magnetic tracking devices (Ascention Inc.) attached to instrumented gloves (Immersion Inc.) which co-locate the user's hands with hand/forearm representations in the virtual workspace. Force-feedback is possible in a work volume defined by a Phantom Desktop device (SensAble inc.). Graphics are written in OpenGL. The system runs on a 2.2 GHz Pentium 4 PC. The prototype VGX provides astronauts and support personnel with a real-time physically-based VE system to simulate basic research tasks both on Earth and in the microgravity of Space. The immersive virtual environment of the VGX also makes it a useful tool for virtual engineering applications including CAD development, procedure design and simulation of human-system systems in a desktop-sized work volume.

  7. Views from the Field on Needs in Precollege Science, Mathematics, and Social Studies Education.

    ERIC Educational Resources Information Center

    Buccino, Alphonse; Evans, Paul L.

    1981-01-01

    Summarizes the findings reported in the final volume of an eight-volume series sponsored by the National Science Foundation that deal with the key elements responsible for influencing and shaping science, mathematics, and social studies education at the precollege level. (CS)

  8. Nicholas Brunhart-Lupo | NREL

    Science.gov Websites

    . Education Ph.D., Computer Science, Colorado School of Mines M.S., Computer Science, University of Queensland B.S., Computer Science, Colorado School of Mines Brunhart-Lupo Nicholas Brunhart-Lupo Computational Science Nicholas.Brunhart-Lupo@nrel.gov

  9. The Need for Computer Science

    ERIC Educational Resources Information Center

    Margolis, Jane; Goode, Joanna; Bernier, David

    2011-01-01

    Broadening computer science learning to include more students is a crucial item on the United States' education agenda, these authors say. Although policymakers advocate more computer science expertise, computer science offerings in high schools are few--and actually shrinking. In addition, poorly resourced schools with a high percentage of…

  10. Summary of research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1988 through March 31, 1989 is summarized.

  11. Computer Program (HEVSIM) for Heavy Duty Vehicle Fuel Economy and Performance Simulation. Volume II: Users' Manual.

    DOT National Transportation Integrated Search

    1981-09-01

    Volume II is the second volume of a three volume document describing the computer program HEVSIM for use with buses and heavy duty trucks. This volume is a user's manual describing how to prepare data input and execute the program. A strong effort ha...

  12. Alliance for Computational Science Collaboration HBCU Partnership at Fisk University. Final Report 2001

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, W. E.

    2004-08-16

    Computational Science plays a big role in research and development in mathematics, science, engineering and biomedical disciplines. The Alliance for Computational Science Collaboration (ACSC) has the goal of training African-American and other minority scientists in the computational science field for eventual employment with the Department of Energy (DOE). The involvements of Historically Black Colleges and Universities (HBCU) in the Alliance provide avenues for producing future DOE African-American scientists. Fisk University has been participating in this program through grants from the DOE. The DOE grant supported computational science activities at Fisk University. The research areas included energy related projects, distributed computing,more » visualization of scientific systems and biomedical computing. Students' involvement in computational science research included undergraduate summer research at Oak Ridge National Lab, on-campus research involving the participation of undergraduates, participation of undergraduate and faculty members in workshops, and mentoring of students. These activities enhanced research and education in computational science, thereby adding to Fisk University's spectrum of research and educational capabilities. Among the successes of the computational science activities are the acceptance of three undergraduate students to graduate schools with full scholarships beginning fall 2002 (one for master degree program and two for Doctoral degree program).« less

  13. Curricular Influences on Female Afterschool Facilitators' Computer Science Interests and Career Choices

    NASA Astrophysics Data System (ADS)

    Koch, Melissa; Gorges, Torie

    2016-10-01

    Underrepresented populations such as women, African-Americans, and Latinos/as often come to STEM (science, technology, engineering, and mathematics) careers by less traditional paths than White and Asian males. To better understand how and why women might shift toward STEM, particularly computer science, careers, we investigated the education and career direction of afterschool facilitators, primarily women of color in their twenties and thirties, who taught Build IT, an afterschool computer science curriculum for middle school girls. Many of these women indicated that implementing Build IT had influenced their own interest in technology and computer science and in some cases had resulted in their intent to pursue technology and computer science education. We wanted to explore the role that teaching Build IT may have played in activating or reactivating interest in careers in computer science and to see whether in the years following implementation of Build IT, these women pursued STEM education and/or careers. We reached nine facilitators who implemented the program in 2011-12 or shortly after. Many indicated that while facilitating Build IT, they learned along with the participants, increasing their interest in and confidence with technology and computer science. Seven of the nine participants pursued further STEM or computer science learning or modified their career paths to include more of a STEM or computer science focus. Through interviews, we explored what aspects of Build IT influenced these facilitators' interest and confidence in STEM and when relevant their pursuit of technology and computer science education and careers.

  14. The NASA computer science research program plan

    NASA Technical Reports Server (NTRS)

    1983-01-01

    A taxonomy of computer science is included, one state of the art of each of the major computer science categories is summarized. A functional breakdown of NASA programs under Aeronautics R and D, space R and T, and institutional support is also included. These areas were assessed against the computer science categories. Concurrent processing, highly reliable computing, and information management are identified.

  15. CSUNSat-1 CubeSat – ELaNa XVII

    NASA Image and Video Library

    2017-04-04

    The primary mission of CSUNSat1 is to space test an innovative low temperature capable energy storage system developed by the Jet Propulsion Laboratory, raising its TRL level to 7 from 4 to 5. The success of this energy storage system will enable future missions, especially those in deep space to do more science while requiring less energy, mass and volume. This CubeSat was designed, built, programmed, and tested by a team of over 70 engineering and computer science students at CSUN.  The primary source of funding for CSUNSat1 comes from NASA’s Smallest Technology Partnership program. Launched by NASA’s CubeSat Launch Initiative on the NET April 18, 2017 ELaNa XVII mission on the seventh Orbital-ATK Cygnus Commercial Resupply Services (OA-7) to the International Space Station and deployed on tbd.

  16. Geomorphological activity at a rock glacier front detected with a 3D density-based clustering algorithm

    NASA Astrophysics Data System (ADS)

    Micheletti, Natan; Tonini, Marj; Lane, Stuart N.

    2017-02-01

    Acquisition of high density point clouds using terrestrial laser scanners (TLSs) has become commonplace in geomorphic science. The derived point clouds are often interpolated onto regular grids and the grids compared to detect change (i.e. erosion and deposition/advancement movements). This procedure is necessary for some applications (e.g. digital terrain analysis), but it inevitably leads to a certain loss of potentially valuable information contained within the point clouds. In the present study, an alternative methodology for geomorphological analysis and feature detection from point clouds is proposed. It rests on the use of the Density-Based Spatial Clustering of Applications with Noise (DBSCAN), applied to TLS data for a rock glacier front slope in the Swiss Alps. The proposed methods allowed the detection and isolation of movements directly from point clouds which yield to accuracies in the following computation of volumes that depend only on the actual registered distance between points. We demonstrated that these values are more conservative than volumes computed with the traditional DEM comparison. The results are illustrated for the summer of 2015, a season of enhanced geomorphic activity associated with exceptionally high temperatures.

  17. On teaching computer ethics within a computer science department.

    PubMed

    Quinn, Michael J

    2006-04-01

    The author has surveyed a quarter of the accredited undergraduate computer science programs in the United States. More than half of these programs offer a 'social and ethical implications of computing' course taught by a computer science faculty member, and there appears to be a trend toward teaching ethics classes within computer science departments. Although the decision to create an 'in house' computer ethics course may sometimes be a pragmatic response to pressure from the accreditation agency, this paper argues that teaching ethics within a computer science department can provide students and faculty members with numerous benefits. The paper lists topics that can be covered in a computer ethics course and offers some practical suggestions for making the course successful.

  18. Computational Science News | Computational Science | NREL

    Science.gov Websites

    -Cooled High-Performance Computing Technology at the ESIF February 28, 2018 NREL Launches New Website for High-Performance Computing System Users The National Renewable Energy Laboratory (NREL) Computational Science Center has launched a revamped website for users of the lab's high-performance computing (HPC

  19. Stability Analysis of Finite Difference Approximations to Hyperbolic Systems, and Problems in Applied and Computational Matrix Theory

    DTIC Science & Technology

    1988-07-08

    Marcus and C. Baczynski), Computer Science Press, Rockville, Maryland, 1986. 3. An Introduction to Pascal and Precalculus , Computer Science Press...Science Press, Rockville, Maryland, 1986. 35. An Introduction to Pascal and Precalculus , Computer Science Press, Rockville, Maryland, 1986. 36

  20. Empirical Determination of Competence Areas to Computer Science Education

    ERIC Educational Resources Information Center

    Zendler, Andreas; Klaudt, Dieter; Seitz, Cornelia

    2014-01-01

    The authors discuss empirically determined competence areas to K-12 computer science education, emphasizing the cognitive level of competence. The results of a questionnaire with 120 professors of computer science serve as a database. By using multi-dimensional scaling and cluster analysis, four competence areas to computer science education…

  1. Factors Influencing Exemplary Science Teachers' Levels of Computer Use

    ERIC Educational Resources Information Center

    Hakverdi, Meral; Dana, Thomas M.; Swain, Colleen

    2011-01-01

    The purpose of this study was to examine exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their…

  2. Preparing Future Secondary Computer Science Educators

    ERIC Educational Resources Information Center

    Ajwa, Iyad

    2007-01-01

    Although nearly every college offers a major in computer science, many computer science teachers at the secondary level have received little formal training. This paper presents details of a project that could make a significant contribution to national efforts to improve computer science education by combining teacher education and professional…

  3. Novel imaging analysis system to measure the spatial dimension of engineered tissue construct.

    PubMed

    Choi, Kyoung-Hwan; Yoo, Byung-Su; Park, So Ra; Choi, Byung Hyune; Min, Byoung-Hyun

    2010-02-01

    The measurement of the spatial dimensions of tissue-engineered constructs is very important for their clinical applications. In this study, a novel method to measure the volume of tissue-engineered constructs was developed using iterative mathematical computations. The method measures and analyzes three-dimensional (3D) parameters of a construct to estimate its actual volume using a sequence of software-based mathematical algorithms. The mathematical algorithm is composed of two stages: the shape extraction and the determination of volume. The shape extraction utilized 3D images of a construct: length, width, and thickness, captured by a high-quality camera with charge coupled device. The surface of the 3D images was then divided into fine sections. The area of each section was measured and combined to obtain the total surface area. The 3D volume of the target construct was then mathematically obtained using its total surface area and thickness. The accuracy of the measurement method was verified by comparing the results with those obtained from the hydrostatic weighing method (Korea Research Institute of Standards and Science [KRISS], Korea). The mean difference in volume between two methods was 0.0313 +/- 0.0003% (n = 5, P = 0.523) with no significant statistical difference. In conclusion, our image-based spatial measurement system is a reliable and easy method to obtain an accurate 3D volume of a tissue-engineered construct.

  4. OPENING REMARKS: SciDAC: Scientific Discovery through Advanced Computing

    NASA Astrophysics Data System (ADS)

    Strayer, Michael

    2005-01-01

    Good morning. Welcome to SciDAC 2005 and San Francisco. SciDAC is all about computational science and scientific discovery. In a large sense, computational science characterizes SciDAC and its intent is change. It transforms both our approach and our understanding of science. It opens new doors and crosses traditional boundaries while seeking discovery. In terms of twentieth century methodologies, computational science may be said to be transformational. There are a number of examples to this point. First are the sciences that encompass climate modeling. The application of computational science has in essence created the field of climate modeling. This community is now international in scope and has provided precision results that are challenging our understanding of our environment. A second example is that of lattice quantum chromodynamics. Lattice QCD, while adding precision and insight to our fundamental understanding of strong interaction dynamics, has transformed our approach to particle and nuclear science. The individual investigator approach has evolved to teams of scientists from different disciplines working side-by-side towards a common goal. SciDAC is also undergoing a transformation. This meeting is a prime example. Last year it was a small programmatic meeting tracking progress in SciDAC. This year, we have a major computational science meeting with a variety of disciplines and enabling technologies represented. SciDAC 2005 should position itself as a new corner stone for Computational Science and its impact on science. As we look to the immediate future, FY2006 will bring a new cycle to SciDAC. Most of the program elements of SciDAC will be re-competed in FY2006. The re-competition will involve new instruments for computational science, new approaches for collaboration, as well as new disciplines. There will be new opportunities for virtual experiments in carbon sequestration, fusion, and nuclear power and nuclear waste, as well as collaborations with industry and virtual prototyping. New instruments of collaboration will include institutes and centers while summer schools, workshops and outreach will invite new talent and expertise. Computational science adds new dimensions to science and its practice. Disciplines of fusion, accelerator science, and combustion are poised to blur the boundaries between pure and applied science. As we open the door into FY2006 we shall see a landscape of new scientific challenges: in biology, chemistry, materials, and astrophysics to name a few. The enabling technologies of SciDAC have been transformational as drivers of change. Planning for major new software systems assumes a base line employing Common Component Architectures and this has become a household word for new software projects. While grid algorithms and mesh refinement software have transformed applications software, data management and visualization have transformed our understanding of science from data. The Gordon Bell prize now seems to be dominated by computational science and solvers developed by TOPS ISIC. The priorities of the Office of Science in the Department of Energy are clear. The 20 year facilities plan is driven by new science. High performance computing is placed amongst the two highest priorities. Moore's law says that by the end of the next cycle of SciDAC we shall have peta-flop computers. The challenges of petascale computing are enormous. These and the associated computational science are the highest priorities for computing within the Office of Science. Our effort in Leadership Class computing is just a first step towards this goal. Clearly, computational science at this scale will face enormous challenges and possibilities. Performance evaluation and prediction will be critical to unraveling the needed software technologies. We must not lose sight of our overarching goal—that of scientific discovery. Science does not stand still and the landscape of science discovery and computing holds immense promise. In this environment, I believe it is necessary to institute a system of science based performance metrics to help quantify our progress towards science goals and scientific computing. As a final comment I would like to reaffirm that the shifting landscapes of science will force changes to our computational sciences, and leave you with the quote from Richard Hamming, 'The purpose of computing is insight, not numbers'.

  5. Enabling Wide-Scale Computer Science Education through Improved Automated Assessment Tools

    NASA Astrophysics Data System (ADS)

    Boe, Bryce A.

    There is a proliferating demand for newly trained computer scientists as the number of computer science related jobs continues to increase. University programs will only be able to train enough new computer scientists to meet this demand when two things happen: when there are more primary and secondary school students interested in computer science, and when university departments have the resources to handle the resulting increase in enrollment. To meet these goals, significant effort is being made to both incorporate computational thinking into existing primary school education, and to support larger university computer science class sizes. We contribute to this effort through the creation and use of improved automated assessment tools. To enable wide-scale computer science education we do two things. First, we create a framework called Hairball to support the static analysis of Scratch programs targeted for fourth, fifth, and sixth grade students. Scratch is a popular building-block language utilized to pique interest in and teach the basics of computer science. We observe that Hairball allows for rapid curriculum alterations and thus contributes to wide-scale deployment of computer science curriculum. Second, we create a real-time feedback and assessment system utilized in university computer science classes to provide better feedback to students while reducing assessment time. Insights from our analysis of student submission data show that modifications to the system configuration support the way students learn and progress through course material, making it possible for instructors to tailor assignments to optimize learning in growing computer science classes.

  6. United States Air Force Summer Research Program -- 1993. Volume 6. Arnold Engineering Development Center, Frank J. Seiler Research Laboratory, Wilford Hall Medical Center

    DTIC Science & Technology

    1993-12-01

    where negative charge state. The local symmetry of the Ge(I) and Ge(II) centers are CI and C2 respectively. (See also Fig. 1.) q=- 1 Ge(I) Ge(II) s p...Raymond Field: Dept. of Computer Science Dept, CEM. M•e s , PhD Laboratory: / 3200 Willow Creek Road zmbry-Riddle Aeronautical Univ Vol-Page No: 0- 0...Field: Electrical Engineering Assistant Professor, PhD Laboratory: PL/WS 2390 S . York Street University of Denver Vol-Page No: 3-35 Denver, CO 80209-0177

  7. A cost and utility analysis of NIM/CAMAC standards and equipment for shuttle payload data acquisition and control systems. Volume 2: Tasks 1 and 2

    NASA Technical Reports Server (NTRS)

    1976-01-01

    A representative set of payloads for both science and applications disciplines were selected that would ensure a realistic and statistically significant estimate of equipment utilization. The selected payloads were analyzed to determine the applicability of Nuclear Instrumentation Modular (NIM)/Computer Automated Measurement Control (CAMAC) equipment in satisfying their data acquisition and control requirements. The analyses results were combined with the comparable results from related studies to arrive at an overall assessment of the applicability and commonality of NIM/CAMAC equipment usage across the spectrum of payloads.

  8. NSSDC activities with 12-inch optical disk drives

    NASA Technical Reports Server (NTRS)

    Lowrey, Barbara E.; Lopez-Swafford, Brian

    1986-01-01

    The development status of optical-disk data transfer and storage technology at the National Space Science Data Center (NSSDC) is surveyed. The aim of the R&D program is to facilitate the exchange of large volumes of data. Current efforts focus on a 12-inch 1-Gbyte write-once/read-many disk and a disk drive which interfaces with VAX/VMS computer systems. The history of disk development at NSSDC is traced; the results of integration and performance tests are summarized; the operating principles of the 12-inch system are explained and illustrated with diagrams; and the need for greater standardization is indicated.

  9. Salary-Trend Studies of Faculty for the Years 1985-86 and 1988-89 in the Following Disciplines/Major Fields: Accounting; Agricultural Production; Anthropology; Architecture and Environmental Design; Area and Ethnic Studies; Audiology and Speech Pathology; Business Administration and Management; Business and Management; Business Economics; Chemistry; Communication Technologies; Communications; Computer and Information Sciences; Curriculum and Instruction; and Dramatic Arts.

    ERIC Educational Resources Information Center

    Howe, Richard D.; And Others

    This volume provides comparative data for faculty salaries in public and private colleges, based on an annual survey of over 700 colleges and universities. Data cover the following 15 disciplines: accounting, agribusiness and agricultural production, anthropology, architecture and environmental design, area and ethnic studies, audiology and speech…

  10. What Works: Building Natural Science Communities. Resources for Reform. Strengthening Undergraduate Science and Mathematics. A Report of Project Kaleidoscope. Volume Two.

    ERIC Educational Resources Information Center

    Narum, Jeanne L., Ed.

    The purpose of Project Kaleidoscope is to be a catalyst for action to encourage a national environment for reform in undergraduate education in science and mathematics in the United States. This report, the second of two volumes, presents ideas from Project Kaleidoscope that involve changing undergraduate science and mathematics education through…

  11. Third International Mathematics and Science Study 1999 Video Study Technical Report: Volume 2--Science. Technical Report. NCES 2011-049

    ERIC Educational Resources Information Center

    Garnier, Helen E.; Lemmens, Meike; Druker, Stephen L.; Roth, Kathleen J.

    2011-01-01

    This second volume of the Third International Mathematics and Science Study (TIMSS) 1999 Video Study Technical Report focuses on every aspect of the planning, implementation, processing, analysis, and reporting of the science components of the TIMSS 1999 Video Study. The report is intended to serve as a record of the actions and documentation of…

  12. Research in Science Education, Volume 5. Proceedings of the Annual Conference of the Australian Science Education Research Association (6th, Flinders University, Bedford Park, South Australia, May 19-21, 1975).

    ERIC Educational Resources Information Center

    Lucas, A. M., Ed.; Power, Colin, N., Ed.

    This volume contains papers presented at the sixth Annual Conference of the Australian Science Education Research Association (ASERA) held at Flinders University in May, 1975. Paper topics include: pupil learning and classroom climate, teacher structuring behavior, the Australian Science Education Project (ASEP), cognitive preference and…

  13. Programmers, professors, and parasites: credit and co-authorship in computer science.

    PubMed

    Solomon, Justin

    2009-12-01

    This article presents an in-depth analysis of past and present publishing practices in academic computer science to suggest the establishment of a more consistent publishing standard. Historical precedent for academic publishing in computer science is established through the study of anecdotes as well as statistics collected from databases of published computer science papers. After examining these facts alongside information about analogous publishing situations and standards in other scientific fields, the article concludes with a list of basic principles that should be adopted in any computer science publishing standard. These principles would contribute to the reliability and scientific nature of academic publications in computer science and would allow for more straightforward discourse in future publications.

  14. Enabling the transition towards Earth Observation Science 2.0

    NASA Astrophysics Data System (ADS)

    Mathieu, Pierre-Philippe; Desnos, Yves-Louis

    2015-04-01

    Science 2.0 refers to the rapid and systematic changes in doing Research and organising Science driven by the rapid advances in ICT and digital technologies combined with a growing demand to do Science for Society (actionable research) and in Society (co-design of knowledge). Nowadays, teams of researchers around the world can easily access a wide range of open data across disciplines and remotely process them on the Cloud, combining them with their own data to generate knowledge, develop information products for societal applications, and tackle complex integrative complex problems that could not be addressed a few years ago. Such rapid exchange of digital data is fostering a new world of data-intensive research, characterized by openness, transparency, and scrutiny and traceability of results, access to large volume of complex data, availability of community open tools, unprecedented level of computing power, and new collaboration among researchers and new actors such as citizen scientists. The EO scientific community is now facing the challenge of responding to this new paradigm in science 2.0 in order to make the most of the large volume of complex and diverse data delivered by the new generation of EO missions, and in particular the Sentinels. In this context, ESA - in particular within the framework of the Scientific Exploitation of Operational Missions (SEOM) element - is supporting a variety of activities in partnership with research communities to ease the transition and make the most of the data. These include the generation of new open tools and exploitation platforms, exploring new ways to exploit data on cloud-based platforms, dissiminate data, building new partnership with citizen scientists, and training the new generation of data scientists. The paper will give a brief overview of some of ESA activities aiming to facilitate the exploitation of large amount of data from EO missions in a collaborative, cross-disciplinary, and open way, from science to applications and education.

  15. Evaluation of Big Data Containers for Popular Storage, Retrieval, and Computation Primitives in Earth Science Analysis

    NASA Astrophysics Data System (ADS)

    Das, K.; Clune, T.; Kuo, K. S.; Mattmann, C. A.; Huang, T.; Duffy, D.; Yang, C. P.; Habermann, T.

    2015-12-01

    Data containers are infrastructures that facilitate storage, retrieval, and analysis of data sets. Big data applications in Earth Science require a mix of processing techniques, data sources and storage formats that are supported by different data containers. Some of the most popular data containers used in Earth Science studies are Hadoop, Spark, SciDB, AsterixDB, and RasDaMan. These containers optimize different aspects of the data processing pipeline and are, therefore, suitable for different types of applications. These containers are expected to undergo rapid evolution and the ability to re-test, as they evolve, is very important to ensure the containers are up to date and ready to be deployed to handle large volumes of observational data and model output. Our goal is to develop an evaluation plan for these containers to assess their suitability for Earth Science data processing needs. We have identified a selection of test cases that are relevant to most data processing exercises in Earth Science applications and we aim to evaluate these systems for optimal performance against each of these test cases. The use cases identified as part of this study are (i) data fetching, (ii) data preparation for multivariate analysis, (iii) data normalization, (iv) distance (kernel) computation, and (v) optimization. In this study we develop a set of metrics for performance evaluation, define the specifics of governance, and test the plan on current versions of the data containers. The test plan and the design mechanism are expandable to allow repeated testing with both new containers and upgraded versions of the ones mentioned above, so that we can gauge their utility as they evolve.

  16. An Example Emphasizing Mass-Volume Relationships for Problem Solving in Soils

    ERIC Educational Resources Information Center

    Heitman, J. L.; Vepraskas, M. J.

    2009-01-01

    Mass-volume relationships are a useful tool emphasized for problem solving in many geo-science and engineering applications. These relationships also have useful applications in soil science. Developing soils students' ability to utilize mass-volume relationships through schematic diagrams of soil phases (i.e., air, water, and solid) can help to…

  17. The medical science DMZ: a network design pattern for data-intensive medical science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peisert, Sean; Dart, Eli; Barnett, William

    We describe a detailed solution for maintaining high-capacity, data-intensive network flows (eg, 10, 40, 100 Gbps+) in a scientific, medical context while still adhering to security and privacy laws and regulations.High-end networking, packet-filter firewalls, network intrusion-detection systems.We describe a "Medical Science DMZ" concept as an option for secure, high-volume transport of large, sensitive datasets between research institutions over national research networks, and give 3 detailed descriptions of implemented Medical Science DMZs.The exponentially increasing amounts of "omics" data, high-quality imaging, and other rapidly growing clinical datasets have resulted in the rise of biomedical research "Big Data." The storage, analysis, and networkmore » resources required to process these data and integrate them into patient diagnoses and treatments have grown to scales that strain the capabilities of academic health centers. Some data are not generated locally and cannot be sustained locally, and shared data repositories such as those provided by the National Library of Medicine, the National Cancer Institute, and international partners such as the European Bioinformatics Institute are rapidly growing. The ability to store and compute using these data must therefore be addressed by a combination of local, national, and industry resources that exchange large datasets. Maintaining data-intensive flows that comply with the Health Insurance Portability and Accountability Act (HIPAA) and other regulations presents a new challenge for biomedical research. We describe a strategy that marries performance and security by borrowing from and redefining the concept of a Science DMZ, a framework that is used in physical sciences and engineering research to manage high-capacity data flows.By implementing a Medical Science DMZ architecture, biomedical researchers can leverage the scale provided by high-performance computer and cloud storage facilities and national high-speed research networks while preserving privacy and meeting regulatory requirements.« less

  18. The medical science DMZ: a network design pattern for data-intensive medical science.

    PubMed

    Peisert, Sean; Dart, Eli; Barnett, William; Balas, Edward; Cuff, James; Grossman, Robert L; Berman, Ari; Shankar, Anurag; Tierney, Brian

    2017-10-06

    We describe a detailed solution for maintaining high-capacity, data-intensive network flows (eg, 10, 40, 100 Gbps+) in a scientific, medical context while still adhering to security and privacy laws and regulations. High-end networking, packet-filter firewalls, network intrusion-detection systems. We describe a "Medical Science DMZ" concept as an option for secure, high-volume transport of large, sensitive datasets between research institutions over national research networks, and give 3 detailed descriptions of implemented Medical Science DMZs. The exponentially increasing amounts of "omics" data, high-quality imaging, and other rapidly growing clinical datasets have resulted in the rise of biomedical research "Big Data." The storage, analysis, and network resources required to process these data and integrate them into patient diagnoses and treatments have grown to scales that strain the capabilities of academic health centers. Some data are not generated locally and cannot be sustained locally, and shared data repositories such as those provided by the National Library of Medicine, the National Cancer Institute, and international partners such as the European Bioinformatics Institute are rapidly growing. The ability to store and compute using these data must therefore be addressed by a combination of local, national, and industry resources that exchange large datasets. Maintaining data-intensive flows that comply with the Health Insurance Portability and Accountability Act (HIPAA) and other regulations presents a new challenge for biomedical research. We describe a strategy that marries performance and security by borrowing from and redefining the concept of a Science DMZ, a framework that is used in physical sciences and engineering research to manage high-capacity data flows. By implementing a Medical Science DMZ architecture, biomedical researchers can leverage the scale provided by high-performance computer and cloud storage facilities and national high-speed research networks while preserving privacy and meeting regulatory requirements. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  19. Increasing Diversity in Computer Science: Acknowledging, yet Moving Beyond, Gender

    NASA Astrophysics Data System (ADS)

    Larsen, Elizabeth A.; Stubbs, Margaret L.

    Lack of diversity within the computer science field has, thus far, been examined most fully through the lens of gender. This article is based on a follow-on to Margolis and Fisher's (2002) study and includes interviews with 33 Carnegie Mellon University students from the undergraduate senior class of 2002 in the School of Computer Science. We found evidence of similarities among the perceptions of these women and men on definitions of computer science, explanations for the notoriously low proportion of women in the field, characterizations of a typical computer science student, impressions of recent curricular changes, a sense of the atmosphere/culture in the program, views of the Women@SCS campus organization, and suggestions for attracting and retaining well-rounded students in computer science. We conclude that efforts to increase diversity in the computer science field will benefit from a more broad-based approach that considers, but is not limited to, notions of gender difference.

  20. Democratizing Computer Science

    ERIC Educational Resources Information Center

    Margolis, Jane; Goode, Joanna; Ryoo, Jean J.

    2015-01-01

    Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…

  1. Implementing an Affordable High-Performance Computing for Teaching-Oriented Computer Science Curriculum

    ERIC Educational Resources Information Center

    Abuzaghleh, Omar; Goldschmidt, Kathleen; Elleithy, Yasser; Lee, Jeongkyu

    2013-01-01

    With the advances in computing power, high-performance computing (HPC) platforms have had an impact on not only scientific research in advanced organizations but also computer science curriculum in the educational community. For example, multicore programming and parallel systems are highly desired courses in the computer science major. However,…

  2. Computer Science and the Liberal Arts

    ERIC Educational Resources Information Center

    Shannon, Christine

    2010-01-01

    Computer science and the liberal arts have much to offer each other. Yet liberal arts colleges, in particular, have been slow to recognize the opportunity that the study of computer science provides for achieving the goals of a liberal education. After the precipitous drop in computer science enrollments during the first decade of this century,…

  3. Marrying Content and Process in Computer Science Education

    ERIC Educational Resources Information Center

    Zendler, A.; Spannagel, C.; Klaudt, D.

    2011-01-01

    Constructivist approaches to computer science education emphasize that as well as knowledge, thinking skills and processes are involved in active knowledge construction. K-12 computer science curricula must not be based on fashions and trends, but on contents and processes that are observable in various domains of computer science, that can be…

  4. Computing Whether She Belongs: Stereotypes Undermine Girls' Interest and Sense of Belonging in Computer Science

    ERIC Educational Resources Information Center

    Master, Allison; Cheryan, Sapna; Meltzoff, Andrew N.

    2016-01-01

    Computer science has one of the largest gender disparities in science, technology, engineering, and mathematics. An important reason for this disparity is that girls are less likely than boys to enroll in necessary "pipeline courses," such as introductory computer science. Two experiments investigated whether high-school girls' lower…

  5. Approaching Gender Parity: Women in Computer Science at Afghanistan's Kabul University

    ERIC Educational Resources Information Center

    Plane, Jandelyn

    2010-01-01

    This study explores the representation of women in computer science at the tertiary level through data collected about undergraduate computer science education at Kabul University in Afghanistan. Previous studies have theorized reasons for underrepresentation of women in computer science, and while many of these reasons are indeed present in…

  6. Some Hail 'Computational Science' as Biggest Advance Since Newton, Galileo.

    ERIC Educational Resources Information Center

    Turner, Judith Axler

    1987-01-01

    Computational science is defined as science done on a computer. A computer can serve as a laboratory for researchers who cannot experiment with their subjects, and as a calculator for those who otherwise might need centuries to solve some problems mathematically. The National Science Foundation's support of supercomputers is discussed. (MLW)

  7. African-American males in computer science---Examining the pipeline for clogs

    NASA Astrophysics Data System (ADS)

    Stone, Daryl Bryant

    The literature on African-American males (AAM) begins with a statement to the effect that "Today young Black men are more likely to be killed or sent to prison than to graduate from college." Why are the numbers of African-American male college graduates decreasing? Why are those enrolled in college not majoring in the science, technology, engineering, and mathematics (STEM) disciplines? This research explored why African-American males are not filling the well-recognized industry need for Computer Scientist/Technologists by choosing college tracks to these careers. The literature on STEM disciplines focuses largely on women in STEM, as opposed to minorities, and within minorities, there is a noticeable research gap in addressing the needs and opportunities available to African-American males. The primary goal of this study was therefore to examine the computer science "pipeline" from the African-American male perspective. The method included a "Computer Science Degree Self-Efficacy Scale" be distributed to five groups of African-American male students, to include: (1) fourth graders, (2) eighth graders, (3) eleventh graders, (4) underclass undergraduate computer science majors, and (5) upperclass undergraduate computer science majors. In addition to a 30-question self-efficacy test, subjects from each group were asked to participate in a group discussion about "African-American males in computer science." The audio record of each group meeting provides qualitative data for the study. The hypotheses include the following: (1) There is no significant difference in "Computer Science Degree" self-efficacy between fourth and eighth graders. (2) There is no significant difference in "Computer Science Degree" self-efficacy between eighth and eleventh graders. (3) There is no significant difference in "Computer Science Degree" self-efficacy between eleventh graders and lower-level computer science majors. (4) There is no significant difference in "Computer Science Degree" self-efficacy between lower-level computer science majors and upper-level computer science majors. (5) There is no significant difference in "Computer Science Degree" self-efficacy between each of the five groups of students. Finally, the researcher selected African-American male students attending six primary schools, including the predominately African-American elementary, middle and high school that the researcher attended during his own academic career. Additionally, a racially mixed elementary, middle and high school was selected from the same county in Maryland. Bowie State University provided both the underclass and upperclass computer science majors surveyed in this study. Of the five hypotheses, the sample provided enough evidence to support the claim that there are significant differences in the "Computer Science Degree" self-efficacy between each of the five groups of students. ANOVA analysis by question and total self-efficacy scores provided more results of statistical significance. Additionally, factor analysis and review of the qualitative data provide more insightful results. Overall, the data suggest 'a clog' may exist in the middle school level and students attending racially mixed schools were more confident in their computer, math and science skills. African-American males admit to spending lots of time on social networking websites and emailing, but are 'dis-aware' of the skills and knowledge needed to study in the computing disciplines. The majority of the subjects knew little, if any, AAMs in the 'computing discipline pipeline'. The collegian African-American males, in this study, agree that computer programming is a difficult area and serves as a 'major clog in the pipeline'.

  8. Data Integration and Analysis System (DIAS) Contributing to the Sustainable Development Goals (SDGs)

    NASA Astrophysics Data System (ADS)

    Koike, T.

    2014-12-01

    It has been said that scientists and experts usually use 80% of their research time for data management (DOE, 2002). Only 20% of research time is used for purely scientific activities. This ratio should be reversed by introducing computer science technology. To realize this goal, the Japanese government supported the development of a data system called "Data Integration and Analysis System (DIAS)," as one of the national key projects promoted by the Council for Science and Technology Policy (CSTP) from 2006 to 2010. A follow-up 5-year project is also ongoing. The essential aim for the DIAS was to create knowledge that would enable solutions to problems and generate socioeconomic benefits. DIAS mainly consists of four data components including data injection, management, integration, and interoperability. DIAS is now tackling a large increase in the diversity and volume of data from observing the Earth. Dictionaries have been developing an ontology system for technical and geographical terms, and a metadata design has been completed according to international standards. The volume of data stored has exponentially increased. Previously, almost all of the large-volume data came from satellites, but model outputs occupy the largest volume of our data storage nowadays. In collaboration with scientific and technological groups, DIAS can accelerate data archiving by including data loading, quality checking, metadata registration, and our system data-searching capability is being enriched. DIAS also enables us to perform integrated research and realize interdisciplinarity. Essentially, we are now working in the fields of climate, water resources, food, fisheries, and biodiversity by collaborating between different disciplines and trying to develop bases of contribution to Sustainable Development Goals (SDGs).

  9. Girls in computer science: A female only introduction class in high school

    NASA Astrophysics Data System (ADS)

    Drobnis, Ann W.

    This study examined the impact of an all girls' classroom environment in a high school introductory computer science class on the student's attitudes towards computer science and their thoughts on future involvement with computer science. It was determined that an all girls' introductory class could impact the declining female enrollment and female students' efficacy towards computer science. This research was conducted in a summer school program through a regional magnet school for science and technology which these students attend during the school year. Three different groupings of students were examined for the research: female students in an all girls' class, female students in mixed-gender classes and male students in mixed-gender classes. A survey, Attitudes about Computers and Computer Science (ACCS), was designed to obtain an understanding of the students' thoughts, preconceptions, attitude, knowledge of computer science, and future intentions around computer science, both in education and career. Students in all three groups were administered the ACCS prior to taking the class and upon completion of the class. In addition, students in the all girls' class wrote in a journal throughout the course, and some of those students were also interviewed upon completion of the course. The data was analyzed using quantitative and qualitative techniques. While there were no major differences found in the quantitative data, it was determined that girls in the all girls' class were truly excited by what they had learned and were more open to the idea of computer science being a part of their future.

  10. Parsing partial molar volumes of small molecules: a molecular dynamics study.

    PubMed

    Patel, Nisha; Dubins, David N; Pomès, Régis; Chalikian, Tigran V

    2011-04-28

    We used molecular dynamics (MD) simulations in conjunction with the Kirkwood-Buff theory to compute the partial molar volumes for a number of small solutes of various chemical natures. We repeated our computations using modified pair potentials, first, in the absence of the Coulombic term and, second, in the absence of the Coulombic and the attractive Lennard-Jones terms. Comparison of our results with experimental data and the volumetric results of Monte Carlo simulation with hard sphere potentials and scaled particle theory-based computations led us to conclude that, for small solutes, the partial molar volume computed with the Lennard-Jones potential in the absence of the Coulombic term nearly coincides with the cavity volume. On the other hand, MD simulations carried out with the pair interaction potentials containing only the repulsive Lennard-Jones term produce unrealistically large partial molar volumes of solutes that are close to their excluded volumes. Our simulation results are in good agreement with the reported schemes for parsing partial molar volume data on small solutes. In particular, our determined interaction volumes() and the thickness of the thermal volume for individual compounds are in good agreement with empirical estimates. This work is the first computational study that supports and lends credence to the practical algorithms of parsing partial molar volume data that are currently in use for molecular interpretations of volumetric data.

  11. 'Towers in the Tempest' Computer Animation Submission

    NASA Technical Reports Server (NTRS)

    Shirah, Greg

    2008-01-01

    The following describes a computer animation that has been submitted to the ACM/SIGGRAPH 2008 computer graphics conference: 'Towers in the Tempest' clearly communicates recent scientific research into how hurricanes intensify. This intensification can be caused by a phenomenon called a 'hot tower.' For the first time, research meteorologists have run complex atmospheric simulations at a very fine temporal resolution of 3 minutes. Combining this simulation data with satellite observations enables detailed study of 'hot towers.' The science of 'hot towers' is described using: satellite observation data, conceptual illustrations, and a volumetric atmospheric simulation data. The movie starts by showing a 'hot tower' observed by NASA's Tropical Rainfall Measuring Mission (TRMM) spacecraft's three dimensional precipitation radar data of Hurricane Bonnie. Next, the dynamics of a hurricane and the formation of 'hot towers' are briefly explained using conceptual illustrations. Finally, volumetric cloud, wind, and vorticity data from a supercomputer simulation of Hurricane Bonnie are shown using volume techniques such as ray marching.

  12. Bringing computational science to the public.

    PubMed

    McDonagh, James L; Barker, Daniel; Alderson, Rosanna G

    2016-01-01

    The increasing use of computers in science allows for the scientific analyses of large datasets at an increasing pace. We provided examples and interactive demonstrations at Dundee Science Centre as part of the 2015 Women in Science festival, to present aspects of computational science to the general public. We used low-cost Raspberry Pi computers to provide hands on experience in computer programming and demonstrated the application of computers to biology. Computer games were used as a means to introduce computers to younger visitors. The success of the event was evaluated by voluntary feedback forms completed by visitors, in conjunction with our own self-evaluation. This work builds on the original work of the 4273π bioinformatics education program of Barker et al. (2013, BMC Bioinform. 14:243). 4273π provides open source education materials in bioinformatics. This work looks at the potential to adapt similar materials for public engagement events. It appears, at least in our small sample of visitors (n = 13), that basic computational science can be conveyed to people of all ages by means of interactive demonstrations. Children as young as five were able to successfully edit simple computer programs with supervision. This was, in many cases, their first experience of computer programming. The feedback is predominantly positive, showing strong support for improving computational science education, but also included suggestions for improvement. Our conclusions are necessarily preliminary. However, feedback forms suggest methods were generally well received among the participants; "Easy to follow. Clear explanation" and "Very easy. Demonstrators were very informative." Our event, held at a local Science Centre in Dundee, demonstrates that computer games and programming activities suitable for young children can be performed alongside a more specialised and applied introduction to computational science for older visitors.

  13. Computer Science and Telecommunications Board summary of activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blumenthal, M.S.

    1992-03-27

    The Computer Science and Telecommunications Board (CSTB) considers technical and policy issues pertaining to computer science, telecommunications, and associated technologies. CSTB actively disseminates the results of its completed projects to those in a position to help implement their recommendations or otherwise use their insights. It provides a forum for the exchange of information on computer science, computing technology, and telecommunications. This report discusses the major accomplishments of CSTB.

  14. Astronomy and astrophysics for the 1980's. Volume 1 - Report of the Astronomy Survey Committee. Volume 2 - Reports of the Panels

    NASA Astrophysics Data System (ADS)

    Recommended priorities for astronomy and astrophysics in the 1980s are considered along with the frontiers of astrophysics, taking into account large-scale structure in the universe, the evolution of galaxies, violent events, the formation of stars and planets, solar and stellar activity, astronomy and the forces of nature, and planets, life, and intelligence. Approved, continuing, and previously recommended programs are related to the Space Telescope and the associated Space Telescope Science Institute, second-generation instrumentation for the Space Telescope, and Gamma Ray Observatory, facilities for the detection of solar neutrinos, and the Shuttle Infrared Telescope Facility. Attention is given to the prerequisites for new research initiatives, new programs, programs for study and development, high-energy astrophysics, radio astronomy, theoretical and laboratory astrophysics, data processing and computational facilities, organization and education, and ultraviolet, optical, and infrared astronomy.

  15. Hispanic women overcoming deterrents to computer science: A phenomenological study

    NASA Astrophysics Data System (ADS)

    Herling, Lourdes

    The products of computer science are important to all aspects of society and are tools in the solution of the world's problems. It is, therefore, troubling that the United States faces a shortage in qualified graduates in computer science. The number of women and minorities in computer science is significantly lower than the percentage of the U.S. population which they represent. The overall enrollment in computer science programs has continued to decline with the enrollment of women declining at a higher rate than that of men. This study addressed three aspects of underrepresentation about which there has been little previous research: addressing computing disciplines specifically rather than embedding them within the STEM disciplines, what attracts women and minorities to computer science, and addressing the issues of race/ethnicity and gender in conjunction rather than in isolation. Since women of underrepresented ethnicities are more severely underrepresented than women in general, it is important to consider whether race and ethnicity play a role in addition to gender as has been suggested by previous research. Therefore, this study examined what attracted Hispanic women to computer science specifically. The study determines whether being subjected to multiple marginalizations---female and Hispanic---played a role in the experiences of Hispanic women currently in computer science. The study found five emergent themes within the experiences of Hispanic women in computer science. Encouragement and role models strongly influenced not only the participants' choice to major in the field, but to persist as well. Most of the participants experienced a negative atmosphere and feelings of not fitting in while in college and industry. The interdisciplinary nature of computer science was the most common aspect that attracted the participants to computer science. The aptitudes participants commonly believed are needed for success in computer science are the Twenty-First Century skills problem solving, creativity, and critical thinking. While not all the participants had experience with computers or programming prior to attending college, experience played a role in the self-confidence of those who did.

  16. Research in applied mathematics, numerical analysis, and computer science

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  17. Science-Driven Computing: NERSC's Plan for 2006-2010

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simon, Horst D.; Kramer, William T.C.; Bailey, David H.

    NERSC has developed a five-year strategic plan focusing on three components: Science-Driven Systems, Science-Driven Services, and Science-Driven Analytics. (1) Science-Driven Systems: Balanced introduction of the best new technologies for complete computational systems--computing, storage, networking, visualization and analysis--coupled with the activities necessary to engage vendors in addressing the DOE computational science requirements in their future roadmaps. (2) Science-Driven Services: The entire range of support activities, from high-quality operations and user services to direct scientific support, that enable a broad range of scientists to effectively use NERSC systems in their research. NERSC will concentrate on resources needed to realize the promise ofmore » the new highly scalable architectures for scientific discovery in multidisciplinary computational science projects. (3) Science-Driven Analytics: The architectural and systems enhancements and services required to integrate NERSC's powerful computational and storage resources to provide scientists with new tools to effectively manipulate, visualize, and analyze the huge data sets derived from simulations and experiments.« less

  18. Volume accumulator design analysis computer codes

    NASA Technical Reports Server (NTRS)

    Whitaker, W. D.; Shimazaki, T. T.

    1973-01-01

    The computer codes, VANEP and VANES, were written and used to aid in the design and performance calculation of the volume accumulator units (VAU) for the 5-kwe reactor thermoelectric system. VANEP computes the VAU design which meets the primary coolant loop VAU volume and pressure performance requirements. VANES computes the performance of the VAU design, determined from the VANEP code, at the conditions of the secondary coolant loop. The codes can also compute the performance characteristics of the VAU's under conditions of possible modes of failure which still permit continued system operation.

  19. The rise of information science: a changing landscape for soil science

    NASA Astrophysics Data System (ADS)

    Roudier, Pierre; Ritchie, Alistair; Hedley, Carolyn; Medyckyj-Scott, David

    2015-07-01

    The last 15 years have seen the rapid development of a wide range of information technologies. Those developments have been impacting all fields of science, at every step of the scientific method: data collection, data analysis, inference, science communication and outreach. The rate at which data is being generated is increasing exponentially, giving opportunities to improve our understanding of soils. Parallel developments in computing hardware and methods, such as machine learning, open ways to not only harness the '”data deluge”, but also offer a new way to generate knowledge. Finally, emerging data and information delivery protocols are leveraging the outreach power of the World Wide Web to disseminate scientific data and information, and increase their use and understanding outside the boundaries of a given scientific field. However, the nature of this data is mostly new to soil science, and requires adaptation to its diversity and volume. In particular, the integration of the significant amount of legacy soil data collected throughout decades of soil science can be problematic when all necessary metadata is not available. Likewise, knowledge accumulated by our scientific field needs to be acknowledged by - rather than opposed to - numerical methods. While the introduction of this set of emerging technologies is enabling soil science from different points of view, its successful implementation depends on the ability of soil scientists to act as knowledge brokers and support numerical methods.

  20. Scheduling science on television: A comparative analysis of the representations of science in 11 European countries.

    PubMed

    Lehmkuhl, Markus; Karamanidou, Christina; Mörä, Tuomo; Petkova, Kristina; Trench, Brian

    2012-11-01

    This article explores the factors that influence the volume and structure of science programming by European television broadcasters, focussing on differences among channel patterns. It proposes three factors as relevant to understanding differences in science programming: A) the segmentation/fragmentation of television markets; B) the presence of middle sized commercial channels; C) the dependency of public service TV channels on commercial income (trading/advertising). We identified countries whose channel patterns encourage a varied picture of science - namely Sweden, Finland and Germany. They are distinguished from those which show a less differentiated picture and present a smaller volume of science content on television - such as Great Britain and Ireland. Finally, we identified countries whose channel patterns don't encourage a varied picture of science - namely Spain, Greece, Bulgaria and Estonia - and these countries present their small volume of science content at off-peak hours, in contrast to patterns in Great Britain and Ireland.

  1. Gender Differences in the Use of Computers, Programming, and Peer Interactions in Computer Science Classrooms

    ERIC Educational Resources Information Center

    Stoilescu, Dorian; Egodawatte, Gunawardena

    2010-01-01

    Research shows that female and male students in undergraduate computer science programs view computer culture differently. Female students are interested more in the use of computers than in doing programming, whereas male students see computer science mainly as a programming activity. The overall purpose of our research was not to find new…

  2. International Conference on Applied Sciences (ICAS2013)

    NASA Astrophysics Data System (ADS)

    Lemle, Ludovic Dan; Jiang, Yiwen

    2014-03-01

    The International Conference on Applied Sciences (ICAS2013) took place in Wuhan, P R China from 26-27 October 2013 at the Military Economics Academy. The conference is regularly organized, alternately in Romania and in P R China, by ''Politehnica'' University of Timişoara, Romania, and Military Economics Academy of Wuhan, P R China, with the aim to serve as a platform for the exchange of information between various areas of applied sciences, and to promote the communication between the scientists of different nations, countries and continents. The conference has been organized for the first time in 15-16 June 2012 at the Engineering Faculty of Hunedoara, Romania. The topics of the conference covered a comprehensive spectrum of issues: Economical sciences Engineering sciences Fundamental sciences Medical sciences The conference gathered qualified researchers whose expertise can be used to develop new engineering knowledge that has applicability potential in economics, defense, medicine, etc. The number of registered participants was nearly 90 from 5 countries. During the two days of the conference 4 invited and 36 oral talks were delivered. A few of the speakers deserve a special mention: Mircea Octavian Popoviciu, Academy of Romanian Scientist — Timişoara Branch, Correlations between mechanical properties and cavitation erosion resistance for stainless steels with 12% chromium and variable contents of nickel; Carmen Eleonora Hărău, ''Politehnica'' University of Timişoara, SWOT analysis of Romania's integration in EU; Ding Hui, Military Economics Academy of Wuhan, Design and engineering analysis of material procurement mobile operation platform; Serban Rosu, University of Medicine and Pharmacy ''Victor Babeş'' Timişoara, Cervical and facial infections — a real life threat, among others. Based on the work presented at the conference, 14 selected papers are included in this volume of IOP Conference Series: Materials Science and Engineering. These papers present new researches in the various fields of materials engineering, mechanical engineering, computers engineering, mathematical engineering and clinical engineering. It's our great pleasure to present this volume of IOP Conference Series: Materials Science and Engineering to the scientific community to promote further researches in these areas. We sincerely hope that the papers published in this volume will contribute to the advancement of knowledge in the respective fields. All papers published in this volume of IOP Conference Series: Materials Science and Engineering (MSE) have been peer reviewed through processes administered by the editors of the ICAS2013 proceedings, Ludovic Dan Lemle and Yiwen Jiang. Special thanks should be directed to the organizing committee for their tremendous efforts in organizing the conference: General Chair Zhou Laixin, Military Economics Academy of Wuhan Co-chairs Du Qifa, Military Economics Academy of Wuhan Serban Viorel-Aurel, ''Politehnica'' University of Timişoara Fen Youmei, Wuhan University Lin Pinghua, Huazhong University of Science and Technology Members Lin Darong, Military Economics Academy of Wuhan Guo Zhonghou, Military Economics Academy of Wuhan Sun Honghong, Military Economics Academy of Wuhan Liu Dong, Military Economics Academy of Wuhan We thank the authors for their contributions and we would also like to express our gratitude everyone who contributed to this conference, especially for the generous support of the sponsor: micromega S C Micro-Mega HD S A Ludovic Dan Lemle and Yiwen Jiang Coordinators of the Scientific Committee of ICAS2013 Deatails of organizers and members of the scientific commmittee are available in the PDF

  3. The Voronoi volume and molecular representation of molar volume: equilibrium simple fluids.

    PubMed

    Hunjan, Jagtar Singh; Eu, Byung Chan

    2010-04-07

    The Voronoi volume of simple fluids was previously made use of in connection with volume transport phenomena in nonequilibrium simple fluids. To investigate volume transport phenomena, it is important to develop a method to compute the Voronoi volume of fluids in nonequilibrium. In this work, as a first step to this goal, we investigate the equilibrium limit of the nonequilibrium Voronoi volume together with its attendant related molar (molal) and specific volumes. It is proved that the equilibrium Voronoi volume is equivalent to the molar (molal) volume. The latter, in turn, is proved equivalent to the specific volume. This chain of equivalences provides an alternative procedure of computing the equilibrium Voronoi volume from the molar volume/specific volume. We also show approximate methods of computing the Voronoi and molar volumes from the information on the pair correlation function. These methods may be employed for their quick estimation, but also provide some aspects of the fluid structure and its relation to the Voronoi volume. The Voronoi volume obtained from computer simulations is fitted to a function of temperature and pressure in the region above the triple point but below the critical point. Since the fitting function is given in terms of reduced variables for the Lennard-Jones (LJ) model and the kindred volumes (i.e., specific and molar volumes) are in essence equivalent to the equation of state, the formula obtained is a reduced equation state for simple fluids obeying the LJ model potential in the range of temperature and pressure examined and hence can be used for other simple fluids.

  4. Opportunities for Computational Discovery in Basic Energy Sciences

    NASA Astrophysics Data System (ADS)

    Pederson, Mark

    2011-03-01

    An overview of the broad-ranging support of computational physics and computational science within the Department of Energy Office of Science will be provided. Computation as the third branch of physics is supported by all six offices (Advanced Scientific Computing, Basic Energy, Biological and Environmental, Fusion Energy, High-Energy Physics, and Nuclear Physics). Support focuses on hardware, software and applications. Most opportunities within the fields of~condensed-matter physics, chemical-physics and materials sciences are supported by the Officeof Basic Energy Science (BES) or through partnerships between BES and the Office for Advanced Scientific Computing. Activities include radiation sciences, catalysis, combustion, materials in extreme environments, energy-storage materials, light-harvesting and photovoltaics, solid-state lighting and superconductivity.~ A summary of two recent reports by the computational materials and chemical communities on the role of computation during the next decade will be provided. ~In addition to materials and chemistry challenges specific to energy sciences, issues identified~include a focus on the role of the domain scientist in integrating, expanding and sustaining applications-oriented capabilities on evolving high-performance computing platforms and on the role of computation in accelerating the development of innovative technologies. ~~

  5. Research | Computational Science | NREL

    Science.gov Websites

    Research Research NREL's computational science experts use advanced high-performance computing (HPC technologies, thereby accelerating the transformation of our nation's energy system. Enabling High-Impact Research NREL's computational science capabilities enable high-impact research. Some recent examples

  6. Three Software Tools for Viewing Sectional Planes, Volume Models, and Surface Models of a Cadaver Hand.

    PubMed

    Chung, Beom Sun; Chung, Min Suk; Shin, Byeong Seok; Kwon, Koojoo

    2018-02-19

    The hand anatomy, including the complicated hand muscles, can be grasped by using computer-assisted learning tools with high quality two-dimensional images and three-dimensional models. The purpose of this study was to present up-to-date software tools that promote learning of stereoscopic morphology of the hand. On the basis of horizontal sectioned images and outlined images of a male cadaver, vertical planes, volume models, and surface models were elaborated. Software to browse pairs of the sectioned and outlined images in orthogonal planes and software to peel and rotate the volume models, as well as a portable document format (PDF) file to select and rotate the surface models, were produced. All of the software tools were downloadable free of charge and usable off-line. The three types of tools for viewing multiple aspects of the hand could be adequately employed according to individual needs. These new tools involving the realistic images of a cadaver and the diverse functions are expected to improve comprehensive knowledge of the hand shape. © 2018 The Korean Academy of Medical Sciences.

  7. Computer vision system for egg volume prediction using backpropagation neural network

    NASA Astrophysics Data System (ADS)

    Siswantoro, J.; Hilman, M. Y.; Widiasri, M.

    2017-11-01

    Volume is one of considered aspects in egg sorting process. A rapid and accurate volume measurement method is needed to develop an egg sorting system. Computer vision system (CVS) provides a promising solution for volume measurement problem. Artificial neural network (ANN) has been used to predict the volume of egg in several CVSs. However, volume prediction from ANN could have less accuracy due to inappropriate input features or inappropriate ANN structure. This paper proposes a CVS for predicting the volume of egg using ANN. The CVS acquired an image of egg from top view and then processed the image to extract its 1D and 2 D size features. The features were used as input for ANN in predicting the volume of egg. The experiment results show that the proposed CSV can predict the volume of egg with a good accuracy and less computation time.

  8. Tree volume and biomass equations for the Lake States.

    Treesearch

    Jerold T. Hahn

    1984-01-01

    Presents species specific equations and methods for computing tree height, cubic foot, and board foot volume, and biomass for the Lake States (Michigan, Minnesota, and Wisconsin). Height equations compute either total or merchantable height to a variable top d.o.b. from d.b.h., site index, and basal area. Volumes and biomass are computed from d.b.h. and height.

  9. Automation of Command and Data Entry in a Glovebox Work Volume: An Evaluation of Data Entry Devices

    NASA Technical Reports Server (NTRS)

    Steele, Marianne K.; Nakamura, Gail; Havens, Cindy; LeMay, Moira

    1996-01-01

    The present study was designed to examine the human-computer interface for data entry while performing experimental procedures within a glovebox work volume in order to make a recommendation to the Space Station Biological Research Project for a data entry system to be used within the Life Sciences Glovebox. Test subjects entered data using either a manual keypad, similar to a standard computer numerical keypad located within the glovebox work volume, or a voice input system using a speech recognition program with a microphone headset. Numerical input and commands were programmed in an identical manner between the two systems. With both electronic systems, a small trackball was available within the work volume for cursor control. Data, such as sample vial identification numbers, sample tissue weights, and health check parameters of the specimen, were entered directly into procedures that were electronically displayed on a video monitor within the glovebox. A pen and paper system with a 'flip-chart' format for procedure display, similar to that currently in use on the Space Shuttle, was used as a baseline data entry condition. Procedures were performed by a single operator; eight test subjects were used in the study. The electronic systems were tested under both a 'nominal' or 'anomalous' condition. The anomalous condition was introduced into the experimental procedure to increase the probability of finding limitations or problems with human interactions with the electronic systems. Each subject performed five test runs during a test day: two procedures each with voice and keypad, one with and one without anomalies, and one pen and paper procedure. The data collected were both quantitative (times, errors) and qualitative (subjective ratings of the subjects).

  10. NASA's computer science research program

    NASA Technical Reports Server (NTRS)

    Larsen, R. L.

    1983-01-01

    Following a major assessment of NASA's computing technology needs, a new program of computer science research has been initiated by the Agency. The program includes work in concurrent processing, management of large scale scientific databases, software engineering, reliable computing, and artificial intelligence. The program is driven by applications requirements in computational fluid dynamics, image processing, sensor data management, real-time mission control and autonomous systems. It consists of university research, in-house NASA research, and NASA's Research Institute for Advanced Computer Science (RIACS) and Institute for Computer Applications in Science and Engineering (ICASE). The overall goal is to provide the technical foundation within NASA to exploit advancing computing technology in aerospace applications.

  11. Responsible science: Ensuring the integrity of the research process. Volume 2. Final report, 1989--1992

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    In 1989, the National Academy of Sciences, the National Academy of Engineering, and the Institute of Medicine initiated a major study to examine issues related to the responsible conduct of research. The findings and recommendations were published in March 1992 as: Responsible Science: Ensuring the Integrity of the Research Process, Vol. 1. Volume II of the report, this volume, includes the six commissioned background papers as well as selected institutional guidelines, reports, policies, and procedures. The institutional statements reprinted in Volume II have been selected to convey the diverse approaches for addressing different aspects of misconduct or integrity in sciencemore » within research institutions.« less

  12. Girls Save the World through Computer Science

    ERIC Educational Resources Information Center

    Murakami, Christine

    2011-01-01

    It's no secret that fewer and fewer women are entering computer science fields. Attracting high school girls to computer science is only part of the solution. Retaining them while they are in higher education or the workforce is also a challenge. To solve this, there is a need to show girls that computer science is a wide-open field that offers…

  13. The Assessment of Taiwanese College Students' Conceptions of and Approaches to Learning Computer Science and Their Relationships

    ERIC Educational Resources Information Center

    Liang, Jyh-Chong; Su, Yi-Ching; Tsai, Chin-Chung

    2015-01-01

    The aim of this study was to explore Taiwanese college students' conceptions of and approaches to learning computer science and then explore the relationships between the two. Two surveys, Conceptions of Learning Computer Science (COLCS) and Approaches to Learning Computer Science (ALCS), were administered to 421 college students majoring in…

  14. Hispanic Women Overcoming Deterrents to Computer Science: A Phenomenological Study

    ERIC Educational Resources Information Center

    Herling, Lourdes

    2011-01-01

    The products of computer science are important to all aspects of society and are tools in the solution of the world's problems. It is, therefore, troubling that the United States faces a shortage in qualified graduates in computer science. The number of women and minorities in computer science is significantly lower than the percentage of the…

  15. The Effects of Integrating Service Learning into Computer Science: An Inter-Institutional Longitudinal Study

    ERIC Educational Resources Information Center

    Payton, Jamie; Barnes, Tiffany; Buch, Kim; Rorrer, Audrey; Zuo, Huifang

    2015-01-01

    This study is a follow-up to one published in computer science education in 2010 that reported preliminary results showing a positive impact of service learning on student attitudes associated with success and retention in computer science. That paper described how service learning was incorporated into a computer science course in the context of…

  16. Non-Determinism: An Abstract Concept in Computer Science Studies

    ERIC Educational Resources Information Center

    Armoni, Michal; Gal-Ezer, Judith

    2007-01-01

    Non-determinism is one of the most important, yet abstract, recurring concepts of Computer Science. It plays an important role in Computer Science areas such as formal language theory, computability theory, distributed computing, and operating systems. We conducted a series of studies on the perception of non-determinism. In the current research,…

  17. An Investigation of Primary School Science Teachers' Use of Computer Applications

    ERIC Educational Resources Information Center

    Ocak, Mehmet Akif; Akdemir, Omur

    2008-01-01

    This study investigated the level and frequency of science teachers' use of computer applications as an instructional tool in the classroom. The manner and frequency of science teachers' use of computer, their perceptions about integration of computer applications, and other factors contributed to changes in their computer literacy are…

  18. Methodical Approaches to Teaching of Computer Modeling in Computer Science Course

    ERIC Educational Resources Information Center

    Rakhimzhanova, B. Lyazzat; Issabayeva, N. Darazha; Khakimova, Tiyshtik; Bolyskhanova, J. Madina

    2015-01-01

    The purpose of this study was to justify of the formation technique of representation of modeling methodology at computer science lessons. The necessity of studying computer modeling is that the current trends of strengthening of general education and worldview functions of computer science define the necessity of additional research of the…

  19. Science Library of Test Items. Volume Two.

    ERIC Educational Resources Information Center

    New South Wales Dept. of Education, Sydney (Australia).

    The second volume of test items in the Science Library of Test Items is intended as a resource to assist teachers in implementing and evaluating science courses in the first 4 years of Australian secondary school. The items were selected from questions submitted to the School Certificate Development Unit by teachers in New South Wales. Only the…

  20. Animal Science. Instructor Guide [and] Student Reference. Volume 28, Number 3 [and] Volume 28, Number 4.

    ERIC Educational Resources Information Center

    Baker, Andy; And Others

    This instructor guide and the corresponding student reference contain 4 units that include 30 lessons for a course in animal science for 11th- and 12th-grade agriculture science students. The units cover nutrition, genetics, reproduction, and animal health. The instructor's guide contains the following: objectives, competencies, motivational…

  1. Proportional Reasoning Ability and Concepts of Scale: Surface Area to Volume Relationships in Science

    ERIC Educational Resources Information Center

    Taylor, Amy; Jones, Gail

    2009-01-01

    The "National Science Education Standards" emphasise teaching unifying concepts and processes such as basic functions of living organisms, the living environment, and scale. Scale influences science processes and phenomena across the domains. One of the big ideas of scale is that of surface area to volume. This study explored whether or not there…

  2. Animal Science Technology. An Experimental Developmental Program. Volume II, Curriculum Course Outlines.

    ERIC Educational Resources Information Center

    Brant, Herman G.

    This volume, the second of a two part evaluation report, is devoted exclusively to the presentation of detailed course outlines representing an Animal Science Technology curriculum. Arranged in 6 terms of study (2 academic years), outlines are included on such topics as: (1) Introductory Animal Science, (2) General Microbiology, (3) Zoonoses, (4)…

  3. What's Happening in the Mathematical Sciences, 1993-1994.

    ERIC Educational Resources Information Center

    Cipra, Barry

    1993-01-01

    This document consists of the first two volumes of a new annual serial devoted to surveying some of the important developments in the mathematical sciences in the previous year or so. Mathematics is constantly growing and changing, reaching out to other areas of science and helping to solve some of the major problems facing society. Volumes 1 and…

  4. ASDC Collaborations and Processes to Ensure Quality Metadata and Consistent Data Availability

    NASA Astrophysics Data System (ADS)

    Trapasso, T. J.

    2017-12-01

    With the introduction of new tools, faster computing, and less expensive storage, increased volumes of data are expected to be managed with existing or fewer resources. Metadata management is becoming a heightened challenge from the increase in data volume, resulting in more metadata records needed to be curated for each product. To address metadata availability and completeness, NASA ESDIS has taken significant strides with the creation of the United Metadata Model (UMM) and Common Metadata Repository (CMR). These UMM helps address hurdles experienced by the increasing number of metadata dialects and the CMR provides a primary repository for metadata so that required metadata fields can be served through a growing number of tools and services. However, metadata quality remains an issue as metadata is not always inherent to the end-user. In response to these challenges, the NASA Atmospheric Science Data Center (ASDC) created the Collaboratory for quAlity Metadata Preservation (CAMP) and defined the Product Lifecycle Process (PLP) to work congruently. CAMP is unique in that it provides science team members a UI to directly supply metadata that is complete, compliant, and accurate for their data products. This replaces back-and-forth communication that often results in misinterpreted metadata. Upon review by ASDC staff, metadata is submitted to CMR for broader distribution through Earthdata. Further, approval of science team metadata in CAMP automatically triggers the ASDC PLP workflow to ensure appropriate services are applied throughout the product lifecycle. This presentation will review the design elements of CAMP and PLP as well as demonstrate interfaces to each. It will show the benefits that CAMP and PLP provide to the ASDC that could potentially benefit additional NASA Earth Science Data and Information System (ESDIS) Distributed Active Archive Centers (DAACs).

  5. Automatic lithofacies segmentation from well-logs data. A comparative study between the Self-Organizing Map (SOM) and Walsh transform

    NASA Astrophysics Data System (ADS)

    Aliouane, Leila; Ouadfeul, Sid-Ali; Rabhi, Abdessalem; Rouina, Fouzi; Benaissa, Zahia; Boudella, Amar

    2013-04-01

    The main goal of this work is to realize a comparison between two lithofacies segmentation techniques of reservoir interval. The first one is based on the Kohonen's Self-Organizing Map neural network machine. The second technique is based on the Walsh transform decomposition. Application to real well-logs data of two boreholes located in the Algerian Sahara shows that the Self-organizing map is able to provide more lithological details that the obtained lithofacies model given by the Walsh decomposition. Keywords: Comparison, Lithofacies, SOM, Walsh References: 1)Aliouane, L., Ouadfeul, S., Boudella, A., 2011, Fractal analysis based on the continuous wavelet transform and lithofacies classification from well-logs data using the self-organizing map neural network, Arabian Journal of geosciences, doi: 10.1007/s12517-011-0459-4 2) Aliouane, L., Ouadfeul, S., Djarfour, N., Boudella, A., 2012, Petrophysical Parameters Estimation from Well-Logs Data Using Multilayer Perceptron and Radial Basis Function Neural Networks, Lecture Notes in Computer Science Volume 7667, 2012, pp 730-736, doi : 10.1007/978-3-642-34500-5_86 3)Ouadfeul, S. and Aliouane., L., 2011, Multifractal analysis revisited by the continuous wavelet transform applied in lithofacies segmentation from well-logs data, International journal of applied physics and mathematics, Vol01 N01. 4) Ouadfeul, S., Aliouane, L., 2012, Lithofacies Classification Using the Multilayer Perceptron and the Self-organizing Neural Networks, Lecture Notes in Computer Science Volume 7667, 2012, pp 737-744, doi : 10.1007/978-3-642-34500-5_87 5) Weisstein, Eric W. "Fast Walsh Transform." From MathWorld--A Wolfram Web Resource. http://mathworld.wolfram.com/FastWalshTransform.html

  6. Climate Modeling Computing Needs Assessment

    NASA Astrophysics Data System (ADS)

    Petraska, K. E.; McCabe, J. D.

    2011-12-01

    This paper discusses early findings of an assessment of computing needs for NASA science, engineering and flight communities. The purpose of this assessment is to document a comprehensive set of computing needs that will allow us to better evaluate whether our computing assets are adequately structured to meet evolving demand. The early results are interesting, already pointing out improvements we can make today to get more out of the computing capacity we have, as well as potential game changing innovations for the future in how we apply information technology to science computing. Our objective is to learn how to leverage our resources in the best way possible to do more science for less money. Our approach in this assessment is threefold: Development of use case studies for science workflows; Creating a taxonomy and structure for describing science computing requirements; and characterizing agency computing, analysis, and visualization resources. As projects evolve, science data sets increase in a number of ways: in size, scope, timelines, complexity, and fidelity. Generating, processing, moving, and analyzing these data sets places distinct and discernable requirements on underlying computing, analysis, storage, and visualization systems. The initial focus group for this assessment is the Earth Science modeling community within NASA's Science Mission Directorate (SMD). As the assessment evolves, this focus will expand to other science communities across the agency. We will discuss our use cases, our framework for requirements and our characterizations, as well as our interview process, what we learned and how we plan to improve our materials after using them in the first round of interviews in the Earth Science Modeling community. We will describe our plans for how to expand this assessment, first into the Earth Science data analysis and remote sensing communities, and then throughout the full community of science, engineering and flight at NASA.

  7. Kenny Gruchalla | NREL

    Science.gov Websites

    feature extraction, human-computer interaction, and physics-based modeling. Professional Experience 2009 ., computer science, University of Colorado at Boulder M.S., computer science, University of Colorado at Boulder B.S., computer science, New Mexico Institute of Mining and Technology

  8. Accurate Characterization of the Pore Volume in Microporous Crystalline Materials

    PubMed Central

    2017-01-01

    Pore volume is one of the main properties for the characterization of microporous crystals. It is experimentally measurable, and it can also be obtained from the refined unit cell by a number of computational techniques. In this work, we assess the accuracy and the discrepancies between the different computational methods which are commonly used for this purpose, i.e, geometric, helium, and probe center pore volumes, by studying a database of more than 5000 frameworks. We developed a new technique to fully characterize the internal void of a microporous material and to compute the probe-accessible and -occupiable pore volume. We show that, unlike the other definitions of pore volume, the occupiable pore volume can be directly related to the experimentally measured pore volumes from nitrogen isotherms. PMID:28636815

  9. Accurate Characterization of the Pore Volume in Microporous Crystalline Materials

    DOE PAGES

    Ongari, Daniele; Boyd, Peter G.; Barthel, Senja; ...

    2017-06-21

    Pore volume is one of the main properties for the characterization of microporous crystals. It is experimentally measurable, and it can also be obtained from the refined unit cell by a number of computational techniques. In this work, we assess the accuracy and the discrepancies between the different computational methods which are commonly used for this purpose, i.e, geometric, helium, and probe center pore volumes, by studying a database of more than 5000 frameworks. We developed a new technique to fully characterize the internal void of a microporous material and to compute the probe-accessible and -occupiable pore volume. Lasty, wemore » show that, unlike the other definitions of pore volume, the occupiable pore volume can be directly related to the experimentally measured pore volumes from nitrogen isotherms.« less

  10. Computer-aided design and computer science technology

    NASA Technical Reports Server (NTRS)

    Fulton, R. E.; Voigt, S. J.

    1976-01-01

    A description is presented of computer-aided design requirements and the resulting computer science advances needed to support aerospace design. The aerospace design environment is examined, taking into account problems of data handling and aspects of computer hardware and software. The interactive terminal is normally the primary interface between the computer system and the engineering designer. Attention is given to user aids, interactive design, interactive computations, the characteristics of design information, data management requirements, hardware advancements, and computer science developments.

  11. The effects of integrating service learning into computer science: an inter-institutional longitudinal study

    NASA Astrophysics Data System (ADS)

    Payton, Jamie; Barnes, Tiffany; Buch, Kim; Rorrer, Audrey; Zuo, Huifang

    2015-07-01

    This study is a follow-up to one published in computer science education in 2010 that reported preliminary results showing a positive impact of service learning on student attitudes associated with success and retention in computer science. That paper described how service learning was incorporated into a computer science course in the context of the Students & Technology in Academia, Research, and Service (STARS) Alliance, an NSF-supported broadening participation in computing initiative that aims to diversify the computer science pipeline through innovative pedagogy and inter-institutional partnerships. The current paper describes how the STARS Alliance has expanded to diverse institutions, all using service learning as a vehicle for broadening participation in computing and enhancing attitudes and behaviors associated with student success. Results supported the STARS model of service learning for enhancing computing efficacy and computing commitment and for providing diverse students with many personal and professional development benefits.

  12. ICASE Computer Science Program

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The Institute for Computer Applications in Science and Engineering computer science program is discussed in outline form. Information is given on such topics as problem decomposition, algorithm development, programming languages, and parallel architectures.

  13. White House announces “big data” initiative

    NASA Astrophysics Data System (ADS)

    Showstack, Randy

    2012-04-01

    The world is now generating zetabytes—which is 10 to the 21st power, or a billion trillion bytess—of information every year, according to John Holdren, director of the White House Office of Science and Technology Policy. With data volumes growing exponentially from a variety of sources such as computers running large-scale models, scientific instruments including telescopes and particle accelerators, and even online retail transactions, a key challenge is to better manage and utilize the data. The Big Data Research and Development Initiative, launched by the White House at a 29 March briefing, initially includes six federal departments and agencies providing more than $200 million in new commitments to improve tools and techniques for better accessing, organizing, and using data for scientific advances. The agencies and departments include the National Science Foundation (NSF), Department of Energy, U.S. Geological Survey (USGS), National Institutes of Health (NIH), Department of Defense, and Defense Advanced Research Projects Agency.

  14. Bringing history to life: simulating landmark experiments in psychology.

    PubMed

    Boynton, David M; Smith, Laurence D

    2006-05-01

    The course in history of psychology can be challenging for students, many of whom enter it with little background in history and faced with unfamiliar names and concepts. The sheer volume of material can encourage passive memorization unless efforts are made to increase student involvement. As part of a trend toward experiential history, historians of science have begun to supplement their lectures with demonstrations of classic physics experiments as a way to bring the history of science to life. Here, the authors report on computer simulations of five landmark experiments from early experimental psychology in the areas of reaction time, span of attention, and apparent motion. The simulations are designed not only to permit hands-on replication of historically important results but also to reproduce the experimental procedures closely enough that students can gain a feel for the nature of early research and the psychological processes being studied.

  15. Data systems trade studies for a next generation sensor

    NASA Astrophysics Data System (ADS)

    Masuoka, Edward J.; Fleig, Albert J.

    1997-01-01

    Processing system designers must make substantial changes to accommodate current and anticipated improvements in remote sensing instruments.Increases in the spectral, radiometric and geometric resolution lead to data rates, processing loads and storage volumes which far exceed the ability of most current computer systems. To accommodate user expectations, the data must be processed and made available quickly in a convenient and easy to use form. This paper describes design trade-offs made in developing the processing system for the moderate resolution imaging spectroradiometer, MODIS, which will fly on the Earth Observing System's, AM-1 spacecraft to be launched in 1998. MODIS will have an average continuous date rate of 6.2 Mbps and require processing at 6.5 GFLOPS to produce 600GB of output products per day. Specific trade-offs occur in the areas of science software portability and usability of science products versus overall system performance and throughput.

  16. The American and His Environment--A Social Sciences Course. Project Reports, Volume 2, The Rachel Carson Project.

    ERIC Educational Resources Information Center

    Tanner, R. Thomas

    This document is the second of seven volumes included in the Rachel Carson Project. The project attempts to introduce environmental lessons and units into existing courses of study within a high school rather than to implement environmental education through the introduction of new courses. This volume focuses on the social science area by…

  17. The Data Science Landscape

    NASA Astrophysics Data System (ADS)

    Mentzel, C.

    2017-12-01

    Modern scientific data continue to increase in volume, variety, and velocity, and though the hype of big data has subsided, its usefulness for scientific discovery has only just begun. Harnessing these data for new insights, more efficient decision making, and other mission critical uses requires a combination of skills and expertise, often labeled data science. Data science can be thought of as a combination of statistics, computation and the domain from which the data relate, and so is a true interdisciplinary pursuit. Though it has reaped large benefits in companies able to afford the high cost of the severely limited talent pool, it suffers from lack of support in mission driven organizations. Not purely in any one historical field, data science is proving difficult to find a home in traditional university academic departments and other research organizations. The landscape of data science efforts, from academia, industry and government, can be characterized as nascent, enthusiastic, uneven, and highly competitive. Part of the challenge in documenting these trends is the lack of agreement about what data science is, and who is a data scientist. Defining these terms too closely and too early runs the risk of cutting off a tremendous amount of productive creativity, but waiting too long leaves many people without a sustainable career, and many organizations without the necessary skills to gain value from their data. This talk will explore the landscape of data science efforts in the US, including how organizations are building and sustaining data science teams.

  18. MSL-RAD Cruise Operations Concept

    NASA Technical Reports Server (NTRS)

    Brinza, David E.; Zeitlin, Cary; Hassler, Donald; Weigle, Gerald E.; Boettcher, Stephan; Martin, Cesar; Wimmer-Schweingrubber, Robert

    2012-01-01

    The Mars Science Laboratory (MSL) payload includes the Radiation Assessment Detector (RAD) instrument, intended to fully characterize the radiation environment for the MSL mission. The RAD instrument operations concept is intended to reduce impact to spacecraft resources and effort for the MSL operations team. By design, RAD autonomously performs regular science observations without the need for frequent commanding from the Rover Compute Element (RCE). RAD operates with pre-defined "sleep" and "observe" periods, with an adjustable duty cycle for meeting power and data volume constraints during the mission. At the start of a new science observation, RAD performs a pre-observation activity to assess count rates for selected RAD detector elements. Based on this assessment, RAD can enter "solar event" mode, in which instrument parameters (including observation duration) are selected to more effectively characterize the environment. At the end of each observation period, RAD stores a time-tagged, fixed length science data packet in its non-volatile mass memory storage. The operating cadence is defined by adjustable parameters, also stored in non-volatile memory within the instrument. Periodically, the RCE executes an on-board sequence to transfer RAD science data packets from the instrument mass storage to the MSL downlink buffer. Infrequently, the RAD instrument operating configuration is modified by updating internal parameter tables and configuration entries.

  19. Applications of Out-of-Domain Knowledge in Students' Reasoning about Computer Program State

    ERIC Educational Resources Information Center

    Lewis, Colleen Marie

    2012-01-01

    To meet a growing demand and a projected deficit in the supply of computer professionals (NCWIT, 2009), it is of vital importance to expand students' access to computer science. However, many researchers in the computer science education community unproductively assume that some students lack an innate ability for computer science and…

  20. Scientific Computing Strategic Plan for the Idaho National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whiting, Eric Todd

    Scientific computing is a critical foundation of modern science. Without innovations in the field of computational science, the essential missions of the Department of Energy (DOE) would go unrealized. Taking a leadership role in such innovations is Idaho National Laboratory’s (INL’s) challenge and charge, and is central to INL’s ongoing success. Computing is an essential part of INL’s future. DOE science and technology missions rely firmly on computing capabilities in various forms. Modeling and simulation, fueled by innovations in computational science and validated through experiment, are a critical foundation of science and engineering. Big data analytics from an increasing numbermore » of widely varied sources is opening new windows of insight and discovery. Computing is a critical tool in education, science, engineering, and experiments. Advanced computing capabilities in the form of people, tools, computers, and facilities, will position INL competitively to deliver results and solutions on important national science and engineering challenges. A computing strategy must include much more than simply computers. The foundational enabling component of computing at many DOE national laboratories is the combination of a showcase like data center facility coupled with a very capable supercomputer. In addition, network connectivity, disk storage systems, and visualization hardware are critical and generally tightly coupled to the computer system and co located in the same facility. The existence of these resources in a single data center facility opens the doors to many opportunities that would not otherwise be possible.« less

Top