Sample records for institutions computer science

  1. Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April, 1986 through September 30, 1986 is summarized.

  2. Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April l, 1988 through September 30, 1988.

  3. Summary of research in applied mathematics, numerical analysis and computer science at the Institute for Computer Applications in Science and Engineering

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science during the period October 1, 1983 through March 31, 1984 is summarized.

  4. Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1986 through March 31, 1987 is summarized.

  5. Summary of research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1988 through March 31, 1989 is summarized.

  6. Institute for Computer Applications in Science and Engineering (ICASE)

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science during the period April 1, 1983 through September 30, 1983 is summarized.

  7. Activities of the Institute for Computer Applications in Science and Engineering

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April 1, 1985 through October 2, 1985 is summarized.

  8. Activities of the Institute for Computer Applications in Science and Engineering (ICASE)

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1984 through March 31, 1985 is summarized.

  9. [Research Conducted at the Institute for Computer Applications in Science and Engineering

    NASA Technical Reports Server (NTRS)

    1997-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period 1 Oct. 1996 - 31 Mar. 1997.

  10. Activities of the Institute for Computer Applications in Science and Engineering (ICASE)

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This report summarizes research conducted at the Institute for Computer Applications Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 2, 1987 through March 31, 1988.

  11. [Activities of Institute for Computer Applications in Science and Engineering (ICASE)

    NASA Technical Reports Server (NTRS)

    1999-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics. fluid mechanics, and computer science during the period April 1, 1999 through September 30. 1999.

  12. Research in progress at the Institute for Computer Applications in Science and Engineering

    NASA Technical Reports Server (NTRS)

    1987-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April 1, 1987 through October 1, 1987.

  13. The effects of integrating service learning into computer science: an inter-institutional longitudinal study

    NASA Astrophysics Data System (ADS)

    Payton, Jamie; Barnes, Tiffany; Buch, Kim; Rorrer, Audrey; Zuo, Huifang

    2015-07-01

    This study is a follow-up to one published in computer science education in 2010 that reported preliminary results showing a positive impact of service learning on student attitudes associated with success and retention in computer science. That paper described how service learning was incorporated into a computer science course in the context of the Students & Technology in Academia, Research, and Service (STARS) Alliance, an NSF-supported broadening participation in computing initiative that aims to diversify the computer science pipeline through innovative pedagogy and inter-institutional partnerships. The current paper describes how the STARS Alliance has expanded to diverse institutions, all using service learning as a vehicle for broadening participation in computing and enhancing attitudes and behaviors associated with student success. Results supported the STARS model of service learning for enhancing computing efficacy and computing commitment and for providing diverse students with many personal and professional development benefits.

  14. [Research Conducted at the Institute for Computer Applications in Science and Engineering for the Period October 1, 1999 through March 31, 2000

    NASA Technical Reports Server (NTRS)

    Bushnell, Dennis M. (Technical Monitor)

    2000-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, computer science, fluid mechanics, and structures and materials during the period October 1, 1999 through March 31, 2000.

  15. Research in progress in applied mathematics, numerical analysis, and computer science

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Research conducted at the Institute in Science and Engineering in applied mathematics, numerical analysis, and computer science is summarized. The Institute conducts unclassified basic research in applied mathematics in order to extend and improve problem solving capabilities in science and engineering, particularly in aeronautics and space.

  16. Institutional computing (IC) information session

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koch, Kenneth R; Lally, Bryan R

    2011-01-19

    The LANL Institutional Computing Program (IC) will host an information session about the current state of unclassified Institutional Computing at Los Alamos, exciting plans for the future, and the current call for proposals for science and engineering projects requiring computing. Program representatives will give short presentations and field questions about the call for proposals and future planned machines, and discuss technical support available to existing and future projects. Los Alamos has started making a serious institutional investment in open computing available to our science projects, and that investment is expected to increase even more.

  17. NASA's computer science research program

    NASA Technical Reports Server (NTRS)

    Larsen, R. L.

    1983-01-01

    Following a major assessment of NASA's computing technology needs, a new program of computer science research has been initiated by the Agency. The program includes work in concurrent processing, management of large scale scientific databases, software engineering, reliable computing, and artificial intelligence. The program is driven by applications requirements in computational fluid dynamics, image processing, sensor data management, real-time mission control and autonomous systems. It consists of university research, in-house NASA research, and NASA's Research Institute for Advanced Computer Science (RIACS) and Institute for Computer Applications in Science and Engineering (ICASE). The overall goal is to provide the technical foundation within NASA to exploit advancing computing technology in aerospace applications.

  18. Research in progress and other activities of the Institute for Computer Applications in Science and Engineering

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics and computer science during the period April 1, 1993 through September 30, 1993. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustic and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.

  19. ICASE Computer Science Program

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The Institute for Computer Applications in Science and Engineering computer science program is discussed in outline form. Information is given on such topics as problem decomposition, algorithm development, programming languages, and parallel architectures.

  20. Ignoring a Revolution in Military Affairs: The Need to Create a Separate Branch of the Armed Forces for Cyber Warfare

    DTIC Science & Technology

    2017-06-09

    those with talent in the computer sciences. Upon graduation from high school, computer -proficient teenagers are selected for an elite cyber force and...Arguably, the Massachusetts Institute of Technology (M.I.T.) is the premiere institution for computer science. M.I.T. graduates make, on average, $83,455...study specific to computer science and provide certification in programs like ethical hacking, cyber security, and programing. As with the other

  1. Institute for Computer Sciences and Technology. Annual Report FY 1986.

    ERIC Educational Resources Information Center

    National Bureau of Standards (DOC), Washington, DC. Inst. for Computer Sciences and Technology.

    Activities of the Institute for Computer Sciences and Technology (ICST) within the U.S. Department of Commerce during fiscal year 1986 are described in this annual report, which summarizes research and publications by ICST in the following areas: (1) standards and guidelines for computer security, including encryption and message authentication…

  2. Introduction to USRA

    NASA Technical Reports Server (NTRS)

    Davis, M. H. (Editor); Singy, A. (Editor)

    1994-01-01

    The Universities Space Research Association (USRA) was incorporated 25 years ago in the District of Columbia as a private nonprofit corporation under the auspices of the National Academy of Sciences. Institutional membership in the association has grown from 49 colleges and universities, when it was founded, to 76 in 1993. USRA provides a mechanism through which universities can cooperate effectively with one another, with the government, and with other organizations to further space science and technology and to promote education in these areas. Its mission is carried out through the institutes, centers, divisions, and programs that are described in detail in this booklet. These include the Lunar and Planetary Institute, the Institute for Computer Applications in Science and Engineering (ICASE), the Research Institute for Advanced Computer Science (RIACS), and the Center of Excellence in Space Data and Information Sciences (CESDIS).

  3. Research in Applied Mathematics, Fluid Mechanics and Computer Science

    NASA Technical Reports Server (NTRS)

    1999-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1998 through March 31, 1999.

  4. [Research activities in applied mathematics, fluid mechanics, and computer science

    NASA Technical Reports Server (NTRS)

    1995-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period April 1, 1995 through September 30, 1995.

  5. Kenny Gruchalla | NREL

    Science.gov Websites

    feature extraction, human-computer interaction, and physics-based modeling. Professional Experience 2009 ., computer science, University of Colorado at Boulder M.S., computer science, University of Colorado at Boulder B.S., computer science, New Mexico Institute of Mining and Technology

  6. Annotated Computer Output for Illustrative Examples of Clustering Using the Mixture Method and Two Comparable Methods from SAS.

    DTIC Science & Technology

    1987-06-26

    BUREAU OF STANDAR-S1963-A Nw BOM -ILE COPY -. 4eo .?3sa.9"-,,A WIN* MAT HEMATICAL SCIENCES _*INSTITUTE AD-A184 687 DTICS!ELECTE ANNOTATED COMPUTER OUTPUT...intoduction to the use of mixture models in clustering. Cornell University Biometrics Unit Technical Report BU-920-M and Mathematical Sciences Institute...mixture method and two comparable methods from SAS. Cornell University Biometrics Unit Technical Report BU-921-M and Mathematical Sciences Institute

  7. The NASA computer science research program plan

    NASA Technical Reports Server (NTRS)

    1983-01-01

    A taxonomy of computer science is included, one state of the art of each of the major computer science categories is summarized. A functional breakdown of NASA programs under Aeronautics R and D, space R and T, and institutional support is also included. These areas were assessed against the computer science categories. Concurrent processing, highly reliable computing, and information management are identified.

  8. Brief History of Computer-Assisted Instruction at the Institute for Mathematical Studies in the Social Sciences.

    ERIC Educational Resources Information Center

    Stanford Univ., CA. Inst. for Mathematical Studies in Social Science.

    In 1963, the Institute began a program of research and development in computer-assisted instruction (CAI). Their efforts have been funded at various times by the Carnegie Corporation of New York, The National Science Foundation and the United States Office of Education. Starting with a medium-sized computer and six student stations, the Institute…

  9. An Assessment of Computer Science Degree Programs in Virginia. A Report to the Council of Higher Education and Virginia's State-Supported Institutions of Higher Education.

    ERIC Educational Resources Information Center

    Virginia State Council of Higher Education, Richmond.

    This report presents the results of a review of all significant instructional efforts in the computer science discipline in Virginia institutions of higher education, with emphasis on those whose instructional activities constitute complete degree programs. The report is based largely on information provided by the institutions in self-studies. A…

  10. Unclassified Computing Capability: User Responses to a Multiprogrammatic and Institutional Computing Questionnaire

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCoy, M; Kissel, L

    2002-01-29

    We are experimenting with a new computing model to be applied to a new computer dedicated to that model. Several LLNL science teams now have computational requirements, evidenced by the mature scientific applications that have been developed over the past five plus years, that far exceed the capability of the institution's computing resources. Thus, there is increased demand for dedicated, powerful parallel computational systems. Computation can, in the coming year, potentially field a capability system that is low cost because it will be based on a model that employs open source software and because it will use PC (IA32-P4) hardware.more » This incurs significant computer science risk regarding stability and system features but also presents great opportunity. We believe the risks can be managed, but the existence of risk cannot be ignored. In order to justify the budget for this system, we need to make the case that it serves science and, through serving science, serves the institution. That is the point of the meeting and the White Paper that we are proposing to prepare. The questions are listed and the responses received are in this report.« less

  11. The Effects of Integrating Service Learning into Computer Science: An Inter-Institutional Longitudinal Study

    ERIC Educational Resources Information Center

    Payton, Jamie; Barnes, Tiffany; Buch, Kim; Rorrer, Audrey; Zuo, Huifang

    2015-01-01

    This study is a follow-up to one published in computer science education in 2010 that reported preliminary results showing a positive impact of service learning on student attitudes associated with success and retention in computer science. That paper described how service learning was incorporated into a computer science course in the context of…

  12. Center for Computing Research Summer Research Proceedings 2015.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bradley, Andrew Michael; Parks, Michael L.

    2015-12-18

    The Center for Computing Research (CCR) at Sandia National Laboratories organizes a summer student program each summer, in coordination with the Computer Science Research Institute (CSRI) and Cyber Engineering Research Institute (CERI).

  13. First 3 years of operation of RIACS (Research Institute for Advanced Computer Science) (1983-1985)

    NASA Technical Reports Server (NTRS)

    Denning, P. J.

    1986-01-01

    The focus of the Research Institute for Advanced Computer Science (RIACS) is to explore matches between advanced computing architectures and the processes of scientific research. An architecture evaluation of the MIT static dataflow machine, specification of a graphical language for expressing distributed computations, and specification of an expert system for aiding in grid generation for two-dimensional flow problems was initiated. Research projects for 1984 and 1985 are summarized.

  14. [Activities of Institute for Computer Applications in Science and Engineering (ICASE)

    NASA Technical Reports Server (NTRS)

    Bushnell, Dennis M. (Technical Monitor)

    2001-01-01

    This report summarizes research conducted at ICASE in applied mathematics, fluid mechanics, computer science, and structures and material sciences during the period April 1, 2000 through September 30, 2000.

  15. ICASE semiannual report, April 1 - September 30, 1989

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The Institute conducts unclassified basic research in applied mathematics, numerical analysis, and computer science in order to extend and improve problem-solving capabilities in science and engineering, particularly in aeronautics and space. The major categories of the current Institute for Computer Applications in Science and Engineering (ICASE) research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification problems, with emphasis on effective numerical methods; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers. ICASE reports are considered to be primarily preprints of manuscripts that have been submitted to appropriate research journals or that are to appear in conference proceedings.

  16. 75 FR 15675 - Professional Research Experience Program in Chemical Science and Technology Laboratory...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-30

    ... in physics, chemistry, mathematics, computer science, or engineering. Institutions should have a 4..., mathematics, computer science, or engineering with work experiences in laboratories or other settings...-0141-01] Professional Research Experience Program in Chemical Science and Technology Laboratory...

  17. Case Studies of Liberal Arts Computer Science Programs

    ERIC Educational Resources Information Center

    Baldwin, D.; Brady, A.; Danyluk, A.; Adams, J.; Lawrence, A.

    2010-01-01

    Many undergraduate liberal arts institutions offer computer science majors. This article illustrates how quality computer science programs can be realized in a wide variety of liberal arts settings by describing and contrasting the actual programs at five liberal arts colleges: Williams College, Kalamazoo College, the State University of New York…

  18. Entrepreneurial Health Informatics for Computer Science and Information Systems Students

    ERIC Educational Resources Information Center

    Lawler, James; Joseph, Anthony; Narula, Stuti

    2014-01-01

    Corporate entrepreneurship is a critical area of curricula for computer science and information systems students. Few institutions of computer science and information systems have entrepreneurship in the curricula however. This paper presents entrepreneurial health informatics as a course in a concentration of Technology Entrepreneurship at a…

  19. Assessment of Examinations in Computer Science Doctoral Education

    ERIC Educational Resources Information Center

    Straub, Jeremy

    2014-01-01

    This article surveys the examination requirements for attaining degree candidate (candidacy) status in computer science doctoral programs at all of the computer science doctoral granting institutions in the United States. It presents a framework for program examination requirement categorization, and categorizes these programs by the type or types…

  20. Current research activities: Applied and numerical mathematics, fluid mechanics, experiments in transition and turbulence and aerodynamics, and computer science

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, fluid mechanics including fluid dynamics, acoustics, and combustion, aerodynamics, and computer science during the period 1 Apr. 1992 - 30 Sep. 1992 is summarized.

  1. Mathematics and Computer Science | Argonne National Laboratory

    Science.gov Websites

    Genomics and Systems Biology LCRCLaboratory Computing Resource Center MCSGMidwest Center for Structural Genomics NAISENorthwestern-Argonne Institute of Science & Engineering SBCStructural Biology Center

  2. Mobile Learning in a Large Blended Computer Science Classroom: System Function, Pedagogies, and Their Impact on Learning

    ERIC Educational Resources Information Center

    Shen, Ruimin; Wang, Minjuan; Gao, Wanping; Novak, D.; Tang, Lin

    2009-01-01

    The computer science classes in China's institutions of higher education often have large numbers of students. In addition, many institutions offer "blended" classes that include both on-campus and online students. These large blended classrooms have long suffered from a lack of interactivity. Many online classes simply provide recorded…

  3. Research in applied mathematics, numerical analysis, and computer science

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  4. Semiannual report

    NASA Technical Reports Server (NTRS)

    1995-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period 1 Oct. 1994 - 31 Mar. 1995.

  5. Snatching Defeat from the Jaws of Victory: When Good Projects Go Bad. Girls and Computer Science.

    ERIC Educational Resources Information Center

    Sanders, Jo

    In week-long semesters in the summers of 1997, 1998, and 1999, the 6APT (Summer Institute in Computer Science for Advanced Placement Teachers) project taught 240 high school teachers of Advanced Placement Computer Science (APCS) about gender equity in computers. Teachers were then followed through 2000. Results indicated that while teachers, did…

  6. Summer Institute to Train Data Processing Teachers for the New Oklahoma State-Wide Computer Science System, Phase II. Final Report.

    ERIC Educational Resources Information Center

    Tuttle, Francis

    Twenty-three instructors participated in an 8-week summer institute to develop their technical competency to teach the second year of a 2-year Technical Education Computer Science Program. Instructional material covered the following areas: (1) compiler languages and systems design, (2) cost studies, (3) business organization, (4) advanced…

  7. Computer Science and Technology Publications. NBS Publications List 84.

    ERIC Educational Resources Information Center

    National Bureau of Standards (DOC), Washington, DC. Inst. for Computer Sciences and Technology.

    This bibliography lists publications of the Institute for Computer Sciences and Technology of the National Bureau of Standards. Publications are listed by subject in the areas of computer security, computer networking, and automation technology. Sections list publications of: (1) current Federal Information Processing Standards; (2) computer…

  8. Multilinear Computing and Multilinear Algebraic Geometry

    DTIC Science & Technology

    2016-08-10

    landmark paper titled “Most tensor problems are NP-hard” (see [14] in Section 3) in the Journal of the ACM, the premier journal in Computer Science ...Higher-order cone programming,” Machine Learning Thematic Trimester, International Centre for Mathematics and Computer Science , Toulouse, France...geometry-and-data-analysis • 2014 SIMONS INSTITUTE WORKSHOP: Workshop on Tensors in Computer Science and Geometry, University of California, Berkeley, CA

  9. Summary of research in progress at ICASE

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1992 through March 31, 1993.

  10. Computers, Electronic Networking and Education: Some American Experiences.

    ERIC Educational Resources Information Center

    McConnell, David

    1991-01-01

    Describes new developments in distributed educational computing at Massachusetts Institute of Technology (MIT, "Athena"), Carnegie Mellon University ("Andrew"), Brown University "Intermedia"), Electronic University Network (California), Western Behavioral Sciences Institute (California), and University of California,…

  11. Computational Science in Armenia (Invited Talk)

    NASA Astrophysics Data System (ADS)

    Marandjian, H.; Shoukourian, Yu.

    This survey is devoted to the development of informatics and computer science in Armenia. The results in theoretical computer science (algebraic models, solutions to systems of general form recursive equations, the methods of coding theory, pattern recognition and image processing), constitute the theoretical basis for developing problem-solving-oriented environments. As examples can be mentioned: a synthesizer of optimized distributed recursive programs, software tools for cluster-oriented implementations of two-dimensional cellular automata, a grid-aware web interface with advanced service trading for linear algebra calculations. In the direction of solving scientific problems that require high-performance computing resources, examples of completed projects include the field of physics (parallel computing of complex quantum systems), astrophysics (Armenian virtual laboratory), biology (molecular dynamics study of human red blood cell membrane), meteorology (implementing and evaluating the Weather Research and Forecast Model for the territory of Armenia). The overview also notes that the Institute for Informatics and Automation Problems of the National Academy of Sciences of Armenia has established a scientific and educational infrastructure, uniting computing clusters of scientific and educational institutions of the country and provides the scientific community with access to local and international computational resources, that is a strong support for computational science in Armenia.

  12. Activities of the Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph

    1994-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. Research at RIACS is currently being done in the following areas: (1) parallel computing; (2) advanced methods for scientific computing; (3) high performance networks; and (4) learning systems. RIACS technical reports are usually preprints of manuscripts that have been submitted to research journals or conference proceedings. A list of these reports for the period January 1, 1994 through December 31, 1994 is in the Reports and Abstracts section of this report.

  13. Determining the Effectiveness of the 3D Alice Programming Environment at the Computer Science I Level

    ERIC Educational Resources Information Center

    Sykes, Edward R.

    2007-01-01

    Student retention in Computer Science is becoming a serious concern among Educators in many colleges and universities. Most institutions currently face a significant drop in enrollment in Computer Science. A number of different tools and strategies have emerged to address this problem (e.g., BlueJ, Karel Robot, etc.). Although these tools help to…

  14. Semiannual final report, 1 October 1991 - 31 March 1992

    NASA Technical Reports Server (NTRS)

    1992-01-01

    A summary of research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period 1 Oct. 1991 through 31 Mar. 1992 is presented.

  15. Institutional Computing Executive Group Review of Multi-programmatic & Institutional Computing, Fiscal Year 2005 and 2006

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Langer, S; Rotman, D; Schwegler, E

    The Institutional Computing Executive Group (ICEG) review of FY05-06 Multiprogrammatic and Institutional Computing (M and IC) activities is presented in the attached report. In summary, we find that the M and IC staff does an outstanding job of acquiring and supporting a wide range of institutional computing resources to meet the programmatic and scientific goals of LLNL. The responsiveness and high quality of support given to users and the programs investing in M and IC reflects the dedication and skill of the M and IC staff. M and IC has successfully managed serial capacity, parallel capacity, and capability computing resources.more » Serial capacity computing supports a wide range of scientific projects which require access to a few high performance processors within a shared memory computer. Parallel capacity computing supports scientific projects that require a moderate number of processors (up to roughly 1000) on a parallel computer. Capability computing supports parallel jobs that push the limits of simulation science. M and IC has worked closely with Stockpile Stewardship, and together they have made LLNL a premier institution for computational and simulation science. Such a standing is vital to the continued success of laboratory science programs and to the recruitment and retention of top scientists. This report provides recommendations to build on M and IC's accomplishments and improve simulation capabilities at LLNL. We recommend that institution fully fund (1) operation of the atlas cluster purchased in FY06 to support a few large projects; (2) operation of the thunder and zeus clusters to enable 'mid-range' parallel capacity simulations during normal operation and a limited number of large simulations during dedicated application time; (3) operation of the new yana cluster to support a wide range of serial capacity simulations; (4) improvements to the reliability and performance of the Lustre parallel file system; (5) support for the new GDO petabyte-class storage facility on the green network for use in data intensive external collaborations; and (6) continued support for visualization and other methods for analyzing large simulations. We also recommend that M and IC begin planning in FY07 for the next upgrade of its parallel clusters. LLNL investments in M and IC have resulted in a world-class simulation capability leading to innovative science. We thank the LLNL management for its continued support and thank the M and IC staff for its vision and dedicated efforts to make it all happen.« less

  16. Science | Argonne National Laboratory

    Science.gov Websites

    Publications Researchers Postdocs Exascale Computing Institute for Molecular Engineering at Argonne Work with Scientific Publications Researchers Postdocs Exascale Computing Institute for Molecular Engineering at understand, predict, and ultimately control matter and energy at the electronic, atomic, and molecular levels

  17. Research in progress in applied mathematics, numerical analysis, fluid mechanics, and computer science

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1993 through March 31, 1994. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustics and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.

  18. Integrating Mobile Robotics and Vision with Undergraduate Computer Science

    ERIC Educational Resources Information Center

    Cielniak, G.; Bellotto, N.; Duckett, T.

    2013-01-01

    This paper describes the integration of robotics education into an undergraduate Computer Science curriculum. The proposed approach delivers mobile robotics as well as covering the closely related field of Computer Vision and is directly linked to the research conducted at the authors' institution. The paper describes the most relevant details of…

  19. Stuck in the Shallow End: Education, Race, and Computing. Updated Edition

    ERIC Educational Resources Information Center

    Margolis, Jane

    2017-01-01

    The number of African Americans and Latino/as receiving undergraduate and advanced degrees in computer science is disproportionately low. And relatively few African American and Latino/a high school students receive the kind of institutional encouragement, educational opportunities, and preparation needed for them to choose computer science as a…

  20. 2005 White Paper on Institutional Capability Computing Requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carnes, B; McCoy, M; Seager, M

    This paper documents the need for a significant increase in the computing infrastructure provided to scientists working in the unclassified domains at Lawrence Livermore National Laboratory (LLNL). This need could be viewed as the next step in a broad strategy outlined in the January 2002 White Paper (UCRL-ID-147449) that bears essentially the same name as this document. Therein we wrote: 'This proposed increase could be viewed as a step in a broader strategy linking hardware evolution to applications development that would take LLNL unclassified computational science to a position of distinction if not preeminence by 2006.' This position of distinctionmore » has certainly been achieved. This paper provides a strategy for sustaining this success but will diverge from its 2002 predecessor in that it will: (1) Amplify the scientific and external success LLNL has enjoyed because of the investments made in 2002 (MCR, 11 TF) and 2004 (Thunder, 23 TF). (2) Describe in detail the nature of additional investments that are important to meet both the institutional objectives of advanced capability for breakthrough science and the scientists clearly stated request for adequate capacity and more rapid access to moderate-sized resources. (3) Put these requirements in the context of an overall strategy for simulation science and external collaboration. While our strategy for Multiprogrammatic and Institutional Computing (M&IC) has worked well, three challenges must be addressed to assure and enhance our position. The first is that while we now have over 50 important classified and unclassified simulation codes available for use by our computational scientists, we find ourselves coping with high demand for access and long queue wait times. This point was driven home in the 2005 Institutional Computing Executive Group (ICEG) 'Report Card' to the Deputy Director for Science and Technology (DDST) Office and Computation Directorate management. The second challenge is related to the balance that should be maintained in the simulation environment. With the advent of Thunder, the institution directed a change in course from past practice. Instead of making Thunder available to the large body of scientists, as was MCR, and effectively using it as a capacity system, the intent was to make it available to perhaps ten projects so that these teams could run very aggressive problems for breakthrough science. This usage model established Thunder as a capability system. The challenge this strategy raises is that the majority of scientists have not seen an improvement in capacity computing resources since MCR, thus creating significant tension in the system. The question then is: 'How do we address the institution's desire to maintain the potential for breakthrough science and also meet the legitimate requests from the ICEG to achieve balance?' Both the capability and the capacity environments must be addressed through this one procurement. The third challenge is to reach out more aggressively to the national science community to encourage access to LLNL resources as part of a strategy for sharpening our science through collaboration. Related to this, LLNL has been unable in the past to provide access for sensitive foreign nationals (SFNs) to the Livermore Computing (LC) unclassified 'yellow' network. Identifying some mechanism for data sharing between LLNL computational scientists and SFNs would be a first practical step in fostering cooperative, collaborative relationships with an important and growing sector of the American science community.« less

  1. ICASE

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in the areas of (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest, including acoustics and combustion; (3) experimental research in transition and turbulence and aerodynamics involving Langley facilities and scientists; and (4) computer science.

  2. Footstep Planning on Uneven Terrain with Mixed-Integer Convex Optimization

    DTIC Science & Technology

    2014-08-01

    ORGANIZATION NAME(S) AND ADDRESS(ES) Massachusetts Institute of Technology,Computer Science and Artificial Intellegence Laboratory,Cambridge,MA,02139...the MIT Energy Initiative, MIT CSAIL, and the DARPA Robotics Challenge. 1Robin Deits is with the Computer Science and Artificial Intelligence Laboratory

  3. UMIST, IDN, NTUA, TUM, ULB: A Successful European Exchange Programme.

    ERIC Educational Resources Information Center

    Borne, Pierre; Singh, Madan G.

    1989-01-01

    Describes the exchange programs that existed for a decade in the fields of automatic control and computer science including the University of Manchester Institute of Science and Technology, the "Institut Industriel du Nord," the National Technical University of Athens, the Technical University of Munich, and the Free University of…

  4. Final Report. Institute for Ultralscale Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, Kwan-Liu; Galli, Giulia; Gygi, Francois

    The SciDAC Institute for Ultrascale Visualization brought together leading experts from visualization, high-performance computing, and science application areas to make advanced visualization solutions for SciDAC scientists and the broader community. Over the five-year project, the Institute introduced many new enabling visualization techniques, which have significantly enhanced scientists’ ability to validate their simulations, interpret their data, and communicate with others about their work and findings. This Institute project involved a large number of junior and student researchers, who received the opportunities to work on some of the most challenging science applications and gain access to the most powerful high-performance computing facilitiesmore » in the world. They were readily trained and prepared for facing the greater challenges presented by extreme-scale computing. The Institute’s outreach efforts, through publications, workshops and tutorials, successfully disseminated the new knowledge and technologies to the SciDAC and the broader scientific communities. The scientific findings and experience of the Institute team helped plan the SciDAC3 program.« less

  5. The Student/Library Computer Science Collaborative

    ERIC Educational Resources Information Center

    Hahn, Jim

    2015-01-01

    With funding from an Institute of Museum and Library Services demonstration grant, librarians of the Undergraduate Library at the University of Illinois at Urbana-Champaign partnered with students in computer science courses to design and build student-centered mobile apps. The grant work called for demonstration of student collaboration…

  6. Provision of Information to the Research Staff.

    ERIC Educational Resources Information Center

    Williams, Martha E.

    The Information Sciences section at Illinois Institute of Technology Research Institute (IITRI) is now operating a Computer Search Center (CSC) for handling numerous machine-readable data bases. The computer programs are generalized in the sense that they will handle any incoming data base. This is accomplished by means of a preprocessor system…

  7. Building a Science Software Institute: Synthesizing the Lessons Learned from the ISEES and WSSI Software Institute Conceptualization Efforts

    NASA Astrophysics Data System (ADS)

    Idaszak, R.; Lenhardt, W. C.; Jones, M. B.; Ahalt, S.; Schildhauer, M.; Hampton, S. E.

    2014-12-01

    The NSF, in an effort to support the creation of sustainable science software, funded 16 science software institute conceptualization efforts. The goal of these conceptualization efforts is to explore approaches to creating the institutional, sociological, and physical infrastructures to support sustainable science software. This paper will present the lessons learned from two of these conceptualization efforts, the Institute for Sustainable Earth and Environmental Software (ISEES - http://isees.nceas.ucsb.edu) and the Water Science Software Institute (WSSI - http://waters2i2.org). ISEES is a multi-partner effort led by National Center for Ecological Analysis and Synthesis (NCEAS). WSSI, also a multi-partner effort, is led by the Renaissance Computing Institute (RENCI). The two conceptualization efforts have been collaborating due to the complementarity of their approaches and given the potential synergies of their science focus. ISEES and WSSI have engaged in a number of activities to address the challenges of science software such as workshops, hackathons, and coding efforts. More recently, the two institutes have also collaborated on joint activities including training, proposals, and papers. In addition to presenting lessons learned, this paper will synthesize across the two efforts to project a unified vision for a science software institute.

  8. Vanderbilt University Institute of Imaging Science Center for Computational Imaging XNAT: A multimodal data archive and processing environment.

    PubMed

    Harrigan, Robert L; Yvernault, Benjamin C; Boyd, Brian D; Damon, Stephen M; Gibney, Kyla David; Conrad, Benjamin N; Phillips, Nicholas S; Rogers, Baxter P; Gao, Yurui; Landman, Bennett A

    2016-01-01

    The Vanderbilt University Institute for Imaging Science (VUIIS) Center for Computational Imaging (CCI) has developed a database built on XNAT housing over a quarter of a million scans. The database provides framework for (1) rapid prototyping, (2) large scale batch processing of images and (3) scalable project management. The system uses the web-based interfaces of XNAT and REDCap to allow for graphical interaction. A python middleware layer, the Distributed Automation for XNAT (DAX) package, distributes computation across the Vanderbilt Advanced Computing Center for Research and Education high performance computing center. All software are made available in open source for use in combining portable batch scripting (PBS) grids and XNAT servers. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. Understanding Initial Undergraduate Expectations and Identity in Computing Studies

    ERIC Educational Resources Information Center

    Kinnunen, Päivi; Butler, Matthew; Morgan, Michael; Nylen, Aletta; Peters, Anne-Kathrin; Sinclair, Jane; Kalvala, Sara; Pesonen, Erkki

    2018-01-01

    There is growing appreciation of the importance of understanding the student perspective in Higher Education (HE) at both institutional and international levels. This is particularly important in Science, Technology, Engineering and Mathematics subjects such as Computer Science (CS) and Engineering in which industry needs are high but so are…

  10. Improving the Human Hazard Characterization of Chemicals: A Tox21 Update

    EPA Science Inventory

    Background: In 2008, the National Institute of Environmental Health Sciences/National Toxicology Program, the U.S. Environmental Protection Agency’s National Center for Computational Toxicology, and the National Human Genome Research Institute/National Institutes of Health ...

  11. Semiannual report, 1 April - 30 September 1991

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The major categories of the current Institute for Computer Applications in Science and Engineering (ICASE) research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification problems, with emphasis on effective numerical methods; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software for parallel computers. Research in these areas is discussed.

  12. Unlocking the Barriers to Women and Minorities in Computer Science and Information Systems Studies: Results from a Multi-Methodological Study Conducted at Two Minority Serving Institutions

    ERIC Educational Resources Information Center

    Buzzetto-More, Nicole; Ukoha, Ojiabo; Rustagi, Narendra

    2010-01-01

    The under representation of women and minorities in undergraduate computer science and information systems programs is a pervasive and persistent problem in the United States. Needed is a better understanding of the background and psychosocial factors that attract, or repel, minority students from computing disciplines. An examination of these…

  13. High performance computing for advanced modeling and simulation of materials

    NASA Astrophysics Data System (ADS)

    Wang, Jue; Gao, Fei; Vazquez-Poletti, Jose Luis; Li, Jianjiang

    2017-02-01

    The First International Workshop on High Performance Computing for Advanced Modeling and Simulation of Materials (HPCMS2015) was held in Austin, Texas, USA, Nov. 18, 2015. HPCMS 2015 was organized by Computer Network Information Center (Chinese Academy of Sciences), University of Michigan, Universidad Complutense de Madrid, University of Science and Technology Beijing, Pittsburgh Supercomputing Center, China Institute of Atomic Energy, and Ames Laboratory.

  14. Translations on USSR Science and Technology Physical Sciences and Technology No. 18

    DTIC Science & Technology

    1977-09-19

    and Avetik Gukasyan discuss component arrangement alternatives. COPYRIGHT: Notice not available 8545 CSO: 1870 CYBERNETICS, COMPUTERS AND...1974. COPYRIGHT: Notice not available 8545 CSO: 1870 CYBERNETICS, COMPUTERS AND AUTOMATION TECHNOLOGY ’PROYEKC’ COMPUTER-ASSISTED DESIGN SYSTEM...throughout the world are struggling. The "Proyekt" system, produced in the Institute of Cybernetics, assists in automating the design and manufacture of

  15. Undergraduate Research in Physics as a course for Engineering and Computer Science Majors

    NASA Astrophysics Data System (ADS)

    O'Brien, James; Rueckert, Franz; Sirokman, Greg

    2017-01-01

    Undergraduate research has become more and more integral to the functioning of higher educational institutions. At many institutions undergraduate research is conducted as capstone projects in the pure sciences, however, science faculty at some schools (including that of the authors) face the challenge of not having science majors. Even at these institutions, a select population of high achieving engineering students will often express a keen interest in conducting pure science research. Since a foray into science research provides the student the full exposure to the scientific method and scientific collaboration, the experience can be quite rewarding and beneficial to the development of the student as a professional. To this end, the authors have been working to find new contexts in which to offer research experiences to non- science majors, including a new undergraduate research class conducted by physics and chemistry faculty. An added benefit is that these courses are inherently interdisciplinary. Students in the engineering and computer science fields step into physics and chemistry labs to solve science problems, often invoking their own relevant expertise. In this paper we start by discussing the common themes and outcomes of the course. We then discuss three particular projects that were conducted with engineering students and focus on how the undergraduate research experience enhanced their already rigorous engineering curriculum.

  16. Experiences of Computer Science Curriculum Design: A Phenomenological Study

    ERIC Educational Resources Information Center

    Sloan, Arthur; Bowe, Brian

    2015-01-01

    This paper presents a qualitative study of 12 computer science lecturers' experiences of curriculum design of several degree programmes during a time of transition from year-long to semesterised courses, due to institutional policy change. The background to the study is outlined, as are the reasons for choosing the research methodology. The main…

  17. Summer Institute for Physical Science Teachers

    NASA Astrophysics Data System (ADS)

    Maheswaranathan, Ponn; Calloway, Cliff

    2007-04-01

    A summer institute for physical science teachers was conducted at Winthrop University, June 19-29, 2006. Ninth grade physical science teachers at schools within a 50-mile radius from Winthrop were targeted. We developed a graduate level physics professional development course covering selected topics from both the physics and chemistry content areas of the South Carolina Science Standards. Delivery of the material included traditional lectures and the following new approaches in science teaching: hands-on experiments, group activities, computer based data collection, computer modeling, with group discussions & presentations. Two experienced master teachers assisted us during the delivery of the course. The institute was funded by the South Carolina Department of Education. The requested funds were used for the following: faculty salaries, the University contract course fee, some of the participants' room and board, startup equipment for each teacher, and indirect costs to Winthrop University. Startup equipment included a Pasco stand-alone, portable Xplorer GLX interface with sensors (temperature, voltage, pH, pressure, motion, and sound), and modeling software (Wavefunction's Spartan Student and Odyssey). What we learned and ideas for future K-12 teacher preparation initiatives will be presented.

  18. United States Air Force Summer Faculty Research Program (1987). Program Technical Report. Volume 1.

    DTIC Science & Technology

    1987-12-01

    Mechanical Engineering Specialty: Engineering Science Rose-Hulman Institute Assigned: APL 5500 Wabash Avenue - Terre Haute, IN 47803 (812) 877-1511 Dr...Professor/Di rector 1973 Dept. of Humanities Specialty: Literature/Language Rose-Hulman Inst. of Technology Assigned: HRL/LR 5500 Wabash Avenue - Terre...1976 Assistant Professor Specialty: Computer Science Dept. of Computer Science Assigned: AL Rose-Hulman Inst. of Technology 5500 Wabash Ave. Terre Haute

  19. PREFACE: 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT2013)

    NASA Astrophysics Data System (ADS)

    Wang, Jianxiong

    2014-06-01

    This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF

  20. 15 CFR Supplement No. 7 to Part 748 - Authorization Validated End-User (VEU): List of Validated End-Users, Respective Items Eligible...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ..., RaycomInfotech, Park Tower C, No. 2 Science Institute South Rd., Zhong Guan Cun, Haidian District... C, No. 2 Science Institute South Rd., Zhong Guan Cun, Haidian District, Beijing, China 100190 75 FR...(limited to technology for computer products or components not exceeding an adjusted peak performance (APP...

  1. A Study to Determine the Basic Science and Mathematics Topics Most Needed by Engineering Technology Graduates of Wake Technical Institute in Performing Job Duties.

    ERIC Educational Resources Information Center

    Edwards, Timothy I.; Roberson, Clarence E., Jr.

    A survey of 470 graduates of the six engineering technology programs at Wake Technical Institute--Architectural, Chemical, Civil Engineering, Computer, Electronic Engineering, and Industrial Engineering Technologies--and 227 of their employers was conducted in October, 1979, to determine the science and mathematics topics most needed by…

  2. Fundamental Computer Science Conceptual Understandings for High School Students Using Original Computer Game Design

    ERIC Educational Resources Information Center

    Ernst, Jeremy V.; Clark, Aaron C.

    2012-01-01

    In 2009, the North Carolina Virtual Public Schools worked with researchers at the William and Ida Friday Institute to produce and evaluate the use of game creation by secondary students as a means for learning content related to career awareness in Science, Technology, Engineering and Mathematics (STEM) disciplines, with particular emphasis in…

  3. Institutional Research Productivity in Science Education for the 1990s: Top 30 Rankings

    NASA Astrophysics Data System (ADS)

    Barrow, Lloyd H.; Settlage, John; Germann, Paul J.

    2008-08-01

    The purpose of this study was to identify the major science education programs in the United States, where the science education researchers published their research. This research is the first study of the scholarly productivity of science education programs at domestic institutions of higher education. Each issue of the eight research journals ( Journal of Research in Science Teaching, Science Education, International Journal of Science Education, Journal of Science Teacher Education, School Science and Mathematics, Journal of Computers in Math and Science Teaching, Journal of Science Education and Technology, and Journal of Elementary Science Education) published in the 1990s provided the author(s) and their institutional affiliation. The resultant ranking of raw and weighted counts for the top 30 science educations programs shows variation in journals where research was published. Overall, regardless whether the total number of publications (raw) or weighted rating there was 90% agreement among top 10 and 70% agreement among the bottom 10. Potential explanations for variations and uses for rankings are discussed.

  4. Collective Properties of Neural Systems and Their Relation to Other Physical Models

    DTIC Science & Technology

    1988-08-05

    been computed explicitly. This has been achieved algorithmically by utilizing methods introduced earlier. It should be emphasized that in addition to...Research Institute for Mathematical Sciences. K’oto Universin. K roto 606. .apan and E. BAROUCH Department of Mathematics and Computer Sciene. Clarkon...Mathematics and Computer Science, Clarkson University, where this work was collaborated. References I. IBabu, S. V. and Barouch E., An exact soIlution for the

  5. Research 1970/1971: Annual Progress Report.

    ERIC Educational Resources Information Center

    Georgia Inst. of Tech., Atlanta. Science Information Research Center.

    The report presents a summary of science information research activities of the School of Information and Computer Science, Georgia Institute of Technology. Included are project reports on interrelated studies in science information, information processing and systems design, automata and systems theories, and semiotics and linguistics. Also…

  6. University of Washington's eScience Institute Promotes New Training and Career Pathways in Data Science

    NASA Astrophysics Data System (ADS)

    Stone, S.; Parker, M. S.; Howe, B.; Lazowska, E.

    2015-12-01

    Rapid advances in technology are transforming nearly every field from "data-poor" to "data-rich." The ability to extract knowledge from this abundance of data is the cornerstone of 21st century discovery. At the University of Washington eScience Institute, our mission is to engage researchers across disciplines in developing and applying advanced computational methods and tools to real world problems in data-intensive discovery. Our research team consists of individuals with diverse backgrounds in domain sciences such as astronomy, oceanography and geology, with complementary expertise in advanced statistical and computational techniques such as data management, visualization, and machine learning. Two key elements are necessary to foster careers in data science: individuals with cross-disciplinary training in both method and domain sciences, and career paths emphasizing alternative metrics for advancement. We see persistent and deep-rooted challenges for the career paths of people whose skills, activities and work patterns don't fit neatly into the traditional roles and success metrics of academia. To address these challenges the eScience Institute has developed training programs and established new career opportunities for data-intensive research in academia. Our graduate students and post-docs have mentors in both a methodology and an application field. They also participate in coursework and tutorials to advance technical skill and foster community. Professional Data Scientist positions were created to support research independence while encouraging the development and adoption of domain-specific tools and techniques. The eScience Institute also supports the appointment of faculty who are innovators in developing and applying data science methodologies to advance their field of discovery. Our ultimate goal is to create a supportive environment for data science in academia and to establish global recognition for data-intensive discovery across all fields.

  7. A survey of students` ethical attitudes using computer-related scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanchey, C.M.; Kingsbury, J.

    Many studies exist that examine ethical beliefs and attitudes of university students ascending medium or large institutions. There are also many studies which examine ethical attitudes and beliefs of computer science and computer information systems majors. None, however, examines ethical attitudes of university students (regardless of undergraduate major) at a small, Christian, liberal arts institution regarding computer-related situations. This paper will present data accumulated by an on-going study in which students are presented seven scenarios--all of which involve some aspect of computing technology. These students were randomly selected from a small, Christian, liberal-arts university.

  8. Cumulative reports and publications through December 31, 1991

    NASA Technical Reports Server (NTRS)

    1992-01-01

    A reports and publications list is given from the Institute for Computer Applications in Science and Engineering (ICASE) through December 31, 1991. The major categories of the current ICASE research program are; numerical methods, control and parameter identification problems, computational problems in engineering and the physical sciences, and computer systems and software. Since ICASE reports are intended to be preprints of articles that will appear in journals or conference proceedings, the published reference is included when available.

  9. Designing a Versatile Dedicated Computing Lab to Support Computer Network Courses: Insights from a Case Study

    ERIC Educational Resources Information Center

    Gercek, Gokhan; Saleem, Naveed

    2006-01-01

    Providing adequate computing lab support for Management Information Systems (MIS) and Computer Science (CS) programs is a perennial challenge for most academic institutions in the US and abroad. Factors, such as lack of physical space, budgetary constraints, conflicting needs of different courses, and rapid obsolescence of computing technology,…

  10. Science Information System in Japan. NIER Occasional Paper 02/83.

    ERIC Educational Resources Information Center

    Matsumura, Tamiko

    This paper describes the development of a proposed Japanese Science Information System (SIS), a nationwide network of research and academic libraries, large-scale computer centers, national research institutes, and other organizations, to be formed for the purpose of sharing information and resources in the natural sciences, technology, the…

  11. Computational Intelligence and Its Impact on Future High-Performance Engineering Systems

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler)

    1996-01-01

    This document contains presentations from the joint UVA/NASA Workshop on Computational Intelligence held at the Virginia Consortium of Engineering and Science Universities, Hampton, Virginia, June 27-28, 1995. The presentations addressed activities in the areas of fuzzy logic, neural networks, and evolutionary computations. Workshop attendees represented NASA, the National Science Foundation, the Department of Energy, National Institute of Standards and Technology (NIST), the Jet Propulsion Laboratory, industry, and academia. The workshop objectives were to assess the state of technology in the Computational intelligence area and to provide guidelines for future research.

  12. Impact of Interdisciplinary Undergraduate Research in Mathematics and Biology on the Development of a New Course Integrating Five STEM Disciplines

    PubMed Central

    Caudill, Lester; Hill, April; Lipan, Ovidiu

    2010-01-01

    Funded by innovative programs at the National Science Foundation and the Howard Hughes Medical Institute, University of Richmond faculty in biology, chemistry, mathematics, physics, and computer science teamed up to offer first- and second-year students the opportunity to contribute to vibrant, interdisciplinary research projects. The result was not only good science but also good science that motivated and informed course development. Here, we describe four recent undergraduate research projects involving students and faculty in biology, physics, mathematics, and computer science and how each contributed in significant ways to the conception and implementation of our new Integrated Quantitative Science course, a course for first-year students that integrates the material in the first course of the major in each of biology, chemistry, mathematics, computer science, and physics. PMID:20810953

  13. Impact of Interdisciplinary Undergraduate Research in mathematics and biology on the development of a new course integrating five STEM disciplines.

    PubMed

    Caudill, Lester; Hill, April; Hoke, Kathy; Lipan, Ovidiu

    2010-01-01

    Funded by innovative programs at the National Science Foundation and the Howard Hughes Medical Institute, University of Richmond faculty in biology, chemistry, mathematics, physics, and computer science teamed up to offer first- and second-year students the opportunity to contribute to vibrant, interdisciplinary research projects. The result was not only good science but also good science that motivated and informed course development. Here, we describe four recent undergraduate research projects involving students and faculty in biology, physics, mathematics, and computer science and how each contributed in significant ways to the conception and implementation of our new Integrated Quantitative Science course, a course for first-year students that integrates the material in the first course of the major in each of biology, chemistry, mathematics, computer science, and physics.

  14. Computer ethics education: Impact from societal norms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, G.B.

    1994-12-31

    Discussions have occurred on the best way to implement the horizontal and vertical integration of education on the social, ethical and professional issues relating to computer science. These discussions have not only included debates on the subject matter and what manner to approach it (i.e. integrated among all computer science courses taught, as a separate required course, or a combination of both), but have also involved debates over who is best qualified to address the subject. What has seldom been addressed, however, is how societal impressions of what is ethical have impacted both those who develop software and those whomore » use it. In light of the experience of such institutions as the U.S. Air Force Academy which recently instituted a program called the Center for Character Development (due to a perceived erosion of the core values of its recruits), should academia and industry expect more from computer scientists than from the population as a whole? It is the integration of ethics courses in the computer science curriculum in light of a general erosion of ethical values in society as a whole that is addressed in this paper.« less

  15. Can a tablet device alter undergraduate science students' study behavior and use of technology?

    PubMed

    Morris, Neil P; Ramsay, Luke; Chauhan, Vikesh

    2012-06-01

    This article reports findings from a study investigating undergraduate biological sciences students' use of technology and computer devices for learning and the effect of providing students with a tablet device. A controlled study was conducted to collect quantitative and qualitative data on the impact of a tablet device on students' use of devices and technology for learning. Overall, we found that students made extensive use of the tablet device for learning, using it in preference to laptop computers to retrieve information, record lectures, and access learning resources. In line with other studies, we found that undergraduate students only use familiar Web 2.0 technologies and that the tablet device did not alter this behavior for the majority of tools. We conclude that undergraduate science students can make extensive use of a tablet device to enhance their learning opportunities without institutions changing their teaching methods or computer systems, but that institutional intervention may be needed to drive changes in student behavior toward the use of novel Web 2.0 technologies.

  16. Mentoring the Next Generation of Science Gateway Developers and Users

    NASA Astrophysics Data System (ADS)

    Hayden, L. B.; Jackson-Ward, F.

    2016-12-01

    The Science Gateway Institute (SGW-I) for the Democratization and Acceleration of Science was a SI2-SSE Collaborative Research conceptualization award funded by NSF in 2012. From 2012 through 2015, we engaged interested members of the science and engineering community in a planning process for a Science Gateway Community Institute (SGCI). Science Gateways provide Web interfaces to some of the most sophisticated cyberinfrastructure resources. They interact with remotely executing science applications on supercomputers, they connect to remote scientific data collections, instruments and sensor streams, and support large collaborations. Gateways allow scientists to concentrate on the most challenging science problems while underlying components such as computing architectures and interfaces to data collection changes. The goal of our institute was to provide coordinating activities across the National Science Foundation, eventually providing services more broadly to projects funded by other agencies. SGW-I has succeeded in identifying two underrepresented communities of future gateway designers and users. The Association of Computer and Information Science/Engineering Departments at Minority Institutions (ADMI) was identified as a source of future gateway designers. The National Organization for the Professional Advancement of Black Chemists and Chemical Engineers (NOBCChE) was identified as a community of future science gateway users. SGW-I efforts to engage NOBCChE and ADMI faculty and students in SGW-I are now woven into the workforce development component of SGCI. SGCI (ScienceGateways.org ) is a collaboration of six universities, led by San Diego Supercomputer Center. The workforce development component is led by Elizabeth City State University (ECSU). ECSU efforts focus is on: Produce a model of engagement; Integration of research into education; and Mentoring of students while aggressively addressing diversity. This paper documents the outcome of the SGW-I conceptualization project and describes the extensive Workforce Development effort going forward into the 5-year SGCI project recently funded by NSF.

  17. [Activities of Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2001-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administrations missions. RIACS is located at the NASA Ames Research Center, Moffett Field, California. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1. Automated Reasoning for Autonomous Systems Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. 2. Human-Centered Computing Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities. 3. High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to analysis of large scientific datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.

  18. Globus Quick Start Guide. Globus Software Version 1.1

    NASA Technical Reports Server (NTRS)

    1999-01-01

    The Globus Project is a community effort, led by Argonne National Laboratory and the University of Southern California's Information Sciences Institute. Globus is developing the basic software infrastructure for computations that integrate geographically distributed computational and information resources.

  19. Association of Small Computer Users in Education (ASCUE) Summer Conference. Proceedings (25th, North Myrtle Beach, South Carolina, June 21-25, 1992).

    ERIC Educational Resources Information Center

    Association of Small Computer Users in Education, Greencastle, IN.

    Forty-three papers from a conference on microcomputers are presented under the following headings: Computing in the Curriculum, Information and Computer Science Information; Institutional and Administrative Computing, and Management, Services, and Training. Topics of the papers include the following: telecommunications projects that work in…

  20. The Computing Alliance of Hispanic-Serving Institutions: Supporting Hispanics at Critical Transition Points

    ERIC Educational Resources Information Center

    Gates, Ann Quiroz; Hug, Sarah; Thiry, Heather; Alo, Richard; Beheshti, Mohsen; Fernandez, John; Rodriguez, Nestor; Adjouadi, Malek

    2011-01-01

    Hispanics have the highest growth rates among all groups in the U.S., yet they remain considerably underrepresented in computing careers and in the numbers who obtain advanced degrees. Hispanics constituted about 7% of undergraduate computer science and computer engineering graduates and 1% of doctoral graduates in 2007-2008. The small number of…

  1. Architecture of a Message-Driven Processor,

    DTIC Science & Technology

    1987-11-01

    Jon Kaplan, Paul Song, Brian Totty, and Scott Wills Artifcial Intelligence Laboratory -4 Laboratory for Computer Science Massachusetts Institute of...Information Dally, Chao, Chien, Hassoun, Horwat, Kaplan, Song, Totty & Wills: Artificial Intelligence i Laboratory and Laboratory for Computer Science, MIT...applied to a problem if we could are 36 bits long (32 data bits + 4 tag bits) and are used to hold efficiently run programs with a granularity of 5s

  2. Expanding capacity and promoting inclusion in introductory computer science: a focus on near-peer mentor preparation and code review

    NASA Astrophysics Data System (ADS)

    Pon-Barry, Heather; Packard, Becky Wai-Ling; St. John, Audrey

    2017-01-01

    A dilemma within computer science departments is developing sustainable ways to expand capacity within introductory computer science courses while remaining committed to inclusive practices. Training near-peer mentors for peer code review is one solution. This paper describes the preparation of near-peer mentors for their role, with a focus on regular, consistent feedback via peer code review and inclusive pedagogy. Introductory computer science students provided consistently high ratings of the peer mentors' knowledge, approachability, and flexibility, and credited peer mentor meetings for their strengthened self-efficacy and understanding. Peer mentors noted the value of videotaped simulations with reflection, discussions of inclusion, and the cohort's weekly practicum for improving practice. Adaptations of peer mentoring for different types of institutions are discussed. Computer science educators, with hopes of improving the recruitment and retention of underrepresented groups, can benefit from expanding their peer support infrastructure and improving the quality of peer mentor preparation.

  3. ISCR Annual Report: Fical Year 2004

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McGraw, J R

    2005-03-03

    Large-scale scientific computation and all of the disciplines that support and help to validate it have been placed at the focus of Lawrence Livermore National Laboratory (LLNL) by the Advanced Simulation and Computing (ASC) program of the National Nuclear Security Administration (NNSA) and the Scientific Discovery through Advanced Computing (SciDAC) initiative of the Office of Science of the Department of Energy (DOE). The maturation of computational simulation as a tool of scientific and engineering research is underscored in the November 2004 statement of the Secretary of Energy that, ''high performance computing is the backbone of the nation's science and technologymore » enterprise''. LLNL operates several of the world's most powerful computers--including today's single most powerful--and has undertaken some of the largest and most compute-intensive simulations ever performed. Ultrascale simulation has been identified as one of the highest priorities in DOE's facilities planning for the next two decades. However, computers at architectural extremes are notoriously difficult to use efficiently. Furthermore, each successful terascale simulation only points out the need for much better ways of interacting with the resulting avalanche of data. Advances in scientific computing research have, therefore, never been more vital to LLNL's core missions than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, LLNL must engage researchers at many academic centers of excellence. In Fiscal Year 2004, the Institute for Scientific Computing Research (ISCR) served as one of LLNL's main bridges to the academic community with a program of collaborative subcontracts, visiting faculty, student internships, workshops, and an active seminar series. The ISCR identifies researchers from the academic community for computer science and computational science collaborations with LLNL and hosts them for short- and long-term visits with the aim of encouraging long-term academic research agendas that address LLNL's research priorities. Through such collaborations, ideas and software flow in both directions, and LLNL cultivates its future workforce. The Institute strives to be LLNL's ''eyes and ears'' in the computer and information sciences, keeping the Laboratory aware of and connected to important external advances. It also attempts to be the ''feet and hands'' that carry those advances into the Laboratory and incorporates them into practice. ISCR research participants are integrated into LLNL's Computing and Applied Research (CAR) Department, especially into its Center for Applied Scientific Computing (CASC). In turn, these organizations address computational challenges arising throughout the rest of the Laboratory. Administratively, the ISCR flourishes under LLNL's University Relations Program (URP). Together with the other five institutes of the URP, it navigates a course that allows LLNL to benefit from academic exchanges while preserving national security. While it is difficult to operate an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and worth the continued effort.« less

  4. An analysis of United States K-12 stem education versus STEM workforce at the dawn of the digital revolution

    NASA Astrophysics Data System (ADS)

    Cataldo, Franca

    The world is at the dawn of a third industrial revolution, the digital revolution, that brings great changes the world over. Today, computing devices, the Internet, and the World Wide Web are vital technology tools that affect every aspect of everyday life and success. While computing technologies offer enormous benefits, there are equally enormous safety and security risks that have been growing exponentially since they became widely available to the public in 1994. Cybercriminals are increasingly implementing sophisticated and serious hack attacks and breaches upon our nation's government, financial institutions, organizations, communities, and private citizens. There is a great need for computer scientists to carry America's innovation and economic growth forward and for cybersecurity professionals to keep our nation safe from criminal hacking. In this digital age, computer science and cybersecurity are essential foundational ingredients of technological innovation, economic growth, and cybersecurity that span all industries. Yet, America's K-12 education institutions are not teaching the computer science and cybersecurity skills required to produce a technologically-savvy 21st century workforce. Education is the key to preparing students to enter the workforce and, therefore, American K-12 STEM education must be reformed to accommodate the teachings required in the digital age. Keywords: Cybersecurity Education, Cybersecurity Education Initiatives, Computer Science Education, Computer Science Education Initiatives, 21 st Century K-12 STEM Education Reform, 21st Century Digital Literacies, High-Tech Innovative Problem-Solving Skills, 21st Century Digital Workforce, Standardized Testing, Foreign Language and Culture Studies, Utica College, Professor Chris Riddell.

  5. Designing a Curriculum for Computer Students in the Community College.

    ERIC Educational Resources Information Center

    Kolatis, Maria

    An overview is provided of the institutional and technological factors to be considered in designing or updating a computer science curriculum at the community college level. After underscoring the importance of the computer in today's society, the paper identifies and discusses the following considerations in curriculum design: (1) the mission of…

  6. Enhancing Student Writing and Computer Programming with LATEX and MATLAB in Multivariable Calculus

    ERIC Educational Resources Information Center

    Sullivan, Eric; Melvin, Timothy

    2016-01-01

    Written communication and computer programming are foundational components of an undergraduate degree in the mathematical sciences. All lower-division mathematics courses at our institution are paired with computer-based writing, coding, and problem-solving activities. In multivariable calculus we utilize MATLAB and LATEX to have students explore…

  7. OPENING REMARKS: SciDAC: Scientific Discovery through Advanced Computing

    NASA Astrophysics Data System (ADS)

    Strayer, Michael

    2005-01-01

    Good morning. Welcome to SciDAC 2005 and San Francisco. SciDAC is all about computational science and scientific discovery. In a large sense, computational science characterizes SciDAC and its intent is change. It transforms both our approach and our understanding of science. It opens new doors and crosses traditional boundaries while seeking discovery. In terms of twentieth century methodologies, computational science may be said to be transformational. There are a number of examples to this point. First are the sciences that encompass climate modeling. The application of computational science has in essence created the field of climate modeling. This community is now international in scope and has provided precision results that are challenging our understanding of our environment. A second example is that of lattice quantum chromodynamics. Lattice QCD, while adding precision and insight to our fundamental understanding of strong interaction dynamics, has transformed our approach to particle and nuclear science. The individual investigator approach has evolved to teams of scientists from different disciplines working side-by-side towards a common goal. SciDAC is also undergoing a transformation. This meeting is a prime example. Last year it was a small programmatic meeting tracking progress in SciDAC. This year, we have a major computational science meeting with a variety of disciplines and enabling technologies represented. SciDAC 2005 should position itself as a new corner stone for Computational Science and its impact on science. As we look to the immediate future, FY2006 will bring a new cycle to SciDAC. Most of the program elements of SciDAC will be re-competed in FY2006. The re-competition will involve new instruments for computational science, new approaches for collaboration, as well as new disciplines. There will be new opportunities for virtual experiments in carbon sequestration, fusion, and nuclear power and nuclear waste, as well as collaborations with industry and virtual prototyping. New instruments of collaboration will include institutes and centers while summer schools, workshops and outreach will invite new talent and expertise. Computational science adds new dimensions to science and its practice. Disciplines of fusion, accelerator science, and combustion are poised to blur the boundaries between pure and applied science. As we open the door into FY2006 we shall see a landscape of new scientific challenges: in biology, chemistry, materials, and astrophysics to name a few. The enabling technologies of SciDAC have been transformational as drivers of change. Planning for major new software systems assumes a base line employing Common Component Architectures and this has become a household word for new software projects. While grid algorithms and mesh refinement software have transformed applications software, data management and visualization have transformed our understanding of science from data. The Gordon Bell prize now seems to be dominated by computational science and solvers developed by TOPS ISIC. The priorities of the Office of Science in the Department of Energy are clear. The 20 year facilities plan is driven by new science. High performance computing is placed amongst the two highest priorities. Moore's law says that by the end of the next cycle of SciDAC we shall have peta-flop computers. The challenges of petascale computing are enormous. These and the associated computational science are the highest priorities for computing within the Office of Science. Our effort in Leadership Class computing is just a first step towards this goal. Clearly, computational science at this scale will face enormous challenges and possibilities. Performance evaluation and prediction will be critical to unraveling the needed software technologies. We must not lose sight of our overarching goal—that of scientific discovery. Science does not stand still and the landscape of science discovery and computing holds immense promise. In this environment, I believe it is necessary to institute a system of science based performance metrics to help quantify our progress towards science goals and scientific computing. As a final comment I would like to reaffirm that the shifting landscapes of science will force changes to our computational sciences, and leave you with the quote from Richard Hamming, 'The purpose of computing is insight, not numbers'.

  8. Comments from the Science Education Directorate, National Science Foundation: CAUSE, ISEP, and LOCI: Three-Program Approach to College-Level Science Improvement. II. Patterns and Problems.

    ERIC Educational Resources Information Center

    Erickson, Judith B.; And Others

    1980-01-01

    Discusses patterns resulting from the monitor of science education proposals which may reflect problems or differing perceptions of NSF. Discusses these areas: proposal submissions from two-year institutions and social and behavioral scientists, trends in project content at the academic-industrial interface and in computer technology, and…

  9. The Computer-Job Salary Picture.

    ERIC Educational Resources Information Center

    Basta, Nicholas

    1987-01-01

    Discusses starting salaries for graduates with various degrees in computer science and electrical engineering. Summarizes the results of a recent study by the Institute of Electrical and Electronics Engineers (IEEE) which provides salary estimates for graduates in different specialties and in different geographical locations. (TW)

  10. Cumulative reports and publications

    NASA Technical Reports Server (NTRS)

    1993-01-01

    A complete list of Institute for Computer Applications in Science and Engineering (ICASE) reports are listed. Since ICASE reports are intended to be preprints of articles that will appear in journals or conference proceedings, the published reference is included when it is available. The major categories of the current ICASE research program are: applied and numerical mathematics, including numerical analysis and algorithm development; theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustics and combustion; experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and computer science.

  11. Operation of the Institute for Computer Applications in Science and Engineering

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The ICASE research program is described in detail; it consists of four major categories: (1) efficient use of vector and parallel computers, with particular emphasis on the CDC STAR-100; (2) numerical analysis, with particular emphasis on the development and analysis of basic numerical algorithms; (3) analysis and planning of large-scale software systems; and (4) computational research in engineering and the natural sciences, with particular emphasis on fluid dynamics. The work in each of these areas is described in detail; other activities are discussed, a prognosis of future activities are included.

  12. 24 CFR 570.416 - Hispanic-serving institutions work study program.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... to pre-professional careers in these fields. (b) Definitions. The following definitions apply to HSI... such as natural sciences, computer sciences, mathematics, accounting, electronics, engineering, and the... pursuing careers in community building, and make them aware of the availability of assistance opportunities...

  13. Access to Supercomputers. Higher Education Panel Report 69.

    ERIC Educational Resources Information Center

    Holmstrom, Engin Inel

    This survey was conducted to provide the National Science Foundation with baseline information on current computer use in the nation's major research universities, including the actual and potential use of supercomputers. Questionnaires were sent to 207 doctorate-granting institutions; after follow-ups, 167 institutions (91% of the institutions…

  14. Centre for Research Infrastructure of Polish GNSS Data - response and possible contribution to EPOS

    NASA Astrophysics Data System (ADS)

    Araszkiewicz, Andrzej; Rohm, Witold; Bosy, Jaroslaw; Szolucha, Marcin; Kaplon, Jan; Kroszczynski, Krzysztof

    2017-04-01

    In the frame of the first call under Action 4.2: Development of modern research infrastructure of the science sector in the Smart Growth Operational Programme 2014-2020 in the late of 2016 the "EPOS-PL" project has launched. Following institutes are responsible for the implementation of this project: Institute of Geophysics, Polish Academy of Sciences - Project Leader, Academic Computer Centre Cyfronet AGH University of Science and Technology, Central Mining Institute, the Institute of Geodesy and Cartography, Wrocław University of Environmental and Life Sciences, Military University of Technology. In addition, resources constituting entrepreneur's own contribution will come from the Polish Mining Group. Research Infrastructure EPOS-PL will integrate both existing and newly built National Research Infrastructures (Theme Centre for Research Infrastructures), which, under the premise of the program EPOS, are financed exclusively by the national founds. In addition, the e-science platform will be developed. The Centre for Research Infrastructure of GNSS Data (CIBDG - Task 5) will be built based on the experience and facilities of two institutions: Military University of Technology and Wrocław University of Environmental and Life Sciences. The project includes the construction of the National GNNS Repository with data QC procedures and adaptation of two Regional GNNS Analysis Centres for rapid and long-term geodynamical monitoring.

  15. Analogical Processes in Learning

    DTIC Science & Technology

    1980-09-15

    Stilluater, MN 55082 1200 19th Street NW 1 r. Genevieve Haddad Washington, DC 20208 1 Mr Avron Barr Program Manager Department of Computer Science Life ...Jack A. Thorp. Maj., USAF I Dr. Kenneth Bowles Life Sciences Directorate I Dr. Andrew R. Molnar Institute for Information Sciences AFOSR Science... Uiversity OGTI 31 1 Dr. Frank Withrow Stanford Univrsit Arlington Annex U. S. Office of Education Stanford. CA 91305 Columbia Pike at Arlington Ridge Rd

  16. Cumulative reports and publications through December 31, 1989

    NASA Technical Reports Server (NTRS)

    1990-01-01

    A complete list of reports from the Institute for Computer Applications in Science and Engineering (ICASE) is presented. The major categories of the current ICASE research program are: numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; control and parameter identification problems, with emphasis on effectual numerical methods; computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, structural analysis, and chemistry; computer systems and software, especially vector and parallel computers, microcomputers, and data management. Since ICASE reports are intended to be preprints of articles that will appear in journals or conference proceedings, the published reference is included when it is available.

  17. 78 FR 42976 - Notice Pursuant to the National Cooperative Research and Production Act of 1993-Heterogeneous...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-18

    ... Computer Science and Engineering, Seoul, REPUBLIC OF KOREA; Missouri University of Science and Technology, Rolla, MO; Industrial Technology Research Institute of Taiwan, Chutung, Hsinchu, TAIWAN, Northeastern... activity of the group research project. Membership in this group research project remains open, and HSA...

  18. Foreign Science and Engineering Presence in U.S. Institutions and the Labor Force

    DTIC Science & Technology

    2007-01-12

    physical therapists . The application for H-1B status must be filed by an employer; an individual cannot obtain an H-1B visa on his or her own...scientist or engineer for permanent residence, if they meet terms established by the Immigration and Nationality Act. 3Foreign students planning to remain...56%; for physical sciences, 64%; life sciences, 63%; mathematics, 57%; computer sciences, 63%; and agricultural sciences, 38%. Stay rates are not

  19. Behavioural science at work for Canada: National Research Council laboratories.

    PubMed

    Veitch, Jennifer A

    2007-03-01

    The National Research Council is Canada's principal research and development agency. Its 20 institutes are structured to address interdisciplinary problems for industrial sectors, and to provide the necessary scientific infrastructure, such as the national science library. Behavioural scientists are active in five institutes: Biological Sciences, Biodiagnostics, Aerospace, Information Technology, and Construction. Research topics include basic cellular neuroscience, brain function, human factors in the cockpit, human-computer interaction, emergency evacuation, and indoor environment effects on occupants. Working in collaboration with NRC colleagues and with researchers from universities and industry, NRC behavioural scientists develop knowledge, designs, and applications that put technology to work for people, designed with people in mind.

  20. Linking of the BENSON graph-plotter with the Elektronika-1001 computer

    NASA Technical Reports Server (NTRS)

    Valtts, I. Y.; Nilolaev, N. Y.; Popov, M. V.; Soglasnov, V. A.

    1980-01-01

    A device, developed by the Institute of Space Research of the Academy of Sciences of the USSR, for linking the Elektronika-100I computer with the BENSON graph-plotter is described. Programs are compiled which provide display of graphic and alphanumeric information. Instructions for their utilization are given.

  1. Use of PL/1 in a Bibliographic Information Retrieval System.

    ERIC Educational Resources Information Center

    Schipma, Peter B.; And Others

    The Information Sciences section of ITT Research Institute (IITRI) has developed a Computer Search Center and is currently conducting a research project to explore computer searching of a variety of machine-readable data bases. The Center provides Selective Dissemination of Information services to academic, industrial and research organizations…

  2. CIS and Information Technology Certifications: Education Program Trends and Implications

    ERIC Educational Resources Information Center

    Andersson, David; Reimers, Karl

    2009-01-01

    The fields of Computer Information Systems (CIS) and Information Technology (IT) are experiencing rapid change. In 2003, an analysis of IT degree programs and those of competing disciplines at 10 post-secondary institutions concluded that an information technology program is perceived differently from information systems and computer science. In…

  3. 75 FR 18849 - Food and Drug Administration/National Heart Lung and Blood Institute/National Science Foundation...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-13

    ... cardiovascular diseases and therapies; Patient-specific modeling, including virtual surgical planning and... Workshop on Computer Methods for Cardiovascular Devices: The Integration of Nonclinical and Clinical Models... Workshop on Computer Methods for Cardiovascular Devices: The Integration of Nonclinical and Clinical Models...

  4. Designing a Network and Systems Computing Curriculum: The Stakeholders and the Issues

    ERIC Educational Resources Information Center

    Tan, Grace; Venables, Anne

    2010-01-01

    Since 2001, there has been a dramatic decline in Information Technology and Computer Science student enrolments worldwide. As a consequence, many institutions have evaluated their offerings and revamped their programs to include units designed to capture students' interests and increase subsequent enrolment. Likewise, at Victoria University the…

  5. Developing a Science Commons for Geosciences

    NASA Astrophysics Data System (ADS)

    Lenhardt, W. C.; Lander, H.

    2016-12-01

    Many scientific communities, recognizing the research possibilities inherent in data sets, have created domain specific archives such as the Incorporated Research Institutions for Seismology (iris.edu) and ClinicalTrials.gov. Though this is an important step forward, most scientists, including geoscientists, also use a variety of software tools and at least some amount of computation to conduct their research. While the archives make it simpler for scientists to locate the required data, provisioning disk space, compute resources, and network bandwidth can still require significant efforts. This challenge exists despite the wealth of resources available to researchers, namely lab IT resources, institutional IT resources, national compute resources (XSEDE, OSG), private clouds, public clouds, and the development of cyberinfrastructure technologies meant to facilitate use of those resources. Further tasks include obtaining and installing required tools for analysis and visualization. If the research effort is a collaboration or involves certain types of data, then the partners may well have additional non-scientific tasks such as securing the data and developing secure sharing methods for the data. These requirements motivate our investigations into the "Science Commons". This paper will present a working definition of a science commons, compare and contrast examples of existing science commons, and describe a project based at RENCI to implement a science commons for risk analytics. We will then explore what a similar tool might look like for the geosciences.

  6. Effectiveness of Various Computer-Based Instructional Strategies in Language Teaching. Final Report, November 1, 1969-August 31, 1970.

    ERIC Educational Resources Information Center

    Van Campen, Joseph A.

    Computer software for programed language instruction, developed in the second quarter of 1970 at Stanford's Institute for Mathematical Studies in the Social Sciences is described in this report. The software includes: (1) a PDP-10 computer assembly language for generating drill sentences; (2) a coding system allowing a large number of sentences to…

  7. The ACI-REF Program: Empowering Prospective Computational Researchers

    NASA Astrophysics Data System (ADS)

    Cuma, M.; Cardoen, W.; Collier, G.; Freeman, R. M., Jr.; Kitzmiller, A.; Michael, L.; Nomura, K. I.; Orendt, A.; Tanner, L.

    2014-12-01

    The ACI-REF program, Advanced Cyberinfrastructure - Research and Education Facilitation, represents a consortium of academic institutions seeking to further advance the capabilities of their respective campus research communities through an extension of the personal connections and educational activities that underlie the unique and often specialized cyberinfrastructure at each institution. This consortium currently includes Clemson University, Harvard University, University of Hawai'i, University of Southern California, University of Utah, and University of Wisconsin. Working together in a coordinated effort, the consortium is dedicated to the adoption of models and strategies which leverage the expertise and experience of its members with a goal of maximizing the impact of each institution's investment in research computing. The ACI-REFs (facilitators) are tasked with making connections and building bridges between the local campus researchers and the many different providers of campus, commercial, and national computing resources. Through these bridges, ACI-REFs assist researchers from all disciplines in understanding their computing and data needs and in mapping these needs to existing capabilities or providing assistance with development of these capabilities. From the Earth sciences perspective, we will give examples of how this assistance improved methods and workflows in geophysics, geography and atmospheric sciences. We anticipate that this effort will expand the number of researchers who become self-sufficient users of advanced computing resources, allowing them to focus on making research discoveries in a more timely and efficient manner.

  8. Critical Interactives: Improving Public Understanding of Institutional Policy

    ERIC Educational Resources Information Center

    Buell, Duncan A.; Cooley, Heidi Rae

    2012-01-01

    Over the past 3 years, the authors have pursued unique cross-college collaboration. They have hosted a National Endowment for the Humanities (NEH)-funded Humanities Gaming Institute and team-taught a cross-listed course that brought together students from the humanities and computer science. Currently, they are overseeing the development of an…

  9. Final Report: A Broad Research Project on the Sciences of Complexity, September 15, 1994 - November 15, 1999

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2000-02-01

    DOE support for a broad research program in the sciences of complexity permitted the Santa Fe Institute to initiate new collaborative research within its integrative core activities as well as to host visitors to participate in research on specific topics that serve as motivation and testing ground for the study of the general principles of complex systems. Results are presented on computational biology, biodiversity and ecosystem research, and advanced computing and simulation.

  10. A System Architecture to Support a Verifiably Secure Multilevel Security System.

    DTIC Science & Technology

    1980-06-01

    4] Newmann, P.G., R. Fabry, K. Levitt, L. Robin - provide a tradeoff between cost and system secur- son, J. Wensley , "On the Design of a Provably ity...ICS-80/05 NL 112. 11W1 --1.25 1111 6 Mli,’O~ll Rl OIIION W AII .q3 0 School of Information and Computer Science S =GEORGIA INSTITUTE OF TECHNOLOGY 808...Multilevel Security Systemt (Extended Abstract) George I. Davida Department of Electical Engineering and Computer Science University of Wisconsin

  11. Non-parallel processing: Gendered attrition in academic computer science

    NASA Astrophysics Data System (ADS)

    Cohoon, Joanne Louise Mcgrath

    2000-10-01

    This dissertation addresses the issue of disproportionate female attrition from computer science as an instance of gender segregation in higher education. By adopting a theoretical framework from organizational sociology, it demonstrates that the characteristics and processes of computer science departments strongly influence female retention. The empirical data identifies conditions under which women are retained in the computer science major at comparable rates to men. The research for this dissertation began with interviews of students, faculty, and chairpersons from five computer science departments. These exploratory interviews led to a survey of faculty and chairpersons at computer science and biology departments in Virginia. The data from these surveys are used in comparisons of the computer science and biology disciplines, and for statistical analyses that identify which departmental characteristics promote equal attrition for male and female undergraduates in computer science. This three-pronged methodological approach of interviews, discipline comparisons, and statistical analyses shows that departmental variation in gendered attrition rates can be explained largely by access to opportunity, relative numbers, and other characteristics of the learning environment. Using these concepts, this research identifies nine factors that affect the differential attrition of women from CS departments. These factors are: (1) The gender composition of enrolled students and faculty; (2) Faculty turnover; (3) Institutional support for the department; (4) Preferential attitudes toward female students; (5) Mentoring and supervising by faculty; (6) The local job market, starting salaries, and competitiveness of graduates; (7) Emphasis on teaching; and (8) Joint efforts for student success. This work contributes to our understanding of the gender segregation process in higher education. In addition, it contributes information that can lead to effective solutions for an economically significant issue in modern American society---gender equality in computer science.

  12. The 1984 NASA/ASEE summer faculty fellowship program

    NASA Technical Reports Server (NTRS)

    Mcinnis, B. C.; Duke, M. B.; Crow, B.

    1984-01-01

    An overview is given of the program management and activities. Participants and research advisors are listed. Abstracts give describe and present results of research assignments performed by 31 fellows either at the Johnson Space Center, at the White Sands test Facility, or at the California Space Institute in La Jolla. Disciplines studied include engineering; biology/life sciences; Earth sciences; chemistry; mathematics/statistics/computer sciences; and physics/astronomy.

  13. Evolution of the Virtualized HPC Infrastructure of Novosibirsk Scientific Center

    NASA Astrophysics Data System (ADS)

    Adakin, A.; Anisenkov, A.; Belov, S.; Chubarov, D.; Kalyuzhny, V.; Kaplin, V.; Korol, A.; Kuchin, N.; Lomakin, S.; Nikultsev, V.; Skovpen, K.; Sukharev, A.; Zaytsev, A.

    2012-12-01

    Novosibirsk Scientific Center (NSC), also known worldwide as Akademgorodok, is one of the largest Russian scientific centers hosting Novosibirsk State University (NSU) and more than 35 research organizations of the Siberian Branch of Russian Academy of Sciences including Budker Institute of Nuclear Physics (BINP), Institute of Computational Technologies, and Institute of Computational Mathematics and Mathematical Geophysics (ICM&MG). Since each institute has specific requirements on the architecture of computing farms involved in its research field, currently we've got several computing facilities hosted by NSC institutes, each optimized for a particular set of tasks, of which the largest are the NSU Supercomputer Center, Siberian Supercomputer Center (ICM&MG), and a Grid Computing Facility of BINP. A dedicated optical network with the initial bandwidth of 10 Gb/s connecting these three facilities was built in order to make it possible to share the computing resources among the research communities, thus increasing the efficiency of operating the existing computing facilities and offering a common platform for building the computing infrastructure for future scientific projects. Unification of the computing infrastructure is achieved by extensive use of virtualization technology based on XEN and KVM platforms. This contribution gives a thorough review of the present status and future development prospects for the NSC virtualized computing infrastructure and the experience gained while using it for running production data analysis jobs related to HEP experiments being carried out at BINP, especially the KEDR detector experiment at the VEPP-4M electron-positron collider.

  14. Meeting Report: Incorporating Genomics Research into Undergraduate Curricula

    ERIC Educational Resources Information Center

    Dyer, Betsey Dexter; LeBlanc, Mark D.

    2002-01-01

    In the first of two National Science Foundation (NSF)-funded workshops, 30 professors of biology and computer science from 18 institutions met at Wheaton College in Norton, Massachusetts, on June 6-7, 2002, to share ideas on how to incorporate genomics research into undergraduate curricula. The participants included nine pairs or trios of…

  15. Increasing Access for Economically Disadvantaged Students: The NSF/CSEM & S-STEM Programs at Louisiana State University

    ERIC Educational Resources Information Center

    Wilson, Zakiya S.; Iyengar, Sitharama S.; Pang, Su-Seng; Warner, Isiah M.; Luces, Candace A.

    2012-01-01

    Increasing college degree attainment for students from disadvantaged backgrounds is a prominent component of numerous state and federal legislation focused on higher education. In 1999, the National Science Foundation (NSF) instituted the "Computer Science, Engineering, and Mathematics Scholarships" (CSEMS) program; this initiative was designed to…

  16. Computer networks for financial activity management, control and statistics of databases of economic administration at the Joint Institute for Nuclear Research

    NASA Astrophysics Data System (ADS)

    Tyupikova, T. V.; Samoilov, V. N.

    2003-04-01

    Modern information technologies urge natural sciences to further development. But it comes together with evaluation of infrastructures, to spotlight favorable conditions for the development of science and financial base in order to prove and protect legally new research. Any scientific development entails accounting and legal protection. In the report, we consider a new direction in software, organization and control of common databases on the example of the electronic document handling, which functions in some departments of the Joint Institute for Nuclear Research.

  17. Trends in life science grid: from computing grid to knowledge grid.

    PubMed

    Konagaya, Akihiko

    2006-12-18

    Grid computing has great potential to become a standard cyberinfrastructure for life sciences which often require high-performance computing and large data handling which exceeds the computing capacity of a single institution. This survey reviews the latest grid technologies from the viewpoints of computing grid, data grid and knowledge grid. Computing grid technologies have been matured enough to solve high-throughput real-world life scientific problems. Data grid technologies are strong candidates for realizing "resourceome" for bioinformatics. Knowledge grids should be designed not only from sharing explicit knowledge on computers but also from community formulation for sharing tacit knowledge among a community. Extending the concept of grid from computing grid to knowledge grid, it is possible to make use of a grid as not only sharable computing resources, but also as time and place in which people work together, create knowledge, and share knowledge and experiences in a community.

  18. Trends in life science grid: from computing grid to knowledge grid

    PubMed Central

    Konagaya, Akihiko

    2006-01-01

    Background Grid computing has great potential to become a standard cyberinfrastructure for life sciences which often require high-performance computing and large data handling which exceeds the computing capacity of a single institution. Results This survey reviews the latest grid technologies from the viewpoints of computing grid, data grid and knowledge grid. Computing grid technologies have been matured enough to solve high-throughput real-world life scientific problems. Data grid technologies are strong candidates for realizing "resourceome" for bioinformatics. Knowledge grids should be designed not only from sharing explicit knowledge on computers but also from community formulation for sharing tacit knowledge among a community. Conclusion Extending the concept of grid from computing grid to knowledge grid, it is possible to make use of a grid as not only sharable computing resources, but also as time and place in which people work together, create knowledge, and share knowledge and experiences in a community. PMID:17254294

  19. ISCR FY2005 Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keyes, D E; McGraw, J R

    2006-02-02

    Large-scale scientific computation and all of the disciplines that support and help validate it have been placed at the focus of Lawrence Livermore National Laboratory (LLNL) by the Advanced Simulation and Computing (ASC) program of the National Nuclear Security Administration (NNSA) and the Scientific Discovery through Advanced Computing (SciDAC) initiative of the Office of Science of the Department of Energy (DOE). The maturation of simulation as a fundamental tool of scientific and engineering research is underscored in the President's Information Technology Advisory Committee (PITAC) June 2005 finding that ''computational science has become critical to scientific leadership, economic competitiveness, and nationalmore » security''. LLNL operates several of the world's most powerful computers--including today's single most powerful--and has undertaken some of the largest and most compute-intensive simulations ever performed, most notably the molecular dynamics simulation that sustained more than 100 Teraflop/s and won the 2005 Gordon Bell Prize. Ultrascale simulation has been identified as one of the highest priorities in DOE's facilities planning for the next two decades. However, computers at architectural extremes are notoriously difficult to use in an efficient manner. Furthermore, each successful terascale simulation only points out the need for much better ways of interacting with the resulting avalanche of data. Advances in scientific computing research have, therefore, never been more vital to the core missions of LLNL than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, LLNL must engage researchers at many academic centers of excellence. In FY 2005, the Institute for Scientific Computing Research (ISCR) served as one of LLNL's main bridges to the academic community with a program of collaborative subcontracts, visiting faculty, student internships, workshops, and an active seminar series. The ISCR identifies researchers from the academic community for computer science and computational science collaborations with LLNL and hosts them for both brief and extended visits with the aim of encouraging long-term academic research agendas that address LLNL research priorities. Through these collaborations, ideas and software flow in both directions, and LLNL cultivates its future workforce. The Institute strives to be LLNL's ''eyes and ears'' in the computer and information sciences, keeping the Laboratory aware of and connected to important external advances. It also attempts to be the ''hands and feet'' that carry those advances into the Laboratory and incorporate them into practice. ISCR research participants are integrated into LLNL's Computing Applications and Research (CAR) Department, especially into its Center for Applied Scientific Computing (CASC). In turn, these organizations address computational challenges arising throughout the rest of the Laboratory. Administratively, the ISCR flourishes under LLNL's University Relations Program (URP). Together with the other four institutes of the URP, the ISCR navigates a course that allows LLNL to benefit from academic exchanges while preserving national security. While it is difficult to operate an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and worth the continued effort. The pages of this annual report summarize the activities of the faculty members, postdoctoral researchers, students, and guests from industry and other laboratories who participated in LLNL's computational mission under the auspices of the ISCR during FY 2005.« less

  20. A Distributed User Information System

    DTIC Science & Technology

    1990-03-01

    NOE08 Department of Computer Science NOVO 8 1990 University of Maryland S College Park, MD 20742 D Abstract Current user information database technology ...Transactions on Computer Systems, May 1988. [So189] K. Sollins. A plan for internet directory services. Technical report, DDN Network Information Center...2424 A Distributed User Information System DTiC Steven D. Miller, Scott Carson, and Leo Mark DELECTE Institute for Advanced Computer Studies and

  1. Effects of Computer-Based Instruction on Student Learning of Psychophysiological Detection of Deception Test Question Formulation.

    ERIC Educational Resources Information Center

    Janniro, Michael J.

    1993-01-01

    Describes a study conducted by the Department of Defense Polygraph Institute for their forensic science curriculum that investigated the effects of computer-based instruction on student learning of psychophysiological detection of deception test question formulation. Treatment of the experimental and control group is explained and posttest scores…

  2. New & Special Grad School Programs.

    ERIC Educational Resources Information Center

    Ross, Steven S.

    1988-01-01

    Discusses some special Master of Science in engineering (MS) programs including manufacturing and quality control, safety engineering, transportation engineering, and computer related areas. Gives a table showing MS degrees, institutions, and faculty. (YP)

  3. Computer Sciences and Data Systems, volume 1

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Topics addressed include: software engineering; university grants; institutes; concurrent processing; sparse distributed memory; distributed operating systems; intelligent data management processes; expert system for image analysis; fault tolerant software; and architecture research.

  4. Computational biology and bioinformatics in Nigeria.

    PubMed

    Fatumo, Segun A; Adoga, Moses P; Ojo, Opeolu O; Oluwagbemi, Olugbenga; Adeoye, Tolulope; Ewejobi, Itunuoluwa; Adebiyi, Marion; Adebiyi, Ezekiel; Bewaji, Clement; Nashiru, Oyekanmi

    2014-04-01

    Over the past few decades, major advances in the field of molecular biology, coupled with advances in genomic technologies, have led to an explosive growth in the biological data generated by the scientific community. The critical need to process and analyze such a deluge of data and turn it into useful knowledge has caused bioinformatics to gain prominence and importance. Bioinformatics is an interdisciplinary research area that applies techniques, methodologies, and tools in computer and information science to solve biological problems. In Nigeria, bioinformatics has recently played a vital role in the advancement of biological sciences. As a developing country, the importance of bioinformatics is rapidly gaining acceptance, and bioinformatics groups comprised of biologists, computer scientists, and computer engineers are being constituted at Nigerian universities and research institutes. In this article, we present an overview of bioinformatics education and research in Nigeria. We also discuss professional societies and academic and research institutions that play central roles in advancing the discipline in Nigeria. Finally, we propose strategies that can bolster bioinformatics education and support from policy makers in Nigeria, with potential positive implications for other developing countries.

  5. Computational Biology and Bioinformatics in Nigeria

    PubMed Central

    Fatumo, Segun A.; Adoga, Moses P.; Ojo, Opeolu O.; Oluwagbemi, Olugbenga; Adeoye, Tolulope; Ewejobi, Itunuoluwa; Adebiyi, Marion; Adebiyi, Ezekiel; Bewaji, Clement; Nashiru, Oyekanmi

    2014-01-01

    Over the past few decades, major advances in the field of molecular biology, coupled with advances in genomic technologies, have led to an explosive growth in the biological data generated by the scientific community. The critical need to process and analyze such a deluge of data and turn it into useful knowledge has caused bioinformatics to gain prominence and importance. Bioinformatics is an interdisciplinary research area that applies techniques, methodologies, and tools in computer and information science to solve biological problems. In Nigeria, bioinformatics has recently played a vital role in the advancement of biological sciences. As a developing country, the importance of bioinformatics is rapidly gaining acceptance, and bioinformatics groups comprised of biologists, computer scientists, and computer engineers are being constituted at Nigerian universities and research institutes. In this article, we present an overview of bioinformatics education and research in Nigeria. We also discuss professional societies and academic and research institutions that play central roles in advancing the discipline in Nigeria. Finally, we propose strategies that can bolster bioinformatics education and support from policy makers in Nigeria, with potential positive implications for other developing countries. PMID:24763310

  6. Air, Ocean and Climate Monitoring Enhancing Undergraduate Training in the Physical, Environmental and Computer Sciences

    NASA Technical Reports Server (NTRS)

    Hope, W. W.; Johnson, L. P.; Obl, W.; Stewart, A.; Harris, W. C.; Craig, R. D.

    2000-01-01

    Faculty in the Department of Physical, Environmental and Computer Sciences strongly believe in the concept that undergraduate research and research-related activities must be integrated into the fabric of our undergraduate Science and Technology curricula. High level skills, such as problem solving, reasoning, collaboration and the ability to engage in research, are learned for advanced study in graduate school or for competing for well paying positions in the scientific community. One goal of our academic programs is to have a pipeline of research activities from high school to four year college, to graduate school, based on the GISS Institute on Climate and Planets model.

  7. Referees Often Miss Obvious Errors in Computer and Electronic Publications

    NASA Astrophysics Data System (ADS)

    de Gloucester, Paul Colin

    2013-05-01

    Misconduct is extensive and damaging. So-called science is prevalent. Articles resulting from so-called science are often cited in other publications. This can have damaging consequences for society and for science. The present work includes a scientometric study of 350 articles (published by the Association for Computing Machinery; Elsevier; The Institute of Electrical and Electronics Engineers, Inc.; John Wiley; Springer; Taylor & Francis; and World Scientific Publishing Co.). A lower bound of 85.4% articles are found to be incongruous. Authors cite inherently self-contradictory articles more than valid articles. Incorrect informational cascades ruin the literature's signal-to-noise ratio even for uncomplicated cases.

  8. Referees often miss obvious errors in computer and electronic publications.

    PubMed

    de Gloucester, Paul Colin

    2013-01-01

    Misconduct is extensive and damaging. So-called science is prevalent. Articles resulting from so-called science are often cited in other publications. This can have damaging consequences for society and for science. The present work includes a scientometric study of 350 articles (published by the Association for Computing Machinery; Elsevier; The Institute of Electrical and Electronics Engineers, Inc.; John Wiley; Springer; Taylor & Francis; and World Scientific Publishing Co.). A lower bound of 85.4% articles are found to be incongruous. Authors cite inherently self-contradictory articles more than valid articles. Incorrect informational cascades ruin the literature's signal-to-noise ratio even for uncomplicated cases.

  9. GET21: Geoinformatics Training and Education for the 21st Century Geoscience Workforce

    NASA Astrophysics Data System (ADS)

    Baru, C.; Allison, L.; Fox, P.; Keane, C.; Keller, R.; Richard, S.

    2012-04-01

    The integration of advanced information technologies (referred to as cyberinfrastructure) into scientific research and education creates a synergistic situation. On the one hand, science begins to move at the speed of information technology, with science applications having to move rapidly to keep apace with the latest innovations in hardware and software. On the other hand, information technology moves at the pace of science, requiring rapid prototyping and rapid development of software and systems to serve the immediate needs of the application. The 21st century geoscience workforce must be adept at both sides of this equation to be able to make the best use of the available cyber-tools for their science and education endeavors. To reach different segments of the broad geosciences community, an education program in geoinformatics must be multi-faceted, ranging from areas dealing with modeling, computational science, and high performance computing, to those dealing with data collection, data science, and data-intensive computing. Based on our experience in geoinformatics and data science education, we propose a multi-pronged approach with a number of different components, including summer institutes typically aimed at graduate students, postdocs and researchers; graduate and undergraduate curriculum development in geoinformatics; development of online course materials to facilitate asynchronous learning, especially for geoscience professionals in the field; provision of internship at geoinformatics-related facilities for graduate students, so that they can observe and participate in geoinformatics "in action"; creation of online communities and networks to facilitate planned as well as serendipitous collaborations and for linking users with experts in the different areas of geoscience and geoinformatics. We will describe some of our experiences and the lessons learned over the years from the Cyberinfrastructure Summer Institute for Geoscientists (CSIG), which is a 1-week institute that has been held each summer (August) at the San Diego Supercomputer Center, University of California, San Diego, since 2005. We will also discuss these opportunities for GET21 and geoinformatics education in the context of the newly launched EarthCube initiative at the US National Science Foundation.

  10. Dropping Out of Computer Science: A Phenomenological Study of Student Lived Experiences in Community College Computer Science

    NASA Astrophysics Data System (ADS)

    Gilbert-Valencia, Daniel H.

    California community colleges contribute alarmingly few computer science degree or certificate earners. While the literature shows clear K-12 impediments to CS matriculation in higher education, very little is known about the experiences of those who overcome initial impediments to CS yet do not persist through to program completion. This phenomenological study explores insights into that specific experience by interviewing underrepresented, low income, first-generation college students who began community college intending to transfer to 4-year institutions majoring in CS but switched to another field and remain enrolled or graduated. This study explores the lived experiences of students facing barriers, their avenues for developing interest in CS, and the persistence support systems they encountered, specifically looking at how students constructed their academic choice from these experiences. The growing diversity within California's population necessitates that experiences specific to underrepresented students be considered as part of this exploration. Ten semi-structured interviews and observations were conducted, transcribed and coded. Artifacts supporting student experiences were also collected. Data was analyzed through a social-constructivist lens to provide insight into experiences and how they can be navigated to create actionable strategies for community college computer science departments wishing to increase student success. Three major themes emerged from this research: (1) students shared pre-college characteristics; (2) faced similar challenges in college CS courses; and (3) shared similar reactions to the "work" of computer science. Results of the study included (1) CS interest development hinged on computer ownership in the home; (2) participants shared characteristics that were ideal for college success but not CS success; and (3) encounters in CS departments produced unique challenges for participants. Though CS interest was and remains abundant, opportunities for learning programming skills before college were non-existent and there were few opportunities in college to build skills or establish a peer support networks. Recommendations for institutional leaders and further research are also provided.

  11. Need Assessment of Computer Science and Engineering Graduates

    NASA Astrophysics Data System (ADS)

    Surakka, Sami; Malmi, Lauri

    2005-06-01

    This case study considered the syllabus of the first and second year studies in computer science. The aim of the study was to reveal which topics covered in the syllabi were really needed during the following years of study or in working life. The program that was assessed in the study was a Masters program in computer science and engineering at a university of technology in Finland. The necessity of different subjects for the advanced studies (years 3? ?5) and for working life was assessed using four content analyses: (a) the course catalog of the institution where this study was carried out, (b) employment reports that were attached to the applications for internship credits, (c) masters theses, and (d) job advertisements in a newspaper. The results of the study imply that the necessity of physics for the advanced study and work was very low compared to the extent to which it was studied. On the other hand, the necessity for mathematics was moderate, and it had remained quite steady during the period 1989? ?2002. The most necessary computer science topic was programming. Also telecommunications and networking was needed often, whereas theoretical computer science was needed quite rarely.

  12. 1999 NCCS Highlights

    NASA Technical Reports Server (NTRS)

    Bennett, Jerome (Technical Monitor)

    2002-01-01

    The NASA Center for Computational Sciences (NCCS) is a high-performance scientific computing facility operated, maintained and managed by the Earth and Space Data Computing Division (ESDCD) of NASA Goddard Space Flight Center's (GSFC) Earth Sciences Directorate. The mission of the NCCS is to advance leading-edge science by providing the best people, computers, and data storage systems to NASA's Earth and space sciences programs and those of other U.S. Government agencies, universities, and private institutions. Among the many computationally demanding Earth science research efforts supported by the NCCS in Fiscal Year 1999 (FY99) are the NASA Seasonal-to-Interannual Prediction Project, the NASA Search and Rescue Mission, Earth gravitational model development efforts, the National Weather Service's North American Observing System program, Data Assimilation Office studies, a NASA-sponsored project at the Center for Ocean-Land-Atmosphere Studies, a NASA-sponsored microgravity project conducted by researchers at the City University of New York and the University of Pennsylvania, the completion of a satellite-derived global climate data set, simulations of a new geodynamo model, and studies of Earth's torque. This document presents highlights of these research efforts and an overview of the NCCS, its facilities, and its people.

  13. ISEES: an institute for sustainable software to accelerate environmental science

    NASA Astrophysics Data System (ADS)

    Jones, M. B.; Schildhauer, M.; Fox, P. A.

    2013-12-01

    Software is essential to the full science lifecycle, spanning data acquisition, processing, quality assessment, data integration, analysis, modeling, and visualization. Software runs our meteorological sensor systems, our data loggers, and our ocean gliders. Every aspect of science is impacted by, and improved by, software. Scientific advances ranging from modeling climate change to the sequencing of the human genome have been rendered possible in the last few decades due to the massive improvements in the capabilities of computers to process data through software. This pivotal role of software in science is broadly acknowledged, while simultaneously being systematically undervalued through minimal investments in maintenance and innovation. As a community, we need to embrace the creation, use, and maintenance of software within science, and address problems such as code complexity, openness,reproducibility, and accessibility. We also need to fully develop new skills and practices in software engineering as a core competency in our earth science disciplines, starting with undergraduate and graduate education and extending into university and agency professional positions. The Institute for Sustainable Earth and Environmental Software (ISEES) is being envisioned as a community-driven activity that can facilitate and galvanize activites around scientific software in an analogous way to synthesis centers such as NCEAS and NESCent that have stimulated massive advances in ecology and evolution. We will describe the results of six workshops (Science Drivers, Software Lifecycles, Software Components, Workforce Development and Training, Sustainability and Governance, and Community Engagement) that have been held in 2013 to envision such an institute. We will present community recommendations from these workshops and our strategic vision for how ISEES will address the technical issues in the software lifecycle, sustainability of the whole software ecosystem, and the critical issue of computational training for the scientific community. Process for envisioning ISEES.

  14. e-Science platform for translational biomedical imaging research: running, statistics, and analysis

    NASA Astrophysics Data System (ADS)

    Wang, Tusheng; Yang, Yuanyuan; Zhang, Kai; Wang, Mingqing; Zhao, Jun; Xu, Lisa; Zhang, Jianguo

    2015-03-01

    In order to enable multiple disciplines of medical researchers, clinical physicians and biomedical engineers working together in a secured, efficient, and transparent cooperative environment, we had designed an e-Science platform for biomedical imaging research and application cross multiple academic institutions and hospitals in Shanghai and presented this work in SPIE Medical Imaging conference held in San Diego in 2012. In past the two-years, we implemented a biomedical image chain including communication, storage, cooperation and computing based on this e-Science platform. In this presentation, we presented the operating status of this system in supporting biomedical imaging research, analyzed and discussed results of this system in supporting multi-disciplines collaboration cross-multiple institutions.

  15. 41 CFR 61-250.2 - What definitions apply to this part?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... institutes and junior colleges, or through equivalent on-the-job training. Includes: Computer programmers and... (medical, dental, electronic, physical science), and kindred workers. (iv) Sales means occupations engaging...

  16. 41 CFR 61-250.2 - What definitions apply to this part?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... institutes and junior colleges, or through equivalent on-the-job training. Includes: Computer programmers and... (medical, dental, electronic, physical science), and kindred workers. (iv) Sales means occupations engaging...

  17. 41 CFR 61-250.2 - What definitions apply to this part?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... institutes and junior colleges, or through equivalent on-the-job training. Includes: Computer programmers and... (medical, dental, electronic, physical science), and kindred workers. (iv) Sales means occupations engaging...

  18. 41 CFR 61-250.2 - What definitions apply to this part?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... institutes and junior colleges, or through equivalent on-the-job training. Includes: Computer programmers and... (medical, dental, electronic, physical science), and kindred workers. (iv) Sales means occupations engaging...

  19. 41 CFR 61-250.2 - What definitions apply to this part?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... institutes and junior colleges, or through equivalent on-the-job training. Includes: Computer programmers and... (medical, dental, electronic, physical science), and kindred workers. (iv) Sales means occupations engaging...

  20. Optimization of computations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mikhalevich, V.S.; Sergienko, I.V.; Zadiraka, V.K.

    1994-11-01

    This article examines some topics of optimization of computations, which have been discussed at 25 seminar-schools and symposia organized by the V.M. Glushkov Institute of Cybernetics of the Ukrainian Academy of Sciences since 1969. We describe the main directions in the development of computational mathematics and present some of our own results that reflect a certain design conception of speed-optimal and accuracy-optimal (or nearly optimal) algorithms for various classes of problems, as well as a certain approach to optimization of computer computations.

  1. Colloquium on Selected Topics in Behavioral Science Basic Research. (Alexandria, Virginia, April 23-25, 1980).

    ERIC Educational Resources Information Center

    Nogami, Glenda Y., Ed.; And Others

    The 21 summaries of research programs, funded by the United States Army Research Institute (ARI) for the Behavioral and Social Sciences which are presented are grouped in five broad topic areas: computer-based systems; information processing; learning, memory and transfer; human relations; and related issues and trends. Papers presented include:…

  2. Removing the center from computing: biology's new mode of digital knowledge production.

    PubMed

    November, Joseph

    2011-06-01

    This article shows how the USA's National Institutes of Health (NIH) helped to bring about a major shift in the way computers are used to produce knowledge and in the design of computers themselves as a consequence of its early 1960s efforts to introduce information technology to biologists. Starting in 1960 the NIH sought to reform the life sciences by encouraging researchers to make use of digital electronic computers, but despite generous federal support biologists generally did not embrace the new technology. Initially the blame fell on biologists' lack of appropriate (i.e. digital) data for computers to process. However, when the NIH consulted MIT computer architect Wesley Clark about this problem, he argued that the computer's quality as a device that was centralized posed an even greater challenge to potential biologist users than did the computer's need for digital data. Clark convinced the NIH that if the agency hoped to effectively computerize biology, it would need to satisfy biologists' experimental and institutional needs by providing them the means to use a computer without going to a computing center. With NIH support, Clark developed the 1963 Laboratory Instrument Computer (LINC), a small, real-time interactive computer intended to be used inside the laboratory and controlled entirely by its biologist users. Once built, the LINC provided a viable alternative to the 1960s norm of large computers housed in computing centers. As such, the LINC not only became popular among biologists, but also served in later decades as an important precursor of today's computing norm in the sciences and far beyond, the personal computer.

  3. Report of the Defense Science Board Task Force on University Responsiveness to National Security Requirements.

    DTIC Science & Technology

    1982-01-01

    R.ugustine Chairman iv OFFICE OF THE SECRETARY OF DEFENSE WASHINGTON, D.C. 20301 27 January 1982 DEFENSE SCIENCIE BOARD Mr. Norman R. Augustine Chai rman...Institute of Technology Dr. Norman Hackerman President Rice University Dr. Richard L. Haley Assistant Deputy Science and Technology USA Material ...Biological and Medical Sciences 51.8 67.8 22% Materials 53.2 65.1 13% Chemistry 47.8 60.1 17% Math and Computer Sciences 44.2 53.6 12% Oceanography 43.2

  4. Leon Cooper, Cooper Pairs, and the BCS Theory

    Science.gov Websites

    , psychology, mathematics, engineering, physics, linguistics and computer science. An Institute objective is to pave the way for the next generation of cognitive pharmaceuticals and intelligent systems for use in

  5. Large-scale visualization projects for teaching software engineering.

    PubMed

    Müller, Christoph; Reina, Guido; Burch, Michael; Weiskopf, Daniel

    2012-01-01

    The University of Stuttgart's software engineering major complements the traditional computer science major with more practice-oriented education. Two-semester software projects in various application areas offered by the university's different computer science institutes are a successful building block in the curriculum. With this realistic, complex project setting, students experience the practice of software engineering, including software development processes, technologies, and soft skills. In particular, visualization-based projects are popular with students. Such projects offer them the opportunity to gain profound knowledge that would hardly be possible with only regular lectures and homework assignments.

  6. GaAs Computer Technology

    DTIC Science & Technology

    1992-01-07

    AD-A259 259 FASTC-ID FOREIGN AEROSPACE SCIENCE AND TECHNOLOGY CENTER GaAs COMPUTER TECHNOLOGY (1) by Wang Qiao-yu 93-00999 Distrir bution t,,,Nm ted...FASTC- ID(RS)T-0310-92 HUMAN TRANSLATION FASTC-ID(RS)T-0310-92 7 January 1993 GaAs COMPUTER TECHNOLOGY (1) By: Wang Qiao-yu English pages: 6 Source...the best quality copy available. j C] " ------ GaAs Computer Technology (1) Wang Qiao-yu (Li-Shan Microelectronics Institute) Abstract: The paper

  7. Workshop in computational molecular biology, April 15, 1991--April 14, 1994

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tavare, S.

    Funds from this award were used to the Workshop in Computational Molecular Biology, `91 Symposium entitled Interface: Computing Science and Statistics, Seattle, Washington, April 21, 1991; the Workshop in Statistical Issues in Molecular Biology held at Stanford, California, August 8, 1993; and the Session on Population Genetics a part of the 56th Annual Meeting, Institute of Mathematical Statistics, San Francisco, California, August 9, 1993.

  8. Celebrating 50 years of the laser (Scientific session of the general meeting of the Physical Sciences Division of the Russian Academy of Sciences, 13 December 2010)

    NASA Astrophysics Data System (ADS)

    2011-08-01

    A scientific session of the general meeting of the Physical Sciences Division of the Russian Academy of Sciences (RAS) dedicated to the 50th anniversary of the creation of lasers was held in the Conference Hall of the Lebedev Physical Institute, RAS, on 13 December 2010. The agenda of the session announced on the website www.gpad.ac.ru of the RAS Physical Sciences Division listed the following reports: (1) Matveev V A, Bagaev S N Opening speech; (2) Bratman V L, Litvak A G, Suvorov E V (Institute of Applied Physics, RAS, Nizhny Novgorod) "Mastering the terahertz domain: sources and applications"; (3) Balykin V I (Institute of Spectroscopy, RAS, Troitsk, Moscow region) "Ultracold atoms and atom optics"; (4) Ledentsov N N (Ioffe Physical Technical Institute, RAS, St. Petersburg) "New-generation surface-emitting lasers as the key element of the computer communication era"; (5) Krasil'nik Z F (Institute for the Physics of Microstructures, RAS, Nizhny Novgorod) "Lasers for silicon optoelectronics"; (6) Shalagin A M (Institute of Automation and Electrometry, Siberian Branch, RAS, Novosibirsk) "High-power diode-pumped alkali metal vapor lasers"; (7) Kul'chin Yu N (Institute for Automation and Control Processes, Far Eastern Branch, RAS, Vladivostok) "Photonics of self-organizing biomineral nanostructures"; (8) Kolachevsky N N (Lebedev Physical Institute, RAS, Moscow) "Laser cooling of rare-earth atoms and precision measurements". The papers written on the basis of reports 2-4, 7, and 8 are published below.Because the paper based on report 6 was received by the Editors late, it will be published in the October issue of Physics-Uspekhi together with the material related to the Scientific Session of the Physical Sciences Division, RAS, of 22 December 2010. • Mastering the terahertz domain: sources and applications, V L Bratman, A G Litvak, E V Suvorov Physics-Uspekhi, 2011, Volume 54, Number 8, Pages 837-844 • Ultracold atoms and atomic optics, V I Balykin Physics-Uspekhi, 2011, Volume 54, Number 8, Pages 844-852 • New-generation vertically emitting lasers as a key factor in the computer communication era, N N Ledentsov, J A Lott Physics-Uspekhi, 2011, Volume 54, Number 8, Pages 853-858 • The photonics of self-organizing biomineral nanostructures, Yu N Kulchin Physics-Uspekhi, 2011, Volume 54, Number 8, Pages 858-863 • Laser cooling of rare-earth atoms and precision measurements, N N Kolachevsky Physics-Uspekhi, 2011, Volume 54, Number 8, Pages 863-870

  9. Cumulutive reports and publications through December 31, 1984

    NASA Technical Reports Server (NTRS)

    1985-01-01

    A complete list of the Institute for Computer Applications in Science and Engineering (ICASE) Reports are given. Since ICASE Reports are intended to be preprints of articles that will appear in journals or conference proceedings, the published reference is included when it is available. Topics include numerical methods, parameter identification, fluid dynamics, acoustics, structural analysis, and computers.

  10. Models of Individual Trajectories in Computer-Assisted Instruction for Deaf Students. Technical Report No. 214.

    ERIC Educational Resources Information Center

    Suppes, P.; And Others

    From some simple and schematic assumptions about information processing, a stochastic differential equation is derived for the motion of a student through a computer-assisted elementary mathematics curriculum. The mathematics strands curriculum of the Institute for Mathematical Studies in the Social Sciences is used to test: (1) the theory and (2)…

  11. ONR Europe Reports. Computer Science/Computer Engineering in Central Europe: A Report on Czechoslovakia, Hungary, and Poland

    DTIC Science & Technology

    1992-08-01

    Rychlik J.: Simulation of distributed control systems. Research report of Institute of Technology in 22 Pilsen no. 209-07-85, Jun. 1985 Kocur P... Kocur P.: Sensitivity analysis of reliability parameters. Proceedings of conf. FTSD, Brno, Jun. 1986, pp. 97-101 Smrha P., Kocur P., Racek S.: A

  12. Hi-Tech Unrevealed.

    ERIC Educational Resources Information Center

    Vernooy, D. Andrew; Alter, Kevin

    2001-01-01

    Presents design features of the University of Texas' Applied Computational Engineering and Sciences Building and discusses how institutions can guide the character of their architecture without subverting the architects' responsibility to confront their contemporary culture in a critical manner. (GR)

  13. 2009 ALCF annual report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beckman, P.; Martin, D.; Drugan, C.

    2010-11-23

    This year the Argonne Leadership Computing Facility (ALCF) delivered nearly 900 million core hours of science. The research conducted at their leadership class facility touched our lives in both minute and massive ways - whether it was studying the catalytic properties of gold nanoparticles, predicting protein structures, or unearthing the secrets of exploding stars. The authors remained true to their vision to act as the forefront computational center in extending science frontiers by solving pressing problems for our nation. Our success in this endeavor was due mainly to the Department of Energy's (DOE) INCITE (Innovative and Novel Computational Impact onmore » Theory and Experiment) program. The program awards significant amounts of computing time to computationally intensive, unclassified research projects that can make high-impact scientific advances. This year, DOE allocated 400 million hours of time to 28 research projects at the ALCF. Scientists from around the world conducted the research, representing such esteemed institutions as the Princeton Plasma Physics Laboratory, National Institute of Standards and Technology, and European Center for Research and Advanced Training in Scientific Computation. Argonne also provided Director's Discretionary allocations for research challenges, addressing such issues as reducing aerodynamic noise, critical for next-generation 'green' energy systems. Intrepid - the ALCF's 557-teraflops IBM Blue/Gene P supercomputer - enabled astounding scientific solutions and discoveries. Intrepid went into full production five months ahead of schedule. As a result, the ALCF nearly doubled the days of production computing available to the DOE Office of Science, INCITE awardees, and Argonne projects. One of the fastest supercomputers in the world for open science, the energy-efficient system uses about one-third as much electricity as a machine of comparable size built with more conventional parts. In October 2009, President Barack Obama recognized the excellence of the entire Blue Gene series by awarding it to the National Medal of Technology and Innovation. Other noteworthy achievements included the ALCF's collaboration with the National Energy Research Scientific Computing Center (NERSC) to examine cloud computing as a potential new computing paradigm for scientists. Named Magellan, the DOE-funded initiative will explore which science application programming models work well within the cloud, as well as evaluate the challenges that come with this new paradigm. The ALCF obtained approval for its next-generation machine, a 10-petaflops system to be delivered in 2012. This system will allow us to resolve ever more pressing problems, even more expeditiously through breakthrough science in the years to come.« less

  14. XXV IUPAP Conference on Computational Physics (CCP2013): Preface

    NASA Astrophysics Data System (ADS)

    2014-05-01

    XXV IUPAP Conference on Computational Physics (CCP2013) was held from 20-24 August 2013 at the Russian Academy of Sciences in Moscow, Russia. The annual Conferences on Computational Physics (CCP) present an overview of the most recent developments and opportunities in computational physics across a broad range of topical areas. The CCP series aims to draw computational scientists from around the world and to stimulate interdisciplinary discussion and collaboration by putting together researchers interested in various fields of computational science. It is organized under the auspices of the International Union of Pure and Applied Physics and has been in existence since 1989. The CCP series alternates between Europe, America and Asia-Pacific. The conferences are traditionally supported by European Physical Society and American Physical Society. This year the Conference host was Landau Institute for Theoretical Physics. The Conference contained 142 presentations, and, in particular, 11 plenary talks with comprehensive reviews from airbursts to many-electron systems. We would like to take this opportunity to thank our sponsors: International Union of Pure and Applied Physics (IUPAP), European Physical Society (EPS), Division of Computational Physics of American Physical Society (DCOMP/APS), Russian Foundation for Basic Research, Department of Physical Sciences of Russian Academy of Sciences, RSC Group company. Further conference information and images from the conference are available in the pdf.

  15. Military Families In Transition: Stress, Resilience, And Well-Being

    DTIC Science & Technology

    2014-01-01

    Smith College of Engineering and Computer Science Professor, Department of Physics , College of Arts and Sciences Fellow: AIAA, ASME, APS, Institute...of Physics (UK) Syracuse University Richard E. Heyman, PhD Professor Family Translational Research Group Department of Cariology and Comprehensive...Pasquina, MD COL(R), USA Residency Director and Chair, Physical Medicine & Rehabilitation Uniformed Services University Walter Reed National

  16. Impact of Interdisciplinary Undergraduate Research in Mathematics and Biology on the Development of a New Course Integrating Five STEM Disciplines

    ERIC Educational Resources Information Center

    Caudill, Lester; Hill, April; Hoke, Kathy; Lipan, Ovidiu

    2010-01-01

    Funded by innovative programs at the National Science Foundation and the Howard Hughes Medical Institute, University of Richmond faculty in biology, chemistry, mathematics, physics, and computer science teamed up to offer first- and second-year students the opportunity to contribute to vibrant, interdisciplinary research projects. The result was…

  17. Unmet needs for analyzing biological big data: A survey of 704 NSF principal investigators

    PubMed Central

    2017-01-01

    In a 2016 survey of 704 National Science Foundation (NSF) Biological Sciences Directorate principal investigators (BIO PIs), nearly 90% indicated they are currently or will soon be analyzing large data sets. BIO PIs considered a range of computational needs important to their work, including high performance computing (HPC), bioinformatics support, multistep workflows, updated analysis software, and the ability to store, share, and publish data. Previous studies in the United States and Canada emphasized infrastructure needs. However, BIO PIs said the most pressing unmet needs are training in data integration, data management, and scaling analyses for HPC—acknowledging that data science skills will be required to build a deeper understanding of life. This portends a growing data knowledge gap in biology and challenges institutions and funding agencies to redouble their support for computational training in biology. PMID:29049281

  18. Unmet needs for analyzing biological big data: A survey of 704 NSF principal investigators.

    PubMed

    Barone, Lindsay; Williams, Jason; Micklos, David

    2017-10-01

    In a 2016 survey of 704 National Science Foundation (NSF) Biological Sciences Directorate principal investigators (BIO PIs), nearly 90% indicated they are currently or will soon be analyzing large data sets. BIO PIs considered a range of computational needs important to their work, including high performance computing (HPC), bioinformatics support, multistep workflows, updated analysis software, and the ability to store, share, and publish data. Previous studies in the United States and Canada emphasized infrastructure needs. However, BIO PIs said the most pressing unmet needs are training in data integration, data management, and scaling analyses for HPC-acknowledging that data science skills will be required to build a deeper understanding of life. This portends a growing data knowledge gap in biology and challenges institutions and funding agencies to redouble their support for computational training in biology.

  19. Japan signs Ocean Agreement

    NASA Astrophysics Data System (ADS)

    The Ocean Research Institute of the University of Tokyo and the National Science Foundation (NSF) have signed a Memorandum of Understanding for cooperation in the Ocean Drilling Program (ODP). The agreement calls for Japanese participation in ODP and an annual contribution of $2.5 million in U.S. currency for the project's 9 remaining years, according to NSF.ODP is an international project whose mission is to learn more about the formation and development of the earth through the collection and examination of core samples from beneath the ocean. The program uses the drillship JOIDES Resolution, which is equipped with laboratories and computer facilities. The Joint Oceanographic Institutions for Deep Earth Sampling (JOIDES), an international group of scientists, provides overall science planning and program advice regarding ODP's science goals and objectives.

  20. ORA User’s Guide 2007

    DTIC Science & Technology

    2007-07-01

    July 2007 CMU-ISRI-07-115 Institute for Software Research School of Computer Science Carnegie Mellon University Pittsburgh, PA 15213...ORA uses a Java interface for ease of use, and a C++ computational backend. The current version ORA1.2 software is available on the CASOS website...06-1-0104, N00014-06-1-0921, the AFOSR for “ Computational Modeling of Cultural Dimensions in Adversary Organization (MURI)”, the ARL for Assessing C2

  1. Launch Pad Physics: Accelerate Interest With Model Rocketry.

    ERIC Educational Resources Information Center

    Key, LeRoy F.

    1982-01-01

    Student activities in an interdisciplinary, model rocket science program are described, including the construction of an Ohio Scientific computer system with graphic capabilities for use in the program and cooperative efforts with the Rocket Research Institute. (JN)

  2. 78 FR 20666 - Food and Drug Administration/National Institutes of Health/National Science Foundation Public...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-05

    ... (CDRH) believes that computer modeling and simulation (M&S) has the potential to substantially augment... simulate multiple use conditions and to visualize and display complex processes and data can revolutionize...

  3. IAIMS development at Harvard Medical School.

    PubMed Central

    Barnett, G O; Greenes, R A; Zielstorff, R D

    1988-01-01

    The long-range goal of this IAIMS development project is to achieve an Integrated Academic Information Management System for the Harvard Medical School, the Francis A. Countway Library of Medicine, and Harvard's affiliated institutions and their respective libraries. An "opportunistic, incremental" approach to planning has been devised. The projects selected for the initial phase are to implement an increasingly powerful electronic communications network, to encourage the use of a variety of bibliographic and information access techniques, and to begin an ambitious program of faculty and student education in computer science and its applications to medical education, medical care, and research. In addition, we will explore means to promote better collaboration among the separate computer science units in the various schools and hospitals. We believe that our planning approach will have relevance to other educational institutions where lack of strong central organizational control prevents a "top-down" approach to planning. PMID:3416098

  4. ASCR Cybersecurity for Scientific Computing Integrity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piesert, Sean

    The Department of Energy (DOE) has the responsibility to address the energy, environmental, and nuclear security challenges that face our nation. Much of DOE’s enterprise involves distributed, collaborative teams; a signi¬cant fraction involves “open science,” which depends on multi-institutional, often international collaborations that must access or share signi¬cant amounts of information between institutions and over networks around the world. The mission of the Office of Science is the delivery of scienti¬c discoveries and major scienti¬c tools to transform our understanding of nature and to advance the energy, economic, and national security of the United States. The ability of DOE tomore » execute its responsibilities depends critically on its ability to assure the integrity and availability of scienti¬c facilities and computer systems, and of the scienti¬c, engineering, and operational software and data that support its mission.« less

  5. Report to the Institutional Computing Executive Group (ICEG) August 14, 2006

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carnes, B

    We have delayed this report from its normal distribution schedule for two reasons. First, due to the coverage provided in the White Paper on Institutional Capability Computing Requirements distributed in August 2005, we felt a separate 2005 ICEG report would not be value added. Second, we wished to provide some specific information about the Peloton procurement and we have just now reached a point in the process where we can make some definitive statements. The Peloton procurement will result in an almost complete replacement of current M&IC systems. We have plans to retire MCR, iLX, and GPS. We will replacemore » them with new parallel and serial capacity systems based on the same node architecture in the new Peloton capability system named ATLAS. We are currently adding the first users to the Green Data Oasis, a large file system on the open network that will provide the institution with external collaboration data sharing. Only Thunder will remain from the current M&IC system list and it will be converted from Capability to Capacity. We are confident that we are entering a challenging yet rewarding new phase for the M&IC program. Institutional computing has been an essential component of our S&T investment strategy and has helped us achieve recognition in many scientific and technical forums. Through consistent institutional investments, M&IC has grown into a powerful unclassified computing resource that is being used across the Lab to push the limits of computing and its application to simulation science. With the addition of Peloton, the Laboratory will significantly increase the broad-based computing resources available to meet the ever-increasing demand for the large scale simulations indispensable to advancing all scientific disciplines. All Lab research efforts are bolstered through the long term development of mission driven scalable applications and platforms. The new systems will soon be fully utilized and will position Livermore to extend the outstanding science and technology breakthroughs the M&IC program has enabled to date.« less

  6. Impact of Multi-Media Tutorials in a Computer Science Laboratory Course--An Empirical Study

    ERIC Educational Resources Information Center

    Dalal, Medha

    2014-01-01

    Higher education institutes of North America, Europe and far-east Asia have been leveraging the advances in ICT for quite some time. However, research based knowledge on the use of ICT in the higher education institutes of central and south-east Asia is still not readily available. The study presented in this paper explores a variant of teaching…

  7. Activities at the Lunar and Planetary Institute

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The activities of the Lunar and Planetary Institute for the period July to December 1984 are discussed. Functions of its departments and projects are summarized. These include: planetary image center; library information center; computer center; production services; scientific staff; visitors program; scientific projects; conferences; workshops; seminars; publications and communications; panels, teams, committees and working groups; NASA-AMES vertical gun range (AVGR); and lunar and planetary science council.

  8. Overview of the SAMSI year-long program on Statistical, Mathematical and Computational Methods for Astronomy

    NASA Astrophysics Data System (ADS)

    Jogesh Babu, G.

    2017-01-01

    A year-long research (Aug 2016- May 2017) program on `Statistical, Mathematical and Computational Methods for Astronomy (ASTRO)’ is well under way at Statistical and Applied Mathematical Sciences Institute (SAMSI), a National Science Foundation research institute in Research Triangle Park, NC. This program has brought together astronomers, computer scientists, applied mathematicians and statisticians. The main aims of this program are: to foster cross-disciplinary activities; to accelerate the adoption of modern statistical and mathematical tools into modern astronomy; and to develop new tools needed for important astronomical research problems. The program provides multiple avenues for cross-disciplinary interactions, including several workshops, long-term visitors, and regular teleconferences, so participants can continue collaborations, even if they can only spend limited time in residence at SAMSI. The main program is organized around five working groups:i) Uncertainty Quantification and Astrophysical Emulationii) Synoptic Time Domain Surveysiii) Multivariate and Irregularly Sampled Time Seriesiv) Astrophysical Populationsv) Statistics, computation, and modeling in cosmology.A brief description of each of the work under way by these groups will be given. Overlaps among various working groups will also be highlighted. How the wider astronomy community can both participate and benefit from the activities, will be briefly mentioned.

  9. Combinatorial Algorithms to Enable Computational Science and Engineering: Work from the CSCAPES Institute

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boman, Erik G.; Catalyurek, Umit V.; Chevalier, Cedric

    2015-01-16

    This final progress report summarizes the work accomplished at the Combinatorial Scientific Computing and Petascale Simulations Institute. We developed Zoltan, a parallel mesh partitioning library that made use of accurate hypergraph models to provide load balancing in mesh-based computations. We developed several graph coloring algorithms for computing Jacobian and Hessian matrices and organized them into a software package called ColPack. We developed parallel algorithms for graph coloring and graph matching problems, and also designed multi-scale graph algorithms. Three PhD students graduated, six more are continuing their PhD studies, and four postdoctoral scholars were advised. Six of these students and Fellowsmore » have joined DOE Labs (Sandia, Berkeley), as staff scientists or as postdoctoral scientists. We also organized the SIAM Workshop on Combinatorial Scientific Computing (CSC) in 2007, 2009, and 2011 to continue to foster the CSC community.« less

  10. Questioning Mechanisms During Tutoring, Conversation, and Human-Computer Interaction

    DTIC Science & Technology

    1993-06-01

    of Psychology Los Angesles, CA 90024 Pittsburgh, PA 15213 Dr. Eduardo Cascallar Dr. Ruth Chabay Dr. Paul G. Chapin Educational Testing Service CDEC...Sharon Deny Educational Testing Service Applied Science Associates Florida State University Mail Stop 22-T P.O. Box 1072 Dept. of Psychology ...Department of Psychology , Department of Mathematical Sciences, and the Institute for Intelligent Systems Mailing address: Arthur C. Graesser Department.of

  11. Cumulative reports and publications through December 31, 1987

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This document contains a complete list of Institute for Computer Applications in Science and Engineering (ICASE) Reports. Since ICASE Reports are preprints of articles to be published in journals or conference proceeding, the published reference is included when available.

  12. Research Networks and Technology Migration (RESNETSII)

    DTIC Science & Technology

    2004-07-01

    Laboratory (LBNL), The International Computer Science Institute (ICSI) Center for Internet Research (ICIR) DARWIN Developing protocols and...degradation in network loss, delay and throughput AT&T Center for Internet Research at ICSI (ACIRI), AT&T Labs-Research, University Of Massachusetts

  13. CESDIS

    NASA Technical Reports Server (NTRS)

    1994-01-01

    CESDIS, the Center of Excellence in Space Data and Information Sciences was developed jointly by NASA, Universities Space Research Association (USRA), and the University of Maryland in 1988 to focus on the design of advanced computing techniques and data systems to support NASA Earth and space science research programs. CESDIS is operated by USRA under contract to NASA. The Director, Associate Director, Staff Scientists, and administrative staff are located on-site at NASA's Goddard Space Flight Center in Greenbelt, Maryland. The primary CESDIS mission is to increase the connection between computer science and engineering research programs at colleges and universities and NASA groups working with computer applications in Earth and space science. Research areas of primary interest at CESDIS include: 1) High performance computing, especially software design and performance evaluation for massively parallel machines; 2) Parallel input/output and data storage systems for high performance parallel computers; 3) Data base and intelligent data management systems for parallel computers; 4) Image processing; 5) Digital libraries; and 6) Data compression. CESDIS funds multiyear projects at U. S. universities and colleges. Proposals are accepted in response to calls for proposals and are selected on the basis of peer reviews. Funds are provided to support faculty and graduate students working at their home institutions. Project personnel visit Goddard during academic recess periods to attend workshops, present seminars, and collaborate with NASA scientists on research projects. Additionally, CESDIS takes on specific research tasks of shorter duration for computer science research requested by NASA Goddard scientists.

  14. Semiannual Report, April 1, 1989 through September 30, 1989 (Institute for Computer Applications in Science and Engineering)

    DTIC Science & Technology

    1990-02-01

    noise. Tobias B. Orloff Work began on developing a high quality rendering algorithm based on the radiosity method. The algorithm is similar to...previous progressive radiosity algorithms except for the following improvements: 1. At each iteration vertex radiosities are computed using a modified scan...line approach, thus eliminating the quadratic cost associated with a ray tracing computation of vortex radiosities . 2. At each iteration the scene is

  15. The Medical Science DMZ.

    PubMed

    Peisert, Sean; Barnett, William; Dart, Eli; Cuff, James; Grossman, Robert L; Balas, Edward; Berman, Ari; Shankar, Anurag; Tierney, Brian

    2016-11-01

    We describe use cases and an institutional reference architecture for maintaining high-capacity, data-intensive network flows (e.g., 10, 40, 100 Gbps+) in a scientific, medical context while still adhering to security and privacy laws and regulations. High-end networking, packet filter firewalls, network intrusion detection systems. We describe a "Medical Science DMZ" concept as an option for secure, high-volume transport of large, sensitive data sets between research institutions over national research networks. The exponentially increasing amounts of "omics" data, the rapid increase of high-quality imaging, and other rapidly growing clinical data sets have resulted in the rise of biomedical research "big data." The storage, analysis, and network resources required to process these data and integrate them into patient diagnoses and treatments have grown to scales that strain the capabilities of academic health centers. Some data are not generated locally and cannot be sustained locally, and shared data repositories such as those provided by the National Library of Medicine, the National Cancer Institute, and international partners such as the European Bioinformatics Institute are rapidly growing. The ability to store and compute using these data must therefore be addressed by a combination of local, national, and industry resources that exchange large data sets. Maintaining data-intensive flows that comply with HIPAA and other regulations presents a new challenge for biomedical research. Recognizing this, we describe a strategy that marries performance and security by borrowing from and redefining the concept of a "Science DMZ"-a framework that is used in physical sciences and engineering research to manage high-capacity data flows. By implementing a Medical Science DMZ architecture, biomedical researchers can leverage the scale provided by high-performance computer and cloud storage facilities and national high-speed research networks while preserving privacy and meeting regulatory requirements. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  16. The Medical Science DMZ

    PubMed Central

    Barnett, William; Dart, Eli; Cuff, James; Grossman, Robert L; Balas, Edward; Berman, Ari; Shankar, Anurag; Tierney, Brian

    2016-01-01

    Objective We describe use cases and an institutional reference architecture for maintaining high-capacity, data-intensive network flows (e.g., 10, 40, 100 Gbps+) in a scientific, medical context while still adhering to security and privacy laws and regulations. Materials and Methods High-end networking, packet filter firewalls, network intrusion detection systems. Results We describe a “Medical Science DMZ” concept as an option for secure, high-volume transport of large, sensitive data sets between research institutions over national research networks. Discussion The exponentially increasing amounts of “omics” data, the rapid increase of high-quality imaging, and other rapidly growing clinical data sets have resulted in the rise of biomedical research “big data.” The storage, analysis, and network resources required to process these data and integrate them into patient diagnoses and treatments have grown to scales that strain the capabilities of academic health centers. Some data are not generated locally and cannot be sustained locally, and shared data repositories such as those provided by the National Library of Medicine, the National Cancer Institute, and international partners such as the European Bioinformatics Institute are rapidly growing. The ability to store and compute using these data must therefore be addressed by a combination of local, national, and industry resources that exchange large data sets. Maintaining data-intensive flows that comply with HIPAA and other regulations presents a new challenge for biomedical research. Recognizing this, we describe a strategy that marries performance and security by borrowing from and redefining the concept of a “Science DMZ”—a framework that is used in physical sciences and engineering research to manage high-capacity data flows. Conclusion By implementing a Medical Science DMZ architecture, biomedical researchers can leverage the scale provided by high-performance computer and cloud storage facilities and national high-speed research networks while preserving privacy and meeting regulatory requirements. PMID:27136944

  17. Academia Sinica, TW E-science to Assistant Seismic Observations for Earthquake Research, Monitor and Hazard Reduction Surrounding the South China Sea

    NASA Astrophysics Data System (ADS)

    Huang, Bor-Shouh; Liu, Chun-Chi; Yen, Eric; Liang, Wen-Tzong; Lin, Simon C.; Huang, Win-Gee; Lee, Shiann-Jong; Chen, Hsin-Yen

    Experience from the 1994 giant Sumatra earthquake, seismic and tsunami hazard have been considered as important issues in the South China Sea and its surrounding region, and attracted many seismologist's interesting. Currently, more than 25 broadband seismic instruments are currently operated by Institute of Earth Sciences, Academia Sinica in northern Vietnam to study the geodynamic evolution of the Red river fracture zone and rearranged to distribute to southern Vietnam recently to study the geodynamic evolution and its deep structures of the South China Sea. Similar stations are planned to deploy in Philippines in near future. In planning, some high quality stations may be as permanent stations and added continuous GPS observations, and instruments to be maintained and operated by several cooperation institutes, for instance, Institute of Geophysics, Vietnamese Acadamy of Sciences and Technology in Vietnam and Philippine Institute of Volcanology and Seismology in Philippines. Finally, those stations will be planed to upgrade as real time transmission stations for earthquake monitoring and tsunami warning. However, high speed data transfer within different agencies is always a critical issue for successful network operation. By taking advantage of both EGEE and EUAsiaGrid e-Infrastructure, Academia Sinica Grid Computing Centre coordinates researchers from various Asian countries to construct a platform to high performance data transfer for huge parallel computation. Efforts from this data service and a newly build earthquake data centre for data management may greatly improve seismic network performance. Implementation of Grid infrastructure and e-science issues in this region may assistant development of earthquake research, monitor and natural hazard reduction. In the near future, we will search for new cooperation continually from the surrounding countries of the South China Sea to install new seismic stations to construct a complete seismic network of the South China Sea and encourage studies for earthquake sciences and natural hazard reductions.

  18. Modeling hazardous mass flows Geoflows09: Mathematical and computational aspects of modeling hazardous geophysical mass flows; Seattle, Washington, 9–11 March 2009

    USGS Publications Warehouse

    Iverson, Richard M.; LeVeque, Randall J.

    2009-01-01

    A recent workshop at the University of Washington focused on mathematical and computational aspects of modeling the dynamics of dense, gravity-driven mass movements such as rock avalanches and debris flows. About 30 participants came from seven countries and brought diverse backgrounds in geophysics; geology; physics; applied and computational mathematics; and civil, mechanical, and geotechnical engineering. The workshop was cosponsored by the U.S. Geological Survey Volcano Hazards Program, by the U.S. National Science Foundation through a Vertical Integration of Research and Education (VIGRE) in the Mathematical Sciences grant to the University of Washington, and by the Pacific Institute for the Mathematical Sciences. It began with a day of lectures open to the academic community at large and concluded with 2 days of focused discussions and collaborative work among the participants.

  19. Engineering and physical sciences in oncology: challenges and opportunities.

    PubMed

    Mitchell, Michael J; Jain, Rakesh K; Langer, Robert

    2017-11-01

    The principles of engineering and physics have been applied to oncology for nearly 50 years. Engineers and physical scientists have made contributions to all aspects of cancer biology, from quantitative understanding of tumour growth and progression to improved detection and treatment of cancer. Many early efforts focused on experimental and computational modelling of drug distribution, cell cycle kinetics and tumour growth dynamics. In the past decade, we have witnessed exponential growth at the interface of engineering, physics and oncology that has been fuelled by advances in fields including materials science, microfabrication, nanomedicine, microfluidics, imaging, and catalysed by new programmes at the National Institutes of Health (NIH), including the National Institute of Biomedical Imaging and Bioengineering (NIBIB), Physical Sciences in Oncology, and the National Cancer Institute (NCI) Alliance for Nanotechnology. Here, we review the advances made at the interface of engineering and physical sciences and oncology in four important areas: the physical microenvironment of the tumour and technological advances in drug delivery; cellular and molecular imaging; and microfluidics and microfabrication. We discussthe research advances, opportunities and challenges for integrating engineering and physical sciences with oncology to develop new methods to study, detect and treat cancer, and we also describe the future outlook for these emerging areas.

  20. Demographics of undergraduates studying games in the United States: a comparison of computer science students and the general population

    NASA Astrophysics Data System (ADS)

    McGill, Monica M.; Settle, Amber; Decker, Adrienne

    2013-06-01

    Our study gathered data to serve as a benchmark of demographics of undergraduate students in game degree programs. Due to the high number of programs that are cross-disciplinary with computer science programs or that are housed in computer science departments, the data is presented in comparison to data from computing students (where available) and the US population. Participants included students studying games at four nationally recognized postsecondary institutions. The results of the study indicate that there is no significant difference between the ratio of men to women studying in computing programs or in game degree programs, with women being severely underrepresented in both. Women, blacks, Hispanics/Latinos, and heterosexuals are underrepresented compared to the US population. Those with moderate and conservative political views and with religious affiliations are underrepresented in the game student population. Participants agree that workforce diversity is important and that their programs are adequately diverse, but only one-half of the participants indicated that diversity has been discussed in any of their courses.

  1. JPRS Report, Science & Technology, USSR: Computers

    DTIC Science & Technology

    1987-09-23

    pages of Literary Gazette, it would be appropriate to proceed with a literary example. Not just elegance of handwriting (made absolutely unnecessary... adult population of the industrially developed nations would have been absorbed by scientific organizations. For this reason, the phenomenon of so...The Institute’s festivities are over. The young specialists in the computer department are in an elated mood . Thanks to their enthusiasm, clearness

  2. 47 CFR Appendix I to Subpart E of... - A Procedure for Calculating PCS Signal Levels at Microwave Receivers (Appendix E of the...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Tropospheric Radio Transmission Loss Over Irregular Terrain, A Computer Method-1968”, ESSA Technical Report ERL 79-ITS 67, Institute for Telecommunications Sciences, July 1968. 2. Rice, P.L. Longley, A.G., Norton... January 30, 1985, from G.A. Hufford, identifying modifications to the computer program. 4. Hufford, G.A...

  3. 47 CFR Appendix I to Subpart E of... - A Procedure for Calculating PCS Signal Levels at Microwave Receivers (Appendix E of the...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Tropospheric Radio Transmission Loss Over Irregular Terrain, A Computer Method-1968”, ESSA Technical Report ERL 79-ITS 67, Institute for Telecommunications Sciences, July 1968. 2. Rice, P.L. Longley, A.G., Norton... January 30, 1985, from G.A. Hufford, identifying modifications to the computer program. 4. Hufford, G.A...

  4. 47 CFR Appendix I to Subpart E of... - A Procedure for Calculating PCS Signal Levels at Microwave Receivers (Appendix E of the...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Tropospheric Radio Transmission Loss Over Irregular Terrain, A Computer Method-1968”, ESSA Technical Report ERL 79-ITS 67, Institute for Telecommunications Sciences, July 1968. 2. Rice, P.L. Longley, A.G., Norton... January 30, 1985, from G.A. Hufford, identifying modifications to the computer program. 4. Hufford, G.A...

  5. 47 CFR Appendix I to Subpart E of... - A Procedure for Calculating PCS Signal Levels at Microwave Receivers (Appendix E of the...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Tropospheric Radio Transmission Loss Over Irregular Terrain, A Computer Method-1968”, ESSA Technical Report ERL 79-ITS 67, Institute for Telecommunications Sciences, July 1968. 2. Rice, P.L. Longley, A.G., Norton... January 30, 1985, from G.A. Hufford, identifying modifications to the computer program. 4. Hufford, G.A...

  6. 47 CFR Appendix I to Subpart E of... - A Procedure for Calculating PCS Signal Levels at Microwave Receivers (Appendix E of the...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Tropospheric Radio Transmission Loss Over Irregular Terrain, A Computer Method-1968”, ESSA Technical Report ERL 79-ITS 67, Institute for Telecommunications Sciences, July 1968. 2. Rice, P.L. Longley, A.G., Norton... January 30, 1985, from G.A. Hufford, identifying modifications to the computer program. 4. Hufford, G.A...

  7. Image Understanding Research and Its Application to Cartography and Computer-Based Analysis of Aerial Imagery

    DTIC Science & Technology

    1983-09-01

    Report Al-TR-346. Artifcial Intelligence Laboratory, Mamachusetts Institute of Tech- niugy. Cambridge, Mmeh mett. June 19 [G.usmn@ A. Gaman-Arenas...Testbed Coordinator, 415/859-4395 Artificial Intelligence Center Computer Science and Technology Division Prepared for: Defense Advanced Research...to support processing of aerial photographs for such military applications as cartography, Intelligence , weapon guidance, and targeting. A key

  8. Copyright Law Constraints on the Transfer of Certain Federal Computer Software with Commercial Applications. Testimony before the U.S. Senate Committee on Commerce, Science and Transportation.

    ERIC Educational Resources Information Center

    Ols, John M., Jr.

    Under current federal copyright law (17 U.S.C. 105), federal agencies cannot copyright and license their computer software. Officials at the Departments of Agriculture, Commerce, and Defense, the Environmental Protection Agency, the National Aeronautics and Space Administration, and the National Institutes of Health state that a significant…

  9. The U.S. "Tox21 Community" and the Future of Toxicology

    EPA Science Inventory

    In early 2008, the National Institute of Environmental Health Sciences/National Toxicology Program, the NIH Chemical Genomics Center, and the Environmental Protection Agency’s National Center for Computational Toxicology entered into a Memorandum of Understanding to collaborate o...

  10. The medical science DMZ: a network design pattern for data-intensive medical science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peisert, Sean; Dart, Eli; Barnett, William

    We describe a detailed solution for maintaining high-capacity, data-intensive network flows (eg, 10, 40, 100 Gbps+) in a scientific, medical context while still adhering to security and privacy laws and regulations.High-end networking, packet-filter firewalls, network intrusion-detection systems.We describe a "Medical Science DMZ" concept as an option for secure, high-volume transport of large, sensitive datasets between research institutions over national research networks, and give 3 detailed descriptions of implemented Medical Science DMZs.The exponentially increasing amounts of "omics" data, high-quality imaging, and other rapidly growing clinical datasets have resulted in the rise of biomedical research "Big Data." The storage, analysis, and networkmore » resources required to process these data and integrate them into patient diagnoses and treatments have grown to scales that strain the capabilities of academic health centers. Some data are not generated locally and cannot be sustained locally, and shared data repositories such as those provided by the National Library of Medicine, the National Cancer Institute, and international partners such as the European Bioinformatics Institute are rapidly growing. The ability to store and compute using these data must therefore be addressed by a combination of local, national, and industry resources that exchange large datasets. Maintaining data-intensive flows that comply with the Health Insurance Portability and Accountability Act (HIPAA) and other regulations presents a new challenge for biomedical research. We describe a strategy that marries performance and security by borrowing from and redefining the concept of a Science DMZ, a framework that is used in physical sciences and engineering research to manage high-capacity data flows.By implementing a Medical Science DMZ architecture, biomedical researchers can leverage the scale provided by high-performance computer and cloud storage facilities and national high-speed research networks while preserving privacy and meeting regulatory requirements.« less

  11. The medical science DMZ: a network design pattern for data-intensive medical science.

    PubMed

    Peisert, Sean; Dart, Eli; Barnett, William; Balas, Edward; Cuff, James; Grossman, Robert L; Berman, Ari; Shankar, Anurag; Tierney, Brian

    2017-10-06

    We describe a detailed solution for maintaining high-capacity, data-intensive network flows (eg, 10, 40, 100 Gbps+) in a scientific, medical context while still adhering to security and privacy laws and regulations. High-end networking, packet-filter firewalls, network intrusion-detection systems. We describe a "Medical Science DMZ" concept as an option for secure, high-volume transport of large, sensitive datasets between research institutions over national research networks, and give 3 detailed descriptions of implemented Medical Science DMZs. The exponentially increasing amounts of "omics" data, high-quality imaging, and other rapidly growing clinical datasets have resulted in the rise of biomedical research "Big Data." The storage, analysis, and network resources required to process these data and integrate them into patient diagnoses and treatments have grown to scales that strain the capabilities of academic health centers. Some data are not generated locally and cannot be sustained locally, and shared data repositories such as those provided by the National Library of Medicine, the National Cancer Institute, and international partners such as the European Bioinformatics Institute are rapidly growing. The ability to store and compute using these data must therefore be addressed by a combination of local, national, and industry resources that exchange large datasets. Maintaining data-intensive flows that comply with the Health Insurance Portability and Accountability Act (HIPAA) and other regulations presents a new challenge for biomedical research. We describe a strategy that marries performance and security by borrowing from and redefining the concept of a Science DMZ, a framework that is used in physical sciences and engineering research to manage high-capacity data flows. By implementing a Medical Science DMZ architecture, biomedical researchers can leverage the scale provided by high-performance computer and cloud storage facilities and national high-speed research networks while preserving privacy and meeting regulatory requirements. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  12. Wave refraction diagrams for the Baltimore Canyon region of the mid-Atlantic continental shelf computed by using three bottom topography approximation techniques

    NASA Technical Reports Server (NTRS)

    Poole, L. R.

    1976-01-01

    The Langley Research Center and Virginia Institute of Marine Science wave refraction computer model was applied to the Baltimore Canyon region of the mid-Atlantic continental shelf. Wave refraction diagrams for a wide range of normally expected wave periods and directions were computed by using three bottom topography approximation techniques: quadratic least squares, cubic least squares, and constrained bicubic interpolation. Mathematical or physical interpretation of certain features appearing in the computed diagrams is discussed.

  13. Using RxNorm for cross-institutional formulary data normalization within a distributed grid-computing environment.

    PubMed

    Wynden, Rob; Anderson, Nick; Casale, Marco; Lakshminarayanan, Prakash; Anderson, Kent; Prosser, Justin; Errecart, Larry; Livshits, Alice; Thimman, Tim; Weiner, Mark

    2011-01-01

    Within the CTSA (Clinical Translational Sciences Awards) program, academic medical centers are tasked with the storage of clinical formulary data within an Integrated Data Repository (IDR) and the subsequent exposure of that data over grid computing environments for hypothesis generation and cohort selection. Formulary data collected over long periods of time across multiple institutions requires normalization of terms before those data sets can be aggregated and compared. This paper sets forth a solution to the challenge of generating derived aggregated normalized views from large, distributed data sets of clinical formulary data intended for re-use within clinical translational research.

  14. Longitudinal effects of college type and selectivity on degrees conferred upon undergraduate females in physical science, life science, math and computer science, and social science

    NASA Astrophysics Data System (ADS)

    Stevens, Stacy Mckimm

    There has been much research to suggest that a single-sex college experience for female undergraduate students can increase self-confidence and leadership ability during the college years and beyond. The results of previous studies also suggest that these students achieve in the workforce and enter graduate school at higher rates than their female peers graduating from coeducational institutions. However, some researchers have questioned these findings, suggesting that it is the selectivity level of the colleges rather than the comprised gender of the students that causes these differences. The purpose of this study was to justify the continuation of single-sex educational opportunities for females at the post-secondary level by examining the effects that college selectivity, college type, and time have on the rate of undergraduate females pursuing majors in non-traditional fields. The study examined the percentage of physical science, life science, math and computer science, and social science degrees conferred upon females graduating from women's colleges from 1985-2001, as compared to those at comparable coeducational colleges. Sampling for this study consisted of 42 liberal arts women's (n = 21) and coeducational (n = 21) colleges. Variables included the type of college, the selectivity level of the college, and the effect of time on the percentage of female graduates. Doubly multivariate repeated measures analysis of variance testing revealed significant main effects for college selectivity on social science graduates, and time on both life science and math and computer science graduates. Significant interaction was also found between the college type and time on social science graduates, as well as the college type, selectivity level, and time on math and computer science graduates. Implications of the results and suggestions for further research are discussed.

  15. Parallel computing of a climate model on the dawn 1000 by domain decomposition method

    NASA Astrophysics Data System (ADS)

    Bi, Xunqiang

    1997-12-01

    In this paper the parallel computing of a grid-point nine-level atmospheric general circulation model on the Dawn 1000 is introduced. The model was developed by the Institute of Atmospheric Physics (IAP), Chinese Academy of Sciences (CAS). The Dawn 1000 is a MIMD massive parallel computer made by National Research Center for Intelligent Computer (NCIC), CAS. A two-dimensional domain decomposition method is adopted to perform the parallel computing. The potential ways to increase the speed-up ratio and exploit more resources of future massively parallel supercomputation are also discussed.

  16. Research Institute for Advanced Computer Science: Annual Report October 1998 through September 1999

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.; Gross, Anthony R. (Technical Monitor)

    1999-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center (ARC). It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. ARC has been designated NASA's Center of Excellence in Information Technology. In this capacity, ARC is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA ARC and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, and visiting scientist programs, designed to encourage and facilitate collaboration between the university and NASA information technology research communities.

  17. Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2000-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth; (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking. Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, and visiting scientist programs, designed to encourage and facilitate collaboration between the university and NASA information technology research communities.

  18. Questioning Mechanisms during Tutoring, Conversation, and Human-Computer Interaction

    DTIC Science & Technology

    1993-06-01

    Department of Psychology Los Angesles, CA 90024 Pittsburgh, PA 15213 Dr. Eduardo Cascallar Dr. Ruth Chabay Dr. Paul G. Chapin Educational Testing Service...Sharon Deny Educational Testing Service Applied Science Associates Florida State University Mail Stop 22-T P.O. Box 1072 Dept. of Psychology Princeton...Principal Investigator Department of Psychology , Department of Mathematical Sciences, and the Institute for Intelligent Systems DTIC ELECTE :JUN 2 9 1993

  19. Solving the "Hidden Line" Problem

    NASA Technical Reports Server (NTRS)

    1984-01-01

    David Hedgley Jr., a mathematician at Dryden Flight Research Center, has developed an accurate computer program that considers whether a line in a graphic model of a three dimensional object should or should not be visible. The Hidden Line Computer Code, program automatically removes superfluous lines and permits the computer to display an object from specific viewpoints, just as the human eye would see it. Users include Rowland Institute for Science in Cambridge, MA, several departments of Lockheed Georgia Co., and Nebraska Public Power District (NPPD).

  20. Assessment of Two Desk-Top Computer Simulations Used to Train Tactical Decision Making (TDM) of Small Unit Infantry Leaders

    DTIC Science & Technology

    2007-04-01

    judgmental self-doubt, depression, and causal uncertainty, tend to take fewer risks, and have lower self-esteem. Results from two studies (Nygren, 2000...U.S. Army Research Institute for the Behavioral and Social Sciences Research Report 1869 Assessment of Two Desk-Top Computer Simulations Used to...SUBTITLE 5a. CONTRACT OR GRANT NUMBER Assessment of Two Desk-Top Computer Simulations Used to Train Tactical Decision Making (TDM) of Small Unit

  1. Engagement, Persistence, and Gender in Computer Science: Results of a Smartphone ESM Study.

    PubMed

    Milesi, Carolina; Perez-Felkner, Lara; Brown, Kevin; Schneider, Barbara

    2017-01-01

    While the underrepresentation of women in the fast-growing STEM field of computer science (CS) has been much studied, no consensus exists on the key factors influencing this widening gender gap. Possible suspects include gender differences in aptitude, interest, and academic environment. Our study contributes to this literature by applying student engagement research to study the experiences of college students studying CS, to assess the degree to which differences in men and women's engagement may help account for gender inequity in the field. Specifically, we use the Experience Sampling Method (ESM) to evaluate in real-time the engagement of college students during varied activities and environments. Over the course of a full week in fall semester and a full week in spring semester, 165 students majoring in CS at two Research I universities were "beeped" several times a day via a smartphone app prompting them to fill out a short questionnaire including open-ended and scaled items. These responses were paired with administrative and over 2 years of transcript data provided by their institutions. We used mean comparisons and logistic regression analysis to compare enrollment and persistence patterns among CS men and women. Results suggest that despite the obstacles associated with women's underrepresentation in computer science, women are more likely to continue taking computer science courses when they felt challenged and skilled in their initial computer science classes. We discuss implications for further research.

  2. UC Merced Center for Computational Biology Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Colvin, Michael; Watanabe, Masakatsu

    Final report for the UC Merced Center for Computational Biology. The Center for Computational Biology (CCB) was established to support multidisciplinary scientific research and academic programs in computational biology at the new University of California campus in Merced. In 2003, the growing gap between biology research and education was documented in a report from the National Academy of Sciences, Bio2010 Transforming Undergraduate Education for Future Research Biologists. We believed that a new type of biological sciences undergraduate and graduate programs that emphasized biological concepts and considered biology as an information science would have a dramatic impact in enabling the transformationmore » of biology. UC Merced as newest UC campus and the first new U.S. research university of the 21st century was ideally suited to adopt an alternate strategy - to create a new Biological Sciences majors and graduate group that incorporated the strong computational and mathematical vision articulated in the Bio2010 report. CCB aimed to leverage this strong commitment at UC Merced to develop a new educational program based on the principle of biology as a quantitative, model-driven science. Also we expected that the center would be enable the dissemination of computational biology course materials to other university and feeder institutions, and foster research projects that exemplify a mathematical and computations-based approach to the life sciences. As this report describes, the CCB has been successful in achieving these goals, and multidisciplinary computational biology is now an integral part of UC Merced undergraduate, graduate and research programs in the life sciences. The CCB began in fall 2004 with the aid of an award from U.S. Department of Energy (DOE), under its Genomes to Life program of support for the development of research and educational infrastructure in the modern biological sciences. This report to DOE describes the research and academic programs made possible by the CCB from its inception until August, 2010, at the end of the final extension. Although DOE support for the center ended in August 2010, the CCB will continue to exist and support its original objectives. The research and academic programs fostered by the CCB have led to additional extramural funding from other agencies, and we anticipate that CCB will continue to provide support for quantitative and computational biology program at UC Merced for many years to come. Since its inception in fall 2004, CCB research projects have continuously had a multi-institutional collaboration with Lawrence Livermore National Laboratory (LLNL), and the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign, as well as individual collaborators at other sites. CCB affiliated faculty cover a broad range of computational and mathematical research including molecular modeling, cell biology, applied math, evolutional biology, bioinformatics, etc. The CCB sponsored the first distinguished speaker series at UC Merced, which had an important role is spreading the word about the computational biology emphasis at this new campus. One of CCB's original goals is to help train a new generation of biologists who bridge the gap between the computational and life sciences. To archive this goal, by summer 2006, a new program - summer undergraduate internship program, have been established under CCB to train the highly mathematical and computationally intensive Biological Science researchers. By the end of summer 2010, 44 undergraduate students had gone through this program. Out of those participants, 11 students have been admitted to graduate schools and 10 more students are interested in pursuing graduate studies in the sciences. The center is also continuing to facilitate the development and dissemination of undergraduate and graduate course materials based on the latest research in computational biology.« less

  3. NASA Langley Research Center outreach in astronautical education

    NASA Technical Reports Server (NTRS)

    Duberg, J. E.

    1976-01-01

    The Langley Research Center has traditionally maintained an active relationship with the academic community, especially at the graduate level, to promote the Center's research program and to make graduate education available to its staff. Two new institutes at the Center - the Joint Institute for Acoustics and Flight Sciences, and the Institute for Computer Applications - are discussed. Both provide for research activity at the Center by university faculties. The American Society of Engineering Education Summer Faculty Fellowship Program and the NASA-NRC Postdoctoral Resident Research Associateship Program are also discussed.

  4. The 1987 RIACS annual report

    NASA Technical Reports Server (NTRS)

    1987-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established at the NASA Ames Research Center in June of 1983. RIACS is privately operated by the Universities Space Research Association (USRA), a consortium of 64 universities with graduate programs in the aerospace sciences, under several Cooperative Agreements with NASA. RIACS's goal is to provide preeminent leadership in basic and applied computer science research as partners in support of NASA's goals and missions. In pursuit of this goal, RIACS contributes to several of the grand challenges in science and engineering facing NASA: flying an airplane inside a computer; determining the chemical properties of materials under hostile conditions in the atmospheres of earth and the planets; sending intelligent machines on unmanned space missions; creating a one-world network that makes all scientific resources, including those in space, accessible to all the world's scientists; providing intelligent computational support to all stages of the process of scientific investigation from problem formulation to results dissemination; and developing accurate global models for climatic behavior throughout the world. In working with these challenges, we seek novel architectures, and novel ways to use them, that exploit the potential of parallel and distributed computation and make possible new functions that are beyond the current reach of computing machines. The investigation includes pattern computers as well as the more familiar numeric and symbolic computers, and it includes networked systems of resources distributed around the world. We believe that successful computer science research is interdisciplinary: it is driven by (and drives) important problems in other disciplines. We believe that research should be guided by a clear long-term vision with planned milestones. And we believe that our environment must foster and exploit innovation. Our activities and accomplishments for the calendar year 1987 and our plans for 1988 are reported.

  5. Stress Computations for Nearly Incompressible Materials

    DTIC Science & Technology

    1988-04-01

    Louis Ivo Babugka Research Professor, Institute for Physical Science and Technology University of Maryland, College Park Bidar K. Chayapathy Research...for Testing and Materials, Philadelphia, pp. 101-124 (1987). [13] Szab6, B. A., PROBE: Theoretical Manual, Release 1.0, Noetic Technologies Corp., St

  6. Galaxy Makers Exhibition: Re-engagement, Evaluation and Content Legacy through an Online Component

    NASA Astrophysics Data System (ADS)

    Borrow, J.; Harrison, C.

    2017-09-01

    For the Royal Society Summer Science Exhibition 2016, Durham University's Institute of Computational Cosmology created the Galaxy Makers exhibit to communicate our computational cosmology and astronomy research. In addition to the physical exhibit we created an online component to foster re-engagement, create a permanent home for our content and allow us to collect important information about participation and impact. Here we summarise the details of the exhibit and the degree of success attached to the online component. We also share suggestions for further uses and improvements that could be implemented for the online components of other science exhibitions.

  7. [Evaluation of the lifestyle of students of physiotherapy and technical & computer science basing on their diet and physical activity].

    PubMed

    Medrela-Kuder, Ewa

    2011-01-01

    The aim of the study was the evaluation of a dietary habits profile and physical activity of Physiotherapy and Technical & Computer Science students. The research involved a group of 174 non-full-time students of higher education institutions in Krakow aged between 22 and 27. 81 students of the surveyed studied Physiotherapy at the University of Physical Education, whereas 93 followed a course in Technical & Computer Science at the Pedagogical University. In this project a diagnostic survey method was used. The study revealed that the lifestyle of university youth left much to be desired. Dietary errors were exemplified by irregular meals intake, low consumption of fish, milk and dairy, snacking between meals on high calorie products with a poor nutrient content. With regard to physical activity, Physiotherapy students were characterised by more positive attitudes than those from Technical & Computer Science. Such physical activity forms as swimming, team sports, cycling and strolling were declared by the surveyed the most frequently. Health-oriented education should be introduced in such a way as to improve the knowledge pertaining to a health-promoting lifestyle as a means of prevention of numerous diseases.

  8. [Intranarcotic infusion therapy -- a computer interpretation using the program package SPSS (Statistical Package for the Social Sciences)].

    PubMed

    Link, J; Pachaly, J

    1975-08-01

    In a retrospective 18-month study the infusion therapy applied in a great anesthesia institute is examined. The data of the course of anesthesia recorded on magnetic tape by routine are analysed for this purpose bya computer with the statistical program SPSS. It could be proved that the behaviour of the several anesthetists is very different. Various correlations are discussed.

  9. Computer Security: Governmentwide Planning Process Had Limited Impact. Report to the Chairman, Committee on Science, Space, and Technology, House of Representatives.

    ERIC Educational Resources Information Center

    General Accounting Office, Washington, DC. Information Management and Technology Div.

    As required by the Computer Security Act of 1987, federal agencies have to identify systems that contain sensitive information and develop plans to safeguard them. The planning process was assessed in 10 civilian agencies as well as the extent to which they had implemented planning controls described in 22 selected plans. The National Institute of…

  10. Superposition Quantification

    NASA Astrophysics Data System (ADS)

    Chang, Li-Na; Luo, Shun-Long; Sun, Yuan

    2017-11-01

    The principle of superposition is universal and lies at the heart of quantum theory. Although ever since the inception of quantum mechanics a century ago, superposition has occupied a central and pivotal place, rigorous and systematic studies of the quantification issue have attracted significant interests only in recent years, and many related problems remain to be investigated. In this work we introduce a figure of merit which quantifies superposition from an intuitive and direct perspective, investigate its fundamental properties, connect it to some coherence measures, illustrate it through several examples, and apply it to analyze wave-particle duality. Supported by Science Challenge Project under Grant No. TZ2016002, Laboratory of Computational Physics, Institute of Applied Physics and Computational Mathematics, Beijing, Key Laboratory of Random Complex Structures and Data Science, Chinese Academy of Sciences, Grant under No. 2008DP173182

  11. Cumulative reports and publications through December 31, 1994

    NASA Technical Reports Server (NTRS)

    1995-01-01

    This document contains a complete list of Institute for Computer Applications in Science and Engineering (ICASE) reports. Since ICASE reports are intended to be preprints of articles that will appear in journals or conference proceedings, the published reference is included when it is available.

  12. Implementation and Student Assessment of Intranet-Based Learning Resources.

    ERIC Educational Resources Information Center

    Sosabowski, Michael H.; Herson, Katie; Lloyd, Andrew W.

    1998-01-01

    The University of Brighton (England) pharmacy and biomedical sciences school developed an institutional intranet providing course information, Internet links, lecture notes, links to computer-assisted instructional packages, and worksheets. Electronic monitoring of usage and subsequent questionnaire-based evaluation showed the intranet to be a…

  13. Running a Research Marathon

    ERIC Educational Resources Information Center

    Maaravi, Yossi

    2018-01-01

    In the current article, I describe a case of experiential learning that can be used to enhance learning, students' research skills and motivation in academic institutions. We used the already existing process of hackathons--intense computer programming events--and conducted a social science research marathon. Fifty-two graduate students…

  14. Innovation in robotic surgery: the Indian scenario.

    PubMed

    Deshpande, Suresh V

    2015-01-01

    Robotics is the science. In scientific words a "Robot" is an electromechanical arm device with a computer interface, a combination of electrical, mechanical, and computer engineering. It is a mechanical arm that performs tasks in Industries, space exploration, and science. One such idea was to make an automated arm - A robot - In laparoscopy to control the telescope-camera unit electromechanically and then with a computer interface using voice control. It took us 5 long years from 2004 to bring it to the level of obtaining a patent. That was the birth of the Swarup Robotic Arm (SWARM) which is the first and the only Indian contribution in the field of robotics in laparoscopy as a total voice controlled camera holding robotic arm developed without any support by industry or research institutes.

  15. Data base development and research and editorial support

    NASA Technical Reports Server (NTRS)

    1988-01-01

    The Life Sciences Bibliographic Data Base was created in 1981 and subsequently expanded. A systematic, professional system was developed to collect, organize, and disseminate information about scientific publications resulting from research. The data base consists of bibliographic information and hard copies of all research papers published by Life Sciences-supported investigators. Technical improvements were instituted in the database. To minimize costs, take advantage of advances in personal computer technology, and achieve maximum flexibility and control, the data base was transferred from the JSC computer to personal computers at George Washington University (GWU). GWU also performed a range of related activities such as conducting in-depth searches on a variety of subjects, retrieving scientific literature, preparing presentations, summarizing research progress, answering correspondence requiring reference support, and providing writing and editorial support.

  16. Cumulative Reports and Publications through December 31, 1989 (Institute for Computer Applications in Science and Engineering)

    DTIC Science & Technology

    1990-05-01

    Research is conducted primarily by visiting scientists from universities and industry who have resident appointments for limited periods of time , and...Elsevier Science Publishers B. V. (North-holland), IFIP, 1989. Crowley, Kay, Joel Saltz, Ravi Mirchandaney, and Harry Berryman: Run- time scheduling...Inverse problem techniques for beams with tip body and time hysteresis camping. ICASE Report No. 89-22, April 18, 1989. 24 pages. To appear in

  17. Multilevel Structural Equation Models for Investigating the Effects of Computer-Based Learning in Math Classrooms on Science Technology Engineering and Math (STEM) Major Selection in 4-Year Postsecondary Institutions

    ERIC Educational Resources Information Center

    Lee, Ahlam

    2017-01-01

    Background/Context: Because of the growing concern over the decline of bachelor degree recipients in the disciplines of science, technology, engineering, and math (STEM) in the U.S., several studies have been devoted to identifying the factors that affect students' STEM major choices. A majority of these studies have focused on factors relevant to…

  18. CES_EHP_Figure_2

    EPA Pesticide Factsheets

    The increasing number of chemicals for which SHEDS probabilistic exposure assessment has been performed over the yearsThis dataset is associated with the following publication:Egeghy , P., L. Sheldon, K. Isaacs , H. Ozkaynak, M. Goldsmith, J. Wambaugh , R. Judson , and T. Buckley. Computational Exposure Science: An Emerging Discipline to Support 21st-Century Risk Assessment. ENVIRONMENTAL HEALTH PERSPECTIVES. National Institute of Environmental Health Sciences (NIEHS), Research Triangle Park, NC, USA, 124(6): 697–702, (2016).

  19. Cyber Science, Biometrics and Digital Forensics: Workshop on Emerging Cyber Techniques and Technologies

    DTIC Science & Technology

    2016-09-07

    and the University of Southern California through have been collaborating on a proposal led by Florida International University’s School of Computing...security. We will develop an action plan to identify needs, assess vulnerabilities and address disruptive technologies that could clearly provide a ...Institute of Technology and his Bachelor of Science degree in Aerospace Engineering, Polytechnic University of New York. Mr. Hurtado is a member of the

  20. Cheyney University Curriculum and Infrastructure Enhamcement in STEM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eva, Sakkar Ara

    Cheyney University is the oldest historically Black educational institution in America. Initially established as a “normal” school emphasizing the matriculation of educators, Cheyney has become a comprehensive university, one of 14 state universities comprising the Pennsylvania State System of Higher Education (PASSHE). Cheyney University graduates still become teachers, but they also enter such fields as journalism, medicine, science, mathematics, law, communication and government. Cheyney University is a small state owned HBCU with very limited resource. At present the university has about a thousand students with 15% in STEM. The CUCIES II grant made significant contribution in saving the computer sciencemore » program from being a discontinued program in the university. The grant enabled the university to hire a temporary faculty to teach in and update the computer science program. The program is enhanced with three tracks; cyber security, human computer interaction and general. The updated and enhanced computer science program will prepare professionals in the area of computer science with the knowledge, skills, and professional ethic needed for the current market. The new curriculum was developed for a professional profile that would focus on the technologies and techniques currently used in the industry. With faculty on board, the university worked with the department to bring back the computer science program from moratorium. Once in the path of being discontinued and loosing students, the program is now growing. Currently the student number has increased from 12 to 30. University is currently in the process of hiring a tenure track faculty in the computer science program. Another product of the grant is the proposal for introductory course in nanotechnology. The course is intended to generate interest in the nanotechnology field. The Natural and Applied Science department that houses all of the STEM programs in Cheyney University, is currently working to bring back environmental science program from moratorium. The university has been working to improve minority participation in STEM and made significant stride in terms of progressing students toward graduate programs and into professoriate track. This success is due to faculty mentors who work closely with students to guiding them through the application processes for research internship and graduate programs; it is also due to the university forming collaborative agreements with research intensive institutions, federal and state agencies and industry. The grant assisted in recruiting and retaining students in STEM by offering tuition scholarship, research scholarship and travel awards. Faculty professional development was supported by the grant by funding travel to conferences, meetings and webinar. As many HBCU Cheyney University is also trying to do more with less. As the STEM programs are inherently expensive, these are the ones that suffer more when resources are scarce. One of the goals of Cheyney University strategic plan is to strengthen STEM programs that is coherent with the critical skill need of Department of Energy. All of the Cheyney University STEM programs are now located in the new science building funded by Pennsylvania state.« less

  1. Integration of Russian Tier-1 Grid Center with High Performance Computers at NRC-KI for LHC experiments and beyond HENP

    NASA Astrophysics Data System (ADS)

    Belyaev, A.; Berezhnaya, A.; Betev, L.; Buncic, P.; De, K.; Drizhuk, D.; Klimentov, A.; Lazin, Y.; Lyalin, I.; Mashinistov, R.; Novikov, A.; Oleynik, D.; Polyakov, A.; Poyda, A.; Ryabinkin, E.; Teslyuk, A.; Tkachenko, I.; Yasnopolskiy, L.

    2015-12-01

    The LHC experiments are preparing for the precision measurements and further discoveries that will be made possible by higher LHC energies from April 2015 (LHC Run2). The need for simulation, data processing and analysis would overwhelm the expected capacity of grid infrastructure computing facilities deployed by the Worldwide LHC Computing Grid (WLCG). To meet this challenge the integration of the opportunistic resources into LHC computing model is highly important. The Tier-1 facility at Kurchatov Institute (NRC-KI) in Moscow is a part of WLCG and it will process, simulate and store up to 10% of total data obtained from ALICE, ATLAS and LHCb experiments. In addition Kurchatov Institute has supercomputers with peak performance 0.12 PFLOPS. The delegation of even a fraction of supercomputing resources to the LHC Computing will notably increase total capacity. In 2014 the development a portal combining a Tier-1 and a supercomputer in Kurchatov Institute was started to provide common interfaces and storage. The portal will be used not only for HENP experiments, but also by other data- and compute-intensive sciences like biology with genome sequencing analysis; astrophysics with cosmic rays analysis, antimatter and dark matter search, etc.

  2. Engineering and physical sciences in oncology: challenges and opportunities

    PubMed Central

    Mitchell, Michael J.; Jain, Rakesh K.; Langer, Robert

    2017-01-01

    The principles of engineering and physics have been applied to oncology for nearly 50 years. Engineers and physical scientists have made contributions to all aspects of cancer biology, from quantitative understanding of tumour growth and progression to improved detection and treatment of cancer. Many early efforts focused on experimental and computational modelling of drug distribution, cell cycle kinetics and tumour growth dynamics. In the past decade, we have witnessed exponential growth at the interface of engineering, physics and oncology that has been fuelled by advances in fields including materials science, microfabrication, nanomedicine, microfluidics, imaging, and catalysed by new programmes at the National Institutes of Health (NIH), including the National Institute of Biomedical Imaging and Bioengineering (NIBIB), Physical Sciences in Oncology, and the National Cancer Institute (NCI) Alliance for Nanotechnology. Here, we review the advances made at the interface of engineering and physical sciences and oncology in four important areas: the physical microenvironment of the tumour and technological advances in drug delivery; cellular and molecular imaging; and microfluidics and microfabrication. We discussthe research advances, opportunities and challenges for integrating engineering and physical sciences with oncology to develop new methods to study, detect and treat cancer, and we also describe the future outlook for these emerging areas. PMID:29026204

  3. PREFACE: New trends in Computer Simulations in Physics and not only in physics

    NASA Astrophysics Data System (ADS)

    Shchur, Lev N.; Krashakov, Serge A.

    2016-02-01

    In this volume we have collected papers based on the presentations given at the International Conference on Computer Simulations in Physics and beyond (CSP2015), held in Moscow, September 6-10, 2015. We hope that this volume will be helpful and scientifically interesting for readers. The Conference was organized for the first time with the common efforts of the Moscow Institute for Electronics and Mathematics (MIEM) of the National Research University Higher School of Economics, the Landau Institute for Theoretical Physics, and the Science Center in Chernogolovka. The name of the Conference emphasizes the multidisciplinary nature of computational physics. Its methods are applied to the broad range of current research in science and society. The choice of venue was motivated by the multidisciplinary character of the MIEM. It is a former independent university, which has recently become the part of the National Research University Higher School of Economics. The Conference Computer Simulations in Physics and beyond (CSP) is planned to be organized biannually. This year's Conference featured 99 presentations, including 21 plenary and invited talks ranging from the analysis of Irish myths with recent methods of statistical physics, to computing with novel quantum computers D-Wave and D-Wave2. This volume covers various areas of computational physics and emerging subjects within the computational physics community. Each section was preceded by invited talks presenting the latest algorithms and methods in computational physics, as well as new scientific results. Both parallel and poster sessions paid special attention to numerical methods, applications and results. For all the abstracts presented at the conference please follow the link http://csp2015.ac.ru/files/book5x.pdf

  4. A National Virtual Specimen Database for Early Cancer Detection

    NASA Technical Reports Server (NTRS)

    Crichton, Daniel; Kincaid, Heather; Kelly, Sean; Thornquist, Mark; Johnsey, Donald; Winget, Marcy

    2003-01-01

    Access to biospecimens is essential for enabling cancer biomarker discovery. The National Cancer Institute's (NCI) Early Detection Research Network (EDRN) comprises and integrates a large number of laboratories into a network in order to establish a collaborative scientific environment to discover and validate disease markers. The diversity of both the institutions and the collaborative focus has created the need for establishing cross-disciplinary teams focused on integrating expertise in biomedical research, computational and biostatistics, and computer science. Given the collaborative design of the network, the EDRN needed an informatics infrastructure. The Fred Hutchinson Cancer Research Center, the National Cancer Institute,and NASA's Jet Propulsion Laboratory (JPL) teamed up to build an informatics infrastructure creating a collaborative, science-driven research environment despite the geographic and morphology differences of the information systems that existed within the diverse network. EDRN investigators identified the need to share biospecimen data captured across the country managed in disparate databases. As a result, the informatics team initiated an effort to create a virtual tissue database whereby scientists could search and locate details about specimens located at collaborating laboratories. Each database, however, was locally implemented and integrated into collection processes and methods unique to each institution. This meant that efforts to integrate databases needed to be done in a manner that did not require redesign or re-implementation of existing system

  5. USRA/RIACS

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph

    1992-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on 6 June 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under a cooperative agreement with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. A flexible scientific staff is provided through a university faculty visitor program, a post doctoral program, and a student visitor program. Not only does this provide appropriate expertise but it also introduces scientists outside of NASA to NASA problems. A small group of core RIACS staff provides continuity and interacts with an ARC technical monitor and scientific advisory group to determine the RIACS mission. RIACS activities are reviewed and monitored by a USRA advisory council and ARC technical monitor. Research at RIACS is currently being done in the following areas: Parallel Computing; Advanced Methods for Scientific Computing; Learning Systems; High Performance Networks and Technology; Graphics, Visualization, and Virtual Environments.

  6. Online Bioinformatics Tutorials | Office of Cancer Genomics

    Cancer.gov

    Bioinformatics is a scientific discipline that applies computer science and information technology to help understand biological processes. The NIH provides a list of free online bioinformatics tutorials, either generated by the NIH Library or other institutes, which includes introductory lectures and "how to" videos on using various tools.

  7. AIA Honors Imaginative Solutions to Common Campus Problems.

    ERIC Educational Resources Information Center

    Chronicle of Higher Education, 1987

    1987-01-01

    The American Institute of Architects honored five recently completed university buildings whose architects solved the difficulties of site and scale: Columbia University's Computer Science Building, Dartmouth's Hood Museum of Art, Emory's Museum of Art, Princeton's Lewis Thomas Laboratory, and the University of California at Irvine's Computer…

  8. A PICKSC Science Gateway for enabling the common plasma physicist to run kinetic software

    NASA Astrophysics Data System (ADS)

    Hu, Q.; Winjum, B. J.; Zonca, A.; Youn, C.; Tsung, F. S.; Mori, W. B.

    2017-10-01

    Computer simulations offer tremendous opportunities for studying plasmas, ranging from simulations for students that illuminate fundamental educational concepts to research-level simulations that advance scientific knowledge. Nevertheless, there is a significant hurdle to using simulation tools. Users must navigate codes and software libraries, determine how to wrangle output into meaningful plots, and oftentimes confront a significant cyberinfrastructure with powerful computational resources. Science gateways offer a Web-based environment to run simulations without needing to learn or manage the underlying software and computing cyberinfrastructure. We discuss our progress on creating a Science Gateway for the Particle-in-Cell and Kinetic Simulation Software Center that enables users to easily run and analyze kinetic simulations with our software. We envision that this technology could benefit a wide range of plasma physicists, both in the use of our simulation tools as well as in its adaptation for running other plasma simulation software. Supported by NSF under Grant ACI-1339893 and by the UCLA Institute for Digital Research and Education.

  9. A synergistic effort among geoscience, physics, computer science and mathematics at Hunter College of CUNY as a Catalyst for educating Earth scientists.

    NASA Astrophysics Data System (ADS)

    Salmun, H.; Buonaiuto, F. S.

    2016-12-01

    The Catalyst Scholarship Program at Hunter College of The City University of New York (CUNY) was established with a four-year award from the National Science Foundation (NSF) to fund scholarships for academically talented but financially disadvantaged students majoring in four disciplines of science, technology, engineering and mathematics (STEM). Led by Earth scientists the Program awarded scholarships to students in their junior or senior years majoring in computer science, geosciences, mathematics and physics to create two cohorts of students that spent a total of four semesters in an interdisciplinary community. The program included mentoring of undergraduate students by faculty and graduate students (peer-mentoring), a sequence of three semesters of a one-credit seminar course and opportunities to engage in research activities, research seminars and other enriching academic experiences. Faculty and peer-mentoring were integrated into all parts of the scholarship activities. The one-credit seminar course, although designed to expose scholars to the diversity STEM disciplines and to highlight research options and careers in these disciplines, was thematically focused on geoscience, specifically on ocean and atmospheric science. The program resulted in increased retention rates relative to institutional averages. In this presentation we will discuss the process of establishing the program, from the original plans to its implementation, as well as the impact of this multidisciplinary approach to geoscience education at our institution and beyond. An overview of accomplishments, lessons learned and potential for best practices will be presented.

  10. Engagement, Persistence, and Gender in Computer Science: Results of a Smartphone ESM Study

    PubMed Central

    Milesi, Carolina; Perez-Felkner, Lara; Brown, Kevin; Schneider, Barbara

    2017-01-01

    While the underrepresentation of women in the fast-growing STEM field of computer science (CS) has been much studied, no consensus exists on the key factors influencing this widening gender gap. Possible suspects include gender differences in aptitude, interest, and academic environment. Our study contributes to this literature by applying student engagement research to study the experiences of college students studying CS, to assess the degree to which differences in men and women's engagement may help account for gender inequity in the field. Specifically, we use the Experience Sampling Method (ESM) to evaluate in real-time the engagement of college students during varied activities and environments. Over the course of a full week in fall semester and a full week in spring semester, 165 students majoring in CS at two Research I universities were “beeped” several times a day via a smartphone app prompting them to fill out a short questionnaire including open-ended and scaled items. These responses were paired with administrative and over 2 years of transcript data provided by their institutions. We used mean comparisons and logistic regression analysis to compare enrollment and persistence patterns among CS men and women. Results suggest that despite the obstacles associated with women's underrepresentation in computer science, women are more likely to continue taking computer science courses when they felt challenged and skilled in their initial computer science classes. We discuss implications for further research. PMID:28487664

  11. Quantum lattice model solver HΦ

    NASA Astrophysics Data System (ADS)

    Kawamura, Mitsuaki; Yoshimi, Kazuyoshi; Misawa, Takahiro; Yamaji, Youhei; Todo, Synge; Kawashima, Naoki

    2017-08-01

    HΦ [aitch-phi ] is a program package based on the Lanczos-type eigenvalue solution applicable to a broad range of quantum lattice models, i.e., arbitrary quantum lattice models with two-body interactions, including the Heisenberg model, the Kitaev model, the Hubbard model and the Kondo-lattice model. While it works well on PCs and PC-clusters, HΦ also runs efficiently on massively parallel computers, which considerably extends the tractable range of the system size. In addition, unlike most existing packages, HΦ supports finite-temperature calculations through the method of thermal pure quantum (TPQ) states. In this paper, we explain theoretical background and user-interface of HΦ. We also show the benchmark results of HΦ on supercomputers such as the K computer at RIKEN Advanced Institute for Computational Science (AICS) and SGI ICE XA (Sekirei) at the Institute for the Solid State Physics (ISSP).

  12. Mastering cognitive development theory in computer science education

    NASA Astrophysics Data System (ADS)

    Gluga, Richard; Kay, Judy; Lister, Raymond; Simon; Kleitman, Sabina

    2013-03-01

    To design an effective computer science curriculum, educators require a systematic method of classifying the difficulty level of learning activities and assessment tasks. This is important for curriculum design and implementation and for communication between educators. Different educators must be able to use the method consistently, so that classified activities and assessments are comparable across the subjects of a degree, and, ideally, comparable across institutions. One widespread approach to supporting this is to write learning objects in terms of Bloom's Taxonomy. This, or other such classifications, is likely to be more effective if educators can use them consistently, in the way experts would use them. To this end, we present the design and evaluation of our online interactive web-based tutorial system, which can be configured and used to offer training in different classification schemes. We report on results from three evaluations. First, 17 computer science educators complete a tutorial on using Bloom's Taxonomy to classify programming examination questions. Second, 20 computer science educators complete a Neo-Piagetian tutorial. Third evaluation was a comparison of inter-rater reliability scores of computer science educators classifying programming questions using Bloom's Taxonomy, before and after taking our tutorial. Based on the results from these evaluations, we discuss the effectiveness of our tutorial system design for teaching computer science educators how to systematically and consistently classify programming examination questions. We also discuss the suitability of Bloom's Taxonomy and Neo-Piagetian theory for achieving this goal. The Bloom's and Neo-Piagetian tutorials are made available as a community resource. The contributions of this paper are the following: the tutorial system for learning classification schemes for the purpose of coding the difficulty of computing learning materials; its evaluation; new insights into the consistency that computing educators can achieve using Bloom; and first insights into the use of Neo-Piagetian theory by a group of classifiers.

  13. Telescience workstation

    NASA Technical Reports Server (NTRS)

    Brown, Robert L.; Doyle, Dee; Haines, Richard F.; Slocum, Michael

    1989-01-01

    As part of the Telescience Testbed Pilot Program, the Universities Space Research Association/ Research Institute for Advanced Computer Science (USRA/RIACS) proposed to support remote communication by providing a network of human/machine interfaces, computer resources, and experimental equipment which allows: remote science, collaboration, technical exchange, and multimedia communication. The telescience workstation is intended to provide a local computing environment for telescience. The purpose of the program are as follows: (1) to provide a suitable environment to integrate existing and new software for a telescience workstation; (2) to provide a suitable environment to develop new software in support of telescience activities; (3) to provide an interoperable environment so that a wide variety of workstations may be used in the telescience program; (4) to provide a supportive infrastructure and a common software base; and (5) to advance, apply, and evaluate the telescience technolgy base. A prototype telescience computing environment designed to bring practicing scientists in domains other than their computer science into a modern style of doing their computing was created and deployed. This environment, the Telescience Windowing Environment, Phase 1 (TeleWEn-1), met some, but not all of the goals stated above. The TeleWEn-1 provided a window-based workstation environment and a set of tools for text editing, document preparation, electronic mail, multimedia mail, raster manipulation, and system management.

  14. A cyber-linked undergraduate research experience in computational biomolecular structure prediction and design

    PubMed Central

    Alford, Rebecca F.; Dolan, Erin L.

    2017-01-01

    Computational biology is an interdisciplinary field, and many computational biology research projects involve distributed teams of scientists. To accomplish their work, these teams must overcome both disciplinary and geographic barriers. Introducing new training paradigms is one way to facilitate research progress in computational biology. Here, we describe a new undergraduate program in biomolecular structure prediction and design in which students conduct research at labs located at geographically-distributed institutions while remaining connected through an online community. This 10-week summer program begins with one week of training on computational biology methods development, transitions to eight weeks of research, and culminates in one week at the Rosetta annual conference. To date, two cohorts of students have participated, tackling research topics including vaccine design, enzyme design, protein-based materials, glycoprotein modeling, crowd-sourced science, RNA processing, hydrogen bond networks, and amyloid formation. Students in the program report outcomes comparable to students who participate in similar in-person programs. These outcomes include the development of a sense of community and increases in their scientific self-efficacy, scientific identity, and science values, all predictors of continuing in a science research career. Furthermore, the program attracted students from diverse backgrounds, which demonstrates the potential of this approach to broaden the participation of young scientists from backgrounds traditionally underrepresented in computational biology. PMID:29216185

  15. A cyber-linked undergraduate research experience in computational biomolecular structure prediction and design.

    PubMed

    Alford, Rebecca F; Leaver-Fay, Andrew; Gonzales, Lynda; Dolan, Erin L; Gray, Jeffrey J

    2017-12-01

    Computational biology is an interdisciplinary field, and many computational biology research projects involve distributed teams of scientists. To accomplish their work, these teams must overcome both disciplinary and geographic barriers. Introducing new training paradigms is one way to facilitate research progress in computational biology. Here, we describe a new undergraduate program in biomolecular structure prediction and design in which students conduct research at labs located at geographically-distributed institutions while remaining connected through an online community. This 10-week summer program begins with one week of training on computational biology methods development, transitions to eight weeks of research, and culminates in one week at the Rosetta annual conference. To date, two cohorts of students have participated, tackling research topics including vaccine design, enzyme design, protein-based materials, glycoprotein modeling, crowd-sourced science, RNA processing, hydrogen bond networks, and amyloid formation. Students in the program report outcomes comparable to students who participate in similar in-person programs. These outcomes include the development of a sense of community and increases in their scientific self-efficacy, scientific identity, and science values, all predictors of continuing in a science research career. Furthermore, the program attracted students from diverse backgrounds, which demonstrates the potential of this approach to broaden the participation of young scientists from backgrounds traditionally underrepresented in computational biology.

  16. Creating Next Generation Teacher Preparation Programs to Support Implementation of the Next Generation Science Standards and Common Core State Standards in K-12 Schools: An Opportunity for the Earth and Space Sciences

    NASA Astrophysics Data System (ADS)

    Geary, E. E.; Egger, A. E.; Julin, S.; Ronca, R.; Vokos, S.; Ebert, E.; Clark-Blickenstaff, J.; Nollmeyer, G.

    2015-12-01

    A consortium of two and four year Washington State Colleges and Universities in partnership with Washington's Office of the Superintendent of Public Instruction (OSPI), the Teachers of Teachers of Science, and Teachers of Teachers of Mathematics, and other key stakeholders, is currently working to improve science and mathematics learning for all Washington State students by creating a new vision for STEM teacher preparation in Washington State aligned with the Next Generation Science Standards (NGSS) and the Common Core State Standards (CCSS) in Mathematics and Language Arts. Specific objectives include: (1) strengthening elementary and secondary STEM Teacher Preparation courses and curricula, (2) alignment of STEM teacher preparation programs across Washington State with the NGSS and CCSS, (3) development of action plans to support implementation of STEM Teacher Preparation program improvement at Higher Education Institutions (HEIs) across the state, (4) stronger collaborations between HEIs, K-12 schools, government agencies, Non-Governmental Organizations, and STEM businesses, involved in the preparation of preservice STEM teachers, (5) new teacher endorsements in Computer Science and Engineering, and (6) development of a proto-type model for rapid, adaptable, and continuous improvement of STEM teacher preparation programs. A 2015 NGSS gap analysis of teacher preparation programs across Washington State indicates relatively good alignment of courses and curricula with NGSS Disciplinary Core Ideas and Scientific practices, but minimal alignment with NGSS Engineering practices and Cross Cutting Concepts. Likewise, Computer Science and Sustainability ideas and practices are not well represented in current courses and curricula. During the coming year teams of STEM faculty, education faculty and administrators will work collaboratively to develop unique action plans for aligning and improving STEM teacher preparation courses and curricula at their institutions.

  17. An Innovative Approach to Bridge a Skill Gap and Grow a Workforce Pipeline: The Computer System, Cluster, and Networking Summer Institute

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Connor, Carolyn Marie; Jacobson, Andree Lars; Bonnie, Amanda Marie

    Sustainable and effective computing infrastructure depends critically on the skills and expertise of domain scientists and of committed and well-trained advanced computing professionals. But, in its ongoing High Performance Computing (HPC) work, Los Alamos National Laboratory noted a persistent shortage of well-prepared applicants, particularly for entry-level cluster administration, file systems administration, and high speed networking positions. Further, based upon recruiting efforts and interactions with universities graduating students in related majors of interest (e.g., computer science (CS)), there has been a long standing skillset gap, as focused training in HPC topics is typically lacking or absent in undergraduate and in evenmore » many graduate programs. Given that the effective operation and use of HPC systems requires specialized and often advanced training, that there is a recognized HPC skillset gap, and that there is intense global competition for computing and computational science talent, there is a long-standing and critical need for innovative approaches to help bridge the gap and create a well-prepared, next generation HPC workforce. Our paper places this need in the context of the HPC work and workforce requirements at Los Alamos National Laboratory (LANL) and presents one such innovative program conceived to address the need, bridge the gap, and grow an HPC workforce pipeline at LANL. The Computer System, Cluster, and Networking Summer Institute (CSCNSI) completed its 10th year in 2016. The story of the CSCNSI and its evolution is detailed below with a description of the design of its Boot Camp, and a summary of its success and some key factors that have enabled that success.« less

  18. An Innovative Approach to Bridge a Skill Gap and Grow a Workforce Pipeline: The Computer System, Cluster, and Networking Summer Institute

    DOE PAGES

    Connor, Carolyn Marie; Jacobson, Andree Lars; Bonnie, Amanda Marie; ...

    2016-11-01

    Sustainable and effective computing infrastructure depends critically on the skills and expertise of domain scientists and of committed and well-trained advanced computing professionals. But, in its ongoing High Performance Computing (HPC) work, Los Alamos National Laboratory noted a persistent shortage of well-prepared applicants, particularly for entry-level cluster administration, file systems administration, and high speed networking positions. Further, based upon recruiting efforts and interactions with universities graduating students in related majors of interest (e.g., computer science (CS)), there has been a long standing skillset gap, as focused training in HPC topics is typically lacking or absent in undergraduate and in evenmore » many graduate programs. Given that the effective operation and use of HPC systems requires specialized and often advanced training, that there is a recognized HPC skillset gap, and that there is intense global competition for computing and computational science talent, there is a long-standing and critical need for innovative approaches to help bridge the gap and create a well-prepared, next generation HPC workforce. Our paper places this need in the context of the HPC work and workforce requirements at Los Alamos National Laboratory (LANL) and presents one such innovative program conceived to address the need, bridge the gap, and grow an HPC workforce pipeline at LANL. The Computer System, Cluster, and Networking Summer Institute (CSCNSI) completed its 10th year in 2016. The story of the CSCNSI and its evolution is detailed below with a description of the design of its Boot Camp, and a summary of its success and some key factors that have enabled that success.« less

  19. BEYSIK: Language description and handbook for programmers (system for the collective use of the Institute of Space Research, Academy of Sciences USSR)

    NASA Technical Reports Server (NTRS)

    Orlov, I. G.

    1979-01-01

    The BASIC algorithmic language is described, and a guide is presented for the programmer using the language interpreter. The high-level algorithm BASIC is a problem-oriented programming language intended for solution of computational and engineering problems.

  20. $10M Gift Supports "Data Recycling" at UCSF.

    PubMed

    2017-10-01

    The University of California, San Francisco's Institute for Computational Health Sciences has received a $10 million gift to support "data recycling" investigations. The approach to medical research involves mining existing data to potentially uncover new uses for existing drugs and help improve clinical care. ©2017 American Association for Cancer Research.

  1. Can the Internet Be Saved?

    ERIC Educational Resources Information Center

    Fischman, Josh

    2007-01-01

    The Internet has great difficulty coping with the sharp increase in mobile devices like cellphones and laptops, and handling bandwidth-hungry traffic such as video, now demanded by an increasing number of users. According to Ellen W. Zegura, chairwoman of computer sciences at the Georgia Institute of Technology, the Internet is like a big…

  2. Life on the borders

    NASA Astrophysics Data System (ADS)

    Barry, Edward

    2010-02-01

    Interdisciplinary science has been a hot topic for more than a decade, with increasing numbers of researchers working on projects that do not fit into neat departmental boxes like "physics" or "biology". Yet despite this increased activity, the structures in place to support these interdisciplinary scientists - including research grants and training for PhD students - have sometimes lagged behind. One programme that aims to help fill this gap for students of biomedical, physical and computational sciences is the Interfaces Initiative, a joint project of the Howard Hughes Medical Institute and the US National Institute of Biomedical Imaging and Bioengineering. Physics World talked to a current Interfaces participant, Edward Barry, who is finishing his PhD in biology-related condensed-matter physics at Brandeis University in Massachusetts.

  3. Why there should be more science Nobel prizes and laureates - And why proportionate credit should be awarded to institutions.

    PubMed

    Charlton, Bruce G

    2007-01-01

    The four science Nobel prizes (physics, chemistry, medicine/physiology and economics) have performed extremely well as a method of recognizing the highest level of achievement. The prizes exist primarily to honour individuals but also have a very important function in science generally. In particular, the institutions and nations which have educated, nurtured or supported many Nobel laureates can be identified as elite in world science. However, the limited range of subjects and a maximum of 12 laureates per year mean that many major scientific achievements remain un-recognized; and relatively few universities can gather sufficient Nobel-credits to enable a precise estimate of their different levels of quality. I advocate that the Nobel committee should expand the number of Nobel laureates and Prize categories as a service to world science. (1) There is a large surplus of high quality prize candidates deserving of recognition. (2) There has been a vast expansion of research with a proliferation of major sub-disciplines in the existing categories. (3) Especially, the massive growth of the bio-medical sciences has created a shortage of Nobel recognition in this area. (4) Whole new fields of major science have emerged. I therefore suggest that the maximum of three laureates per year should always be awarded in the categories of physics, chemistry and economics, even when these prizes are for diverse and un-related achievements; that the number of laureates in the 'biology' category of physiology or medicine should be increased to six or preferably nine per year; and that two new Prize categories should be introduced to recognize achievements in mathematics and computing science. Together, these measures could increase the science laureates from a maximum of 12 to a minimum of 24, and increase the range of scientific coverage. In future, the Nobel committee should also officially allocate proportionate credit to institutions for each laureate, and a historical task force could also award institutional credit for past prizes.

  4. Promoting Interests in Atmospheric Science at a Liberal Arts Institution

    NASA Astrophysics Data System (ADS)

    Roussev, S.; Sherengos, P. M.; Limpasuvan, V.; Xue, M.

    2007-12-01

    Coastal Carolina University (CCU) students in Computer Science participated in a project to set up an operational weather forecast for the local community. The project involved the construction of two computing clusters and the automation of daily forecasting. Funded by NSF-MRI, two high-performance clusters were successfully established to run the University of Oklahoma's Advance Regional Prediction System (ARPS). Daily weather predictions are made over South Carolina and North Carolina at 3-km horizontal resolution (roughly 1.9 miles) using initial and boundary condition data provided by UNIDATA. At this high resolution, the model is cloud- resolving, thus providing detailed picture of heavy thunderstorms and precipitation. Forecast results are displayed on CCU's website (https://marc.coastal.edu/HPC) to complement observations at the National Weather Service in Wilmington N.C. Present efforts include providing forecasts at 1-km resolution (or finer), comparisons with other models like Weather Research and Forecasting (WRF) model, and the examination of local phenomena (like water spouts and tornadoes). Through these activities the students learn about shell scripting, cluster operating systems, and web design. More importantly, students are introduced to Atmospheric Science, the processes involved in making weather forecasts, and the interpretation of their forecasts. Simulations generated by the forecasts will be integrated into the contents of CCU's course like Fluid Dynamics, Atmospheric Sciences, Atmospheric Physics, and Remote Sensing. Operated jointly between the departments of Applied Physics and Computer Science, the clusters are expected to be used by CCU faculty and students for future research and inquiry-based projects in Computer Science, Applied Physics, and Marine Science.

  5. An Overview of High Performance Computing and Challenges for the Future

    ScienceCinema

    Google Tech Talks

    2017-12-09

    In this talk we examine how high performance computing has changed over the last 10-year and look toward the future in terms of trends. These changes have had and will continue to have a major impact on our software. A new generation of software libraries and lgorithms are needed for the effective and reliable use of (wide area) dynamic, distributed and parallel environments. Some of the software and algorithm challenges have already been encountered, such as management of communication and memory hierarchies through a combination of compile--time and run--time techniques, but the increased scale of computation, depth of memory hierarchies, range of latencies, and increased run--time environment variability will make these problems much harder. We will focus on the redesign of software to fit multicore architectures. Speaker: Jack Dongarra University of Tennessee Oak Ridge National Laboratory University of Manchester Jack Dongarra received a Bachelor of Science in Mathematics from Chicago State University in 1972 and a Master of Science in Computer Science from the Illinois Institute of Technology in 1973. He received his Ph.D. in Applied Mathematics from the University of New Mexico in 1980. He worked at the Argonne National Laboratory until 1989, becoming a senior scientist. He now holds an appointment as University Distinguished Professor of Computer Science in the Electrical Engineering and Computer Science Department at the University of Tennessee, has the position of a Distinguished Research Staff member in the Computer Science and Mathematics Division at Oak Ridge National Laboratory (ORNL), Turing Fellow in the Computer Science and Mathematics Schools at the University of Manchester, and an Adjunct Professor in the Computer Science Department at Rice University. He specializes in numerical algorithms in linear algebra, parallel computing, the use of advanced-computer architectures, programming methodology, and tools for parallel computers. His research includes the development, testing and documentation of high quality mathematical software. He has contributed to the design and implementation of the following open source software packages and systems: EISPACK, LINPACK, the BLAS, LAPACK, ScaLAPACK, Netlib, PVM, MPI, NetSolve, Top500, ATLAS, and PAPI. He has published approximately 200 articles, papers, reports and technical memoranda and he is coauthor of several books. He was awarded the IEEE Sid Fernbach Award in 2004 for his contributions in the application of high performance computers using innovative approaches. He is a Fellow of the AAAS, ACM, and the IEEE and a member of the National Academy of Engineering.

  6. An Overview of High Performance Computing and Challenges for the Future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Google Tech Talks

    In this talk we examine how high performance computing has changed over the last 10-year and look toward the future in terms of trends. These changes have had and will continue to have a major impact on our software. A new generation of software libraries and lgorithms are needed for the effective and reliable use of (wide area) dynamic, distributed and parallel environments. Some of the software and algorithm challenges have already been encountered, such as management of communication and memory hierarchies through a combination of compile--time and run--time techniques, but the increased scale of computation, depth of memory hierarchies,more » range of latencies, and increased run--time environment variability will make these problems much harder. We will focus on the redesign of software to fit multicore architectures. Speaker: Jack Dongarra University of Tennessee Oak Ridge National Laboratory University of Manchester Jack Dongarra received a Bachelor of Science in Mathematics from Chicago State University in 1972 and a Master of Science in Computer Science from the Illinois Institute of Technology in 1973. He received his Ph.D. in Applied Mathematics from the University of New Mexico in 1980. He worked at the Argonne National Laboratory until 1989, becoming a senior scientist. He now holds an appointment as University Distinguished Professor of Computer Science in the Electrical Engineering and Computer Science Department at the University of Tennessee, has the position of a Distinguished Research Staff member in the Computer Science and Mathematics Division at Oak Ridge National Laboratory (ORNL), Turing Fellow in the Computer Science and Mathematics Schools at the University of Manchester, and an Adjunct Professor in the Computer Science Department at Rice University. He specializes in numerical algorithms in linear algebra, parallel computing, the use of advanced-computer architectures, programming methodology, and tools for parallel computers. His research includes the development, testing and documentation of high quality mathematical software. He has contributed to the design and implementation of the following open source software packages and systems: EISPACK, LINPACK, the BLAS, LAPACK, ScaLAPACK, Netlib, PVM, MPI, NetSolve, Top500, ATLAS, and PAPI. He has published approximately 200 articles, papers, reports and technical memoranda and he is coauthor of several books. He was awarded the IEEE Sid Fernbach Award in 2004 for his contributions in the application of high performance computers using innovative approaches. He is a Fellow of the AAAS, ACM, and the IEEE and a member of the National Academy of Engineering.« less

  7. The Argonne Leadership Computing Facility 2010 annual report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drugan, C.

    Researchers found more ways than ever to conduct transformative science at the Argonne Leadership Computing Facility (ALCF) in 2010. Both familiar initiatives and innovative new programs at the ALCF are now serving a growing, global user community with a wide range of computing needs. The Department of Energy's (DOE) INCITE Program remained vital in providing scientists with major allocations of leadership-class computing resources at the ALCF. For calendar year 2011, 35 projects were awarded 732 million supercomputer processor-hours for computationally intensive, large-scale research projects with the potential to significantly advance key areas in science and engineering. Argonne also continued tomore » provide Director's Discretionary allocations - 'start up' awards - for potential future INCITE projects. And DOE's new ASCR Leadership Computing (ALCC) Program allocated resources to 10 ALCF projects, with an emphasis on high-risk, high-payoff simulations directly related to the Department's energy mission, national emergencies, or for broadening the research community capable of using leadership computing resources. While delivering more science today, we've also been laying a solid foundation for high performance computing in the future. After a successful DOE Lehman review, a contract was signed to deliver Mira, the next-generation Blue Gene/Q system, to the ALCF in 2012. The ALCF is working with the 16 projects that were selected for the Early Science Program (ESP) to enable them to be productive as soon as Mira is operational. Preproduction access to Mira will enable ESP projects to adapt their codes to its architecture and collaborate with ALCF staff in shaking down the new system. We expect the 10-petaflops system to stoke economic growth and improve U.S. competitiveness in key areas such as advancing clean energy and addressing global climate change. Ultimately, we envision Mira as a stepping-stone to exascale-class computers that will be faster than petascale-class computers by a factor of a thousand. Pete Beckman, who served as the ALCF's Director for the past few years, has been named director of the newly created Exascale Technology and Computing Institute (ETCi). The institute will focus on developing exascale computing to extend scientific discovery and solve critical science and engineering problems. Just as Pete's leadership propelled the ALCF to great success, we know that that ETCi will benefit immensely from his expertise and experience. Without question, the future of supercomputing is certainly in good hands. I would like to thank Pete for all his effort over the past two years, during which he oversaw the establishing of ALCF2, the deployment of the Magellan project, increases in utilization, availability, and number of projects using ALCF1. He managed the rapid growth of ALCF staff and made the facility what it is today. All the staff and users are better for Pete's efforts.« less

  8. Scalable data management, analysis and visualization (SDAV) Institute. Final Scientific/Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geveci, Berk

    The purpose of the SDAV institute is to provide tools and expertise in scientific data management, analysis, and visualization to DOE’s application scientists. Our goal is to actively work with application teams to assist them in achieving breakthrough science, and to provide technical solutions in the data management, analysis, and visualization regimes that are broadly used by the computational science community. Over the last 5 years members of our institute worked directly with application scientists and DOE leadership-class facilities to assist them by applying the best tools and technologies at our disposal. We also enhanced our tools based on inputmore » from scientists on their needs. Many of the applications we have been working with are based on connections with scientists established in previous years. However, we contacted additional scientists though our outreach activities, as well as engaging application teams running on leading DOE computing systems. Our approach is to employ an evolutionary development and deployment process: first considering the application of existing tools, followed by the customization necessary for each particular application, and then the deployment in real frameworks and infrastructures. The institute is organized into three areas, each with area leaders, who keep track of progress, engagement of application scientists, and results. The areas are: (1) Data Management, (2) Data Analysis, and (3) Visualization. Kitware has been involved in the Visualization area. This report covers Kitware’s contributions over the last 5 years (February 2012 – February 2017). For details on the work performed by the SDAV institute as a whole, please see the SDAV final report.« less

  9. A survey of specific individualized instruction strategies in elementary science methods courses in Tennessee teacher education institutions

    NASA Astrophysics Data System (ADS)

    Hazari, Alan A.

    The purpose of the study was to determine the status of individualized science instruction in Tennessee teacher education institutions. Specifically, the study sought to investigate the extent of teaching about and/or use of 31 strategies for individualizing instruction in elementary science teaching methods courses. The individualized instruction frameworks, with strategies for individualizing instruction, were developed by Rowell, et al. in the College of Education at the University of Tennessee, Knoxville. A review of the literature on the preparation of preservice elementary science teachers for individualized instruction in K-8 classrooms revealed very limited research. This investigation sought to identify how the elementary science teacher educators prepared their preservice elementary science teachers to (1) learn about the children they will teach, (2) determine differences among learners, (3) plan for individualized science instruction in the elementary school classroom, and (4) help attend to individual student differences. The researcher prepared and used a 31-item survey to poll elementary science teacher educators in Tennessee. The participants included K-8 educators from 40 state-approved teacher education institutions. The high teacher education institution response rate (72.5%) brought input from institutions of varying sizes, operated privately or publicly across the state of Tennessee. In general, Tennessee elementary science teacher educators reported that they tended to teach about and/or use a fair number of the 31 individualized instruction strategies that involve both learning about K-8 students and their differences. On the other hand, many of these educators provided preservice teachers with quite a bit of the strategies that lead to planning for individualized science instruction and to attending to individual student differences. The two strategies that were the most taught about and/or used in elementary science methods by Tennessee educators were planning for and maintaining an interactive classroom and implementing cooperative learning groups. The two strategies with the lowest rating were using a computer-tracking system to keep student profiles and using commercial tests to determine student placement. Almost 42% of the strategies in the survey were rated high to very high. This indicated that Tennessee educators do regularly include many of these 31 strategies in their elementary science methods courses. Examples include hands-on approach, cooperative learning, thematic and project teaching, learning centers, and the use of the Tennessee Instructional Model. The study also showed that Tennessee science teacher educators in church-related institutions appeared to utilize more of the 31 strategies for individualizing instruction in K-8 classrooms than do the educators in non-church-related institutions. Tennessee K-8 teachers could be better prepared if exposed to as many different and effective pedagogical tools and practices as possible during their education and preparation. A strong science program rich in content and a variety of instructional strategies (including individualized instruction) is needed to help maximize the science learning opportunities for all Tennessee students.

  10. Sundials in the shade: A study of women's persistence in the first year of a computer science program in a selective university

    NASA Astrophysics Data System (ADS)

    Powell, Rita Manco

    Currently women are underrepresented in departments of computer science, making up approximately 18% of the undergraduate enrollment in selective universities. Most attrition in computer science occurs early in this major, in the freshman and sophomore years, and women drop out in disproportionately greater numbers than their male counterparts. Taking an ethnographic approach to investigating women's experiences and progress in the first year courses in the computer science major at the University of Pennsylvania, this study examined the pre-college influences that led these women to the major and the nature of their experiences in and outside of class with faculty, peers, and academic support services. This study sought an understanding of the challenges these women faced in the first year of the major with the goal of informing institutional practice about how to best support their persistence. The research reviewed for this study included patterns of leaving majors in science, math and engineering (Seymour & Hewitt 1997), the high school preparation needed to pursue math and engineering majors in college (Strenta, Elliott, Adair, Matier, & Scott, 1994), and intervention programs that have positively impacted persistence of women in computer science (Margolis & Fisher, 2002). The research method of this study employed a series of personal interviews over the course of one calendar year with fourteen first year women who had either declared on intended to declare the computer science major in the School of Engineering and Applied Science at the University of Pennsylvania. Other data sources were focus groups and personal interviews with faculty, administrators, admissions and student life professionals, teaching assistants, female graduate students, and male first year students at the University of Pennsylvania. This study found that the women in this study group came to the University of Pennsylvania with a thorough grounding in mathematics, but many either had an inadequate background in computer science, or at least perceived inadequacies in their background, which prevented them from beginning the major on an equal footing with their mostly male peers and caused some to lose confidence and consequently interest in the major. Issues also emanated from their gender-minority status in the Computer and Information Science Department, causing them to be socially isolated from their peers and further weakening their resolve to persist. These findings suggest that female first year students could benefit from multiple pathways into the major designed for students with varying degrees of prior experience with computer science. In addition, a computer science community within the department characterized by more frequent interaction and collaboration with faculty and peers could positively impact women's persistence in the major.

  11. Science alliance: A vital ORNL-UT partnership

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richmond, C.R.; Riedinger, L.; Garritano, T.

    1991-01-01

    Partnerships between Department of Energy national laboratories and universities have long been keys to advancing scientific research and education in the United States. Perhaps the most enduring and closely knit of these relationships is the one between Oak Ridge National Laboratory and the University of Tennessee at Knoxville. Since its birth in the 1940's, ORNL has had a very special relationship with UT, and today the two institutions have closer ties than virtually any other university and national laboratory. Seven years ago, ORNL and UT began a new era of cooperation by creating the Science Alliance, a Center of Excellencemore » at UT sponsored by the Tennessee Higher Education Commission. As the oldest and largest of these centers, the Science Alliance is the primary vehicle through which Tennessee promotes research and educational collaboration between UT and ORNL. By letting the two institutions pool their intellectual and financial resources, the alliance creates a more fertile scientific environment than either could achieve on its own. Part of the UT College of Liberal Arts, the Science Alliance is composed of four divisions (Biological Sciences, Chemical Sciences, Physical Sciences, and Mathematics and Computer Science) that team 100 of the university's top faculty with their outstanding colleagues from ORNL.« less

  12. Energy and technology review

    NASA Astrophysics Data System (ADS)

    Johnson, K. C.

    1991-04-01

    This issue of Energy and Technology Review discusses the various educational programs in which Lawrence Livermore National Laboratory (LLNL) participates or sponsors. LLNL has a long history of fostering educational programs for students from kindergarten through graduate school. A goal is to enhance the teaching of science, mathematics, and technology and thereby assist educational institutions to increase the pool of scientists, engineers, and technicians. LLNL programs described include: (1) contributions to the improvement of U.S. science education; (2) the LESSON program; (3) collaborations with Bay Area Science and Technology Education; (4) project HOPES; (5) lasers and fusion energy education; (6) a curriculum on global climate change; (7) computer and technology instruction at LLNL's Science Education Center; (8) the National Education Supercomputer Program; (9) project STAR; (10) the American Indian Program; (11) LLNL programs with historically Black colleges and Universities; (12) the Undergraduate Summer Institute on Contemporary Topics in Applied Science; (13) the National Physical Science Consortium: A Fellowship Program for Minorities and Women; (14) LLNL's participation with AWU; (15) the apprenticeship programs at LLNL; and (16) the future of LLNL's educational programs. An appendix lists all of LLNL's educational programs and activities. Contacts and their respective telephone numbers are given for all these programs and activities.

  13. Service-Oriented Architectures and Project Optimization for a Special Cost Management Problem Creating Synergies for Informed Change between Qualitative and Quantitative Strategic Management Processes

    DTIC Science & Technology

    2010-05-01

    Science, Werner Heisenberg -Weg 39,85577 Neubiberg, Germany,CA,93943 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S...University of the Federal Armed Forces of Germany Institute for Theoretic Computer Science Mathematics and Operations Research Werner Heisenberg -Weg...Research Werner Heisenberg -Weg 39 85577 Neubiberg, Germany Phone +49 89 6004 2400 Marco Schuler—Marco Schuler is an active Officer of the Federal

  14. Research summary, January 1989 - June 1990

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established at NASA ARC in June of 1983. RIACS is privately operated by the Universities Space Research Association (USRA), a consortium of 62 universities with graduate programs in the aerospace sciences, under a Cooperative Agreement with NASA. RIACS serves as the representative of the USRA universities at ARC. This document reports our activities and accomplishments for the period 1 Jan. 1989 - 30 Jun. 1990. The following topics are covered: learning systems, networked systems, and parallel systems.

  15. Grids for Dummies: Featuring Earth Science Data Mining Application

    NASA Technical Reports Server (NTRS)

    Hinke, Thomas H.

    2002-01-01

    This viewgraph presentation discusses the concept and advantages of linking computers together into data grids, an emerging technology for managing information across institutions, and potential users of data grids. The logistics of access to a grid, including the use of the World Wide Web to access grids, and security concerns are also discussed. The potential usefulness of data grids to the earth science community is also discussed, as well as the Global Grid Forum, and other efforts to establish standards for data grids.

  16. The Critical Path Institute's approach to precompetitive sharing and advancing regulatory science.

    PubMed

    Woosley, R L; Myers, R T; Goodsaid, F

    2010-05-01

    Many successful large industries, such as computer-chip manufacturers, the cable television industry, and high-definition television developers,(1) have established successful precompetitive collaborations focusing on standards, applied science, and technology that advance the field for all stakeholders and benefit the public.(2) The pharmaceutical industry, however, has a well-earned reputation for fierce competition and did not demonstrate willingness to share data or knowledge until the US Food and Drug Administration (FDA) launched the Critical Path Initiative in 2004 (ref. 3).

  17. Neuroengineering control and regulation of behavior

    NASA Astrophysics Data System (ADS)

    Wróbel, A.; Radzewicz, C.; Mankiewicz, L.; Hottowy, P.; Knapska, E.; Konopka, W.; Kublik, E.; Radwańska, K.; Waleszczyk, W. J.; Wójcik, D. K.

    2014-11-01

    To monitor neuronal circuits involved in emotional modulation of sensory processing we proposed a plan to establish novel research techniques combining recent biological, technical and analytical discoveries. The project was granted by National Science Center and we started to build a new experimental model for studying the selected circuits of genetically marked and behaviorally activated neurons. To achieve this goal we will combine the pioneering, interdisciplinary expertise of four Polish institutions: (i) the Nencki Institute of Experimental Biology (Polish Academy of Sciences) will deliver the expertise on genetically modified mice and rats, mapping of the neuronal circuits activated by behavior, monitoring complex behaviors measured in the IntelliCage system, electrophysiological brain activity recordings by multielectrodes in behaving animals, analysis and modeling of behavioral and electrophysiological data; (ii) the AGH University of Science and Technology (Faculty of Physics and Applied Computer Sciences) will use its experience in high-throughput electronics to build multichannel systems for recording the brain activity of behaving animals; (iii) the University of Warsaw (Faculty of Physics) and (iv) the Center for Theoretical Physics (Polish Academy of Sciences) will construct optoelectronic device for remote control of opto-animals produced in the Nencki Institute based on the unique experience in laser sources, studies of light propagation and its interaction with condensed media, wireless medical robotic systems, fast readout opto-electronics with control software and micromechanics.

  18. The art and science of selecting graduate students in the biomedical sciences: Performance in doctoral study of the foundational sciences.

    PubMed

    Park, Hee-Young; Berkowitz, Oren; Symes, Karen; Dasgupta, Shoumita

    2018-01-01

    The goal of this study was to investigate associations between admissions criteria and performance in Ph.D. programs at Boston University School of Medicine. The initial phase of this project examined student performance in the classroom component of a newly established curriculum named "Foundations in Biomedical Sciences (FiBS)". Quantitative measures including undergraduate grade point average (GPA), graduate record examination (GRE; a standardized, computer-based test) scores for the verbal (assessment of test takers' ability to analyze, evaluate, and synthesize information and concepts provided in writing) and quantitative (assessment of test takers' problem-solving ability) components of the examination, previous research experience, and competitiveness of previous research institution were used in the study. These criteria were compared with competencies in the program defined as students who pass the curriculum as well as students categorized as High Performers. These data indicated that there is a significant positive correlation between FiBS performance and undergraduate GPA, GRE scores, and competitiveness of undergraduate institution. No significant correlations were found between FiBS performance and research background. By taking a data-driven approach to examine admissions and performance, we hope to refine our admissions criteria to facilitate an unbiased approach to recruitment of students in the life sciences and to share our strategy to support similar goals at other institutions.

  19. Data Science Priorities for a University Hospital-Based Institute of Infectious Diseases: A Viewpoint.

    PubMed

    Valleron, Alain-Jacques

    2017-08-15

    Automation of laboratory tests, bioinformatic analysis of biological sequences, and professional data management are used routinely in a modern university hospital-based infectious diseases institute. This dates back to at least the 1980s. However, the scientific methods of this 21st century are changing with the increased power and speed of computers, with the "big data" revolution having already happened in genomics and environment, and eventually arriving in medical informatics. The research will be increasingly "data driven," and the powerful machine learning methods whose efficiency is demonstrated in daily life will also revolutionize medical research. A university-based institute of infectious diseases must therefore not only gather excellent computer scientists and statisticians (as in the past, and as in any medical discipline), but also fully integrate the biologists and clinicians with these computer scientists, statisticians, and mathematical modelers having a broad culture in machine learning, knowledge representation, and knowledge discovery. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail: journals.permissions@oup.com.

  20. The WHATs and HOWs of maturing computational and software engineering skills in Russian higher education institutions

    NASA Astrophysics Data System (ADS)

    Semushin, I. V.; Tsyganova, J. V.; Ugarov, V. V.; Afanasova, A. I.

    2018-05-01

    Russian higher education institutions' tradition of teaching large-enrolled classes is impairing student striving for individual prominence, one-upmanship, and hopes for originality. Intending to converting these drawbacks into benefits, a Project-Centred Education Model (PCEM) has been introduced to deliver Computational Mathematics and Information Science courses. The model combines a Frontal Competitive Approach and a Project-Driven Learning (PDL) framework. The PDL framework has been developed by stating and solving three design problems: (i) enhance the diversity of project assignments on specific computation methods algorithmic approaches, (ii) balance similarity and dissimilarity of the project assignments, and (iii) develop a software assessment tool suitable for evaluating the technological maturity of students' project deliverables and thus reducing instructor's workload and possible overlook. The positive experience accumulated over 15 years shows that implementing the PCEM keeps students motivated to strive for success in rising to higher levels of their computational and software engineering skills.

  1. Why so few women enroll in computing? Gender and ethnic differences in students' perception

    NASA Astrophysics Data System (ADS)

    Varma, Roli

    2010-12-01

    Women are seriously under-represented in computer science and computer engineering (CS/CE) education and, thus, in the information technology (IT) workforce in the USA. This is a grim situation for both the women whose potential remains unutilized and the US society which is dependent on IT. This article examines the reasons behind low enrollment of women in CS/CE education at institutions of higher education. It is based on 150 in-depth interviews of female and male undergraduate students majoring in CS/CE, members of five major ethnic groups (White, Afro-American, Hispanic, Asian American and Native American) from seven Minority-Serving Institutions in the USA. The article finds bias in early socialization and anxiety toward technology as two main factors responsible for the under-representation of women in CS/CE education. It further shows significant gender and ethnic differences in students' responses on why so few women enroll in CS/CE.

  2. Teaching and Training in Geoinformatics: Experiences from the Cyberinfrastructure Summer Institute for Geoscientists (CSIG)

    NASA Astrophysics Data System (ADS)

    Smeekens, M.; Baru, C.; Keller, G. R.; Arrowsmith, R.; Crosby, C. J.

    2009-12-01

    The Cyberinfrastructure Summer Institute for Geoscientists (CSIG) has been conducted each year since 2004 under sponsorship of the GEON project that is funded by the NSF. The goal of the institute, which is broadly advertised to the Geoscience community, is to introduce geoscientists to Computer Science concepts and commonly-used as well as emergent information technology tools. The week-long program originally covered topics ranging from Data Modeling, Web Services, and Geographic Information Systems, to brief introductions to key concepts in Grid Computing, Parallel Programming, and Scientific Workflows. However, the program as well as the composition and expectations of the audience have evolved over time. Detailed course and instructor evaluations provide valuable feedback on course content and presentation approaches, and are used to plan future CSIG curriculum. From an initial emphasis on Geoscience graduate students and postdocs, the selection process has evolved to encourage participation by individuals with backgrounds in Geoscience as well as Computer Science from academia, government agencies, and industry. More recently, there has been an emphasis on selecting junior faculty and those interested in teaching Geoinformatics courses. While the initial objective of CSIG was to provide an overview of information technology topics via lectures and demonstrations, over time attendees have become more interested in specific instruction in how informatics and cyberinfrastructure (CI) capabilities could be utilized to address issues in Earth Science research and education. There have been requests over the years for more in-depth coverage on some topics and hands-on exercises. The program has now evolved to include a “Build Track”, focused on IT issues related to the development and implementation of Geoinformatics systems, and an “Education Track”, focused on use of Geoinformatics resources in education. With increasing awareness of CI projects, the audience is also becoming more interested in an introduction to the broader landscape of CI activities in the Geosciences and related areas. In the future, we plan a “demo” session to showcase various CI projects. Attendees will not only hear about such projects but will be able to use and experience the cyber-environments and tools in a hands-on session. The evolution of the CSIG program reflects major changes in the IT landscape since 2004. Where we once discussed Grid Computing, students are now learning about Cloud Computing and related concepts. An institute like CSIG play an important role in providing “cross-training” such that geoscientists gain insight into IT issues and solution approaches, while computer scientist gain a better appreciation of the needs and requirements of geoscience applications. In this presentation, we will summarize and analyze the trends over the years in program as well as audience composition; discuss lessons learnt over the years; and present our plan for future CSIG offerings.

  3. A Simple Algorithm for Obtaining Nearly Optimal Quadrature Rules for NURBS-based Isogeometric Analysis

    DTIC Science & Technology

    2012-01-05

    Università degli Studi di Pavia bIstituto di Matematica Applicata e Tecnologie Informatiche “E. Magenes” del CNR, Pavia cDAEIMI, Università degli Studi di...Cassino d Institute for Computational Engineering and Sciences, University of Texas at Austin eDipartimento di Matematica , Università degli Studi di

  4. Microelectronic Information Processing Systems: Computing Systems. Summary of Awards Fiscal Year 1994.

    ERIC Educational Resources Information Center

    National Science Foundation, Arlington, VA. Directorate for Computer and Information Science and Engineering.

    The purpose of this summary of awards is to provide the scientific and engineering communities with a summary of the grants awarded in 1994 by the National Science Foundation's Division of Microelectronic Information Processing Systems. Similar areas of research are grouped together. Grantee institutions and principal investigators are identified…

  5. Automatic crown cover mapping to improve forest inventory

    Treesearch

    Claude Vidal; Jean-Guy Boureau; Nicolas Robert; Nicolas Py; Josiane Zerubia; Xavier Descombes; Guillaume Perrin

    2009-01-01

    To automatically analyze near infrared aerial photographs, the French National Institute for Research in Computer Science and Control developed together with the French National Forest Inventory (NFI) a method for automatic crown cover mapping. This method uses a Reverse Jump Monte Carlo Markov Chain algorithm to locate the crowns and describe those using ellipses or...

  6. KEYNOTE ADDRESS: The role of standards in the emerging optical digital data disk storage systems market

    NASA Astrophysics Data System (ADS)

    Bainbridge, Ross C.

    1984-09-01

    The Institute for Computer Sciences and Technology at the National Bureau of Standards is pleased to cooperate with the International Society for Optical Engineering and to join with the other distinguished organizations in cosponsoring this conference on applications of optical digital data disk storage systems.

  7. NASIC at MIT. Final Report, 1 March 1974 through 28 February 1975.

    ERIC Educational Resources Information Center

    Benenfeld, Alan R.; And Others

    Computer-based reference search services were provided to users on a fee-for-service basis at the Massachusetts Institute of Technology as the first, and experimental, note in the development of the Northeast Academic Science Information Center (NASIC). Development of a training program for information specialists and training materials is…

  8. A Hands-On Approach for Teaching Denial of Service Attacks: A Case Study

    ERIC Educational Resources Information Center

    Trabelsi, Zouheir; Ibrahim, Walid

    2013-01-01

    Nowadays, many academic institutions are including ethical hacking in their information security and Computer Science programs. Information security students need to experiment common ethical hacking techniques in order to be able to implement the appropriate security solutions. This will allow them to more efficiently protect the confidentiality,…

  9. Learning Styles of ICT Specialisation Students: Do Differences in Disciplines Exist?

    ERIC Educational Resources Information Center

    de Salas, Kristy; Lewis, Ian; Dermoudy, Julian

    2014-01-01

    Within existing ICT degrees there is a widely-held belief that content must be tailored for different "kinds" of students--often two differing student groups: a technical group requiring detailed Computer Science knowledge and a separate group requiring less technical, more strategic ICT knowledge and skills. Our institution has produced…

  10. Methods and successes of New York University workshops for science graduate students and post-docs in science writing for general audiences (readers and radio listeners)

    NASA Astrophysics Data System (ADS)

    Hall, S. S.

    2012-12-01

    Scientists and science administrators often stress the importance of communication to the general public, but rarely develop educational infrastructures to achieve this goal. Since 2009, the Arthur L. Carter Journalism Institute at New York University has offered a series of basic and advanced writing workshops for graduate students and post-docs in NYU's eight scientific divisions (neuroscience, psychology, physics, biology, chemistry, mathematics, anthropology, and computer science). The basic methodology of the NYU approach will be described, along with successful examples of both written and radio work by students that have been either published or broadcast by general interest journalism outlets.

  11. Proceedings of the 5. joint Russian-American computational mathematics conference

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1997-12-31

    These proceedings contain a record of the talks presented and papers submitted by participants. The conference participants represented three institutions from the United States, Sandia National Laboratories (SNL), Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and two from Russia, Russian Federal Nuclear Center--All Russian Research Institute of Experimental Physics (RFNC-VNIIEF/Arzamas-16), and Russian Federal Nuclear Center--All Russian Research Institute of Technical Physics (RFNC-VNIITF/Chelyabinsk-70). The presentations and papers cover a wide range of applications from radiation transport to materials. Selected papers have been indexed separately for inclusion in the Energy Science and Technology Database.

  12. The Pisgah Astronomical Research Institute

    NASA Astrophysics Data System (ADS)

    Cline, J. Donald; Castelaz, M.

    2009-01-01

    Pisgah Astronomical Research Institute is a not-for-profit foundation located at a former NASA tracking station in the Pisgah National Forest in western North Carolina. PARI is celebrating its 10th year. During its ten years, PARI has developed and implemented innovative science education programs. The science education programs are hands-on experimentally based, mixing disciplines in astronomy, computer science, earth and atmospheric science, engineering, and multimedia. The basic tools for the educational programs include a 4.6-m radio telescope accessible via the Internet, a StarLab planetarium, the Astronomical Photographic Data Archive (APDA), a distributed computing online environment to classify stars called SCOPE, and remotely accessible optical telescopes. The PARI 200 acre campus has a 4.6-m, a 12-m and two 26-m radio telescopes, optical solar telescopes, a Polaris monitoring telescope, 0.4-m and 0.35-m optical research telescopes, and earth and atmospheric science instruments. PARI is also the home of APDA, a repository for astronomical photographic plate collections which will eventually be digitized and made available online. PARI has collaborated with visiting scientists who have developed their research with PARI telescopes and lab facilities. Current experiments include: the Dedicated Interferometer for Rapid Variability (Dennison et al. 2007, Astronomical and Astrophysical Transactions, 26, 557); the Plate Boundary Observatory operated by UNAVCO; the Clemson University Fabry-Perot Interferometers (Meriwether 2008, Journal of Geophysical Research, submitted) measuring high velocity winds and temperatures in the Thermosphere, and the Western Carolina University - PARI variable star program. Current status of the education and research programs and instruments will be presented. Also, development plans will be reviewed. Development plans include the greening of PARI with the installation of solar panels to power the optical telescopes, a new distance learning center, and enhancements to the atmospheric and earth science suite of instrumentation.

  13. Research Reports: 1988 NASA/ASEE Summer Faculty Fellowship Program

    NASA Technical Reports Server (NTRS)

    Freeman, L. Michael (Editor); Chappell, Charles R. (Editor); Cothran, Ernestine K. (Editor); Karr, Gerald R. (Editor)

    1988-01-01

    The basic objectives are to further the professional knowledge of qualified engineering and science faculty members; to stimulate an exchange of ideas between participants and NASA: to enrich and refresh the research and teaching activities of the participants' institutions; and to contribute to the research objectives of the NASA centers. Topics addressed include: cryogenics; thunderstorm simulation; computer techniques; computer assisted instruction; system analysis weather forecasting; rocket engine design; crystal growth; control systems design; turbine pumps for the Space Shuttle Main engine; electron mobility; heat transfer predictions; rotor dynamics; mathematical models; computational fluid dynamics; and structural analysis.

  14. Cumulative Reports and Publications through December 31, 1991 (Institute for Computer Applications in Science and Engineering)

    DTIC Science & Technology

    1992-02-01

    universities and industry who have resident appointments for limited periods of time , and by consultants. Members of NASA’s research staff also may be...Submitted to Journal of Computational Physics. Banks, H. T., G. Propst, and R. J. Silcox: A comparison of time domain boundary conditions for acoustic...2, pp. 117-145, i991. Nicol, David M.: T/ cost of conservative synchronization in parallel discrete event sim- ulations. ICASE Report No. 90-20, May

  15. Beyond the first "click:" Women graduate students in computer science

    NASA Astrophysics Data System (ADS)

    Sader, Jennifer L.

    This dissertation explored the ways that constructions of gender shaped the choices and expectations of women doctoral students in computer science. Women who do graduate work in computer science still operate in an environment where they are in the minority. How much of women's underrepresentation in computer science fields results from a problem of imagining women as computer scientists? As long as women in these fields are seen as exceptions, they are exceptions that prove the "rule" that computing is a man's domain. The following questions were the focus of this inquiry: What are the career aspirations of women doctoral students in computer science? How do they feel about their chances to succeed in their chosen career and field? How do women doctoral students in computer science construct womanhood? What are their constructions of what it means to be a computer scientist? In what ways, if any, do they believe their gender has affected their experience in their graduate programs? The goal was to examine how constructions of computer science and of gender---including participants' own understanding of what it meant to be a woman, as well as the messages they received from their environment---contributed to their success as graduate students in a field where women are still greatly outnumbered by men. Ten women from four different institutions of higher education were recruited to participate in this study. These women varied in demographic characteristics like age, race, and ethnicity. Still, there were many common threads in their experiences. For example, their construction of womanhood did not limit their career prospects to traditionally female jobs. They had grown up with the expectation that they would be able to succeed in whatever field they chose. Most also had very positive constructions of programming as something that was "fun," rewarding, and intellectually stimulating. Their biggest obstacles were feelings of isolation and a resulting loss of confidence. Implications for future research are provided. There are also several implications for practice, especially the recommendation that graduate schools provide more support for all of their students. The experiences of these women also suggest ways to more effectively recruit women students to computer science. The importance of women faculty in these students' success also suggests that schools trying to counteract gender imbalances should actively recruit women faculty to teach in fields where women are underrepresented. These faculty serve as important role models and mentors to women students in their field.

  16. Understanding initial undergraduate expectations and identity in computing studies

    NASA Astrophysics Data System (ADS)

    Kinnunen, Päivi; Butler, Matthew; Morgan, Michael; Nylen, Aletta; Peters, Anne-Kathrin; Sinclair, Jane; Kalvala, Sara; Pesonen, Erkki

    2018-03-01

    There is growing appreciation of the importance of understanding the student perspective in Higher Education (HE) at both institutional and international levels. This is particularly important in Science, Technology, Engineering and Mathematics subjects such as Computer Science (CS) and Engineering in which industry needs are high but so are student dropout rates. An important factor to consider is the management of students' initial expectations of university study and career. This paper reports on a study of CS first-year students' expectations across three European countries using qualitative data from student surveys and essays. Expectation is examined from both short-term (topics to be studied) and long-term (career goals) perspectives. Tackling these issues will help paint a picture of computing education through students' eyes and explore their vision of its and their role in society. It will also help educators prepare students more effectively for university study and to improve the student experience.

  17. GSDC: A Unique Data Center in Korea for HEP research

    NASA Astrophysics Data System (ADS)

    Ahn, Sang-Un

    2017-04-01

    Global Science experimental Data hub Center (GSDC) at Korea Institute of Science and Technology Information (KISTI) is a unique data center in South Korea established for promoting the fundamental research fields by supporting them with the expertise on Information and Communication Technology (ICT) and the infrastructure for High Performance Computing (HPC), High Throughput Computing (HTC) and Networking. GSDC has supported various research fields in South Korea dealing with the large scale of data, e.g. RENO experiment for neutrino research, LIGO experiment for gravitational wave detection, Genome sequencing project for bio-medical, and HEP experiments such as CDF at FNAL, Belle at KEK, and STAR at BNL. In particular, GSDC has run a Tier-1 center for ALICE experiment using the LHC at CERN since 2013. In this talk, we present the overview on computing infrastructure that GSDC runs for the research fields and we discuss on the data center infrastructure management system deployed at GSDC.

  18. The assessment of virtual reality for human anatomy instruction

    NASA Technical Reports Server (NTRS)

    Benn, Karen P.

    1994-01-01

    This research project seeks to meet the objective of science training by developing, assessing, and validating virtual reality as a human anatomy training medium. In ideal situations, anatomic models, computer-based instruction, and cadaver dissection are utilized to augment the traditional methods of instruction. At many institutions, lack of financial resources limits anatomy instruction to textbooks and lectures. However, human anatomy is three dimensional, unlike the one dimensional depiction found in textbooks and the two dimensional depiction found on the computer. Virtual reality is a breakthrough technology that allows one to step through the computer screen into a three dimensional world. This technology offers many opportunities to enhance science education. Therefore, a virtual testing environment of the abdominopelvic region of a human cadaver was created to study the placement of body parts within the nine anatomical divisions of the abdominopelvic region and the four abdominal quadrants.

  19. RIACS/USRA

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph

    1993-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on 6 June 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. A flexible scientific staff is provided through a university faculty visitor program, a post doctoral program, and a student visitor program. Not only does this provide appropriate expertise but it also introduces scientists outside of NASA to NASA problems. A small group of core RIACS staff provides continuity and interacts with an ARC technical monitor and scientific advisory group to determine the RIACS mission. RIACS activities are reviewed and monitored by a USRA advisory council and ARC technical monitor. Research at RIACS is currently being done in the following areas: Parallel Computing, Advanced Methods for Scientific Computing, High Performance Networks and Technology, and Learning Systems. Parallel compiler techniques, adaptive numerical methods for flows in complicated geometries, and optimization were identified as important problems to investigate for ARC's involvement in the Computational Grand Challenges of the next decade.

  20. 2K09 and thereafter : the coming era of integrative bioinformatics, systems biology and intelligent computing for functional genomics and personalized medicine research.

    PubMed

    Yang, Jack Y; Niemierko, Andrzej; Bajcsy, Ruzena; Xu, Dong; Athey, Brian D; Zhang, Aidong; Ersoy, Okan K; Li, Guo-Zheng; Borodovsky, Mark; Zhang, Joe C; Arabnia, Hamid R; Deng, Youping; Dunker, A Keith; Liu, Yunlong; Ghafoor, Arif

    2010-12-01

    Significant interest exists in establishing synergistic research in bioinformatics, systems biology and intelligent computing. Supported by the United States National Science Foundation (NSF), International Society of Intelligent Biological Medicine (http://www.ISIBM.org), International Journal of Computational Biology and Drug Design (IJCBDD) and International Journal of Functional Informatics and Personalized Medicine, the ISIBM International Joint Conferences on Bioinformatics, Systems Biology and Intelligent Computing (ISIBM IJCBS 2009) attracted more than 300 papers and 400 researchers and medical doctors world-wide. It was the only inter/multidisciplinary conference aimed to promote synergistic research and education in bioinformatics, systems biology and intelligent computing. The conference committee was very grateful for the valuable advice and suggestions from honorary chairs, steering committee members and scientific leaders including Dr. Michael S. Waterman (USC, Member of United States National Academy of Sciences), Dr. Chih-Ming Ho (UCLA, Member of United States National Academy of Engineering and Academician of Academia Sinica), Dr. Wing H. Wong (Stanford, Member of United States National Academy of Sciences), Dr. Ruzena Bajcsy (UC Berkeley, Member of United States National Academy of Engineering and Member of United States Institute of Medicine of the National Academies), Dr. Mary Qu Yang (United States National Institutes of Health and Oak Ridge, DOE), Dr. Andrzej Niemierko (Harvard), Dr. A. Keith Dunker (Indiana), Dr. Brian D. Athey (Michigan), Dr. Weida Tong (FDA, United States Department of Health and Human Services), Dr. Cathy H. Wu (Georgetown), Dr. Dong Xu (Missouri), Drs. Arif Ghafoor and Okan K Ersoy (Purdue), Dr. Mark Borodovsky (Georgia Tech, President of ISIBM), Dr. Hamid R. Arabnia (UGA, Vice-President of ISIBM), and other scientific leaders. The committee presented the 2009 ISIBM Outstanding Achievement Awards to Dr. Joydeep Ghosh (UT Austin), Dr. Aidong Zhang (Buffalo) and Dr. Zhi-Hua Zhou (Nanjing) for their significant contributions to the field of intelligent biological medicine.

  1. 2K09 and thereafter : the coming era of integrative bioinformatics, systems biology and intelligent computing for functional genomics and personalized medicine research

    PubMed Central

    2010-01-01

    Significant interest exists in establishing synergistic research in bioinformatics, systems biology and intelligent computing. Supported by the United States National Science Foundation (NSF), International Society of Intelligent Biological Medicine (http://www.ISIBM.org), International Journal of Computational Biology and Drug Design (IJCBDD) and International Journal of Functional Informatics and Personalized Medicine, the ISIBM International Joint Conferences on Bioinformatics, Systems Biology and Intelligent Computing (ISIBM IJCBS 2009) attracted more than 300 papers and 400 researchers and medical doctors world-wide. It was the only inter/multidisciplinary conference aimed to promote synergistic research and education in bioinformatics, systems biology and intelligent computing. The conference committee was very grateful for the valuable advice and suggestions from honorary chairs, steering committee members and scientific leaders including Dr. Michael S. Waterman (USC, Member of United States National Academy of Sciences), Dr. Chih-Ming Ho (UCLA, Member of United States National Academy of Engineering and Academician of Academia Sinica), Dr. Wing H. Wong (Stanford, Member of United States National Academy of Sciences), Dr. Ruzena Bajcsy (UC Berkeley, Member of United States National Academy of Engineering and Member of United States Institute of Medicine of the National Academies), Dr. Mary Qu Yang (United States National Institutes of Health and Oak Ridge, DOE), Dr. Andrzej Niemierko (Harvard), Dr. A. Keith Dunker (Indiana), Dr. Brian D. Athey (Michigan), Dr. Weida Tong (FDA, United States Department of Health and Human Services), Dr. Cathy H. Wu (Georgetown), Dr. Dong Xu (Missouri), Drs. Arif Ghafoor and Okan K Ersoy (Purdue), Dr. Mark Borodovsky (Georgia Tech, President of ISIBM), Dr. Hamid R. Arabnia (UGA, Vice-President of ISIBM), and other scientific leaders. The committee presented the 2009 ISIBM Outstanding Achievement Awards to Dr. Joydeep Ghosh (UT Austin), Dr. Aidong Zhang (Buffalo) and Dr. Zhi-Hua Zhou (Nanjing) for their significant contributions to the field of intelligent biological medicine. PMID:21143775

  2. Integration of Cloud resources in the LHCb Distributed Computing

    NASA Astrophysics Data System (ADS)

    Úbeda García, Mario; Méndez Muñoz, Víctor; Stagni, Federico; Cabarrou, Baptiste; Rauschmayr, Nathalie; Charpentier, Philippe; Closier, Joel

    2014-06-01

    This contribution describes how Cloud resources have been integrated in the LHCb Distributed Computing. LHCb is using its specific Dirac extension (LHCbDirac) as an interware for its Distributed Computing. So far, it was seamlessly integrating Grid resources and Computer clusters. The cloud extension of DIRAC (VMDIRAC) allows the integration of Cloud computing infrastructures. It is able to interact with multiple types of infrastructures in commercial and institutional clouds, supported by multiple interfaces (Amazon EC2, OpenNebula, OpenStack and CloudStack) - instantiates, monitors and manages Virtual Machines running on this aggregation of Cloud resources. Moreover, specifications for institutional Cloud resources proposed by Worldwide LHC Computing Grid (WLCG), mainly by the High Energy Physics Unix Information Exchange (HEPiX) group, have been taken into account. Several initiatives and computing resource providers in the eScience environment have already deployed IaaS in production during 2013. Keeping this on mind, pros and cons of a cloud based infrasctructure have been studied in contrast with the current setup. As a result, this work addresses four different use cases which represent a major improvement on several levels of our infrastructure. We describe the solution implemented by LHCb for the contextualisation of the VMs based on the idea of Cloud Site. We report on operational experience of using in production several institutional Cloud resources that are thus becoming integral part of the LHCb Distributed Computing resources. Furthermore, we describe as well the gradual migration of our Service Infrastructure towards a fully distributed architecture following the Service as a Service (SaaS) model.

  3. A study of the effects of gender and different instructional media (computer-assisted instruction tutorials vs. textbook) on student attitudes and achievement in a team-taught integrated science class

    NASA Astrophysics Data System (ADS)

    Eardley, Julie Anne

    The purpose of this study was to determine the effect of different instructional media (computer assisted instruction (CAI) tutorial vs. traditional textbook) on student attitudes toward science and computers and achievement scores in a team-taught integrated science course, ENS 1001, "The Whole Earth Course," which was offered at Florida Institute of Technology during the Fall 2000 term. The effect of gender on student attitudes toward science and computers and achievement scores was also investigated. This study employed a randomized pretest-posttest control group experimental research design with a sample of 30 students (12 males and 18 females). Students had registered for weekly lab sessions that accompanied the course and had been randomly assigned to the treatment or control group. The treatment group used a CAI tutorial for completing homework assignments and the control group used the required textbook for completing homework assignments. The Attitude toward Science and Computers Questionnaire and Achievement Test were the two instruments administered during this study to measure students' attitudes and achievement score changes. A multivariate analysis of covariance (MANCOVA), using hierarchical multiple regression/correlation (MRC), was employed to determine: (1) treatment versus control group attitude and achievement differences; and (2) male versus female attitude and achievement differences. The differences between the treatment group's and control group's homework averages were determined by t test analyses. The overall MANCOVA model was found to be significant at p < .05. Examining research factor set independent variables separately resulted in gender being the only variable that significantly contributed in explaining the variability in a dependent variable, attitudes toward science and computers. T test analyses of the homework averages showed no significant differences. Contradictory to the findings of this study, anecdotal information from personal communication, course evaluations, and homework assignments indicated favorable attitudes and higher achievement scores for a majority of the students in the treatment group.

  4. The Magellan Final Report on Cloud Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    ,; Coghlan, Susan; Yelick, Katherine

    The goal of Magellan, a project funded through the U.S. Department of Energy (DOE) Office of Advanced Scientific Computing Research (ASCR), was to investigate the potential role of cloud computing in addressing the computing needs for the DOE Office of Science (SC), particularly related to serving the needs of mid- range computing and future data-intensive computing workloads. A set of research questions was formed to probe various aspects of cloud computing from performance, usability, and cost. To address these questions, a distributed testbed infrastructure was deployed at the Argonne Leadership Computing Facility (ALCF) and the National Energy Research Scientific Computingmore » Center (NERSC). The testbed was designed to be flexible and capable enough to explore a variety of computing models and hardware design points in order to understand the impact for various scientific applications. During the project, the testbed also served as a valuable resource to application scientists. Applications from a diverse set of projects such as MG-RAST (a metagenomics analysis server), the Joint Genome Institute, the STAR experiment at the Relativistic Heavy Ion Collider, and the Laser Interferometer Gravitational Wave Observatory (LIGO), were used by the Magellan project for benchmarking within the cloud, but the project teams were also able to accomplish important production science utilizing the Magellan cloud resources.« less

  5. Downscaling Climate Science to the Classroom: Diverse Opportunities for Teaching Climate Science in Diverse Ways to Diverse Undergraduate Populations

    NASA Astrophysics Data System (ADS)

    Jones, R. M.; Gill, T. E.; Quesada, D.; Hedquist, B. C.

    2015-12-01

    Climate literacy and climate education are important topics in current socio-political debate. Despite numerous scientific findings supporting global climate changes and accelerated greenhouse warming, there is a social inertia resisting and slowing the rate at which many of our students understand and absorb these facts. A variety of reasons, including: socio-economic interests, political and ideological biases, misinformation from mass media, inappropriate preparation of science teachers, and lack of numancy have created serious challenges for public awareness of such an important issue. Different agencies and organizations (NASA, NOAA, EPA, AGU, APS, AMS and others) have created training programs for educators, not involved directly in climatology research, in order to learn climate science in a consistent way and then communicate it to the public and students. Different approaches on how to deliver such information to undergraduate students in diverse environments is discussed based on the author's experiences working in different minority-serving institutions across the nation and who have attended AMS Weather and Climate Studies training workshops, MSI-REACH, and the School of Ice. Different parameters are included in the analysis: demographics of students, size of the institutions, geographical locations, target audience, programs students are enrolled in, conceptual units covered, and availability of climate-related courses in the curricula. Additionally, the feasibility of incorporating a laboratory and quantitative analysis is analyzed. As a result of these comparisons it seems that downscaling of climate education experiences do not always work as expected in every institution regardless of the student body demographics. Different geographical areas, student body characteristics and type of institution determine the approach to be adopted as well as the feasibility to introduce different components for weather and climate studies. Some ideas are shared on how to integrate meteorology and climatology topics in other disciplines: Biology, Geology, Mathematics, Chemistry, Computer Science, and Science Methods. Such approaches might help small institutions with curriculum constraints to not fall behind in communicating climate science to the populations they serve.

  6. Algorithmic trends in computational fluid dynamics; The Institute for Computer Applications in Science and Engineering (ICASE)/LaRC Workshop, NASA Langley Research Center, Hampton, VA, US, Sep. 15-17, 1991

    NASA Technical Reports Server (NTRS)

    Hussaini, M. Y. (Editor); Kumar, A. (Editor); Salas, M. D. (Editor)

    1993-01-01

    The purpose here is to assess the state of the art in the areas of numerical analysis that are particularly relevant to computational fluid dynamics (CFD), to identify promising new developments in various areas of numerical analysis that will impact CFD, and to establish a long-term perspective focusing on opportunities and needs. Overviews are given of discretization schemes, computational fluid dynamics, algorithmic trends in CFD for aerospace flow field calculations, simulation of compressible viscous flow, and massively parallel computation. Also discussed are accerelation methods, spectral and high-order methods, multi-resolution and subcell resolution schemes, and inherently multidimensional schemes.

  7. Integrating Xgrid into the HENP distributed computing model

    NASA Astrophysics Data System (ADS)

    Hajdu, L.; Kocoloski, A.; Lauret, J.; Miller, M.

    2008-07-01

    Modern Macintosh computers feature Xgrid, a distributed computing architecture built directly into Apple's OS X operating system. While the approach is radically different from those generally expected by the Unix based Grid infrastructures (Open Science Grid, TeraGrid, EGEE), opportunistic computing on Xgrid is nonetheless a tempting and novel way to assemble a computing cluster with a minimum of additional configuration. In fact, it requires only the default operating system and authentication to a central controller from each node. OS X also implements arbitrarily extensible metadata, allowing an instantly updated file catalog to be stored as part of the filesystem itself. The low barrier to entry allows an Xgrid cluster to grow quickly and organically. This paper and presentation will detail the steps that can be taken to make such a cluster a viable resource for HENP research computing. We will further show how to provide to users a unified job submission framework by integrating Xgrid through the STAR Unified Meta-Scheduler (SUMS), making tasks and jobs submission effortlessly at reach for those users already using the tool for traditional Grid or local cluster job submission. We will discuss additional steps that can be taken to make an Xgrid cluster a full partner in grid computing initiatives, focusing on Open Science Grid integration. MIT's Xgrid system currently supports the work of multiple research groups in the Laboratory for Nuclear Science, and has become an important tool for generating simulations and conducting data analyses at the Massachusetts Institute of Technology.

  8. Oak Ridge National Laboratory Core Competencies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberto, J.B.; Anderson, T.D.; Berven, B.A.

    1994-12-01

    A core competency is a distinguishing integration of capabilities which enables an organization to deliver mission results. Core competencies represent the collective learning of an organization and provide the capacity to perform present and future missions. Core competencies are distinguishing characteristics which offer comparative advantage and are difficult to reproduce. They exhibit customer focus, mission relevance, and vertical integration from research through applications. They are demonstrable by metrics such as level of investment, uniqueness of facilities and expertise, and national impact. The Oak Ridge National Laboratory (ORNL) has identified four core competencies which satisfy the above criteria. Each core competencymore » represents an annual investment of at least $100M and is characterized by an integration of Laboratory technical foundations in physical, chemical, and materials sciences; biological, environmental, and social sciences; engineering sciences; and computational sciences and informatics. The ability to integrate broad technical foundations to develop and sustain core competencies in support of national R&D goals is a distinguishing strength of the national laboratories. The ORNL core competencies are: 9 Energy Production and End-Use Technologies o Biological and Environmental Sciences and Technology o Advanced Materials Synthesis, Processing, and Characterization & Neutron-Based Science and Technology. The distinguishing characteristics of each ORNL core competency are described. In addition, written material is provided for two emerging competencies: Manufacturing Technologies and Computational Science and Advanced Computing. Distinguishing institutional competencies in the Development and Operation of National Research Facilities, R&D Integration and Partnerships, Technology Transfer, and Science Education are also described. Finally, financial data for the ORNL core competencies are summarized in the appendices.« less

  9. IS Course Success in Liberal Arts Institutions -- What's the Formula?

    ERIC Educational Resources Information Center

    Ghosh, Suvankar; Naik, Bijayananda; Li, Xiaolin

    2014-01-01

    Much of IS pedagogy research has focused on IS programs in business schools or in computer science departments. Insufficient attention has been given to assessing IS pedagogy in business schools without an IS major and in a strong liberal arts environment where skepticism about IS education is high. We describe a newly-designed IS core course that…

  10. A Survey of Immersive Technology For Maintenance Evaluations

    DTIC Science & Technology

    1998-04-01

    image display system. Based on original work performed at the German National Computer Science and Mathematics Research Institute (GMD), and further...simulations, architectural walk- throughs, medical simulations, general research , entertainment applications and location based entertainment use...simulations. This study was conducted as part of a logistics research and development program Design Evaluation for Personnel, Training, and Human Factors

  11. Common Database Interface for Heterogeneous Software Engineering Tools.

    DTIC Science & Technology

    1987-12-01

    SUB-GROUP Database Management Systems ;Programming(Comuters); 1e 05 Computer Files;Information Transfer;Interfaces; 19. ABSTRACT (Continue on reverse...Air Force Institute of Technology Air University In Partial Fulfillment of the Requirements for the Degree of Master of Science in Information Systems ...Literature ..... 8 System 690 Configuration ......... 8 Database Functionis ............ 14 Software Engineering Environments ... 14 Data Manager

  12. 1985 Annual Technical Report: A Research Program in Computer Technology. July 1984--June 1985.

    ERIC Educational Resources Information Center

    University of Southern California, Marina del Rey. Information Sciences Inst.

    Summaries of research performed by the Information Sciences Institute at the University of Southern California for the U.S. Department of Defense Advanced Research Projects Agency in 17 areas are provided in this report: (1) Common LISP framework, an exportable version of the Formalized Software Development (FSD) testbed; (2) Explainable Expert…

  13. Research Institute for Technical Careers

    NASA Technical Reports Server (NTRS)

    Glenn, Ronald L.

    1996-01-01

    The NASA research grant to Wilberforce University enabled us to establish the Research Institute for Technical Careers (RITC) in order to improve the teaching of science and engineering at Wilberforce. The major components of the research grant are infrastructure development, establishment of the Wilberforce Intensive Summer Experience (WISE), and Joint Research Collaborations with NASA Scientists. (A) Infrastructure Development. The NASA grant has enabled us to improve the standard of our chemistry laboratory and establish the electronics, design, and robotics laboratories. These laboratories have significantly improved the level of instruction at Wilberforce University. (B) Wilberforce Intensive Summer Experience (WISE). The WISE program is a science and engineering bridge program for prefreshman students. It is an intensive academic experience designed to strengthen students' knowledge in mathematics, science, engineering, computing skills, and writing. (C) Joint Collaboration. Another feature of the grant is research collaborations between NASA Scientists and Wilberforce University Scientists. These collaborations have enabled our faculty and students to conduct research at NASA Lewis during the summer and publish research findings in various journals and scientific proceedings.

  14. The OMICS of Sports & Space: How Genomics is Transforming Both Fields

    NASA Technical Reports Server (NTRS)

    Reeves, Katherine

    2016-01-01

    Join top 10 New York Times Bestseller “The Sports Gene” author David Epstein and NASA Twins Study investigator Christopher E. Mason, Ph.D., in the debate as old as physical competition—nature versus nurture. From personal experience, Epstein tackles the great debate and traces how far science has come in solving this timeless riddle, and how genetics has entered into the field of sports. He’s an investigative science reporter for ProPublica and longtime contributor to Sports Illustrated. Epstein will share insights into performance-enhancing drugs, the lucky genetics that separate a professional athlete from a less talented athlete, and his research into the death of a friend with Hypertrophic Cardiomyopathy (HCM).From an epigenomic viewpoint, Mason examines the benefits and risks for astronauts who face extreme spaceflight conditions and what it means for the future of human space travel. He is an associate professor in the Department of Physiology and Biophysics, The Feil Family Brain and Mind Research Institute (BMRI) & The Institute for Computational Biomedicine at Weill Cornell Medicine. He is also part of the Tri-Institutional Program on Computational Biology and a Medicine Fellow of Genomics, Ethics, and Law in the Information Society Project at Yale Law School.The study of omics shows tremendous potential in prevention, diagnosis and treatment of injuries and diseases but genetic discrimination and molecular privacy concerns are raised in both sports and space.

  15. Proceedings of Image Understanding Workshop Held at Miami, Florida on 9- 10 December 1985

    DTIC Science & Technology

    1985-12-01

    established powerful tecl’n;ques f()( adaptatiOn and change in these networks (Feldmar.. 1982). A major milc:su..ne was ac;hie~ed with Sabbah’s thesis ...guation." Ph.D. thesis , Cor.1puter Science Dept.., Unh. Rochester. r\\p.il 1985; also TR145. Comput~.:r Science Dept, Lmv. Rochester. ~ .. 1ay, 1985...i(ien.:e Maater’• Thesis , 1985. Fl«lc, Mar~arrt. "Local Rotational !’yiDmetria: ~.t~hu!~etts Institute o( Tcc:hnology Dep.~rtmeot ol Electrical

  16. Three two-week enhancement institutes: Design and implementation of the technology and telecomputing component

    NASA Technical Reports Server (NTRS)

    Hale, L. Vincent

    1995-01-01

    The Teacher Enhancement Institute (TEI), under the direction of the Center Education Programs Officer offered three two-week workshops to 58 elementary and middle school teachers in science, math, and technology using the Problem Based Learning Model. The 1995 program was designed with input from evaluations and recommendations from previous TEI participants and faculty. The TEI focused on Aviation and Aeronautics as the unifying theme. Four specific objectives were developed. After completing the requirements for the TEI, the participants should be able to: (1) Increase their content knowledge, particularly in aeronautics, science, math, and technology; (2) Design and implement lessons that use scientific inquiry through Problem Based Learning; (3) Demonstrate knowledge of instructional technologies, their uses, and applications to curricula; and (4) Disseminate to their school communities the information acquired through the TEI. Thirty percent of the program was devoted to the effective use of computer technology. SpaceLink, the NASA telecomputing service for educators, was the primary tool used in the technology component of the institute. The training focused on the use of SpaceLink and its many educational services, and Internet tools because of its universal, nongraphical link to any computer plafform the participant may use at his or her school or home. All participants were given Educator Accounts to facilitate the use of E-mail, and access to the Internet and the World Wide Web using their SpaceLink accounts. Classroom demonstrations used videotaped guides and handouts to support concepts presented followed by intensive hands-on activities. Each participant was assigned to an individual Power Mac networked workstation and introduced to the state of the art, graphical, Word Wide Web with the Netscape browser. The methodology proved very effective in reaching the program's goals for technology integration by having the participants learn to use the computer as a tool for communication and research rather than teaching the use of any particular software application alone. However, because of the skill level of the majority of the participants, more hands-on computer time is recommended for future Teacher Enhancement Institutes.

  17. A bibliometric and visual analysis of global geo-ontology research

    NASA Astrophysics Data System (ADS)

    Li, Lin; Liu, Yu; Zhu, Haihong; Ying, Shen; Luo, Qinyao; Luo, Heng; Kuai, Xi; Xia, Hui; Shen, Hang

    2017-02-01

    In this paper, the results of a bibliometric and visual analysis of geo-ontology research articles collected from the Web of Science (WOS) database between 1999 and 2014 are presented. The numbers of national institutions and published papers are visualized and a global research heat map is drawn, illustrating an overview of global geo-ontology research. In addition, we present a chord diagram of countries and perform a visual cluster analysis of a knowledge co-citation network of references, disclosing potential academic communities and identifying key points, main research areas, and future research trends. The International Journal of Geographical Information Science, Progress in Human Geography, and Computers & Geosciences are the most active journals. The USA makes the largest contributions to geo-ontology research by virtue of its highest numbers of independent and collaborative papers, and its dominance was also confirmed in the country chord diagram. The majority of institutions are in the USA, Western Europe, and Eastern Asia. Wuhan University, University of Munster, and the Chinese Academy of Sciences are notable geo-ontology institutions. Keywords such as "Semantic Web," "GIS," and "space" have attracted a great deal of attention. "Semantic granularity in ontology-driven geographic information systems, "Ontologies in support of activities in geographical space" and "A translation approach to portable ontology specifications" have the highest cited centrality. Geographical space, computer-human interaction, and ontology cognition are the three main research areas of geo-ontology. The semantic mismatch between the producers and users of ontology data as well as error propagation in interdisciplinary and cross-linguistic data reuse needs to be solved. In addition, the development of geo-ontology modeling primitives based on OWL (Web Ontology Language)and finding methods to automatically rework data in Semantic Web are needed. Furthermore, the topological relations between geographical entities still require further study.

  18. Live theater on a virtual stage: incorporating soft skills and teamwork in computer graphics education.

    PubMed

    Schweppe, M; Geigel, J

    2011-01-01

    Industry has increasingly emphasized the need for "soft" or interpersonal skills development and team-building experience in the college curriculum. Here, we discuss our experiences with providing such opportunities via a collaborative project called the Virtual Theater. In this joint project between the Rochester Institute of Technology's School of Design and Department of Computer Science, the goal is to enable live performance in a virtual space with participants in different physical locales. Students work in teams, collaborating with other students in and out of their disciplines.

  19. CyVerse Data Commons: lessons learned in cyberinfrastructure management and data hosting from the Life Sciences

    NASA Astrophysics Data System (ADS)

    Swetnam, T. L.; Walls, R.; Merchant, N.

    2017-12-01

    CyVerse, is a US National Science Foundation funded initiative "to design, deploy, and expand a national cyberinfrastructure for life sciences research, and to train scientists in its use," supporting and enabling cross disciplinary collaborations across institutions. CyVerse' free, open-source, cyberinfrastructure is being adopted into biogeoscience and space sciences research. CyVerse data-science agnostic platforms provide shared data storage, high performance computing, and cloud computing that allow analysis of very large data sets (including incomplete or work-in-progress data sets). Part of CyVerse success has been in addressing the handling of data through its entire lifecycle, from creation to final publication in a digital data repository to reuse in new analyses. CyVerse developers and user communities have learned many lessons that are germane to Earth and Environmental Science. We present an overview of the tools and services available through CyVerse including: interactive computing with the Discovery Environment (https://de.cyverse.org/), an interactive data science workbench featuring data storage and transfer via the Data Store; cloud computing with Atmosphere (https://atmo.cyverse.org); and access to HPC via Agave API (https://agaveapi.co/). Each CyVerse service emphasizes access to long term data storage, including our own Data Commons (http://datacommons.cyverse.org), as well as external repositories. The Data Commons service manages, organizes, preserves, publishes, allows for discovery and reuse of data. All data published to CyVerse's Curated Data receive a permanent identifier (PID) in the form of a DOI (Digital Object Identifier) or ARK (Archival Resource Key). Data that is more fluid can also be published in the Data commons through Community Collaborated data. The Data Commons provides landing pages, permanent DOIs or ARKs, and supports data reuse and citation through features such as open data licenses and downloadable citations. The ability to access and do computing on data within the CyVerse framework or with external compute resources when necessary, has proven highly beneficial to our user community, which has continuously grown since the inception of CyVerse nine years ago.

  20. Pacific Research Platform - Creation of a West Coast Big Data Freeway System Applied to the CONNected objECT (CONNECT) Data Mining Framework for Earth Science Knowledge Discovery

    NASA Astrophysics Data System (ADS)

    Sellars, S. L.; Nguyen, P.; Tatar, J.; Graham, J.; Kawsenuk, B.; DeFanti, T.; Smarr, L.; Sorooshian, S.; Ralph, M.

    2017-12-01

    A new era in computational earth sciences is within our grasps with the availability of ever-increasing earth observational data, enhanced computational capabilities, and innovative computation approaches that allow for the assimilation, analysis and ability to model the complex earth science phenomena. The Pacific Research Platform (PRP), CENIC and associated technologies such as the Flash I/O Network Appliance (FIONA) provide scientists a unique capability for advancing towards this new era. This presentation reports on the development of multi-institutional rapid data access capabilities and data pipeline for applying a novel image characterization and segmentation approach, CONNected objECT (CONNECT) algorithm to study Atmospheric River (AR) events impacting the Western United States. ARs are often associated with torrential rains, swollen rivers, flash flooding, and mudslides. CONNECT is computationally intensive, reliant on very large data transfers, storage and data mining techniques. The ability to apply the method to multiple variables and datasets located at different University of California campuses has previously been challenged by inadequate network bandwidth and computational constraints. The presentation will highlight how the inter-campus CONNECT data mining framework improved from our prior download speeds of 10MB/s to 500MB/s using the PRP and the FIONAs. We present a worked example using the NASA MERRA data to describe how the PRP and FIONA have provided researchers with the capability for advancing knowledge about ARs. Finally, we will discuss future efforts to expand the scope to additional variables in earth sciences.

  1. The space physics analysis network

    NASA Astrophysics Data System (ADS)

    Green, James L.

    1988-04-01

    The Space Physics Analysis Network, or SPAN, is emerging as a viable method for solving an immediate communication problem for space and Earth scientists and has been operational for nearly 7 years. SPAN and its extension into Europe, utilizes computer-to-computer communications allowing mail, binary and text file transfer, and remote logon capability to over 1000 space science computer systems. The network has been used to successfully transfer real-time data to remote researchers for rapid data analysis but its primary function is for non-real-time applications. One of the major advantages for using SPAN is its spacecraft mission independence. Space science researchers using SPAN are located in universities, industries and government institutions all across the United States and Europe. These researchers are in such fields as magnetospheric physics, astrophysics, ionosperic physics, atmospheric physics, climatology, meteorology, oceanography, planetary physics and solar physics. SPAN users have access to space and Earth science data bases, mission planning and information systems, and computational facilities for the purposes of facilitating correlative space data exchange, data analysis and space research. For example, the National Space Science Data Center (NSSDC), which manages the network, is providing facilities on SPAN such as the Network Information Center (SPAN NIC). SPAN has interconnections with several national and international networks such as HEPNET and TEXNET forming a transparent DECnet network. The combined total number of computers now reachable over these combined networks is about 2000. In addition, SPAN supports full function capabilities over the international public packet switched networks (e.g. TELENET) and has mail gateways to ARPANET, BITNET and JANET.

  2. Endowment and Education

    NASA Astrophysics Data System (ADS)

    Moore, John W.

    2000-05-01

    The 1998 annual report of the Research Corporation ( http://www.rescorp.org) contains fascinating reading for anyone with an interest in science education at private institutions. An article titled "The Midas Touch: Do Soaring Endowments Have Any Impact on College Science" concludes that "college science is seldom more than an incidental beneficiary of endowment resources, even when they are conspicuously plentiful." Written by Research Corporation director of communication W. Stevenson Bacon, the article reports on a survey of leading undergraduate institutions, dividing them between those with endowments above and below 300 million. The first surprise to me was that Harvard's endowment of 727,522 per full-time equivalent (FTE) student is exceeded by Grinnell's 760,404, and Yale's 612,015 per FTE student is far exceeded by Agnes Scott's 692,914 (much of it in Coca-Cola stock and somewhat restricted) and closely rivaled by Swarthmore's 608,955. Of the eleven institutions in the Research Corporation survey, seven were above 300,000 per FTE student and only four were below. Private-college endowments have soared along with a soaring stock market. The Research Corporation report asks whether this increased endowment income is helping colleges to provide improved education in the sciences. A major use of endowment income and gift funds is for construction of buildings. Seven of the eleven institutions surveyed had building programs under way or planned for the sciences, and three of the four remaining expected to stress science facilities in upcoming campaigns. In some cases new buildings are designed to support science effectively, but in others, according to Research Corporation Vice President Michael Doyle, "the building is an elegant shell without modern instrumentation or flexibility for future uses." New construction serves to make a campus attractive to prospective students who will bring in the tuition fees that support most of a college's budget. An "elegant shell" may serve this goal adequately, and science faculty need to become intimately involved in building plans to ensure that a building is well equipped, flexible, and safe (see page 547 regarding safety). There appears to be little correlation between endowment and support for those who carry out research with undergraduates. Expectations regarding hours spent in classrooms and laboratories seem to depend on tradition. Some institutions below the 300,000/FTE line provide teaching credit for time spent with undergraduate research students, while many above it do not. A positive development is that five of the eleven institutions surveyed are raising endowment funds specifically to support summer student-faculty research programs, with campaign goals in the range from 0.5 to 6 million. This is a trend that could profitably be extended to many more colleges, because there is clear evidence that undergraduate research experience is strongly correlated with the success of students who are potential scientists. Endowment funds are being used to support startup packages for new faculty, which are required to attract the best teachers and researchers. From the survey, packages appear to be in the range from 20 to 50 thousand, and there has been a tenfold increase over the past 15 years. Endowment also supports purchases of instruments, where matching funds are required by federal grants. However, it is not always easy to come up with matching funds for big-ticket items like NMRs. Also, there is constant pressure to provide the latest in computer equipment, especially for use in teaching. Computers and other technology seem to become obsolete overnight, and maintaining facilities that will attract students who are more and more computer literate is an ongoing drain on endowment income. Recently competition for the best students has begun to draw endowment income away from science departments. In addition to scholarships based on need, merit awards have become de rigueur. There appears to be a trend to offer to match the best scholarship package a really good student has been able to get from a competing institution. The average tuition and fees paid at most institutions is well below the advertised "sticker price", and the difference is being made up from endowment income and gifts. Two thoughts came to me as I read the Research Corporation report. First, private funding agencies, such as the Research Corporation and the Camille and Henry Dreyfus Foundation (which sponsored JCE's Viewpoints series), are uniquely positioned to influence science research and science education in this country. Their reports and activities provide perspectives and ideas that those of us in the trenches might otherwise be too busy to come up with. Second, science departments in undergraduate institutions have considerable control over their destinies. Quoting the report, "small endowments and even substandard facilities do not rule out vigorous science departments-or even necessarily impact morale, if faculty can see that good use is being made of available resources." I would turn this around. If we don't allow external, uncontrollable forces to get us down, and if we work hard at things that will make a difference, we can accomplish a lot, even with only a little money. The most important factor is what we do- and what attitudes and habits of mind we impart to our students. A college or university that is well endowed with human resources provides the best possible venue for learning.

  3. Changing from computing grid to knowledge grid in life-science grid.

    PubMed

    Talukdar, Veera; Konar, Amit; Datta, Ayan; Choudhury, Anamika Roy

    2009-09-01

    Grid computing has a great potential to become a standard cyber infrastructure for life sciences that often require high-performance computing and large data handling, which exceeds the computing capacity of a single institution. Grid computer applies the resources of many computers in a network to a single problem at the same time. It is useful to scientific problems that require a great number of computer processing cycles or access to a large amount of data.As biologists,we are constantly discovering millions of genes and genome features, which are assembled in a library and distributed on computers around the world.This means that new, innovative methods must be developed that exploit the re-sources available for extensive calculations - for example grid computing.This survey reviews the latest grid technologies from the viewpoints of computing grid, data grid and knowledge grid. Computing grid technologies have been matured enough to solve high-throughput real-world life scientific problems. Data grid technologies are strong candidates for realizing a "resourceome" for bioinformatics. Knowledge grids should be designed not only from sharing explicit knowledge on computers but also from community formulation for sharing tacit knowledge among a community. By extending the concept of grid from computing grid to knowledge grid, it is possible to make use of a grid as not only sharable computing resources, but also as time and place in which people work together, create knowledge, and share knowledge and experiences in a community.

  4. Commentary: Attitude Adjustment--Educating PhD Scientist for Business Careers

    ERIC Educational Resources Information Center

    Schuster, Sheldon M.

    2011-01-01

    The PhD graduate from a US research academic institution who has worked 5-7 years to solve a combination of laboratory and computational problems after an in-depth classroom experience is likely superbly trained in at least a subset of the life sciences and the underlying methodology and thought processes required to perform high level research.…

  5. A Federal Vision for Future Computing: A Nanotechnology-Inspired Grand Challenge

    DTIC Science & Technology

    2016-07-29

    Science Foundation (NSF), Department of Defense (DOD), National Institute of Standards and Technology (NIST), Intelligence Community (IC) Introduction...multiple Federal agencies: • Intelligent big data sensors that act autonomously and are programmable via the network for increased flexibility, and... intelligence for scientific discovery enabled by rapid extreme-scale data analysis, capable of understanding and making sense of results and thereby

  6. State University of New York Institute of Technology (SUNYIT) Visiting Scholars Program

    DTIC Science & Technology

    2013-05-01

    team members, and build the necessary backend metal interconnections. APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED 4 Baek-Young Choi...Cooperative and Opportunistic Mobile Cloud for Energy Efficient Positioning; Department of Computer Science Electrical Engineering, University of...Missouri - Kansas City The fast growing popularity of smartphones and tablets enables us the use of various intelligent mobile applications. As many of

  7. Supporting Regularized Logistic Regression Privately and Efficiently.

    PubMed

    Li, Wenfa; Liu, Hongzhe; Yang, Peng; Xie, Wei

    2016-01-01

    As one of the most popular statistical and machine learning models, logistic regression with regularization has found wide adoption in biomedicine, social sciences, information technology, and so on. These domains often involve data of human subjects that are contingent upon strict privacy regulations. Concerns over data privacy make it increasingly difficult to coordinate and conduct large-scale collaborative studies, which typically rely on cross-institution data sharing and joint analysis. Our work here focuses on safeguarding regularized logistic regression, a widely-used statistical model while at the same time has not been investigated from a data security and privacy perspective. We consider a common use scenario of multi-institution collaborative studies, such as in the form of research consortia or networks as widely seen in genetics, epidemiology, social sciences, etc. To make our privacy-enhancing solution practical, we demonstrate a non-conventional and computationally efficient method leveraging distributing computing and strong cryptography to provide comprehensive protection over individual-level and summary data. Extensive empirical evaluations on several studies validate the privacy guarantee, efficiency and scalability of our proposal. We also discuss the practical implications of our solution for large-scale studies and applications from various disciplines, including genetic and biomedical studies, smart grid, network analysis, etc.

  8. Supporting Regularized Logistic Regression Privately and Efficiently

    PubMed Central

    Li, Wenfa; Liu, Hongzhe; Yang, Peng; Xie, Wei

    2016-01-01

    As one of the most popular statistical and machine learning models, logistic regression with regularization has found wide adoption in biomedicine, social sciences, information technology, and so on. These domains often involve data of human subjects that are contingent upon strict privacy regulations. Concerns over data privacy make it increasingly difficult to coordinate and conduct large-scale collaborative studies, which typically rely on cross-institution data sharing and joint analysis. Our work here focuses on safeguarding regularized logistic regression, a widely-used statistical model while at the same time has not been investigated from a data security and privacy perspective. We consider a common use scenario of multi-institution collaborative studies, such as in the form of research consortia or networks as widely seen in genetics, epidemiology, social sciences, etc. To make our privacy-enhancing solution practical, we demonstrate a non-conventional and computationally efficient method leveraging distributing computing and strong cryptography to provide comprehensive protection over individual-level and summary data. Extensive empirical evaluations on several studies validate the privacy guarantee, efficiency and scalability of our proposal. We also discuss the practical implications of our solution for large-scale studies and applications from various disciplines, including genetic and biomedical studies, smart grid, network analysis, etc. PMID:27271738

  9. Imagining tomorrow's university in an era of open science.

    PubMed

    Howe, Adina; Howe, Michael; Kaleita, Amy L; Raman, D Raj

    2017-01-01

    As part of a recent workshop entitled "Imagining Tomorrow's University", we were asked to visualize the future of universities as research becomes increasingly data- and computation-driven, and identify a set of principles characterizing pertinent opportunities and obstacles presented by this shift. In order to establish a holistic view, we take a multilevel approach and examine the impact of open science on individual scholars and how this impacts as well as on the university as a whole. At the university level, open science presents a double-edged sword: when well executed, open science can accelerate the rate of scientific inquiry across the institution and beyond; however, haphazard or half-hearted efforts are likely to squander valuable resources, diminish university productivity and prestige, and potentially do more harm than good. We present our perspective on the role of open science at the university.

  10. Photometric analysis in the Kepler Science Operations Center pipeline

    NASA Astrophysics Data System (ADS)

    Twicken, Joseph D.; Clarke, Bruce D.; Bryson, Stephen T.; Tenenbaum, Peter; Wu, Hayley; Jenkins, Jon M.; Girouard, Forrest; Klaus, Todd C.

    2010-07-01

    We describe the Photometric Analysis (PA) software component and its context in the Kepler Science Operations Center (SOC) Science Processing Pipeline. The primary tasks of this module are to compute the photometric flux and photocenters (centroids) for over 160,000 long cadence (~thirty minute) and 512 short cadence (~one minute) stellar targets from the calibrated pixels in their respective apertures. We discuss science algorithms for long and short cadence PA: cosmic ray cleaning; background estimation and removal; aperture photometry; and flux-weighted centroiding. We discuss the end-to-end propagation of uncertainties for the science algorithms. Finally, we present examples of photometric apertures, raw flux light curves, and centroid time series from Kepler flight data. PA light curves, centroid time series, and barycentric timestamp corrections are exported to the Multi-mission Archive at Space Telescope [Science Institute] (MAST) and are made available to the general public in accordance with the NASA/Kepler data release policy.

  11. Bioengineering and Bioinformatics Summer Institutes: Meeting Modern Challenges in Undergraduate Summer Research

    PubMed Central

    Dong, Cheng; Snyder, Alan J.; Jones, A. Daniel; Sheets, Erin D.

    2008-01-01

    Summer undergraduate research programs in science and engineering facilitate research progress for faculty and provide a close-ended research experience for students, which can prepare them for careers in industry, medicine, and academia. However, ensuring these outcomes is a challenge when the students arrive ill-prepared for substantive research or if projects are ill-defined or impractical for a typical 10-wk summer. We describe how the new Bioengineering and Bioinformatics Summer Institutes (BBSI), developed in response to a call for proposals by the National Institutes of Health (NIH) and the National Science Foundation (NSF), provide an impetus for the enhancement of traditional undergraduate research experiences with intense didactic training in particular skills and technologies. Such didactic components provide highly focused and qualified students for summer research with the goal of ensuring increased student satisfaction with research and mentor satisfaction with student productivity. As an example, we focus on our experiences with the Penn State Biomaterials and Bionanotechnology Summer Institute (PSU-BBSI), which trains undergraduates in core technologies in surface characterization, computational modeling, cell biology, and fabrication to prepare them for student-centered research projects in the role of materials in guiding cell biology. PMID:18316807

  12. Human-based approaches to pharmacology and cardiology: an interdisciplinary and intersectorial workshop.

    PubMed

    Rodriguez, Blanca; Carusi, Annamaria; Abi-Gerges, Najah; Ariga, Rina; Britton, Oliver; Bub, Gil; Bueno-Orovio, Alfonso; Burton, Rebecca A B; Carapella, Valentina; Cardone-Noott, Louie; Daniels, Matthew J; Davies, Mark R; Dutta, Sara; Ghetti, Andre; Grau, Vicente; Harmer, Stephen; Kopljar, Ivan; Lambiase, Pier; Lu, Hua Rong; Lyon, Aurore; Minchole, Ana; Muszkiewicz, Anna; Oster, Julien; Paci, Michelangelo; Passini, Elisa; Severi, Stefano; Taggart, Peter; Tinker, Andy; Valentin, Jean-Pierre; Varro, Andras; Wallman, Mikael; Zhou, Xin

    2016-09-01

    Both biomedical research and clinical practice rely on complex datasets for the physiological and genetic characterization of human hearts in health and disease. Given the complexity and variety of approaches and recordings, there is now growing recognition of the need to embed computational methods in cardiovascular medicine and science for analysis, integration and prediction. This paper describes a Workshop on Computational Cardiovascular Science that created an international, interdisciplinary and inter-sectorial forum to define the next steps for a human-based approach to disease supported by computational methodologies. The main ideas highlighted were (i) a shift towards human-based methodologies, spurred by advances in new in silico, in vivo, in vitro, and ex vivo techniques and the increasing acknowledgement of the limitations of animal models. (ii) Computational approaches complement, expand, bridge, and integrate in vitro, in vivo, and ex vivo experimental and clinical data and methods, and as such they are an integral part of human-based methodologies in pharmacology and medicine. (iii) The effective implementation of multi- and interdisciplinary approaches, teams, and training combining and integrating computational methods with experimental and clinical approaches across academia, industry, and healthcare settings is a priority. (iv) The human-based cross-disciplinary approach requires experts in specific methodologies and domains, who also have the capacity to communicate and collaborate across disciplines and cross-sector environments. (v) This new translational domain for human-based cardiology and pharmacology requires new partnerships supported financially and institutionally across sectors. Institutional, organizational, and social barriers must be identified, understood and overcome in each specific setting. © The Author 2015. Published by Oxford University Press on behalf of the European Society of Cardiology.

  13. RIACS FY2002 Annual Report

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.; Gross, Anthony R. (Technical Monitor)

    2002-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. Operated by the Universities Space Research Association (a non-profit university consortium), RIACS is located at the NASA Ames Research Center, Moffett Field, California. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in September 2003. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology (IT) Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1) Automated Reasoning for Autonomous Systems; 2) Human-Centered Computing; and 3) High Performance Computing and Networking. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains including aerospace technology, earth science, life sciences, and astrobiology. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.

  14. Extending Landauer's bound from bit erasure to arbitrary computation

    NASA Astrophysics Data System (ADS)

    Wolpert, David

    The minimal thermodynamic work required to erase a bit, known as Landauer's bound, has been extensively investigated both theoretically and experimentally. However, when viewed as a computation that maps inputs to outputs, bit erasure has a very special property: the output does not depend on the input. Existing analyses of thermodynamics of bit erasure implicitly exploit this property, and thus cannot be directly extended to analyze the computation of arbitrary input-output maps. Here we show how to extend these earlier analyses of bit erasure to analyze the thermodynamics of arbitrary computations. Doing this establishes a formal connection between the thermodynamics of computers and much of theoretical computer science. We use this extension to analyze the thermodynamics of the canonical ``general purpose computer'' considered in computer science theory: a universal Turing machine (UTM). We consider a UTM which maps input programs to output strings, where inputs are drawn from an ensemble of random binary sequences, and prove: i) The minimal work needed by a UTM to run some particular input program X and produce output Y is the Kolmogorov complexity of Y minus the log of the ``algorithmic probability'' of Y. This minimal amount of thermodynamic work has a finite upper bound, which is independent of the output Y, depending only on the details of the UTM. ii) The expected work needed by a UTM to compute some given output Y is infinite. As a corollary, the overall expected work to run a UTM is infinite. iii) The expected work needed by an arbitrary Turing machine T (not necessarily universal) to compute some given output Y can either be infinite or finite, depending on Y and the details of T. To derive these results we must combine ideas from nonequilibrium statistical physics with fundamental results from computer science, such as Levin's coding theorem and other theorems about universal computation. I would like to ackowledge the Santa Fe Institute, Grant No. TWCF0079/AB47 from the Templeton World Charity Foundation, Grant No. FQXi-RHl3-1349 from the FQXi foundation, and Grant No. CHE-1648973 from the U.S. National Science Foundation.

  15. 2013 Progress Report -- DOE Joint Genome Institute

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2013-11-01

    In October 2012, we introduced a 10-Year Strategic Vision [http://bit.ly/JGI-Vision] for the Institute. A central focus of this Strategic Vision is to bridge the gap between sequenced genomes and an understanding of biological functions at the organism and ecosystem level. This involves the continued massive-scale generation of sequence data, complemented by orthogonal new capabilities to functionally annotate these large sequence data sets. Our Strategic Vision lays out a path to guide our decisions and ensure that the evolving set of experimental and computational capabilities available to DOE JGI users will continue to enable groundbreaking science.

  16. A Boon for the Architect Engineer

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Langley Research Center's need for an improved construction specification system led to an automated system called SPECSINTACT. A catalog of specifications, the system enables designers to retrieve relevant sections from computer storage and modify them as needed. SPECSINTACT has also been adopted by government agencies. The system is an integral part of the Construction Criteria Base (CCB), a single disc containing design and construction information for 10 government agencies including the American Institute of Architects' MASTERSPEC. CCB employs CD- ROM technologies and is available from the National Institute of Building Sciences. Users report substantial savings in time and productivity.

  17. New theory insights and experimental opportunities in Majorana wires

    NASA Astrophysics Data System (ADS)

    Alicea, Jason

    Over the past decade, the quest for Majorana zero modes in exotic superconductors has undergone transformational advances on the design, fabrication, detection, and characterization fronts. The field now seems primed for a new era aimed at Majorana control and readout. This talk will survey intertwined theory and experimental developments that illuminate a practical path toward these higher-level goals. In particular, I will highlight near-term opportunities for testing fundamentals of topological quantum computing and longer-term strategies for building scalable hardware. Supported by the National Science Foundation (DMR-1341822), Institute for Quantum Information and Matter, and Walter Burke Institute at Caltech.

  18. Research Projects, Technical Reports and Publications

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph

    1996-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. A flexible scientific staff is provided through a university faculty visitor program, a post doctoral program, and a student visitor program. Not only does this provide appropriate expertise but it also introduces scientists outside of NASA to NASA problems. A small group of core RIACS staff provides continuity and interacts with an ARC technical monitor and scientific advisory group to determine the RIACS mission. RIACS activities are reviewed and monitored by a USRA advisory council and ARC technical monitor. Research at RIACS is currently being done in the following areas: Advanced Methods for Scientific Computing High Performance Networks During this report pefiod Professor Antony Jameson of Princeton University, Professor Wei-Pai Tang of the University of Waterloo, Professor Marsha Berger of New York University, Professor Tony Chan of UCLA, Associate Professor David Zingg of University of Toronto, Canada and Assistant Professor Andrew Sohn of New Jersey Institute of Technology have been visiting RIACS. January 1, 1996 through September 30, 1996 RIACS had three staff scientists, four visiting scientists, one post-doctoral scientist, three consultants, two research associates and one research assistant. RIACS held a joint workshop with Code 1 29-30 July 1996. The workshop was held to discuss needs and opportunities in basic research in computer science in and for NASA applications. There were 14 talks given by NASA, industry and university scientists and three open discussion sessions. There were approximately fifty participants. A proceedings is being prepared. It is planned to have similar workshops on an annual basis. RIACS technical reports are usually preprints of manuscripts that have been submitted to research 'ournals or conference proceedings. A list of these reports for the period January i 1, 1996 through September 30, 1996 is in the Reports and Abstracts section of this report.

  19. CDAC Student Report: Summary of LLNL Internship

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herriman, Jane E.

    Multiple objectives motivated me to apply for an internship at LLNL: I wanted to experience the work environment at a national lab, to learn about research and job opportunities at LLNL in particular, and to gain greater experience with code development, particularly within the realm of high performance computing (HPC). This summer I was selected to participate in LLNL's Computational Chemistry and Material Science Summer Institute (CCMS). CCMS is a 10 week program hosted by the Quantum Simulations group leader, Dr. Eric Schwegler. CCMS connects graduate students to mentors at LLNL involved in similar re- search and provides weekly seminarsmore » on a broad array of topics from within chemistry and materials science. Dr. Xavier Andrade and Dr. Erik Draeger served as my co-mentors over the summer, and Dr. Andrade continues to mentor me now that CCMS has concluded. Dr. Andrade is a member of the Quantum Simulations group within the Physical and Life Sciences at LLNL, and Dr. Draeger leads the HPC group within the Center for Applied Scientific Computing (CASC). The two have worked together to develop Qb@ll, an open-source first principles molecular dynamics code that was the platform for my summer research project.« less

  20. The OSG Open Facility: an on-ramp for opportunistic scientific computing

    NASA Astrophysics Data System (ADS)

    Jayatilaka, B.; Levshina, T.; Sehgal, C.; Gardner, R.; Rynge, M.; Würthwein, F.

    2017-10-01

    The Open Science Grid (OSG) is a large, robust computing grid that started primarily as a collection of sites associated with large HEP experiments such as ATLAS, CDF, CMS, and DZero, but has evolved in recent years to a much larger user and resource platform. In addition to meeting the US LHC community’s computational needs, the OSG continues to be one of the largest providers of distributed high-throughput computing (DHTC) to researchers from a wide variety of disciplines via the OSG Open Facility. The Open Facility consists of OSG resources that are available opportunistically to users other than resource owners and their collaborators. In the past two years, the Open Facility has doubled its annual throughput to over 200 million wall hours. More than half of these resources are used by over 100 individual researchers from over 60 institutions in fields such as biology, medicine, math, economics, and many others. Over 10% of these individual users utilized in excess of 1 million computational hours each in the past year. The largest source of these cycles is temporary unused capacity at institutions affiliated with US LHC computational sites. An increasing fraction, however, comes from university HPC clusters and large national infrastructure supercomputers offering unused capacity. Such expansions have allowed the OSG to provide ample computational resources to both individual researchers and small groups as well as sizable international science collaborations such as LIGO, AMS, IceCube, and sPHENIX. Opening up access to the Fermilab FabrIc for Frontier Experiments (FIFE) project has also allowed experiments such as mu2e and NOvA to make substantial use of Open Facility resources, the former with over 40 million wall hours in a year. We present how this expansion was accomplished as well as future plans for keeping the OSG Open Facility at the forefront of enabling scientific research by way of DHTC.

  1. The OSG Open Facility: An On-Ramp for Opportunistic Scientific Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jayatilaka, B.; Levshina, T.; Sehgal, C.

    The Open Science Grid (OSG) is a large, robust computing grid that started primarily as a collection of sites associated with large HEP experiments such as ATLAS, CDF, CMS, and DZero, but has evolved in recent years to a much larger user and resource platform. In addition to meeting the US LHC community’s computational needs, the OSG continues to be one of the largest providers of distributed high-throughput computing (DHTC) to researchers from a wide variety of disciplines via the OSG Open Facility. The Open Facility consists of OSG resources that are available opportunistically to users other than resource ownersmore » and their collaborators. In the past two years, the Open Facility has doubled its annual throughput to over 200 million wall hours. More than half of these resources are used by over 100 individual researchers from over 60 institutions in fields such as biology, medicine, math, economics, and many others. Over 10% of these individual users utilized in excess of 1 million computational hours each in the past year. The largest source of these cycles is temporary unused capacity at institutions affiliated with US LHC computational sites. An increasing fraction, however, comes from university HPC clusters and large national infrastructure supercomputers offering unused capacity. Such expansions have allowed the OSG to provide ample computational resources to both individual researchers and small groups as well as sizable international science collaborations such as LIGO, AMS, IceCube, and sPHENIX. Opening up access to the Fermilab FabrIc for Frontier Experiments (FIFE) project has also allowed experiments such as mu2e and NOvA to make substantial use of Open Facility resources, the former with over 40 million wall hours in a year. We present how this expansion was accomplished as well as future plans for keeping the OSG Open Facility at the forefront of enabling scientific research by way of DHTC.« less

  2. Students Perception towards the Implementation of Computer Graphics Technology in Class via Unified Theory of Acceptance and Use of Technology (UTAUT) Model

    NASA Astrophysics Data System (ADS)

    Binti Shamsuddin, Norsila

    Technology advancement and development in a higher learning institution is a chance for students to be motivated to learn in depth in the information technology areas. Students should take hold of the opportunity to blend their skills towards these technologies as preparation for them when graduating. The curriculum itself can rise up the students' interest and persuade them to be directly involved in the evolvement of the technology. The aim of this study is to see how deep is the students' involvement as well as their acceptance towards the adoption of the technology used in Computer Graphics and Image Processing subjects. The study will be towards the Bachelor students in Faculty of Industrial Information Technology (FIIT), Universiti Industri Selangor (UNISEL); Bac. In Multimedia Industry, BSc. Computer Science and BSc. Computer Science (Software Engineering). This study utilizes the new Unified Theory of Acceptance and Use of Technology (UTAUT) to further validate the model and enhance our understanding of the adoption of Computer Graphics and Image Processing Technologies. Four (4) out of eight (8) independent factors in UTAUT will be studied towards the dependent factor.

  3. Science at the interstices: an evolution in the academy.

    PubMed

    Balser, Jeffrey R; Baruchin, Andrea

    2008-09-01

    Biomedical science is at an evolutionary turning point. Many of the rate-limiting steps to realizing the next generation of personalized, highly targeted diagnostics and therapeutics rest at the interstices between biomedical science and the classic, university-based disciplines, such as physics, mathematics, computational science, engineering, social sciences, business, and law. Institutes, centers, or other entities created to foster interdisciplinary science are rapidly forming to tackle these formidable challenges, but they are plagued with substantive barriers, born of traditions, processes, and culture, which impede scientific progress and endanger success. Without a more seamless interdisciplinary framework, academic health centers will struggle to move transformative advances in technology into the foundation of biomedical science, and the equally challenging advancement of models that effectively integrate new molecular diagnostics and therapies into the business and social fabric of our population will be similarly hampered. At the same time, excess attention on rankings tied to competition for National Institutes of Health and other federal funds adversely encourages academic medical centers (AMCs) and universities to hoard, rather than share, resources effectively and efficiently. To fully realize their discovery potential, AMCs must consider a substantive realignment relative to one another, as well as with their associated universities, as the academy looks toward innovative approaches to provide a more supportive foundation for the emergent biomedical research enterprise. The authors discuss potential models that could serve to lower barriers to interdisciplinary science, promoting a new synergy between AMCs and their parent universities.

  4. The role of gender on academic performance in STEM-related disciplines: Data from a tertiary institution.

    PubMed

    John, Temitope M; Badejo, Joke A; Popoola, Segun I; Omole, David O; Odukoya, Jonathan A; Ajayi, Priscilla O; Aboyade, Mary; Atayero, Aderemi A

    2018-06-01

    This data article presents data of academic performances of undergraduate students in Science, Technology, Engineering and Mathematics (STEM) disciplines in Covenant University, Nigeria. The data shows academic performances of Male and Female students who graduated from 2010 to 2014. The total population of samples in the observation is 3046 undergraduates mined from Biochemistry (BCH), Building technology (BLD), Computer Engineering (CEN), Chemical Engineering (CHE), Industrial Chemistry (CHM), Computer Science (CIS), Civil Engineering (CVE), Electrical and Electronics Engineering (EEE), Information and Communication Engineering (ICE), Mathematics (MAT), Microbiology (MCB), Mechanical Engineering (MCE), Management and Information System (MIS), Petroleum Engineering (PET), Industrial Physics-Electronics and IT Applications (PHYE), Industrial Physics-Applied Geophysics (PHYG) and Industrial Physics-Renewable Energy (PHYR). The detailed dataset is made available in form of a Microsoft Excel spreadsheet in the supplementary material of this article.

  5. Data Processing System (DPS) software with experimental design, statistical analysis and data mining developed for use in entomological research.

    PubMed

    Tang, Qi-Yi; Zhang, Chuan-Xi

    2013-04-01

    A comprehensive but simple-to-use software package called DPS (Data Processing System) has been developed to execute a range of standard numerical analyses and operations used in experimental design, statistics and data mining. This program runs on standard Windows computers. Many of the functions are specific to entomological and other biological research and are not found in standard statistical software. This paper presents applications of DPS to experimental design, statistical analysis and data mining in entomology. © 2012 The Authors Insect Science © 2012 Institute of Zoology, Chinese Academy of Sciences.

  6. ZTF Undergraduate Astronomy Institute at Caltech and Pomona College

    NASA Astrophysics Data System (ADS)

    Penprase, Bryan Edward; Bellm, Eric Christopher

    2017-01-01

    From the new Zwicky Transient Facility (ZTF), an NSF funded project based at Caltech, comes a new initiative for undergraduate research known as the Summer Undergraduate Astronomy Institute. The Institute brings together 15-20 students from across the world for an immersive experience in astronomy techniques before they begin their summer research projects. The students are primarly based at Caltech in their SURF program but also includes a large cohort of students enrolled in research internships at Pomona College in nearby Claremont CA. The program is intended to introduce students to research techniques in astronomy, laboratory and computational technologies, and to observational astronomy. Since many of the students are previously computer science or physics majors with little astronomy experience, this immersive experience has been extremely helpful for enabling students to learn about the terminologies, techniques and technologies of astronomy. The field trips to the Mount Wilson and Palomar telescopes deepen their knowledge and excitement about astronomy. Lectures about astronomical research from Caltech staff scientists and graduate students also provide context for the student research. Perhaps more importantly, the creation of a cohort of like-minded students, and the chance to reflect about careers in astronomy and research, give these students opportunities to consider themselves as future research scientists and help them immensely as they move forward in their careers. We discuss some of the social and intercultural aspects of the experience as well, as our cohorts typically include international students from many countries and several students from under-represented groups in science.

  7. Determining the Effects of Pre-College STEM Contexts on STEM Major Choices in 4-Year Postsecondary Institutions Using Multilevel Structural Equation Modeling

    ERIC Educational Resources Information Center

    Lee, Ahlam

    2013-01-01

    Many STEM studies have focused on traditional learning contexts, such as math- and science-related learning factors, as pre-college learning predictors for STEM major choices in colleges. Few studies have considered a progressive learning activity embedded within STEM contexts. This study chose computer-based learning activities in K-12 math…

  8. 1993 Annual report on scientific programs: A broad research program on the sciences of complexity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1993-12-31

    This report provides a summary of many of the research projects completed by the Santa Fe Institute (SFI) during 1993. These research efforts continue to focus on two general areas: the study of, and search for, underlying scientific principles governing complex adaptive systems, and the exploration of new theories of computation that incorporate natural mechanisms of adaptation (mutation, genetics, evolution).

  9. Triangle Computer Science Distinguished Lecture Series

    DTIC Science & Technology

    2018-01-30

    scientific inquiry - the cell, the brain, the market - as well as in the models developed by scientists over the centuries for studying them. Human...the great objects of scientific inquiry - the cell, the brain, the market - as well as in the models developed by scientists over the centuries for...in principle , secure system operation can be achieved. Massive-Scale Streaming Analytics David Bader, Georgia Institute of Technology (telecast from

  10. Systems engineering technology for networks

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The report summarizes research pursued within the Systems Engineering Design Laboratory at Virginia Polytechnic Institute and State University between May 16, 1993 and January 31, 1994. The project was proposed in cooperation with the Computational Science and Engineering Research Center at Howard University. Its purpose was to investigate emerging systems engineering tools and their applicability in analyzing the NASA Network Control Center (NCC) on the basis of metrics and measures.

  11. Adaptive Mesh Experiments for Hyperbolic Partial Differential Equations

    DTIC Science & Technology

    1990-02-01

    JOSEPH E. FLAHERTY FEBRUARY 1990 US ARMY ARMAMENT RESEARCH , ~ DEVELOPMENT AND ENGINEERlING CENTER CLOSE COMBAT ARMAMENTS CENTER BENET LABORATORIES...NY 12189-4050 If. CONTROLLING OFFICE NAME AND ADDRESS 12. REPORT DATE U.S. Army ARDEC February 1990 Close Combat Armaments Center 13. NUMBER OF...Flaherty Department of Computer Science Rensselaer Polytechnic Institute Troy, NY 12180-3590 and U.S. Army ARDEC Close Combat Armaments Center Benet

  12. Evaluating Implementations of Service Oriented Architecture for Sensor Network via Simulation

    DTIC Science & Technology

    2011-04-01

    Subject: COMPUTER SCIENCE Approved: Boleslaw Szymanski , Thesis Adviser Rensselaer Polytechnic Institute Troy, New York April 2011 (For Graduation May 2011...simulation supports distributed and centralized composition with a type hierarchy and multiple -service statically-located nodes in a 2-dimensional space...distributed and centralized composition with a type hierarchy and multiple -service statically-located nodes in a 2-dimensional space. The second simulation

  13. Special data base of Informational - Computational System 'INM RAS - Black Sea' for solving inverse and data assimilation problems

    NASA Astrophysics Data System (ADS)

    Zakharova, Natalia; Piskovatsky, Nicolay; Gusev, Anatoly

    2014-05-01

    Development of Informational-Computational Systems (ICS) for data assimilation procedures is one of multidisciplinary problems. To study and solve these problems one needs to apply modern results from different disciplines and recent developments in: mathematical modeling; theory of adjoint equations and optimal control; inverse problems; numerical methods theory; numerical algebra and scientific computing. The above problems are studied in the Institute of Numerical Mathematics of the Russian Academy of Science (INM RAS) in ICS for personal computers. In this work the results on the Special data base development for ICS "INM RAS - Black Sea" are presented. In the presentation the input information for ICS is discussed, some special data processing procedures are described. In this work the results of forecast using ICS "INM RAS - Black Sea" with operational observation data assimilation are presented. This study was supported by the Russian Foundation for Basic Research (project No 13-01-00753) and by Presidium Program of Russian Academy of Sciences (project P-23 "Black sea as an imitational ocean model"). References 1. V.I. Agoshkov, M.V. Assovskii, S.A. Lebedev, Numerical simulation of Black Sea hydrothermodynamics taking into account tide-forming forces. Russ. J. Numer. Anal. Math. Modelling (2012) 27, No.1, pp. 5-31. 2. E.I. Parmuzin, V.I. Agoshkov, Numerical solution of the variational assimilation problem for sea surface temperature in the model of the Black Sea dynamics. Russ. J. Numer. Anal. Math. Modelling (2012) 27, No.1, pp. 69-94. 3. V.B. Zalesny, N.A. Diansky, V.V. Fomin, S.N. Moshonkin, S.G. Demyshev, Numerical model of the circulation of Black Sea and Sea of Azov. Russ. J. Numer. Anal. Math. Modelling (2012) 27, No.1, pp. 95-111. 4. Agoshkov V.I.,Assovsky M.B., Giniatulin S. V., Zakharova N.B., Kuimov G.V., Parmuzin E.I., Fomin V.V. Informational Computational system of variational assimilation of observation data "INM RAS - Black sea"// Ecological safety of coastal and shelf zones and complex use of shelf resources: Collection of scientific works. Issue 26, Volume 2. - National Academy of Sciences of Ukraine, Marine Hydrophysical Institute, Sebastopol, 2012. Pages 352-360. (In russian)

  14. The Center for Nanophase Materials Sciences

    NASA Astrophysics Data System (ADS)

    Lowndes, Douglas

    2005-03-01

    The Center for Nanophase Materials Sciences (CNMS) located at Oak Ridge National Laboratory (ORNL) will be the first DOE Nanoscale Science Research Center to begin operation, with construction to be completed in April 2005 and initial operations in October 2005. The CNMS' scientific program has been developed through workshops with the national community, with the goal of creating a highly collaborative research environment to accelerate discovery and drive technological advances. Research at the CNMS is organized under seven Scientific Themes selected to address challenges to understanding and to exploit particular ORNL strengths (see http://cnms.ornl.govhttp://cnms.ornl.gov). These include extensive synthesis and characterization capabilities for soft, hard, nanostructured, magnetic and catalytic materials and their composites; neutron scattering at the Spallation Neutron Source and High Flux Isotope Reactor; computational nanoscience in the CNMS' Nanomaterials Theory Institute and utilizing facilities and expertise of the Center for Computational Sciences and the new Leadership Scientific Computing Facility at ORNL; a new CNMS Nanofabrication Research Laboratory; and a suite of unique and state-of-the-art instruments to be made reliably available to the national community for imaging, manipulation, and properties measurements on nanoscale materials in controlled environments. The new research facilities will be described together with the planned operation of the user research program, the latter illustrated by the current ``jump start'' user program that utilizes existing ORNL/CNMS facilities.

  15. Summer Institute for High School Teachers

    NASA Astrophysics Data System (ADS)

    Maheswaranathan, Ponn; Calloway, Cliff

    2008-04-01

    We have conducted again a summer institute for high-school teachers in South Carolina at Winthrop University. The target audience were 9th grade physical science teachers in schools within a 50-mile radius from Winthrop. We developed a graduate level physics professional development course covering selected topics from the physics and chemistry content areas of the South Carolina Science Standards. Delivery of the material included the traditional lectures and the following innovative approaches in science teaching: hands-on experiments, group activities, computer based data collection, group discussions, and presentations. Two master teachers assisted us during the delivery of the course which took place in June 20-29, 2007 using Winthrop facilities. Requested funds were used for the following: salary for us and master teachers, contract course fee, some of the participants' room and board, startup equipment for all the teachers, and indirect costs to Winthrop University. Startup equipment included Pasco's stand-alone and portable Xplorer GLX interface and sensors (temperature, voltage, pH, pressure, motion, and sound). What we learned and ideas for continued K-12 teacher preparation initiatives will be presented.

  16. Photometric Analysis in the Kepler Science Operations Center Pipeline

    NASA Technical Reports Server (NTRS)

    Twicken, Joseph D.; Clarke, Bruce D.; Bryson, Stephen T.; Tenenbaum, Peter; Wu, Hayley; Jenkins, Jon M.; Girouard, Forrest; Klaus, Todd C.

    2010-01-01

    We describe the Photometric Analysis (PA) software component and its context in the Kepler Science Operations Center (SOC) pipeline. The primary tasks of this module are to compute the photometric flux and photocenters (centroids) for over 160,000 long cadence (thirty minute) and 512 short cadence (one minute) stellar targets from the calibrated pixels in their respective apertures. We discuss the science algorithms for long and short cadence PA: cosmic ray cleaning; background estimation and removal; aperture photometry; and flux-weighted centroiding. We discuss the end-to-end propagation of uncertainties for the science algorithms. Finally, we present examples of photometric apertures, raw flux light curves, and centroid time series from Kepler flight data. PA light curves, centroid time series, and barycentric timestamp corrections are exported to the Multi-mission Archive at Space Telescope [Science Institute] (MAST) and are made available to the general public in accordance with the NASA/Kepler data release policy.

  17. Opportunities and challenges for the life sciences community.

    PubMed

    Kolker, Eugene; Stewart, Elizabeth; Ozdemir, Vural

    2012-03-01

    Twenty-first century life sciences have transformed into data-enabled (also called data-intensive, data-driven, or big data) sciences. They principally depend on data-, computation-, and instrumentation-intensive approaches to seek comprehensive understanding of complex biological processes and systems (e.g., ecosystems, complex diseases, environmental, and health challenges). Federal agencies including the National Science Foundation (NSF) have played and continue to play an exceptional leadership role by innovatively addressing the challenges of data-enabled life sciences. Yet even more is required not only to keep up with the current developments, but also to pro-actively enable future research needs. Straightforward access to data, computing, and analysis resources will enable true democratization of research competitions; thus investigators will compete based on the merits and broader impact of their ideas and approaches rather than on the scale of their institutional resources. This is the Final Report for Data-Intensive Science Workshops DISW1 and DISW2. The first NSF-funded Data Intensive Science Workshop (DISW1, Seattle, WA, September 19-20, 2010) overviewed the status of the data-enabled life sciences and identified their challenges and opportunities. This served as a baseline for the second NSF-funded DIS workshop (DISW2, Washington, DC, May 16-17, 2011). Based on the findings of DISW2 the following overarching recommendation to the NSF was proposed: establish a community alliance to be the voice and framework of the data-enabled life sciences. After this Final Report was finished, Data-Enabled Life Sciences Alliance (DELSA, www.delsall.org ) was formed to become a Digital Commons for the life sciences community.

  18. Opportunities and Challenges for the Life Sciences Community

    PubMed Central

    Stewart, Elizabeth; Ozdemir, Vural

    2012-01-01

    Abstract Twenty-first century life sciences have transformed into data-enabled (also called data-intensive, data-driven, or big data) sciences. They principally depend on data-, computation-, and instrumentation-intensive approaches to seek comprehensive understanding of complex biological processes and systems (e.g., ecosystems, complex diseases, environmental, and health challenges). Federal agencies including the National Science Foundation (NSF) have played and continue to play an exceptional leadership role by innovatively addressing the challenges of data-enabled life sciences. Yet even more is required not only to keep up with the current developments, but also to pro-actively enable future research needs. Straightforward access to data, computing, and analysis resources will enable true democratization of research competitions; thus investigators will compete based on the merits and broader impact of their ideas and approaches rather than on the scale of their institutional resources. This is the Final Report for Data-Intensive Science Workshops DISW1 and DISW2. The first NSF-funded Data Intensive Science Workshop (DISW1, Seattle, WA, September 19–20, 2010) overviewed the status of the data-enabled life sciences and identified their challenges and opportunities. This served as a baseline for the second NSF-funded DIS workshop (DISW2, Washington, DC, May 16–17, 2011). Based on the findings of DISW2 the following overarching recommendation to the NSF was proposed: establish a community alliance to be the voice and framework of the data-enabled life sciences. After this Final Report was finished, Data-Enabled Life Sciences Alliance (DELSA, www.delsall.org) was formed to become a Digital Commons for the life sciences community. PMID:22401659

  19. The role of a clinically based computer department of instruction in a school of medicine.

    PubMed

    Yamamoto, W S

    1991-10-01

    The evolution of activities and educational directions of a department of instruction in medical computer technology in a school of medicine are reviewed. During the 18 years covered, the society at large has undergone marked change in availability and use of computation in every aspect of medical care. It is argued that a department of instruction should be clinical and develop revenue sources based on patient care, perform technical services for the institution with a decentralized structure, and perform both health services and scientific research. Distinction should be drawn between utilization of computing in medical specialties, library function, and instruction in computer science. The last is the proper arena for the academic content of instruction and is best labelled as the philosophical basis of medical knowledge, in particular, its epistemology. Contemporary pressures for teaching introductory computer skills are probably temporary.

  20. MICA: The Meta-Institute for Computational Astrophysics

    NASA Astrophysics Data System (ADS)

    McMillan, Stephen L. W.; Djorgovski, S. G.; Hut, P.; Vesperini, E.; Knop, R.; Portegies Zwart, S.

    2009-05-01

    We describe MICA, the Meta Institute for Computational Astrophysics, the first professional scientific and educational, non-profit organization based in virtual worlds [VWs]. Most MICA activities are currently conducted in Second Life, arguably the most popular and best developed VW; we plan to expand our presence into other VWs as those venues evolve. The goals of MICA include (1) exploration, development and promotion of VWs and virtual reality [VR] technologies for professional research in astronomy and related fields; (2) development of novel networking venues and mechanisms for virtual scientific communication and interaction, including professional meetings, visualization, and telecollaboration; (3) use of VWs and VR technologies for education and public outreach; and (4) exchange of ideas and joint efforts with other scientific disciplines in promoting these goals for science and scholarship in general. We present representative example of MICA activities and achievements, and outline plans for expansion of the organization. For more information on MICA, please visit http://mica-vw.org .

  1. Remote Earth Sciences data collection using ACTS

    NASA Technical Reports Server (NTRS)

    Evans, Robert H.

    1992-01-01

    Given the focus on global change and the attendant scope of such research, we anticipate significant growth of requirements for investigator interaction, processing system capabilities, and availability of data sets. The increased complexity of global processes requires interdisciplinary teams to address them; the investigators will need to interact on a regular basis; however, it is unlikely that a single institution will house sufficient investigators with the required breadth of skills. The complexity of the computations may also require resources beyond those located within a single institution; this lack of sufficient computational resources leads to a distributed system located at geographically dispersed institutions. Finally the combination of long term data sets like the Pathfinder datasets and the data to be gathered by new generations of satellites such as SeaWiFS and MODIS-N yield extra-ordinarily large amounts of data. All of these factors combine to increase demands on the communications facilities available; the demands are generating requirements for highly flexible, high capacity networks. We have been examining the applicability of the Advanced Communications Technology Satellite (ACTS) to address the scientific, computational, and, primarily, communications questions resulting from global change research. As part of this effort three scenarios for oceanographic use of ACTS have been developed; a full discussion of this is contained in Appendix B.

  2. eHealth research from the user's perspective.

    PubMed

    Hesse, Bradford W; Shneiderman, Ben

    2007-05-01

    The application of information technology (IT) to issues of healthcare delivery has had a long and tortuous history in the United States. Within the field of eHealth, vanguard applications of advanced computing techniques, such as applications in artificial intelligence or expert systems, have languished in spite of a track record of scholarly publication and decisional accuracy. The problem is one of purpose, of asking the right questions for the science to solve. Historically, many computer science pioneers have been tempted to ask "what can the computer do?" New advances in eHealth are prompting developers to ask "what can people do?" How can eHealth take part in national goals for healthcare reform to empower relationships between healthcare professionals and patients, healthcare teams and families, and hospitals and communities to improve health equitably throughout the population? To do this, eHealth researchers must combine best evidence from the user sciences (human factors engineering, human-computer interaction, psychology, and usability) with best evidence in medicine to create transformational improvements in the quality of care that medicine offers. These improvements should follow recommendations from the Institute of Medicine to create a healthcare system that is (1) safe, (2) effective (evidence based), (3) patient centered, and (4) timely. Relying on the eHealth researcher's intuitive grasp of systems issues, improvements should be made with considerations of users and beneficiaries at the individual (patient-physician), group (family-staff), community, and broad environmental levels.

  3. Transformative Undergraduate Science Courses for Non-Majors at a Historically Black Institution and at a Primarily White Institution

    ERIC Educational Resources Information Center

    Marbach-Ad, Gili; McGinnis, J. Randy; Pease, Rebecca; Dai, Amy; Benson, Spencer; Dantley, Scott Jackson

    2010-01-01

    We investigated curricular and pedagogical innovations in undergraduate science courses for non-science majors at a Historically Black Institution (HBI) and a Primarily White Institution (PWI). The aims were to improve students' understanding of science, increase their enthusiasm towards science by connecting their prior experience and interest to…

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Glotzer, S. C.; Kim, S.; Cummings, P. T.

    This WTEC panel report assesses the international research and development activities in the field of Simulation- Based Engineering and Science (SBE&S). SBE&S involves the use of computer modeling and simulation to solve mathematical formulations of physical models of engineered and natural systems. SBE&S today has reached a level of predictive capability that it now firmly complements the traditional pillars of theory and experimentation/observation. As a result, computer simulation is more pervasive today – and having more impact – than at any other time in human history. Many critical technologies, including those to develop new energy sources and to shift themore » cost-benefit factors in healthcare, are on the horizon that cannot be understood, developed, or utilized without simulation. A panel of experts reviewed and assessed the state of the art in SBE&S as well as levels of activity overseas in the broad thematic areas of life sciences and medicine, materials, and energy and sustainability; and in the crosscutting issues of next generation hardware and algorithms; software development; engineering simulations; validation, verification, and uncertainty quantification; multiscale modeling and simulation; and SBE&S education. The panel hosted a U.S. baseline workshop, conducted a bibliometric analysis, consulted numerous experts and reports, and visited 59 institutions and companies throughout East Asia and Western Europe to explore the active research projects in those institutions, the computational infrastructure used for the projects, the funding schemes that enable the research, the collaborative interactions among universities, national laboratories, and corporate research centers, and workforce needs and development for SBE&S.« less

  5. NASA Science Institutes Plan. Report of the NASA Science Institutes Team: Final Publication (Incorporating Public Comments and Revisions)

    NASA Technical Reports Server (NTRS)

    1996-01-01

    This NASA Science Institute Plan has been produced in response to direction from the NASA Administrator for the benefit of NASA Senior Management, science enterprise leaders, and Center Directors. It is intended to provide a conceptual framework for organizing and planning the conduct of science in support of NASA's mission through the creation of a limited number of science Institutes. This plan is the product of the NASA Science Institute Planning Integration Team (see Figure A). The team worked intensively over a three-month period to review proposed Institutes and produce findings for NASA senior management. The team's activities included visits to current NASA Institutes and associated Centers, as well as approximately a dozen non-NASA research Institutes. In addition to producing this plan, the team published a "Benchmarks" report. The Benchmarks report provides a basis for comparing NASA's proposed activities with those sponsored by other national science agencies, and identifies best practices to be considered in the establishment of NASA Science Institutes. Throughout the team's activities, a Board of Advisors comprised of senior NASA officials (augmented as necessary with other government employees) provided overall advice and counsel.

  6. Role of High-End Computing in Meeting NASA's Science and Engineering Challenges

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Tu, Eugene L.; Van Dalsem, William R.

    2006-01-01

    Two years ago, NASA was on the verge of dramatically increasing its HEC capability and capacity. With the 10,240-processor supercomputer, Columbia, now in production for 18 months, HEC has an even greater impact within the Agency and extending to partner institutions. Advanced science and engineering simulations in space exploration, shuttle operations, Earth sciences, and fundamental aeronautics research are occurring on Columbia, demonstrating its ability to accelerate NASA s exploration vision. This talk describes how the integrated production environment fostered at the NASA Advanced Supercomputing (NAS) facility at Ames Research Center is accelerating scientific discovery, achieving parametric analyses of multiple scenarios, and enhancing safety for NASA missions. We focus on Columbia s impact on two key engineering and science disciplines: Aerospace, and Climate. We also discuss future mission challenges and plans for NASA s next-generation HEC environment.

  7. Using the Principles of BIO2010 to Develop an Introductory, Interdisciplinary Course for Biology Students

    PubMed Central

    Adams, Peter; Goos, Merrilyn

    2010-01-01

    Modern biological sciences require practitioners to have increasing levels of knowledge, competence, and skills in mathematics and programming. A recent review of the science curriculum at the University of Queensland, a large, research-intensive institution in Australia, resulted in the development of a more quantitatively rigorous undergraduate program. Inspired by the National Research Council's BIO2010 report, a new interdisciplinary first-year course (SCIE1000) was created, incorporating mathematics and computer programming in the context of modern science. In this study, the perceptions of biological science students enrolled in SCIE1000 in 2008 and 2009 are measured. Analysis indicates that, as a result of taking SCIE1000, biological science students gained a positive appreciation of the importance of mathematics in their discipline. However, the data revealed that SCIE1000 did not contribute positively to gains in appreciation for computing and only slightly influenced students' motivation to enroll in upper-level quantitative-based courses. Further comparisons between 2008 and 2009 demonstrated the positive effect of using genuine, real-world contexts to enhance student perceptions toward the relevance of mathematics. The results support the recommendation from BIO2010 that mathematics should be introduced to biology students in first-year courses using real-world examples, while challenging the benefits of introducing programming in first-year courses. PMID:20810961

  8. Public library computer training for older adults to access high-quality Internet health information

    PubMed Central

    Xie, Bo; Bugg, Julie M.

    2010-01-01

    An innovative experiment to develop and evaluate a public library computer training program to teach older adults to access and use high-quality Internet health information involved a productive collaboration among public libraries, the National Institute on Aging and the National Library of Medicine of the National Institutes of Health (NIH), and a Library and Information Science (LIS) academic program at a state university. One hundred and thirty-one older adults aged 54–89 participated in the study between September 2007 and July 2008. Key findings include: a) participants had overwhelmingly positive perceptions of the training program; b) after learning about two NIH websites (http://nihseniorhealth.gov and http://medlineplus.gov) from the training, many participants started using these online resources to find high quality health and medical information and, further, to guide their decision-making regarding a health- or medically-related matter; and c) computer anxiety significantly decreased (p < .001) while computer interest and efficacy significantly increased (p = .001 and p < .001, respectively) from pre- to post-training, suggesting statistically significant improvements in computer attitudes between pre- and post-training. The findings have implications for public libraries, LIS academic programs, and other organizations interested in providing similar programs in their communities. PMID:20161649

  9. SciDAC's Earth System Grid Center for Enabling Technologies Semiannual Progress Report October 1, 2010 through March 31, 2011

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Dean N.

    2011-04-02

    This report summarizes work carried out by the Earth System Grid Center for Enabling Technologies (ESG-CET) from October 1, 2010 through March 31, 2011. It discusses ESG-CET highlights for the reporting period, overall progress, period goals, and collaborations, and lists papers and presentations. To learn more about our project and to find previous reports, please visit the ESG-CET Web sites: http://esg-pcmdi.llnl.gov/ and/or https://wiki.ucar.edu/display/esgcet/Home. This report will be forwarded to managers in the Department of Energy (DOE) Scientific Discovery through Advanced Computing (SciDAC) program and the Office of Biological and Environmental Research (OBER), as well as national and international collaborators andmore » stakeholders (e.g., those involved in the Coupled Model Intercomparison Project, phase 5 (CMIP5) for the Intergovernmental Panel on Climate Change (IPCC) 5th Assessment Report (AR5); the Community Earth System Model (CESM); the Climate Science Computational End Station (CCES); SciDAC II: A Scalable and Extensible Earth System Model for Climate Change Science; the North American Regional Climate Change Assessment Program (NARCCAP); the Atmospheric Radiation Measurement (ARM) program; the National Aeronautics and Space Administration (NASA), the National Oceanic and Atmospheric Administration (NOAA)), and also to researchers working on a variety of other climate model and observation evaluation activities. The ESG-CET executive committee consists of Dean N. Williams, Lawrence Livermore National Laboratory (LLNL); Ian Foster, Argonne National Laboratory (ANL); and Don Middleton, National Center for Atmospheric Research (NCAR). The ESG-CET team is a group of researchers and scientists with diverse domain knowledge, whose home institutions include eight laboratories and two universities: ANL, Los Alamos National Laboratory (LANL), Lawrence Berkeley National Laboratory (LBNL), LLNL, NASA/Jet Propulsion Laboratory (JPL), NCAR, Oak Ridge National Laboratory (ORNL), Pacific Marine Environmental Laboratory (PMEL)/NOAA, Rensselaer Polytechnic Institute (RPI), and University of Southern California, Information Sciences Institute (USC/ISI). All ESG-CET work is accomplished under DOE open-source guidelines and in close collaboration with the project's stakeholders, domain researchers, and scientists. Through the ESG project, the ESG-CET team has developed and delivered a production environment for climate data from multiple climate model sources (e.g., CMIP (IPCC), CESM, ocean model data (e.g., Parallel Ocean Program), observation data (e.g., Atmospheric Infrared Sounder, Microwave Limb Sounder), and analysis and visualization tools) that serves a worldwide climate research community. Data holdings are distributed across multiple sites including LANL, LBNL, LLNL, NCAR, and ORNL as well as unfunded partners sites such as the Australian National University (ANU) National Computational Infrastructure (NCI), the British Atmospheric Data Center (BADC), the Geophysical Fluid Dynamics Laboratory/NOAA, the Max Planck Institute for Meteorology (MPI-M), the German Climate Computing Centre (DKRZ), and NASA/JPL. As we transition from development activities to production and operations, the ESG-CET team is tasked with making data available to all users who want to understand it, process it, extract value from it, visualize it, and/or communicate it to others. This ongoing effort is extremely large and complex, but it will be incredibly valuable for building 'science gateways' to critical climate resources (such as CESM, CMIP5, ARM, NARCCAP, Atmospheric Infrared Sounder (AIRS), etc.) for processing the next IPCC assessment report. Continued ESG progress will result in a production-scale system that will empower scientists to attempt new and exciting data exchanges, which could ultimately lead to breakthrough climate science discoveries.« less

  10. Optical character recognition: an illustrated guide to the frontier

    NASA Astrophysics Data System (ADS)

    Nagy, George; Nartker, Thomas A.; Rice, Stephen V.

    1999-12-01

    We offer a perspective on the performance of current OCR systems by illustrating and explaining actual OCR errors made by three commercial devices. After discussing briefly the character recognition abilities of humans and computers, we present illustrated examples of recognition errors. The top level of our taxonomy of the causes of errors consists of Imaging Defects, Similar Symbols, Punctuation, and Typography. The analysis of a series of 'snippets' from this perspective provides insight into the strengths and weaknesses of current systems, and perhaps a road map to future progress. The examples were drawn from the large-scale tests conducted by the authors at the Information Science Research Institute of the University of Nevada, Las Vegas. By way of conclusion, we point to possible approaches for improving the accuracy of today's systems. The talk is based on our eponymous monograph, recently published in The Kluwer International Series in Engineering and Computer Science, Kluwer Academic Publishers, 1999.

  11. Yearning to Give Back: Searching for Social Purpose in Computer Science and Engineering.

    PubMed

    Carrigan, Coleen M

    2017-01-01

    Computing is highly segregated and stratified by gender. While there is abundant scholarship investigating this problem, emerging evidence suggests that a hierarchy of value exists between the social and technical dimensions of Computer Science and Engineering (CSE) and this plays a role in the underrepresentation of women in the field. This ethnographic study of women's experiences in computing offers evidence of a systemic preference for the technical dimensions of computing over the social and a correlation between gender and social aspirations. Additionally, it suggests there is a gap between the exaltation of computing's social contributions and the realities of them. My participants expressed a yearning to contribute to the collective well-being of society using their computing skills. I trace moments of rupture in my participants' stories, moments when they felt these aspirations were in conflict with the cultural values in their organizations. I interpret these ruptures within a consideration of yearning, a need my participants had to contribute meaningfully to society that remained unfulfilled. The yearning to align one's altruistic values with one's careers aspirations in CSE illuminates an area for greater exploration on the path to realizing gender equity in computing. I argue that before a case can be made that careers in computing do indeed contribute to social and civil engagements, we must first address the meaning of the social within the values, ideologies and practices of CSE institutions and next, develop ways to measure and evaluate the field's contributions to society.

  12. Yearning to Give Back: Searching for Social Purpose in Computer Science and Engineering

    PubMed Central

    Carrigan, Coleen M.

    2017-01-01

    Computing is highly segregated and stratified by gender. While there is abundant scholarship investigating this problem, emerging evidence suggests that a hierarchy of value exists between the social and technical dimensions of Computer Science and Engineering (CSE) and this plays a role in the underrepresentation of women in the field. This ethnographic study of women's experiences in computing offers evidence of a systemic preference for the technical dimensions of computing over the social and a correlation between gender and social aspirations. Additionally, it suggests there is a gap between the exaltation of computing's social contributions and the realities of them. My participants expressed a yearning to contribute to the collective well-being of society using their computing skills. I trace moments of rupture in my participants' stories, moments when they felt these aspirations were in conflict with the cultural values in their organizations. I interpret these ruptures within a consideration of yearning, a need my participants had to contribute meaningfully to society that remained unfulfilled. The yearning to align one's altruistic values with one's careers aspirations in CSE illuminates an area for greater exploration on the path to realizing gender equity in computing. I argue that before a case can be made that careers in computing do indeed contribute to social and civil engagements, we must first address the meaning of the social within the values, ideologies and practices of CSE institutions and next, develop ways to measure and evaluate the field's contributions to society. PMID:28790936

  13. Incremental Centrality Algorithms for Dynamic Network Analysis

    DTIC Science & Technology

    2013-08-01

    encouragement he gave me to complete my degree. Last but not least, I would like to thank CASOS members for insightful discussions and feedback they gave me at...Systems ( CASOS ) under the Institute for Software Research within the School of Computer Science (SCS) at Carnegie Mellon University (CMU). Financial...discusses several ways of generalizing betweenness 23 centrality including scaling of values with respect to length, inclusion of end-points in the

  14. Georgetown Institute for Cognitive and Computational Sciences

    DTIC Science & Technology

    2000-03-01

    in neuronal apoptosis. Cerebellar granules cells (CGCs) were co-transfected with a green fluorescent protein reporter and one of several hammerhead ... ribozymes constructed to cleave caspase-3 RNA. Use of such ribozymes is highly selective. In separate experiments we co-transfected with a gene that...expressing a ribozyme against rat caspase-3. Apoptosis was assessed after 24, 36, or 48 h of serum/K+ deprivation. In negative control cells expressing ß

  15. The Computer as a Tool for Learning through Reflection.

    DTIC Science & Technology

    1986-03-01

    different accents and backgrounds (e.g., Vanessa Redgrave, Martin Luther King, and Ricardo Montalban). Thus students can compare how they read the...Coordinated Science Laboratory Santa Barbara, CA 93106 University of Illinois Urbana, IL 61801 Edward E. Eddowes CNATRA N301 Goery Delacote Naval Air Station...DC 20052 Dr. James G. Greeno University of California Dr Jim Hollan Berkeley. CA 94720 Intelligent Systems Group Institute for Prof Edward Haertel

  16. Modeling Laser Damage Thresholds Using the Thompson-Gerstman Model

    DTIC Science & Technology

    2014-10-01

    Gerstman model was intended to be a modular tool fit for integration into other computational models. This adds usability to the standalone code...Advanced Study Institute, Series A – Life Sciences, Vol. 34, pp. 77-97. New York: Plenum Press . 4. Birngruber, R., V.-P. Gabel and F. Hillenkamp...Random granule placement - varies with melnum. ; ii. Depth averaging or shadowing - varies with melnum. ; iii. T(r,t) single granule calc

  17. CSRI Summer Proceedings 2010

    DTIC Science & Technology

    2010-12-17

    AND ADDRESSES U.S. Army Research Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 15. SUBJECT TERMS Mathematics; Computer Science Eric C...Institute at Sandia National Laboratories Editors: Eric C. Cyr and S. Scott Collis Sandia National Laboratories December 17, 2010 SAND2010-8783P...CSRI and its activities which have benefited both Sandia and the greater research community. Eric C. Cyr S. Scott Collis December 17, 2010 iv CSRI

  18. A Requirements Analysis Model for Selection of Personal Computer (PC) software in Air Force Organizations

    DTIC Science & Technology

    1988-09-01

    Institute of Technology Air University In Partial Fulfillment of the Requirements for the Degree of Master of Science in Systems Management Dexter R... management system software Diag/Prob Diagnosis and problem solving or problem finding GR Graphics software Int/Transp Interoperability and...language software Plan/D.S. Planning and decision support or decision making PM Program management software SC Systems for Command, Control, Communications

  19. JPRS Report, Science & Technology, USSR: Computers

    DTIC Science & Technology

    1987-09-29

    Reliability of Protected Systems (L.S. Stoykova, O.A. Yushchenko; KIBERNETIKA, No 5, Sep-Oct 86) U Decision Making Based on Analysis of a Decision...34 published by the Central Scientific Research Institute for Information and Technoeconomic Research on Material and Technical Supply (TsNIITEIMS) of the...was said becomes clear after a subconscious analysis of the context. We have built our device according to the same pattern. In contrast to its

  20. A Transcript Analysis of Graduates of Three Community College of Philadelphia Curricula between the Years 1985 and 1992. Institutional Research Report #83.

    ERIC Educational Resources Information Center

    Terzian, Aram L.; Obetz, Wayne S.

    A study was conducted at the Community College of Philadelphia (CCP) to examine the course-taking patterns of 94 graduates of the associate in arts (AA) curriculum, 1,957 graduates of the association in general studies (AGS) curriculum, and 99 graduates of the associate in science (AS) curriculum. Using a computer-based approach to transcript…

  1. Semiannual Report for Contract NAS1-19480 (Institute for Computer Applications in Science and Engineering)

    DTIC Science & Technology

    1994-06-01

    algorithms for large, irreducibly coupled systems iteratively solve concurrent problems within different subspaces of a Hilbert space, or within different...effective on problems amenable to SIMD solution. Together with researchers at AT&T Bell Labs (Boris Lubachevsky, Albert Greenberg ) we have developed...reasonable measurement. In the study of different speedups, various causes of superlinear speedup are also presented. Greenberg , Albert G., Boris D

  2. Minority University-Space Interdisciplinary Network Conference Proceedings of the Seventh Annual Users' Conference

    NASA Technical Reports Server (NTRS)

    Harrington, James L., Jr.; Brown, Robin L.; Shukla, Pooja

    1998-01-01

    Seventh annual conference proceedings of the Minority University-SPace Interdisciplinary Network (MU-SPIN) conference. MU-SPIN is cosponsored by NASA Goddard Space Flight Center and the National Science Foundation, and is a comprehensive educational initiative for Historically Black Colleges and Universities, and minority universities. MU-SPIN focuses on the transfer of advanced computer networking technologies to these institutions and their use for supporting multidisciplinary research.

  3. What’s Wrong With Automatic Speech Recognition (ASR) and How Can We Fix It?

    DTIC Science & Technology

    2013-03-01

    Jordan Cohen International Computer Science Institute 1947 Center Street, Suite 600 Berkeley, CA 94704 MARCH 2013 Final Report ...This report was cleared for public release by the 88th Air Base Wing Public Affairs Office and is available to the general public, including foreign...711th Human Performance Wing Air Force Research Laboratory This report is published in the interest of scientific and technical

  4. Opening Remarks: SciDAC 2007

    NASA Astrophysics Data System (ADS)

    Strayer, Michael

    2007-09-01

    Good morning. Welcome to Boston, the home of the Red Sox, Celtics and Bruins, baked beans, tea parties, Robert Parker, and SciDAC 2007. A year ago I stood before you to share the legacy of the first SciDAC program and identify the challenges that we must address on the road to petascale computing—a road E E Cummins described as `. . . never traveled, gladly beyond any experience.' Today, I want to explore the preparations for the rapidly approaching extreme scale (X-scale) generation. These preparations are the first step propelling us along the road of burgeoning scientific discovery enabled by the application of X- scale computing. We look to petascale computing and beyond to open up a world of discovery that cuts across scientific fields and leads us to a greater understanding of not only our world, but our universe. As part of the President's America Competitiveness Initiative, the ASCR Office has been preparing a ten year vision for computing. As part of this planning the LBNL together with ORNL and ANL hosted three town hall meetings on Simulation and Modeling at the Exascale for Energy, Ecological Sustainability and Global Security (E3). The proposed E3 initiative is organized around four programmatic themes: Engaging our top scientists, engineers, computer scientists and applied mathematicians; investing in pioneering large-scale science; developing scalable analysis algorithms, and storage architectures to accelerate discovery; and accelerating the build-out and future development of the DOE open computing facilities. It is clear that we have only just started down the path to extreme scale computing. Plan to attend Thursday's session on the out-briefing and discussion of these meetings. The road to the petascale has been at best rocky. In FY07, the continuing resolution provided 12% less money for Advanced Scientific Computing than either the President, the Senate, or the House. As a consequence, many of you had to absorb a no cost extension for your SciDAC work. I am pleased that the President's FY08 budget restores the funding for SciDAC. Quoting from Advanced Scientific Computing Research description in the House Energy and Water Development Appropriations Bill for FY08, "Perhaps no other area of research at the Department is so critical to sustaining U.S. leadership in science and technology, revolutionizing the way science is done and improving research productivity." As a society we need to revolutionize our approaches to energy, environmental and global security challenges. As we go forward along the road to the X-scale generation, the use of computation will continue to be a critical tool along with theory and experiment in understanding the behavior of the fundamental components of nature as well as for fundamental discovery and exploration of the behavior of complex systems. The foundation to overcome these societal challenges will build from the experiences and knowledge gained as you, members of our SciDAC research teams, work together to attack problems at the tera- and peta- scale. If SciDAC is viewed as an experiment for revolutionizing scientific methodology, then a strategic goal of ASCR program must be to broaden the intellectual base prepared to address the challenges of the new X-scale generation of computing. We must focus our computational science experiences gained over the past five years on the opportunities introduced with extreme scale computing. Our facilities are on a path to provide the resources needed to undertake the first part of our journey. Using the newly upgraded 119 teraflop Cray XT system at the Leadership Computing Facility, SciDAC research teams have in three days performed a 100-year study of the time evolution of the atmospheric CO2 concentration originating from the land surface. The simulation of the El Nino/Southern Oscillation which was part of this study has been characterized as `the most impressive new result in ten years' gained new insight into the behavior of superheated ionic gas in the ITER reactor as a result of an AORSA run on 22,500 processors that achieved over 87 trillion calculations per second (87 teraflops) which is 74% of the system's theoretical peak. Tomorrow, Argonne and IBM will announce that the first IBM Blue Gene/P, a 100 teraflop system, will be shipped to the Argonne Leadership Computing Facility later this fiscal year. By the end of FY2007 ASCR high performance and leadership computing resources will include the 114 teraflop IBM Blue Gene/P; a 102 teraflop Cray XT4 at NERSC and a 119 teraflop Cray XT system at Oak Ridge. Before ringing in the New Year, Oak Ridge will upgrade to 250 teraflops with the replacement of the dual core processors with quad core processors and Argonne will upgrade to between 250-500 teraflops, and next year, a petascale Cray Baker system is scheduled for delivery at Oak Ridge. The multidisciplinary teams in our SciDAC Centers for Enabling Technologies and our SciDAC Institutes must continue to work with our Scientific Application teams to overcome the barriers that prevent effective use of these new systems. These challenges include: the need for new algorithms as well as operating system and runtime software and tools which scale to parallel systems composed of hundreds of thousands processors; program development environments and tools which scale effectively and provide ease of use for developers and scientific end users; and visualization and data management systems that support moving, storing, analyzing, manipulating and visualizing multi-petabytes of scientific data and objects. The SciDAC Centers, located primarily at our DOE national laboratories will take the lead in ensuring that critical computer science and applied mathematics issues are addressed in a timely and comprehensive fashion and to address issues associated with research software lifecycle. In contrast, the SciDAC Institutes, which are university-led centers of excellence, will have more flexibility to pursue new research topics through a range of research collaborations. The Institutes will also work to broaden the intellectual and researcher base—conducting short courses and summer schools to take advantage of new high performance computing capabilities. The SciDAC Outreach Center at Lawrence Berkeley National Laboratory complements the outreach efforts of the SciDAC Institutes. The Outreach Center is our clearinghouse for SciDAC activities and resources and will communicate with the high performance computing community in part to understand their needs for workshops, summer schools and institutes. SciDAC is not ASCR's only effort to broaden the computational science community needed to meet the challenges of the new X-scale generation. I hope that you were able to attend the Computational Science Graduate Fellowship poster session last night. ASCR developed the fellowship in 1991 to meet the nation's growing need for scientists and technology professionals with advanced computer skills. CSGF, now jointly funded between ASCR and NNSA, is more than a traditional academic fellowship. It has provided more than 200 of the best and brightest graduate students with guidance, support and community in preparing them as computational scientists. Today CSGF alumni are bringing their diverse top-level skills and knowledge to research teams at DOE laboratories and in industries such as Proctor and Gamble, Lockheed Martin and Intel. At universities they are working to train the next generation of computational scientists. To build on this success, we intend to develop a wholly new Early Career Principal Investigator's (ECPI) program. Our objective is to stimulate academic research in scientific areas within ASCR's purview especially among faculty in early stages of their academic careers. Last February, we lost Ken Kennedy, one of the leading lights of our community. As we move forward into the extreme computing generation, his vision and insight will be greatly missed. In memorial to Ken Kennedy, we shall designate the ECPI grants to beginning faculty in Computer Science as the Ken Kennedy Fellowship. Watch the ASCR website for more information about ECPI and other early career programs in the computational sciences. We look to you, our scientists, researchers, and visionaries to take X-scale computing and use it to explode scientific discovery in your fields. We at SciDAC will work to ensure that this tool is the sharpest and most precise and efficient instrument to carve away the unknown and reveal the most exciting secrets and stimulating scientific discoveries of our time. The partnership between research and computing is the marriage that will spur greater discovery, and as Spencer said to Susan in Robert Parker's novel, `Sudden Mischief', `We stick together long enough, and we may get as smart as hell'. Michael Strayer

  5. Informal Science Institutions and Learning to Teach: An Examination of Identity, Agency, and Affordances

    ERIC Educational Resources Information Center

    Adams, Jennifer D.; Gupta, Preeti

    2017-01-01

    Informal science education institutions play an important in the public understanding of science and, because of this are well-positioned to positively impact science teacher education. Informal science institutions (ISIs) have a range of affordances that could contribute to learner-centered science teacher identity development. This article…

  6. The Tanenbaum Open Science Institute: Leading a Paradigm Shift at the Montreal Neurological Institute.

    PubMed

    Poupon, Viviane; Seyller, Annabel; Rouleau, Guy A

    2017-08-30

    The Montreal Neurological Institute is adopting an Open Science Policy that will be enacted by the Tanenbaum Open Science Institute. The aim is to accelerate the generation of knowledge and novel effective treatments for brain disorders by freeing science. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. 77 FR 52704 - Notice of Submission for OMB Review; Institute of Education Sciences; Early Childhood...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-30

    ... DEPARTMENT OF EDUCATION Notice of Submission for OMB Review; Institute of Education Sciences... for Education Statistics (NCES) within the Institute of Education Sciences (IES) of the U.S... Statistics (NCES) within the Institute of Education Sciences (IES) of the U.S. Department of Education (ED...

  8. Evaluating outcomes of computer-based classroom testing: Student acceptance and impact on learning and exam performance.

    PubMed

    Zheng, Meixun; Bender, Daniel

    2018-03-13

    Computer-based testing (CBT) has made progress in health sciences education. In 2015, the authors led implementation of a CBT system (ExamSoft) at a dental school in the U.S. Guided by the Technology Acceptance Model (TAM), the purposes of this study were to (a) examine dental students' acceptance of ExamSoft; (b) understand factors impacting acceptance; and (c) evaluate the impact of ExamSoft on students' learning and exam performance. Survey and focus group data revealed that ExamSoft was well accepted by students as a testing tool and acknowledged by most for its potential to support learning. Regression analyses showed that perceived ease of use and perceived usefulness of ExamSoft significantly predicted student acceptance. Prior CBT experience and computer skills did not significantly predict acceptance of ExamSoft. Students reported that ExamSoft promoted learning in the first program year, primarily through timely and rich feedback on examination performance. t-Tests yielded mixed results on whether students performed better on computerized or paper examinations. The study contributes to the literature on CBT and the application of the TAM model in health sciences education. Findings also suggest ways in which health sciences institutions can implement CBT to maximize its potential as an assessment and learning tool.

  9. Experience of validation and tuning of turbulence models as applied to the problem of boundary layer separation on a finite-width wedge

    NASA Astrophysics Data System (ADS)

    Babulin, A. A.; Bosnyakov, S. M.; Vlasenko, V. V.; Engulatova, M. F.; Matyash, S. V.; Mikhailov, S. V.

    2016-06-01

    Modern differential turbulence models are validated by computing a separation zone generated in the supersonic flow past a compression wedge lying on a plate of finite width. The results of three- and two-dimensional computations based on the ( q-ω), SST, and Spalart-Allmaras turbulence models are compared with experimental data obtained for 8°, 25°, and 45° wedges by A.A. Zheltovodov at the Institute of Theoretical and Applied Mechanics of the Siberian Branch of the Russian Academy of Sciences. An original law-of-the-wall boundary condition and modifications of the SST model intended for improving the quality of the computed separation zone are described.

  10. Econophysics and evolutionary economics (Scientific session of the Physical Sciences Division of the Russian Academy of Sciences, 2 November 2010)

    NASA Astrophysics Data System (ADS)

    2011-07-01

    The scientific session "Econophysics and evolutionary economics" of the Division of Physical Sciences of the Russian Academy of Sciences (RAS) took place on 2 November 2010 in the conference hall of the Lebedev Physical Institute, Russian Academy of Sciences. The session agenda announced on the website www.gpad.ac.ru of the RAS Physical Sciences Division listed the following reports: (1) Maevsky V I (Institute of Economics, RAS, Moscow) "The transition from simple reproduction to economic growth"; (2) Yudanov A Yu (Financial University of the Government of the Russian Federation, Moscow) "Experimental data on the development of fast-growing innovative companies in Russia"; (3) Pospelov I G (Dorodnitsyn Computation Center, RAS, Moscow) "Why is it sometimes possible to successfully model an economy? (4) Chernyavskii D S (Lebedev Physical Institute, RAS, Moscow) "Theoretical economics"; (5) Romanovskii M Yu (Prokhorov Institute of General Physics, RAS, Moscow) "Nonclassical random walks and the phenomenology of fluctuations of the yield of securities in the securities market"; (6) Dubovikov M M, Starchenko N V (INTRAST Management Company, Moscow Engineering Physics Institute, Moscow) "Fractal analysis of financial time series and the prediction problem"; Papers written on the basis of these reports are published below. • The transition from simple reproduction to economic growth, V I Maevsky, S Yu Malkov Physics-Uspekhi, 2011, Volume 54, Number 7, Pages 729-733 • High-growth firms in Russia: experimental data and prospects for the econophysical simulation of economic modernization, A Yu Yudanov Physics-Uspekhi, 2011, Volume 54, Number 7, Pages 733-737 • Equilibrium models of economics in the period of a global financial crisis, I G Pospelov Physics-Uspekhi, 2011, Volume 54, Number 7, Pages 738-742 • On econophysics and its place in modern theoretical economics, D S Chernavskii, N I Starkov, S Yu Malkov, Yu V Kosse, A V Shcherbakov Physics-Uspekhi, 2011, Volume 54, Number 7, Pages 742-749 • Nonclassical random walks and the phenomenology of fluctuations of securities returns in the stock market, P V Vidov, M Yu Romanovsky Physics-Uspekhi, 2011, Volume 54, Number 7, Pages 749-753 • Econophysics and the fractal analysis of financial time series, M M Dubovikov, N V Starchenko Physics-Uspekhi, 2011, Volume 54, Number 7, Pages 754-761

  11. NIH's National Institute of General Medical Sciences celebrates 45 years of Discovery for Health

    MedlinePlus

    ... Alison Davis NIH's National Institute of General Medical Sciences celebrates 45 years of Discovery for Health The National Institute of General Medical Sciences (NIGMS) is the NIH institute that primarily supports ...

  12. Animating climate model data

    NASA Astrophysics Data System (ADS)

    DaPonte, John S.; Sadowski, Thomas; Thomas, Paul

    2006-05-01

    This paper describes a collaborative project conducted by the Computer Science Department at Southern Connecticut State University and NASA's Goddard Institute for Space Science (GISS). Animations of output from a climate simulation math model used at GISS to predict rainfall and circulation have been produced for West Africa from June to September 2002. These early results have assisted scientists at GISS in evaluating the accuracy of the RM3 climate model when compared to similar results obtained from satellite imagery. The results presented below will be refined to better meet the needs of GISS scientists and will be expanded to cover other geographic regions for a variety of time frames.

  13. Creating a Podcast/Vodcast: A How-To Approach

    NASA Astrophysics Data System (ADS)

    Petersen, C. C.

    2011-09-01

    Creating podcasts and vodcasts is a wonderful way to share news of science research. Public affairs officers use them to reveal the latest discoveries done by scientists in their institutions. Educators can offer podcast/vodcast creation for students who want a unique way to demonstrate their mastery of science topics. Anyone with a computer and a USB microphone can create a podcast. To do a vodcast, you also need a digital video camera and video editing software. This session focused mainly on creating a podcast - writing the script and recording the soundtrack. Attendees also did a short activity to learn to write effective narrative copy for a podcast/vodcast.

  14. Astronomy Education in Greece

    NASA Astrophysics Data System (ADS)

    Metaxa, M.

    Basic education is fundamental to higher education and scientific and technological literacy. We can confront the widespread adult ignorance and apathy about science and technology. Astronomy, an interdisciplinary science, enhances students' interest and overcomes educational problems. Three years ago, we developed astronomy education in these ways: 1. Summer School for School Students. (50 students from Athens came to the first Summer School in Astrophysics at the National Observatory, September 2-5, 1996, for lectures by professional astronomers and to be familiarized with observatory instruments. 2. Introducing Students to Research. (This teaches students more about science so they are more confident about it. Our students have won top prizes in European research contests for their studies of objects on Schmidt plates and computations on PCs.) 3. Hands-on Activities. (Very important because they bring students close to their natural environment. Activities are: variable-star observations (AAVSO), Eratosthenes project, solar-eclipse, sunspot and comet studies. 4. Contact with Professional Astronomers and Institutes. (These help students reach their social environment and motivate them as "science carriers". We try to make contacts at astronomical events, and through visits to appropriate institutions.) 5. Internet Programs. (Students learn about and familiarize themselves with their technological environment.) 6. Laboratory exercises. (Students should do science, not just learn about it We introduced the following lab. exercises: supernova remnants, galaxy classification, both from Schmidt plates, celestial sphere.

  15. A Kenyan perspective on the use of animals in science education and scientific research in Africa and prospects for improvement

    PubMed Central

    Kimwele, Charles; Matheka, Duncan; Ferdowsian, Hope

    2011-01-01

    Introduction Animal experimentation is common in Africa, a region that accords little priority on animal protection in comparison to economic and social development. The current study aimed at investigating the prevalence of animal experimentation in Kenya, and to review shortfalls in policy, legislation, implementation and enforcement that result in inadequate animal care in Kenya and other African nations. Methods Data was collected using questionnaires, administered at 39 highly ranked academic and research institutions aiming to identify those that used animals, their sources of animals, and application of the three Rs. Perceived challenges to the use of non-animal alternatives and common methods of euthanasia were also queried. Data was analyzed using Epidata, SPSS 16.0 and Microsoft Excel. Results Thirty-eight (97.4%) of thirty-nine institutions reported using animals for education and/or research. Thirty (76.9%) institutions reported using analgesics or anesthetics on a regular basis. Thirteen (33.3%) institutions regularly used statistical methods to minimize the use of animals. Overall, sixteen (41.0%) institutions explored the use of alternatives to animals such as cell cultures and computer simulation techniques, with one (2.6%) academic institution having completely replaced animals with computer modeling, manikins and visual illustrations. The commonest form of euthanasia employed was chloroform administration, reportedly in fourteen (29.8%) of 47 total methods (some institutions used more than one method). Twenty-eight (71.8%) institutions had no designated ethics committee to review or monitor protocols using animals. Conclusion Animals are commonly used in academic and research institutions in Kenya. The relative lack of ethical guidance and oversight regarding the use of animals in research and education presents significant concerns. PMID:22355442

  16. A Kenyan perspective on the use of animals in science education and scientific research in Africa and prospects for improvement.

    PubMed

    Kimwele, Charles; Matheka, Duncan; Ferdowsian, Hope

    2011-01-01

    Animal experimentation is common in Africa, a region that accords little priority on animal protection in comparison to economic and social development. The current study aimed at investigating the prevalence of animal experimentation in Kenya, and to review shortfalls in policy, legislation, implementation and enforcement that result in inadequate animal care in Kenya and other African nations. Data was collected using questionnaires, administered at 39 highly ranked academic and research institutions aiming to identify those that used animals, their sources of animals, and application of the three Rs. Perceived challenges to the use of non-animal alternatives and common methods of euthanasia were also queried. Data was analyzed using Epidata, SPSS 16.0 and Microsoft Excel. Thirty-eight (97.4%) of thirty-nine institutions reported using animals for education and/or research. Thirty (76.9%) institutions reported using analgesics or anesthetics on a regular basis. Thirteen (33.3%) institutions regularly used statistical methods to minimize the use of animals. Overall, sixteen (41.0%) institutions explored the use of alternatives to animals such as cell cultures and computer simulation techniques, with one (2.6%) academic institution having completely replaced animals with computer modeling, manikins and visual illustrations. The commonest form of euthanasia employed was chloroform administration, reportedly in fourteen (29.8%) of 47 total methods (some institutions used more than one method). Twenty-eight (71.8%) institutions had no designated ethics committee to review or monitor protocols using animals. Animals are commonly used in academic and research institutions in Kenya. The relative lack of ethical guidance and oversight regarding the use of animals in research and education presents significant concerns.

  17. RIACS

    NASA Technical Reports Server (NTRS)

    Moore, Robert C.

    1998-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities that serves as a bridge between NASA and the academic community. Under a five-year co-operative agreement with NASA, research at RIACS is focused on areas that are strategically enabling to the Ames Research Center's role as NASA's Center of Excellence for Information Technology. The primary mission of RIACS is charted to carry out research and development in computer science. This work is devoted in the main to tasks that are strategically enabling with respect to NASA's bold mission in space exploration and aeronautics. There are three foci for this work: (1) Automated Reasoning. (2) Human-Centered Computing. and (3) High Performance Computing and Networking. RIACS has the additional goal of broadening the base of researcher in these areas of importance to the nation's space and aeronautics enterprises. Through its visiting scientist program, RIACS facilitates the participation of university-based researchers, including both faculty and students, in the research activities of NASA and RIACS. RIACS researchers work in close collaboration with NASA computer scientists on projects such as the Remote Agent Experiment on Deep Space One mission, and Super-Resolution Surface Modeling.

  18. Benchmarks: Reports of the NASA Science Institutes Team

    NASA Technical Reports Server (NTRS)

    Diaz, A. V.

    1995-01-01

    This report results from a benchmarking study undertaken by NASA as part of its planning for the possible creation of new science Institutes. Candidate Institutes under consideration cover a range of scientific and technological activities ranging from biomedical to astrophysical research and from the global hydrological cycle to microgravity material science. Should NASA create these Institutes, the intent will be to preserve and strengthen key science and technology activities now being performed by Government employees at NASA Field Centers. Because the success of these projected non-Government-operated Institutes is vital for the continued development of space science and applications, NASA has sought to identify the best practices of successful existing scientific and technological research institutions as they carry out those processes that will be most important for the new science Institutes. While many individuals and organizations may be interested in our findings, the primary use of this report will be to formulate plas for establishing the new science Institutes. As a result, the report is organized to that the "best practices" of the finest institutes are associated with characteristics of all institutes. These characteristics or "attributes" serve as the headings for the main body of this report.

  19. Information technology industry certification's impact on undergraduate student perception of instructor effectiveness

    NASA Astrophysics Data System (ADS)

    Andersson, David L.

    The field of Computer Information Systems (CIS) or Information Technology (IT) is experiencing rapid change. A 2003 study analyzing the IT degree programs and those of competing disciplines at 10 post-secondary institutions concluded that information technology programs are perceived differently from information systems and computer science programs and are significantly less focused on both math and pure science subjects. In Information Technology programs, voluntary professional certifications, generally known in the Information Technology field as "IT" certifications, are used as indicators of professional skill. A descriptive study noting one subject group's responses to items that were nearly identical except for IT certification information was done to investigate undergraduate CIS/IT student perceptions of IT industry certified instructors. The subject group was comprised of undergraduate CIS/IT students from a regionally accredited private institution and a public institution. The methodology was descriptive, based on a previous model by Dr. McKillip, Professor of Psychology, Southern Illinois University at Carbondale, utilizing a web-based survey instrument with a Likert scale, providing for voluntary anonymous responses outside the classroom over a ten day window. The results indicated that IT certification affected student perceptions of instructor effectiveness, teaching methodology, and student engagement in the class, and to a lesser degree, instructor technical qualifications. The implications suggest that additional research on this topic is merited. Although the study was not designed to examine the precise cause and effect, an important implication is that students may be motivated to attend classes taught by instructors they view as more confident and effective and that teachers with IT industry certification can better engage their students.

  20. An Integrative and Collaborative Approach to Creating a Diverse and Computationally Competent Geoscience Workforce

    NASA Astrophysics Data System (ADS)

    Moore, S. L.; Kar, A.; Gomez, R.

    2015-12-01

    A partnership between Fort Valley State University (FVSU), the Jackson School of Geosciences at The University of Texas (UT) at Austin, and the Texas Advanced Computing Center (TACC) is engaging computational geoscience faculty and researchers with academically talented underrepresented minority (URM) students, training them to solve grand challenges . These next generation computational geoscientists are being trained to solve some of the world's most challenging geoscience grand challenges requiring data intensive large scale modeling and simulation on high performance computers . UT Austin's geoscience outreach program GeoFORCE, recently awarded the Presidential Award in Excellence in Science, Mathematics and Engineering Mentoring, contributes to the collaborative best practices in engaging researchers with URM students. Collaborative efforts over the past decade are providing data demonstrating that integrative pipeline programs with mentoring and paid internship opportunities, multi-year scholarships, computational training, and communication skills development are having an impact on URMs developing middle skills for geoscience careers. Since 1997, the Cooperative Developmental Energy Program at FVSU and its collaborating universities have graduated 87 engineers, 33 geoscientists, and eight health physicists. Recruited as early as high school, students enroll for three years at FVSU majoring in mathematics, chemistry or biology, and then transfer to UT Austin or other partner institutions to complete a second STEM degree, including geosciences. A partnership with the Integrative Computational Education and Research Traineeship (ICERT), a National Science Foundation (NSF) Research Experience for Undergraduates (REU) Site at TACC provides students with a 10-week summer research experience at UT Austin. Mentored by TACC researchers, students with no previous background in computational science learn to use some of the world's most powerful high performance computing resources to address a grand geosciences problem. Students increase their ability to understand and explain the societal impact of their research and communicate the research to multidisciplinary and lay audiences via near-peer mentoring, poster presentations, and publication opportunities.

  1. Development of an Agile Knowledge Engineering Framework in Support of Multi-Disciplinary Translational Research

    PubMed Central

    Borlawsky, Tara B.; Dhaval, Rakesh; Hastings, Shannon L.; Payne, Philip R. O.

    2009-01-01

    In October 2006, the National Institutes of Health launched a new national consortium, funded through Clinical and Translational Science Awards (CTSA), with the primary objective of improving the conduct and efficiency of the inherently multi-disciplinary field of translational research. To help meet this goal, the Ohio State University Center for Clinical and Translational Science has launched a knowledge management initiative that is focused on facilitating widespread semantic interoperability among administrative, basic science, clinical and research computing systems, both internally and among the translational research community at-large, through the integration of domain-specific standard terminologies and ontologies with local annotations. This manuscript describes an agile framework that builds upon prevailing knowledge engineering and semantic interoperability methods, and will be implemented as part this initiative. PMID:21347164

  2. Development of an agile knowledge engineering framework in support of multi-disciplinary translational research.

    PubMed

    Borlawsky, Tara B; Dhaval, Rakesh; Hastings, Shannon L; Payne, Philip R O

    2009-03-01

    In October 2006, the National Institutes of Health launched a new national consortium, funded through Clinical and Translational Science Awards (CTSA), with the primary objective of improving the conduct and efficiency of the inherently multi-disciplinary field of translational research. To help meet this goal, the Ohio State University Center for Clinical and Translational Science has launched a knowledge management initiative that is focused on facilitating widespread semantic interoperability among administrative, basic science, clinical and research computing systems, both internally and among the translational research community at-large, through the integration of domain-specific standard terminologies and ontologies with local annotations. This manuscript describes an agile framework that builds upon prevailing knowledge engineering and semantic interoperability methods, and will be implemented as part this initiative.

  3. Creating the Public Connection: Interactive Experiences with Real-Time Earth and Space Science Data

    NASA Technical Reports Server (NTRS)

    Reiff, Patricia H.; Ledley, Tamara S.; Sumners, Carolyn; Wyatt, Ryan

    1995-01-01

    The Houston Museum of Natural Sciences is less than two miles from Rice University, a major hub on the Internet. This project links these two institutions so that NASA real-time data and imagery can flow via Rice to the Museum where it reaches the public in the form of planetarium programs, computer based interactive kiosks, and space and Earth science problem solving simulation. Through this program at least 200,000 visitors annually (including every 4th and 7th grader in the Houston Independent School District) will have direct exposure to the Earth and space research being conducted by NASA and available over the Internet. Each information conduit established between Rice University and the Houston Museum of Natural Science will become a model for public information dissemination that can be replicated nationally in museums, planetariums, Challenger Centers, and schools.

  4. Toward a Big Data Science: A challenge of "Science Cloud"

    NASA Astrophysics Data System (ADS)

    Murata, Ken T.; Watanabe, Hidenobu

    2013-04-01

    During these 50 years, along with appearance and development of high-performance computers (and super-computers), numerical simulation is considered to be a third methodology for science, following theoretical (first) and experimental and/or observational (second) approaches. The variety of data yielded by the second approaches has been getting more and more. It is due to the progress of technologies of experiments and observations. The amount of the data generated by the third methodologies has been getting larger and larger. It is because of tremendous development and programming techniques of super computers. Most of the data files created by both experiments/observations and numerical simulations are saved in digital formats and analyzed on computers. The researchers (domain experts) are interested in not only how to make experiments and/or observations or perform numerical simulations, but what information (new findings) to extract from the data. However, data does not usually tell anything about the science; sciences are implicitly hidden in the data. Researchers have to extract information to find new sciences from the data files. This is a basic concept of data intensive (data oriented) science for Big Data. As the scales of experiments and/or observations and numerical simulations get larger, new techniques and facilities are required to extract information from a large amount of data files. The technique is called as informatics as a fourth methodology for new sciences. Any methodologies must work on their facilities: for example, space environment are observed via spacecraft and numerical simulations are performed on super-computers, respectively in space science. The facility of the informatics, which deals with large-scale data, is a computational cloud system for science. This paper is to propose a cloud system for informatics, which has been developed at NICT (National Institute of Information and Communications Technology), Japan. The NICT science cloud, we named as OneSpaceNet (OSN), is the first open cloud system for scientists who are going to carry out their informatics for their own science. The science cloud is not for simple uses. Many functions are expected to the science cloud; such as data standardization, data collection and crawling, large and distributed data storage system, security and reliability, database and meta-database, data stewardship, long-term data preservation, data rescue and preservation, data mining, parallel processing, data publication and provision, semantic web, 3D and 4D visualization, out-reach and in-reach, and capacity buildings. Figure (not shown here) is a schematic picture of the NICT science cloud. Both types of data from observation and simulation are stored in the storage system in the science cloud. It should be noted that there are two types of data in observation. One is from archive site out of the cloud: this is a data to be downloaded through the Internet to the cloud. The other one is data from the equipment directly connected to the science cloud. They are often called as sensor clouds. In the present talk, we first introduce the NICT science cloud. We next demonstrate the efficiency of the science cloud, showing several scientific results which we achieved with this cloud system. Through the discussions and demonstrations, the potential performance of sciences cloud will be revealed for any research fields.

  5. 75 FR 39001 - Notice Inviting Comments on Priorities To Be Proposed to the National Board for Education...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-07

    ... Proposed to the National Board for Education Sciences of the Institute of Education Sciences AGENCY... be proposed to the National Board for Education Sciences of the Institute of Education Sciences... the work of the Institute. The National Board for Education Sciences (Board) must approve the...

  6. 78 FR 12369 - United States Government Policy for Institutional Oversight of Life Sciences Dual Use Research of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-22

    ... Oversight of Life Sciences Dual Use Research of Concern AGENCY: Office of Science and Technology Policy... comments on the proposed United States Government Policy for Institutional Oversight of Life Sciences Dual... requirements for certain categories of life sciences research at institutions that accept Federal funding for...

  7. Georgetown Institute for Cognitive and Computational Sciences

    DTIC Science & Technology

    2004-04-01

    lumbar DRG after formalin injection into the hindpaw. Dilute formalin (1.8%) was injected into the rat hindpaw and DRG were harvested 30 minutes later...staining (Figure 140, arrows) on the ipsilateral side to nerve crush. In the lumbar spinal cord, the site of sciatic innervation, there was a dramatic...Proteases in traumatic brain injury. Proieases in Biology and Disease, Volume 3.: Proteases in the Brain, Edited by Nigel Hooper and Uwe Lendeckel, in

  8. Implementation of a Fully-Balanced Periodic Tridiagonal Solver on a Parallel Distributed Memory Architecture

    DTIC Science & Technology

    1994-05-01

    PARALLEL DISTRIBUTED MEMORY ARCHITECTURE LTJh T. M. Eidson 0 - 8 l 9 5 " G. Erlebacher _ _ _. _ DTIe QUALITY INSPECTED a Contract NAS I - 19480 May 1994...DISTRIBUTED MEMORY ARCHITECTURE T.M. Eidson * High Technology Corporation Hampton, VA 23665 G. Erlebachert Institute for Computer Applications in Science and...developed and evaluated. Simple model calculations as well as timing results are pres.nted to evaluate the various strategies. The particular

  9. Uncovering and Managing the Impact of Methodological Choices for the Computational Construction of Socio-Technical Networks from Texts

    DTIC Science & Technology

    2012-09-01

    supported by the National Science Foundation (NSF) IGERT 9972762, the Army Research Institute (ARI) W91WAW07C0063, the Army Research Laboratory (ARL/CTA...prediction models in AutoMap .................................................. 144   Figure 13: Decision Tree for prediction model selection in...generated for nationally funded initiatives and made available through the Linguistic Data Consortium (LDC). An overview of these datasets is provided in

  10. The American Indian Summer Institute in Earth System Science (AISESS) at UC Irvine: A Two-Week Residential Summer Program for High School Students

    NASA Astrophysics Data System (ADS)

    Johnson, K. R.; Polequaptewa, N.; Leon, Y.

    2012-12-01

    Native Americans remain severely underrepresented in the geosciences, despite a clear need for qualified geoscience professionals within Tribal communities to address critical issues such as natural resource and land management, water and air pollution, and climate change. In addition to the need for geoscience professionals within Tribal communities, increased participation of Native Americans in the geosciences would enhance the overall diversity of perspectives represented within the Earth science community and lead to improved Earth science literacy within Native communities. To address this need, the Department of Earth System Science and the American Indian Resource Program at the University California have organized a two-week residential American Indian Summer Institute in Earth System Science (AISESS) for high-school students (grades 9-12) from throughout the nation. The format of the AISESS program is based on the highly-successful framework of a previous NSF Funded American Indian Summer Institute in Computer Science (AISICS) at UC Irvine and involves key senior personnel from the AISICS program. The AISESS program, however, incorporates a week of camping on the La Jolla Band of Luiseño Indians reservation in Northern San Diego County, California. Following the week of camping and field projects, the students spend a week on the campus of UC Irvine participating in Earth System Science lectures, laboratory activities, and tours. The science curriculum is closely woven together with cultural activities, native studies, and communication skills programs The program culminates with a closing ceremony during which students present poster projects on environmental issues relevant to their tribal communities. The inaugural AISESS program took place from July 15th-28th, 2012. We received over 100 applications from Native American high school students from across the nation. We accepted 40 students for the first year, of which 34 attended the program. The objective of the program is to introduce students to Earth System Science and, hopefully, inspire them to pursue Earth or Environmental Science degrees. Towards this end, we developed a fairly broad curriculum which will be presented here. Evaluation planning was conducted during the first quarter of 2012 during recruitment. A longitudinal database was established for the project to track college preparatory course-taking, GPA, school attendance, participation in earth science activities, and attitudes and interest in attending college and completing a degree after high school. Based on attendance during AISESS, schools and students will be selected as descriptive case studies. A pre-post design for evaluating the Summer Institute includes a survey about student background, attitudes, and knowledge about preparing to complete high school and attend college after graduation and focus groups of participants immediately after the Institute to capture qualitative data about their experiences in the field and at the University. Initial evaluation results will be presented here.

  11. eHealth Research from the User’s Perspective

    PubMed Central

    Hesse, Bradford W.; Shneiderman, Ben

    2007-01-01

    The application of Information Technology (IT) to issues of healthcare delivery has had a long and tortuous history in the U.S. Within the field of eHealth, vanguard applications of advanced computing techniques, such as applications in artificial intelligence or expert systems, have languished in spite of a track record of scholarly publication and decisional accuracy. The problem is one of purpose, of asking the right questions for the science to solve. Historically, many computer science pioneers have been tempted to ask “what can the computer do?” New advances in eHealth are prompting developers to ask “what can people do?” How can eHealth take part in national goals for healthcare reform to empower relationships between healthcare professionals and patients, healthcare teams and families, and hospitals and communities to improve health equitably throughout the population? To do this, eHealth researchers must combine best evidence from the user sciences (human factors engineering, human-computer interaction, psychology, and usability) with best evidence in medicine to create transformational improvements in the quality of care that medicine offers. These improvements should follow recommendations from the Institute of Medicine to create a health care system that is (a) safe, (b) effective (evidence-based), (c) patient-centered, and (d) timely. Relying on the eHealth researcher’s intuitive grasp of systems issues, improvements should be made with considerations of users and beneficiaries at the individual (patient/physician), group (family/staff), community, and broad environmental levels. PMID:17466825

  12. Know Your Discipline: Teaching the Philosophy of Computer Science

    ERIC Educational Resources Information Center

    Tedre, Matti

    2007-01-01

    The diversity and interdisciplinarity of computer science and the multiplicity of its uses in other sciences make it hard to define computer science and to prescribe how computer science should be carried out. The diversity of computer science also causes friction between computer scientists from different branches. Computer science curricula, as…

  13. A New Paradigm of Engineering Education for the 21st Century:Perspectives of Rose-Hulman Institute of Technology

    NASA Astrophysics Data System (ADS)

    Western, Arthur; Stamper, Richard

    Strategic initiatives for engineering education in the next decade as planned by the Rose-Hulman Institute of Technology are presented. The Rose-Hulman Institute of Technology is a private college in the United States that specializes in undergraduate engineering, mathematics and science education. The initiatives are in response to broad changes in the practice of the engineering profession in its modern global context. The initiatives comprise five strategic thrust areas and five programmatic themes. The thrust areas are: Energy and Environment; Health and Safety; Transportation; Materials; and Information, Computation, and Communication. The programmatic themes are: Excellence in Education; International Awareness; Business Awareness;Service Learning; and Life-long Learning. The objective of these initiatives is to prepare students to meet the challenges of the 21st century and to serve as leaders in society.

  14. The EDRN knowledge environment: an open source, scalable informatics platform for biological sciences research

    NASA Astrophysics Data System (ADS)

    Crichton, Daniel; Mahabal, Ashish; Anton, Kristen; Cinquini, Luca; Colbert, Maureen; Djorgovski, S. George; Kincaid, Heather; Kelly, Sean; Liu, David

    2017-05-01

    We describe here the Early Detection Research Network (EDRN) for Cancer's knowledge environment. It is an open source platform built by NASA's Jet Propulsion Laboratory with contributions from the California Institute of Technology, and Giesel School of Medicine at Dartmouth. It uses tools like Apache OODT, Plone, and Solr, and borrows heavily from JPL's Planetary Data System's ontological infrastructure. It has accumulated data on hundreds of thousands of biospecemens and serves over 1300 registered users across the National Cancer Institute (NCI). The scalable computing infrastructure is built such that we are being able to reach out to other agencies, provide homogeneous access, and provide seamless analytics support and bioinformatics tools through community engagement.

  15. Milestones toward Majorana-based quantum computing

    NASA Astrophysics Data System (ADS)

    Alicea, Jason

    Experiments on nanowire-based Majorana platforms now appear poised to move beyond the preliminary problem of zero-mode detection and towards loftier goals of realizing non-Abelian statistics and quantum information applications. Using an approach that synthesizes recent materials growth breakthroughs with tools long successfully deployed in quantum-dot research, I will outline a number of relatively modest milestones that progressively bridge the gap between the current state of the art and these grand longer-term challenges. The intermediate Majorana experiments surveyed in this talk should be broadly adaptable to other approaches as well. Supported by the National Science Foundation (DMR-1341822), Institute for Quantum Information and Matter, and Walter Burke Institute at Caltech.

  16. Exascale computing and what it means for shock physics

    NASA Astrophysics Data System (ADS)

    Germann, Timothy

    2015-06-01

    The U.S. Department of Energy is preparing to launch an Exascale Computing Initiative, to address the myriad challenges required to deploy and effectively utilize an exascale-class supercomputer (i.e., one capable of performing 1018 operations per second) in the 2023 timeframe. Since physical (power dissipation) requirements limit clock rates to at most a few GHz, this will necessitate the coordination of on the order of a billion concurrent operations, requiring sophisticated system and application software, and underlying mathematical algorithms, that may differ radically from traditional approaches. Even at the smaller workstation or cluster level of computation, the massive concurrency and heterogeneity within each processor will impact computational scientists. Through the multi-institutional, multi-disciplinary Exascale Co-design Center for Materials in Extreme Environments (ExMatEx), we have initiated an early and deep collaboration between domain (computational materials) scientists, applied mathematicians, computer scientists, and hardware architects, in order to establish the relationships between algorithms, software stacks, and architectures needed to enable exascale-ready materials science application codes within the next decade. In my talk, I will discuss these challenges, and what it will mean for exascale-era electronic structure, molecular dynamics, and engineering-scale simulations of shock-compressed condensed matter. In particular, we anticipate that the emerging hierarchical, heterogeneous architectures can be exploited to achieve higher physical fidelity simulations using adaptive physics refinement. This work is supported by the U.S. Department of Energy, Office of Science, Office of Advanced Scientific Computing Research.

  17. 75 FR 38100 - National Institute of Environmental Health Sciences Superfund Hazardous Substance Research and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-01

    ...: Notice. SUMMARY: The National Institute of Environmental Health Sciences (NIEHS), a research institute of... translation. Research translation fosters the movement of fundamental science toward a useable end-product. It... Innovation Promote transdisciplinary science. SRP firmly supports transdisciplinary research--the synthesis...

  18. Citizen Science and Event-Based Science Education with the Quake-Catcher Network

    NASA Astrophysics Data System (ADS)

    DeGroot, R. M.; Sumy, D. F.; Benthien, M. L.

    2017-12-01

    The Quake-Catcher Network (QCN, quakecatcher.net) is a collaborative, citizen-science initiative to develop the world's largest, low-cost strong-motion seismic network through the utilization of sensors in laptops and smartphones or small microelectromechanical systems (MEMS) accelerometers attached to internet-connected computers. The volunteer computers monitor seismic motion and other vibrations and send the "triggers" in real-time to the QCN server hosted at the University of Southern California. The QCN servers sift through these signals and determine which ones represent earthquakes and which ones represent cultural noise. Data collected by the Quake-Catcher Network can contribute to better understanding earthquakes, provide teachable moments for students, and engage the public with authentic science experiences. QCN partners coordinate sensor installations, develop QCN's scientific tools and engagement activities, and create next generation online resources. In recent years, the QCN team has installed sensors in over 225 K-12 schools and free-choice learning institutions (e.g. museums) across the United States and Canada. One of the current goals of the program in the United States is to establish several QCN stations in K-12 schools around a local museum hub as a means to provide coordinated and sustained educational opportunities leading up to the yearly Great ShakeOut Earthquake Drill, to encourage citizen science, and enrich STEM curriculum. Several school districts and museums throughout Southern California have been instrumental in the development of QCN. For educators QCN fulfills a key component of the Next Generation Science Standards where students are provided an opportunity to utilize technology and interface with authentic scientific data and learn about emerging programs such as the ShakeAlert earthquake early warning system. For example, Sunnylands Center in Rancho Mirage, CA leads Coachella Valley Hub, which serves 31 K-12 schools, many of which are within kilometers of the San Andreas fault. Sunnylands established contact with the schools and organized the installations. Since 2016, representatives from the Incorporated Research Institutions for Seismology (IRIS), the Southern California Earthquake Center (SCEC), and the U.S. Geological Survey manage QCN.

  19. Political Science Careers at Comprehensive Universities: Building Balanced Careers at "Greedy" Institutions

    ERIC Educational Resources Information Center

    Hendrickson, Ryan C.; Mueller, Melinda A.; Strand, Jonathan R.

    2011-01-01

    A considerable amount of research exists about political science careers at community colleges and liberal arts institutions, as well as about training and hiring practices across different types of institutions. However, there is virtually no commentary available on political science careers at comprehensive institutions, where a significant…

  20. PREFACE: Rusnanotech 2010 International Forum on Nanotechnology

    NASA Astrophysics Data System (ADS)

    Kazaryan, Konstantin

    2011-03-01

    The Rusnanotech 2010 International Forum on Nanotechnology was held from November 1-3, 2010, in Moscow, Russia. It was the third forum organized by RUSNANO (Russian Corporation of Nanotechnologies) since 2008. In March 2011 RUSNANO was established as an open joint-stock company through the reorganization of the state corporation Russian Corporation of Nanotechnologies. RUSNANO's mission is to develop the Russian nanotechnology industry through co-investment in nanotechnology projects with substantial economic potential or social benefit. Within the framework of the Forum Science and Technology Program, presentations on key trends of nanotechnology development were given by foreign and Russian scientists, R&D officers of leading international companies, universities and scientific centers. The science and technology program of the Forum was divided into eight sections as follows (by following hyperlinks you may find each section's program including videos of all oral presentations): Catalysis and Chemical Industry Nanobiotechnology Nanodiagnostics Nanoelectronics Nanomaterials Nanophotonics Nanotechnolgy In The Energy Industry Nanotechnology in Medicine The scientific program of the forum included 115 oral presentations by leading scientists from 15 countries. Among them in the "Nanomaterials" section was the lecture by Dr Konstantin Novoselov, winner of the Nobel Prize in Physics 2010. The poster session consisted of over 500 presentations, 300 of which were presented in the framework of the young scientists' nanotechnology papers competition. This volume of the Journal of Physics: Conference Series includes a selection of 57 submissions. The scientific program committee: Prof Zhores Alferov, AcademicianVice-president of Russian Academy of Sciences, Nobel Prize winner, Russia, Chairman of the Program CommitteeProf Sergey Deev, Corresponding Member of Russian Academy of SciencesHead of the Laboratory of Molecular Immunology, M M Shemyakin and Yu A Ovchinnikov Institute of Bioorganic Chemistry, Russian Academy of Sciences, Russia, Deputy Chairman of the Program CommitteeProf Alexander Aseev, AcademicianVice-president of Russian Academy of Sciences Director, A V Rzhanov-Institute of Semiconductor Physics, Siberian Branch of Russian Academy of Sciences, RussiaProf Sergey Bagaev, AcademicianDirector, Institute of Laser Physics, Siberian Branch of Russian Academy of Sciences, RussiaProf Alexander Gintsburg, Ademician, Russian Academy of Medical SciencesDirector Gamaleya Research Institute of Epidemiology and Microbiology, Russian Academy of Medical Sciences, RussiaProf Anatoly Grigoryev, Academician, Russian Academy of Sciences, Russian Academy of Medical SciencesVice-president, Russian Academy of Medical Sciences, RussiaProf Michael Kovalchuk, RAS Corresponding MemberDirector, Kurchatov Institute Russian Scientific Center, RussiaProf Valery Lunin, AcademicianDean, Department of Chemistry, Lomonosov Moscow State University, RussiaProf Valentin Parmon, Academician, DirectorBoreskov Institute of Catalysis, Siberian Branch of Russian Academy of Sciences, RussiaProf Rem Petrov, AcademicianAdvisor, Russian Academy of Sciences, RussiaProf Konstantin Skryabin, AcademicianDirector, Bioinzheneriya Center, Russian Academy of Sciences, RussiaProf Vsevolod Tkachuk, Academician, Russian Academy of Sciences, Russian Academy of Medical SciencesDean, Faculty of Fundamental Medicine, Lomonosov Moscow State University, RussiaProf Vladimir Fortov, AcademicianDirector, Joint Institute for High Temperatures, Russian Academy of Sciences, RussiaProf Alexey Khokhlov, AcademicianVice Principal, Head of Innovation, Information and International Scientific Affairs Department, Lomonosov Moscow State University, RussiaProf Valery Bukhtiyarov, RAS Corresponding MemberDirector, Physicochemical Research Methods Dept., Boreskov Institute of Catalysis, Siberian Branch of Russian Academy of Sciences, RussiaProf Anatoly Dvurechensky, RAS Corresponding MemberDeputy Director, Institute of Semiconductor Physics, Siberian Branch of Russian Academy of Sciences, RussiaProf Vladimir Kvardakov, Corresponding Member of Russian Academy of SciencesExecutive Director, Kurchatov Center of Synchrotron Radiation and Nanotechnology, RussiaProf Edward Son, Corresponding member of Russian Academy of SciencesScientific Deputy Director, Joint Institute for High Temperatures, Russian Academy of Sciences, RussiaProf Andrey GudkovSenior Vice President, Basic Science Chairman, Department of Cell Stress Biology, Roswell Park Cancer Institute, USAProf Robert NemanichChair, Department of Physics, Arizona State University, USAProf Kandlikar SatishProfessor, Rochester Institute of Technology, USAProf Xiang ZhangUC Berkeley, Director of NSF Nano-scale Science and Engineering Center (NSEC), USAProf Andrei ZvyaginProfessor, Macquarie University, AustraliaProf Sergey KalyuzhnyDirector of the Scientific and Technological Expertise Department, RUSNANO, RussiaKonstantin Kazaryan, PhDExpert of the Scientific and Technological Expertise Department, RUSNANO, Russia, Program Committee SecretarySimeon ZhavoronkovHead of Nanotechnology Programs Development Office, Rusnanotech Forum Fund for the Nanotechnology Development, Russia Editors of the proceedings: Section "Nanoelectronics" - Corresponding Member of Russian Academy of Sciences, Professor Anatoly Dvurechenskii (Institute of Semiconductor Physics, RAS).Section "Nanophotonics" - Professor Vasily Klimov (Institute of Physics, RAS).Section "Nanodiagnostics" - Professor P Kashkarov (Russian Scientific Center, Kurchatov Institute).Section "Nanotechnology for power engineering" - Corresponding Member of Russian Academy of Sciences, Professor Eduard Son (Joint Institute for High Temperatures, RAS).Section "Catalysis and chemical industry" - Member of Russian Academy of Sciences, Professor Valentin Parmon (Institute of Catalysis SB RAS).Section "Nanomaterials" - E Obraztsova, PhD (Institute of Physics, RAS), Marat Gallamov PhD (Moscow State University).Section "Nanotechnology in medicine" - Denis Logunov, PhD (Gamaleya Research Institute of Epidemiology and Microbiology, RAMS).Section "Nanobiotechnology" - Member of Russian Academy of Sciences, Professor Konstantin Skryabin (Bioengineering Center, RAS), Member of Russian Academy of Sciences, Professor Rem Petrov (RAS), Corresponding Member of Russian Academy of Sciences, Professor Sergey Deev (Institute of Bioorganic Chemistry).

  1. A Computing Environment to Support Repeatable Scientific Big Data Experimentation of World-Wide Scientific Literature

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schlicher, Bob G; Kulesz, James J; Abercrombie, Robert K

    A principal tenant of the scientific method is that experiments must be repeatable and relies on ceteris paribus (i.e., all other things being equal). As a scientific community, involved in data sciences, we must investigate ways to establish an environment where experiments can be repeated. We can no longer allude to where the data comes from, we must add rigor to the data collection and management process from which our analysis is conducted. This paper describes a computing environment to support repeatable scientific big data experimentation of world-wide scientific literature, and recommends a system that is housed at the Oakmore » Ridge National Laboratory in order to provide value to investigators from government agencies, academic institutions, and industry entities. The described computing environment also adheres to the recently instituted digital data management plan mandated by multiple US government agencies, which involves all stages of the digital data life cycle including capture, analysis, sharing, and preservation. It particularly focuses on the sharing and preservation of digital research data. The details of this computing environment are explained within the context of cloud services by the three layer classification of Software as a Service , Platform as a Service , and Infrastructure as a Service .« less

  2. How can we improve Science, Technology, Engineering, and Math education to encourage careers in Biomedical and Pathology Informatics?

    PubMed

    Uppal, Rahul; Mandava, Gunasheil; Romagnoli, Katrina M; King, Andrew J; Draper, Amie J; Handen, Adam L; Fisher, Arielle M; Becich, Michael J; Dutta-Moscato, Joyeeta

    2016-01-01

    The Computer Science, Biology, and Biomedical Informatics (CoSBBI) program was initiated in 2011 to expose the critical role of informatics in biomedicine to talented high school students.[1] By involving them in Science, Technology, Engineering, and Math (STEM) training at the high school level and providing mentorship and research opportunities throughout the formative years of their education, CoSBBI creates a research infrastructure designed to develop young informaticians. Our central premise is that the trajectory necessary to be an expert in the emerging fields of biomedical informatics and pathology informatics requires accelerated learning at an early age.In our 4(th) year of CoSBBI as a part of the University of Pittsburgh Cancer Institute (UPCI) Academy (http://www.upci.upmc.edu/summeracademy/), and our 2nd year of CoSBBI as an independent informatics-based academy, we enhanced our classroom curriculum, added hands-on computer science instruction, and expanded research projects to include clinical informatics. We also conducted a qualitative evaluation of the program to identify areas that need improvement in order to achieve our goal of creating a pipeline of exceptionally well-trained applicants for both the disciplines of pathology informatics and biomedical informatics in the era of big data and personalized medicine.

  3. The Construction of a New Science by Means of an Institute and Its Communication Media: The Institute of Educational Sciences in Geneva (1912-1948)

    ERIC Educational Resources Information Center

    Hofstetter, Rita

    2004-01-01

    This article is aimed at understanding some mechanisms implied in the construction of educational sciences as a disciplinary field in interaction with other social fields. Its object is the Genevan "Institut des sciences de l'education" from 1912 to 1948. This institution is analyzed from two main points of view: the foundation and…

  4. ACToR-Aggregated Computational Resource | Science ...

    EPA Pesticide Factsheets

    ACToR (Aggregated Computational Toxicology Resource) is a database and set of software applications that bring into one central location many types and sources of data on environmental chemicals. Currently, the ACToR chemical database contains information on chemical structure, in vitro bioassays and in vivo toxicology assays derived from more than 150 sources including the U.S. Environmental Protection Agency (EPA), Centers for Disease Control (CDC), U.S. Food & Drug Administration (FDA), National Institutes of Health (NIH), state agencies, corresponding government agencies in Canada, Europe and Japan, universities, the World Health Organization (WHO) and non-governmental organizations (NGOs). At the EPA National Center for Computational Toxicology, ACToR helps manage large data sets being used in a high throughput environmental chemical screening and prioritization program called ToxCast(TM).

  5. White paper: A plan for cooperation between NASA and DARPA to establish a center for advanced architectures

    NASA Technical Reports Server (NTRS)

    Denning, P. J.; Adams, G. B., III; Brown, R. L.; Kanerva, P.; Leiner, B. M.; Raugh, M. R.

    1986-01-01

    Large, complex computer systems require many years of development. It is recognized that large scale systems are unlikely to be delivered in useful condition unless users are intimately involved throughout the design process. A mechanism is described that will involve users in the design of advanced computing systems and will accelerate the insertion of new systems into scientific research. This mechanism is embodied in a facility called the Center for Advanced Architectures (CAA). CAA would be a division of RIACS (Research Institute for Advanced Computer Science) and would receive its technical direction from a Scientific Advisory Board established by RIACS. The CAA described here is a possible implementation of a center envisaged in a proposed cooperation between NASA and DARPA.

  6. 78 FR 66370 - National Institute of General Medical Sciences; Notice of Closed Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-05

    ... General Medical Sciences; Notice of Closed Meetings Pursuant to section 10(d) of the Federal Advisory... Sciences Special Emphasis Panel; Peer Review of SCORE Grant Applications. Date: November 15, 2013. Time: 8... Officer, Office of Scientific Review, National Institute of General Medical Sciences, National Institutes...

  7. 77 FR 19678 - National Institute of General Medical Sciences; Notice of Closed Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-02

    ... General Medical Sciences; Notice of Closed Meeting Pursuant to section 10(d) of the Federal Advisory... Sciences Special Emphasis Panel; NIH Loan Repayment Program for Clinical and Pediatric Research. Date... Scientific Review, National Institute of General Medical Sciences, National Institutes of Health, 45 Center...

  8. 76 FR 10911 - National Institute of General Medical Sciences; Notice of Closed Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-28

    ... General Medical Sciences; Notice of Closed Meeting Pursuant to section 10(d) of the Federal Advisory... Sciences Special Emphasis Panel; Review of Minority Biomedical Research Support Applications. Date: March... Review, National Institute of General Medical Sciences, National Institutes of Health, 45 Center Drive...

  9. 75 FR 5771 - Institute of Education Sciences; Overview Information; Education Research and Special Education...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-04

    ... Sciences (Institute) announces the Institute's FY 2011 competitions for grants to support education research and special education research. The Director takes this action under the Education Sciences Reform... mathematics or science. The data for this annual measure are based on What Works Clearinghouse (WWC) reviews...

  10. PREFACE: IUPAP C20 Conference on Computational Physics (CCP 2011)

    NASA Astrophysics Data System (ADS)

    Troparevsky, Claudia; Stocks, George Malcolm

    2012-12-01

    Increasingly, computational physics stands alongside experiment and theory as an integral part of the modern approach to solving the great scientific challenges of the day on all scales - from cosmology and astrophysics, through climate science, to materials physics, and the fundamental structure of matter. Computational physics touches aspects of science and technology with direct relevance to our everyday lives, such as communication technologies and securing a clean and efficient energy future. This volume of Journal of Physics: Conference Series contains the proceedings of the scientific contributions presented at the 23rd Conference on Computational Physics held in Gatlinburg, Tennessee, USA, in November 2011. The annual Conferences on Computational Physics (CCP) are dedicated to presenting an overview of the most recent developments and opportunities in computational physics across a broad range of topical areas and from around the world. The CCP series has been in existence for more than 20 years, serving as a lively forum for computational physicists. The topics covered by this conference were: Materials/Condensed Matter Theory and Nanoscience, Strongly Correlated Systems and Quantum Phase Transitions, Quantum Chemistry and Atomic Physics, Quantum Chromodynamics, Astrophysics, Plasma Physics, Nuclear and High Energy Physics, Complex Systems: Chaos and Statistical Physics, Macroscopic Transport and Mesoscopic Methods, Biological Physics and Soft Materials, Supercomputing and Computational Physics Teaching, Computational Physics and Sustainable Energy. We would like to take this opportunity to thank our sponsors: International Union of Pure and Applied Physics (IUPAP), IUPAP Commission on Computational Physics (C20), American Physical Society Division of Computational Physics (APS-DCOMP), Oak Ridge National Laboratory (ORNL), Center for Defect Physics (CDP), the University of Tennessee (UT)/ORNL Joint Institute for Computational Sciences (JICS) and Cray, Inc. We are grateful to the committees that helped put the conference together, especially the local organizing committee. Particular thanks are also due to a number of ORNL staff who spent long hours with the administrative details. We are pleased to express our thanks to the conference administrator Ann Strange (ORNL/CDP) for her responsive and efficient day-to-day handling of this event, Sherry Samples, Assistant Conference Administrator (ORNL), Angie Beach and the ORNL Conference Office, and Shirley Shugart (ORNL) and Fern Stooksbury (ORNL) who created and maintained the conference website. Editors: G Malcolm Stocks (ORNL) and M Claudia Troparevsky (UT) http://ccp2011.ornl.gov Chair: Dr Malcolm Stocks (ORNL) Vice Chairs: Adriana Moreo (ORNL/UT) James Guberrnatis (LANL) Local Program Committee: Don Batchelor (ORNL) Jack Dongarra (UTK/ORNL) James Hack (ORNL) Robert Harrison (ORNL) Paul Kent (ORNL) Anthony Mezzacappa (ORNL) Adriana Moreo (ORNL) Witold Nazarewicz (UT) Loukas Petridis (ORNL) David Schultz (ORNL) Bill Shelton (ORNL) Claudia Troparevsky (ORNL) Mina Yoon (ORNL) International Advisory Board Members: Joan Adler (Israel Institute of Technology, Israel) Constantia Alexandrou (University of Cyprus, Cyprus) Claudia Ambrosch-Draxl (University of Leoben, Austria) Amanda Barnard (CSIRO, Australia) Peter Borcherds (University of Birmingham, UK) Klaus Cappelle (UFABC, Brazil) Giovanni Ciccotti (Università degli Studi di Roma 'La Sapienza', Italy) Nithaya Chetty (University of Pretoria, South Africa) Charlotte Froese-Fischer (NIST, US) Giulia A. Galli (University of California, Davis, US) Gillian Gehring (University of Sheffield, UK) Guang-Yu Guo (National Taiwan University, Taiwan) Sharon Hammes-Schiffer (Penn State, US) Alex Hansen (Norweigan UST) Duane D. Johnson (University of Illinois at Urbana-Champaign, US) David Landau (University of Georgia, US) Joaquin Marro (University of Granada, Spain) Richard Martin (UIUC, US) Todd Martinez (Stanford University, US) Bill McCurdy (Lawrence Berkeley National Laboratory, US) Ingrid Mertig (Martin Luther University, Germany) Alejandro Muramatsu (Universitat Stuttgart, Germany) Richard Needs (Cavendish Laboratory, UK) Giuseppina Orlandini (University of Trento, Italy) Martin Savage (University of Washington, US) Thomas Schulthess (ETH, Switzerland) Dzidka Szotek (Daresbury Laboratory, UK) Hideaki Takabe (Osaka University, Japan) William M. Tang (Princeton University, US) James Vary (Iowa State, US) Enge Wang (Chinese Academy of Science, China) Jian-Guo Wang (Institute of Applied Physics and Computational Mathematics, China) Jian-Sheng Wang (National University, Singapore) Dan Wei (Tsinghua University, China) Tony Williams (University of Adelaide, Australia) Rudy Zeller (Julich, Germany) Conference Administrator: Ann Strange (ORNL)

  11. Predictive Model and Methodology for Heat Treatment Distortion Final Report CRADA No. TC-298-92

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nikkel, D. J.; McCabe, J.

    This project was a multi-lab, multi-partner CRADA involving LLNL, Los Alamos National Laboratory, Sandia National Laboratories, Oak Ridge National Laboratory, Martin Marietta Energy Systems and the industrial partner, The National Center of Manufacturing Sciences (NCMS). A number of member companies of NCMS participated including General Motors Corporation, Ford Motor Company, The Torrington Company, Gear Research, the Illinois Institute of Technology Research Institute, and Deformation Control Technology •. LLNL was the lead laboratory for metrology technology used for validation of the computational tool/methodology. LLNL was also the lead laboratory for the development of the software user interface , for the computationalmore » tool. This report focuses on the participation of LLNL and NCMS. The purpose of the project was to develop a computational tool/methodology that engineers would use to predict the effects of heat treatment on the _size and shape of industrial parts made of quench hardenable alloys. Initially, the target application of the tool was gears for automotive power trains.« less

  12. Earth and Space Science Ph.D. Class of 2003 Report released

    NASA Astrophysics Data System (ADS)

    Keelor, Brad

    AGU and the American Geological Institute (AGI) released on 26 July an employment study of 180 Earth and space science Ph.D. recipients who received degrees from U.S. universities in 2003. The AGU/AGI survey asked graduates about their education and employment, efforts to find their first job after graduation, and experiences in graduate school. Key results from the study include: The vast majority (87%) of 2003 graduates found work in the Earth and space sciences, earning salaries commensurate with or slightly higher than 2001 and 2002 salary averages. Most (64%) graduates were employed within academia (including postdoctoral appointments), with the remainder in government (19%), industry (10%), and other (7%) sectors. Most graduates were positive about their employment situation and found that their work was challenging, relevant, and appropriate for someone with a Ph.D. The percentage of Ph.D. recipients accepting postdoctoral positions (58%) increased slightly from 2002. In contrast, the fields of physics and chemistry showed significant increases in postdoctoral appointments for Ph.D.s during the same time period. As in previous years, recipients of Ph.D.s in the Earth, atmospheric, and ocean sciences (median age of 32.7 years) are slightly older than Ph.D. recipients in most other natural sciences (except computer sciences), which is attributed to time taken off between undergraduate and graduate studies. Women in the Earth, atmospheric,and ocean sciences earned 33% of Ph.D.s in the class of 2003, surpassing the percentage of Ph.D.s earned by women in chemistry (32%) and well ahead of the percentage in computer sciences (20%), physics (19%), and engineering (17%). Participation of other underrepresented groups in the Earth, atmospheric, and ocean sciences remained extremely low.

  13. The SCEC Community Modeling Environment (SCEC/CME) - An Overview of its Architecture and Current Capabilities

    NASA Astrophysics Data System (ADS)

    Maechling, P. J.; Jordan, T. H.; Minster, B.; Moore, R.; Kesselman, C.; SCEC ITR Collaboration

    2004-12-01

    The Southern California Earthquake Center (SCEC), in collaboration with the San Diego Supercomputer Center, the USC Information Sciences Institute, the Incorporated Research Institutions for Seismology, and the U.S. Geological Survey, is developing the Southern California Earthquake Center Community Modeling Environment (CME) under a five-year grant from the National Science Foundation's Information Technology Research (ITR) Program jointly funded by the Geosciences and Computer and Information Science & Engineering Directorates. The CME system is an integrated geophysical simulation modeling framework that automates the process of selecting, configuring, and executing models of earthquake systems. During the Project's first three years, we have performed fundamental geophysical and information technology research and have also developed substantial system capabilities, software tools, and data collections that can help scientist perform systems-level earthquake science. The CME system provides collaborative tools to facilitate distributed research and development. These collaborative tools are primarily communication tools, providing researchers with access to information in ways that are convenient and useful. The CME system provides collaborators with access to significant computing and storage resources. The computing resources of the Project include in-house servers, Project allocations on USC High Performance Computing Linux Cluster, as well as allocations on NPACI Supercomputers and the TeraGrid. The CME system provides access to SCEC community geophysical models such as the Community Velocity Model, Community Fault Model, Community Crustal Motion Model, and the Community Block Model. The organizations that develop these models often provide access to them so it is not necessary to use the CME system to access these models. However, in some cases, the CME system supplements the SCEC community models with utility codes that make it easier to use or access these models. In some cases, the CME system also provides alternatives to the SCEC community models. The CME system hosts a collection of community geophysical software codes. These codes include seismic hazard analysis (SHA) programs developed by the SCEC/USGS OpenSHA group. Also, the CME system hosts anelastic wave propagation codes including Kim Olsen's Finite Difference code and Carnegie Mellon's Hercules Finite Element tool chain. The CME system can execute a workflow, that is, a series of geophysical computations using the output of one processing step as the input to a subsequent step. Our workflow capability utilizes grid-based computing software that can submit calculations to a pool of computing resources as well as data management tools that help us maintain an association between data files and metadata descriptions of those files. The CME system maintains, and provides access to, a collection of valuable geophysical data sets. The current CME Digital Library holdings include a collection of 60 ground motion simulation results calculated by a SCEC/PEER working group and a collection of Greens Functions calculated for 33 TriNet broadband receiver sites in the Los Angeles area.

  14. Chemistry in the News: 1998 Nobel Prizes in Chemistry and Medicine

    NASA Astrophysics Data System (ADS)

    Miller, Jennifer B.

    1999-01-01

    The Royal Swedish Academy of Sciences has awarded the 1998 Nobel Prize in Chemistry to Walter Kohn (University of California at Santa Barbara) for his development of the density-functional theory and to John A. Pople (Northwestern University at Evanston, Illinois) for his development of computational methods in quantum chemistry. The Nobel Assembly at the Karolinska Institute has awarded the 1998 Nobel Prize in Physiology or Medicine jointly to Robert F. Fuchgott (State University of New York Health Science Center at Brooklyn), Louis J. Ignarro (University of California at Los Angeles), and Ferid Murad (University of Texas Medical School at Houston) for identifying nitric oxide as a key biological signaling molecule in the cardiovascular system.

  15. Computational Study of Breathing-type Processes in Driven, Confined, Granular Alignments

    DTIC Science & Technology

    2012-04-17

    Government of India, Title: : “Newton’s cradle, Fermi, Pasta , Ulam chain & the nonlinear many body frontier,” June 29, 2011 2. Physics Seminar, Indian...Institute of Science, Bangalore, India, Title: “Newton’s cradle, Fermi, Pasta , Ulam chain & the nonlinear many body frontier,” June 30, 2011 3. Physics...Department Colloquium, SUNY Buffalo, Title: “Newton’s cradle, Fermi, Pasta , Ulam chain & the nonlinear many body frontier,” January 20, 2011. 4

  16. Learning to Obtain Reward, but Not Avoid Punishment, Is Affected by Presence of PTSD Symptoms in Male Veterans: Empirical Data and Computational Model

    DTIC Science & Technology

    2013-08-27

    University of New Jersey, Newark, New Jersey, United States of America, 3 Department of Psychology , Rutgers, The State University of New Jersey...United States of America, 5 Marcs Institute for Brain and Behaviour & School of Social Sciences and Psychology , University of Western Sydney, Sydney...for current, severe PTSD symptoms (PTSS) were tested on a probabilistic classification task [19] that interleaves reward learning and punishment

  17. Shake, Rattle and Roles: Lessons from Experimental Earthquake Engineering for Incorporating Remote Users in Large-Scale E-Science Experiments

    DTIC Science & Technology

    2007-01-01

    Mechanical Turk: Artificial Artificial Intelligence . Retrieved May 15, 2006 from http://www.mturk.com/ mturk/welcome Atkins, D. E., Droegemeier, K. K...Turk (Amazon, 2006) site goes beyond volunteers and pays people to do Human Intelligence Tasks, those that are difficult for computers but relatively...geographically distributed scientific collaboration, and the use of videogame technology for training. Address: U.S. Army Research Institute, 2511 Jefferson

  18. 78 FR 32259 - National Institute of Environmental Health Sciences; Amended Notice of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-29

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health National Institute of Environmental Health Sciences; Amended Notice of Meeting Notice is hereby given of a change in the meeting of the National Institute of Environmental Health Sciences Special Emphasis Panel, July 15, 2013, 8:00 a...

  19. An immersed boundary method for modeling a dirty geometry data

    NASA Astrophysics Data System (ADS)

    Onishi, Keiji; Tsubokura, Makoto

    2017-11-01

    We present a robust, fast, and low preparation cost immersed boundary method (IBM) for simulating an incompressible high Re flow around highly complex geometries. The method is achieved by the dispersion of the momentum by the axial linear projection and the approximate domain assumption satisfying the mass conservation around the wall including cells. This methodology has been verified against an analytical theory and wind tunnel experiment data. Next, we simulate the problem of flow around a rotating object and demonstrate the ability of this methodology to the moving geometry problem. This methodology provides the possibility as a method for obtaining a quick solution at a next large scale supercomputer. This research was supported by MEXT as ``Priority Issue on Post-K computer'' (Development of innovative design and production processes) and used computational resources of the K computer provided by the RIKEN Advanced Institute for Computational Science.

  20. Biomolecular computers with multiple restriction enzymes.

    PubMed

    Sakowski, Sebastian; Krasinski, Tadeusz; Waldmajer, Jacek; Sarnik, Joanna; Blasiak, Janusz; Poplawski, Tomasz

    2017-01-01

    The development of conventional, silicon-based computers has several limitations, including some related to the Heisenberg uncertainty principle and the von Neumann "bottleneck". Biomolecular computers based on DNA and proteins are largely free of these disadvantages and, along with quantum computers, are reasonable alternatives to their conventional counterparts in some applications. The idea of a DNA computer proposed by Ehud Shapiro's group at the Weizmann Institute of Science was developed using one restriction enzyme as hardware and DNA fragments (the transition molecules) as software and input/output signals. This computer represented a two-state two-symbol finite automaton that was subsequently extended by using two restriction enzymes. In this paper, we propose the idea of a multistate biomolecular computer with multiple commercially available restriction enzymes as hardware. Additionally, an algorithmic method for the construction of transition molecules in the DNA computer based on the use of multiple restriction enzymes is presented. We use this method to construct multistate, biomolecular, nondeterministic finite automata with four commercially available restriction enzymes as hardware. We also describe an experimental applicaton of this theoretical model to a biomolecular finite automaton made of four endonucleases.

  1. Cyberinfrastructure for Open Science at the Montreal Neurological Institute

    PubMed Central

    Das, Samir; Glatard, Tristan; Rogers, Christine; Saigle, John; Paiva, Santiago; MacIntyre, Leigh; Safi-Harab, Mouna; Rousseau, Marc-Etienne; Stirling, Jordan; Khalili-Mahani, Najmeh; MacFarlane, David; Kostopoulos, Penelope; Rioux, Pierre; Madjar, Cecile; Lecours-Boucher, Xavier; Vanamala, Sandeep; Adalat, Reza; Mohaddes, Zia; Fonov, Vladimir S.; Milot, Sylvain; Leppert, Ilana; Degroot, Clotilde; Durcan, Thomas M.; Campbell, Tara; Moreau, Jeremy; Dagher, Alain; Collins, D. Louis; Karamchandani, Jason; Bar-Or, Amit; Fon, Edward A.; Hoge, Rick; Baillet, Sylvain; Rouleau, Guy; Evans, Alan C.

    2017-01-01

    Data sharing is becoming more of a requirement as technologies mature and as global research and communications diversify. As a result, researchers are looking for practical solutions, not only to enhance scientific collaborations, but also to acquire larger amounts of data, and to access specialized datasets. In many cases, the realities of data acquisition present a significant burden, therefore gaining access to public datasets allows for more robust analyses and broadly enriched data exploration. To answer this demand, the Montreal Neurological Institute has announced its commitment to Open Science, harnessing the power of making both clinical and research data available to the world (Owens, 2016a,b). As such, the LORIS and CBRAIN (Das et al., 2016) platforms have been tasked with the technical challenges specific to the institutional-level implementation of open data sharing, including: Comprehensive linking of multimodal data (phenotypic, clinical, neuroimaging, biobanking, and genomics, etc.)Secure database encryption, specifically designed for institutional and multi-project data sharing, ensuring subject confidentiality (using multi-tiered identifiers).Querying capabilities with multiple levels of single study and institutional permissions, allowing public data sharing for all consented and de-identified subject data.Configurable pipelines and flags to facilitate acquisition and analysis, as well as access to High Performance Computing clusters for rapid data processing and sharing of software tools.Robust Workflows and Quality Control mechanisms ensuring transparency and consistency in best practices.Long term storage (and web access) of data, reducing loss of institutional data assets.Enhanced web-based visualization of imaging, genomic, and phenotypic data, allowing for real-time viewing and manipulation of data from anywhere in the world.Numerous modules for data filtering, summary statistics, and personalized and configurable dashboards. Implementing the vision of Open Science at the Montreal Neurological Institute will be a concerted undertaking that seeks to facilitate data sharing for the global research community. Our goal is to utilize the years of experience in multi-site collaborative research infrastructure to implement the technical requirements to achieve this level of public data sharing in a practical yet robust manner, in support of accelerating scientific discovery. PMID:28111547

  2. Cyberinfrastructure for Open Science at the Montreal Neurological Institute.

    PubMed

    Das, Samir; Glatard, Tristan; Rogers, Christine; Saigle, John; Paiva, Santiago; MacIntyre, Leigh; Safi-Harab, Mouna; Rousseau, Marc-Etienne; Stirling, Jordan; Khalili-Mahani, Najmeh; MacFarlane, David; Kostopoulos, Penelope; Rioux, Pierre; Madjar, Cecile; Lecours-Boucher, Xavier; Vanamala, Sandeep; Adalat, Reza; Mohaddes, Zia; Fonov, Vladimir S; Milot, Sylvain; Leppert, Ilana; Degroot, Clotilde; Durcan, Thomas M; Campbell, Tara; Moreau, Jeremy; Dagher, Alain; Collins, D Louis; Karamchandani, Jason; Bar-Or, Amit; Fon, Edward A; Hoge, Rick; Baillet, Sylvain; Rouleau, Guy; Evans, Alan C

    2016-01-01

    Data sharing is becoming more of a requirement as technologies mature and as global research and communications diversify. As a result, researchers are looking for practical solutions, not only to enhance scientific collaborations, but also to acquire larger amounts of data, and to access specialized datasets. In many cases, the realities of data acquisition present a significant burden, therefore gaining access to public datasets allows for more robust analyses and broadly enriched data exploration. To answer this demand, the Montreal Neurological Institute has announced its commitment to Open Science, harnessing the power of making both clinical and research data available to the world (Owens, 2016a,b). As such, the LORIS and CBRAIN (Das et al., 2016) platforms have been tasked with the technical challenges specific to the institutional-level implementation of open data sharing, including: Comprehensive linking of multimodal data (phenotypic, clinical, neuroimaging, biobanking, and genomics, etc.)Secure database encryption, specifically designed for institutional and multi-project data sharing, ensuring subject confidentiality (using multi-tiered identifiers).Querying capabilities with multiple levels of single study and institutional permissions, allowing public data sharing for all consented and de-identified subject data.Configurable pipelines and flags to facilitate acquisition and analysis, as well as access to High Performance Computing clusters for rapid data processing and sharing of software tools.Robust Workflows and Quality Control mechanisms ensuring transparency and consistency in best practices.Long term storage (and web access) of data, reducing loss of institutional data assets.Enhanced web-based visualization of imaging, genomic, and phenotypic data, allowing for real-time viewing and manipulation of data from anywhere in the world.Numerous modules for data filtering, summary statistics, and personalized and configurable dashboards. Implementing the vision of Open Science at the Montreal Neurological Institute will be a concerted undertaking that seeks to facilitate data sharing for the global research community. Our goal is to utilize the years of experience in multi-site collaborative research infrastructure to implement the technical requirements to achieve this level of public data sharing in a practical yet robust manner, in support of accelerating scientific discovery.

  3. 75 FR 71134 - National Institute of General Medical Sciences; Notice of Closed Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-22

    ... General Medical Sciences; Notice of Closed Meeting Pursuant to section 10(d) of the Federal Advisory... Sciences Special Emphasis Panel; Conference Grants Review. Date: December 13, 2010. Time: 1 p.m. to 6 p.m..., Office of Scientific Review, National Institute of General Medical Sciences, National Institutes of...

  4. 76 FR 19105 - National Institute of General Medical Sciences; Notice of Closed Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-06

    ... General Medical Sciences; Notice of Closed Meeting Pursuant to section 10(d) of the Federal Advisory... Sciences Special Emphasis Panel; NIGMS Legacy Community-Wide Scientific Resources. Date: April 12, 2011... Institute of General Medical Sciences, National Institutes of Health, 45 Center Drive, Room 3AN18, Bethesda...

  5. 78 FR 39741 - National Institute of General Medical Sciences; Notice of Closed Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-02

    ... General Medical Sciences; Notice of Closed Meetings Pursuant to section 10(d) of the Federal Advisory... Sciences Special Emphasis Panel; SCORE Grant Applications. Date: July 23, 2013. Time: 8:00 a.m. to 5:00 p.m..., Office of Scientific Review, National Institute of General Medical Sciences, National Institutes of...

  6. 78 FR 70311 - National Institute of General Medical Sciences; Notice of Closed Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-25

    ... General Medical Sciences; Notice of Closed Meetings Pursuant to section 10(d) of the Federal Advisory... Sciences Special Emphasis Panel; Review of R-13 Conference Grant Applications. Date: December 3, 2013. Time..., National Institute of General Medical Sciences, National Institutes of Health, 45 Center Drive, Room 3An.22...

  7. 76 FR 36932 - National Institute of General Medical Sciences; Notice of Closed Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-23

    ... General Medical Sciences; Notice of Closed Meeting Pursuant to section 10(d) of the Federal Advisory... Sciences Special Emphasis Panel, MBRS Score. Date: July 18-19, 2011. Time: 8 a.m. to 5 p.m. Agenda: To..., Office of Scientific Review, National Institute of General Medical Sciences, National Institutes of...

  8. 78 FR 32672 - National Institute of Environmental Health Sciences (NIEHS); Notice of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-31

    ... Environmental Health Sciences (NIEHS); Notice of Meeting Pursuant to the NIH Reform Act of 2006 (42 U.S.C. 281 (d)(4)), notice is hereby given that the National Institute of Environmental Health Sciences (NIEHS... Popovich, National Institute of Environmental Health Sciences, Division of Extramural Research and Training...

  9. Hope or Hype? What is Next for Biofuels? (LBNL Science at the Theater)

    ScienceCinema

    Keasling, Jay; Bristow, Jim; Tringe, Susannah Green

    2017-12-09

    Science at the Theater: From the sun to your gas tank: A new breed of biofuels may help solve the global energy challenge and reduce the impact of fossil fuels on global warming. KTVU Channel 2 health and science editor John Fowler will moderate a panel of Lawrence Berkeley National Laboratory scientists who are developing ways to convert the solar energy stored in plants into liquid fuels. Jay Keasling is one of the foremost authorities in the field of synthetic biology. He is applying this research toward the production of advanced carbon-neutral biofuels that can replace gasoline on a gallon-for-gallon basis. Keasling is Berkeley Labs Acting Deputy Director and the Chief Executive Officer of the U.S. Department of Energys Joint BioEnergy Institute. Jim Bristow is deputy director of programs for the U.S. Department of Energy Joint Genome Institute (JGI), a national user facility in Walnut Creek, CA. He developed and implemented JGIs Community Sequencing Program, which provides large-scale DNA sequencing and analysis to advance genomics related to bioenergy and environmental characterization and cleanup. Susanna Green Tringe is a computational biologist with the U.S. Department of Energy Joint Genome Institute (JGI). She helped pioneer the field of metagenomics, a new strategy for isolating, sequencing, and characterizing DNA extracted directly from environmental samples, such as the contents of the termite gut, which yielded enzymes responsible for breakdown of wood into fuel.

  10. Out-of-equilibrium processes in suspensions of oppositely charged colloids: liquid-to-crystal nucleation and gel formation

    NASA Astrophysics Data System (ADS)

    Sanz, Eduardo

    2009-03-01

    We study the kinetics of the liquid-to-crystal transformation and of gel formation in colloidal suspensions of oppositely charged particles. We analyse, by means of both computer simulations and experiments, the evolution of a fluid quenched to a state point of the phase diagram where the most stable state is either a homogeneous crystalline solid or a solid phase in contact with a dilute gas. On the one hand, at high temperatures and high packing fractions, close to an ordered-solid/disordered-solid coexistence line, we find that the fluid-to-crystal pathway does not follow the minimum free energy route. On the other hand, a quench to a state point far from the ordered-crystal/disordered-crystal coexistence border is followed by a fluid-to-solid transition through the minimum free energy pathway. At low temperatures and packing fractions we observe that the system undergoes a gas-liquid spinodal decomposition that, at some point, arrests giving rise to a gel-like structure. Both our simulations and experiments suggest that increasing the interaction range favors crystallization over vitrification in gel-like structures. [4pt] In collaboration with Chantal Valeriani, Soft Condensed Matter, Debye Institute for Nanomaterials Science, Utrecht University, Princetonplein 5, 3584 CC Utrecht, The Netherlands and SUPA, School of Physics, University of Edinburgh, JCMB King's Buildings, Mayfield Road, Edinburgh EH9 3JZ, UK; Teun Vissers, Andrea Fortini, Mirjam E. Leunissen, and Alfons van Blaaderen, Soft Condensed Matter, Debye Institute for Nanomaterials Science, Utrecht University; Daan Frenke, FOM Institute for Atomic and Molecular Physics, Kruislaan 407, 1098 SJ Amsterdam, The Netherlands and Department of Chemistry, University of Cambridge, Lensfield Road, CB2 1EW, Cambridge, UK; and Marjolein Dijkstra, Soft Condensed Matter, Debye Institute for Nanomaterials Science, Utrecht University.

  11. The TeraShake Computational Platform for Large-Scale Earthquake Simulations

    NASA Astrophysics Data System (ADS)

    Cui, Yifeng; Olsen, Kim; Chourasia, Amit; Moore, Reagan; Maechling, Philip; Jordan, Thomas

    Geoscientific and computer science researchers with the Southern California Earthquake Center (SCEC) are conducting a large-scale, physics-based, computationally demanding earthquake system science research program with the goal of developing predictive models of earthquake processes. The computational demands of this program continue to increase rapidly as these researchers seek to perform physics-based numerical simulations of earthquake processes for larger meet the needs of this research program, a multiple-institution team coordinated by SCEC has integrated several scientific codes into a numerical modeling-based research tool we call the TeraShake computational platform (TSCP). A central component in the TSCP is a highly scalable earthquake wave propagation simulation program called the TeraShake anelastic wave propagation (TS-AWP) code. In this chapter, we describe how we extended an existing, stand-alone, wellvalidated, finite-difference, anelastic wave propagation modeling code into the highly scalable and widely used TS-AWP and then integrated this code into the TeraShake computational platform that provides end-to-end (initialization to analysis) research capabilities. We also describe the techniques used to enhance the TS-AWP parallel performance on TeraGrid supercomputers, as well as the TeraShake simulations phases including input preparation, run time, data archive management, and visualization. As a result of our efforts to improve its parallel efficiency, the TS-AWP has now shown highly efficient strong scaling on over 40K processors on IBM’s BlueGene/L Watson computer. In addition, the TSCP has developed into a computational system that is useful to many members of the SCEC community for performing large-scale earthquake simulations.

  12. A novel computational approach towards the certification of large-scale boson sampling

    NASA Astrophysics Data System (ADS)

    Huh, Joonsuk

    Recent proposals of boson sampling and the corresponding experiments exhibit the possible disproof of extended Church-Turning Thesis. Furthermore, the application of boson sampling to molecular computation has been suggested theoretically. Till now, however, only small-scale experiments with a few photons have been successfully performed. The boson sampling experiments of 20-30 photons are expected to reveal the computational superiority of the quantum device. A novel theoretical proposal for the large-scale boson sampling using microwave photons is highly promising due to the deterministic photon sources and the scalability. Therefore, the certification protocol of large-scale boson sampling experiments should be presented to complete the exciting story. We propose, in this presentation, a computational protocol towards the certification of large-scale boson sampling. The correlations of paired photon modes and the time-dependent characteristic functional with its Fourier component can show the fingerprint of large-scale boson sampling. This work was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Education, Science and Technology(NRF-2015R1A6A3A04059773), the ICT R&D program of MSIP/IITP [2015-019, Fundamental Research Toward Secure Quantum Communication] and Mueunjae Institute for Chemistry (MIC) postdoctoral fellowship.

  13. Global information infrastructure.

    PubMed

    Lindberg, D A

    1994-01-01

    The High Performance Computing and Communications Program (HPCC) is a multiagency federal initiative under the leadership of the White House Office of Science and Technology Policy, established by the High Performance Computing Act of 1991. It has been assigned a critical role in supporting the international collaboration essential to science and to health care. Goals of the HPCC are to extend USA leadership in high performance computing and networking technologies; to improve technology transfer for economic competitiveness, education, and national security; and to provide a key part of the foundation for the National Information Infrastructure. The first component of the National Institutes of Health to participate in the HPCC, the National Library of Medicine (NLM), recently issued a solicitation for proposals to address a range of issues, from privacy to 'testbed' networks, 'virtual reality,' and more. These efforts will build upon the NLM's extensive outreach program and other initiatives, including the Unified Medical Language System (UMLS), MEDLARS, and Grateful Med. New Internet search tools are emerging, such as Gopher and 'Knowbots'. Medicine will succeed in developing future intelligent agents to assist in utilizing computer networks. Our ability to serve patients is so often restricted by lack of information and knowledge at the time and place of medical decision-making. The new technologies, properly employed, will also greatly enhance our ability to serve the patient.

  14. RIACS

    NASA Technical Reports Server (NTRS)

    Moore, Robert C.

    1998-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities that serves as a bridge between NASA and the academic community. Under a five-year co-operative agreement with NASA, research at RIACS is focused on areas that are strategically enabling to the Ames Research Center's role as NASA's Center of Excellence for Information Technology. Research is carried out by a staff of full-time scientist,augmented by visitors, students, post doctoral candidates and visiting university faculty. The primary mission of RIACS is charted to carry out research and development in computer science. This work is devoted in the main to tasks that are strategically enabling with respect to NASA's bold mission in space exploration and aeronautics. There are three foci for this work: Automated Reasoning. Human-Centered Computing. and High Performance Computing and Networking. RIACS has the additional goal of broadening the base of researcher in these areas of importance to the nation's space and aeronautics enterprises. Through its visiting scientist program, RIACS facilitates the participation of university-based researchers, including both faculty and students, in the research activities of NASA and RIACS. RIACS researchers work in close collaboration with NASA computer scientists on projects such as the Remote Agent Experiment on Deep Space One mission, and Super-Resolution Surface Modeling.

  15. 42 CFR 65a.2 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... NATIONAL INSTITUTE OF ENVIRONMENTAL HEALTH SCIENCES HAZARDOUS SUBSTANCES BASIC RESEARCH AND TRAINING GRANTS... of the National Institute of Environmental Health Sciences, or the Director's delegate. HHS means the... of Environmental Health Sciences, an organizational component of the National Institutes of Health...

  16. Automated metadata--final project report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schissel, David

    This report summarizes the work of the Automated Metadata, Provenance Cataloging, and Navigable Interfaces: Ensuring the Usefulness of Extreme-Scale Data Project (MPO Project) funded by the United States Department of Energy (DOE), Offices of Advanced Scientific Computing Research and Fusion Energy Sciences. Initially funded for three years starting in 2012, it was extended for 6 months with additional funding. The project was a collaboration between scientists at General Atomics, Lawrence Berkley National Laboratory (LBNL), and Massachusetts Institute of Technology (MIT). The group leveraged existing computer science technology where possible, and extended or created new capabilities where required. The MPO projectmore » was able to successfully create a suite of software tools that can be used by a scientific community to automatically document their scientific workflows. These tools were integrated into workflows for fusion energy and climate research illustrating the general applicability of the project’s toolkit. Feedback was very positive on the project’s toolkit and the value of such automatic workflow documentation to the scientific endeavor.« less

  17. With Great Measurements Come Great Results

    NASA Astrophysics Data System (ADS)

    Williams, Carl

    Measurements are the foundation for science and modern life. Technologies we take for granted every day depend on them-cell phones, CAT scans, pharmaceuticals, even sports equipment. Metrology, or measurement science, determines what industry can make reliably and what they cannot. At the National Institute of Standards and Technology (NIST) we specialize in making world class measurements that an incredibly wide range of industries use to continually improve their products - computer chips with nanoscale components, atomic clocks that you can hold in your hand, lasers for both super-strong welds and delicate eye surgeries. Think of all the key technologies developed over the last 100 years and better measurements, standards, or analysis techniques played a role in making them possible. NIST works collaboratively with industry researchers on the advanced metrology for tomorrow's technologies. A new kilogram based on electromagnetic force, cars that weigh half as much but are just as strong, quantum computers, personalized medicine, single atom devices - it's all happening in our labs now. This talk will focus on how metrology creates the future.

  18. Interdisciplinary multiinstitutional alliances in support of educational programs for health sciences librarians.

    PubMed Central

    Smith, L C

    1996-01-01

    This project responds to the need to identify the knowledge, skills, and expertise required by health sciences librarians in the future and to devise mechanisms for providing this requisite training. The approach involves interdisciplinary multiinstitutional alliances with collaborators drawn from two graduate schools of library and information science (University of Illinois at Urbana-Champaign and Indiana University) and two medical schools (University of Illinois at Chicago and Washington University). The project encompasses six specific aims: (1) investigate the evolving role of the health sciences librarian; (2) analyze existing programs of study in library and information science at all levels at Illinois and Indiana; (3) develop opportunities for practicums, internships, and residencies; (4) explore the possibilities of computing and communication technologies to enhance instruction; (5) identify mechanisms to encourage faculty and graduate students to participate in medical informatics research projects; and (6) create recruitment strategies to achieve better representation of currently underrepresented groups. The project can serve as a model for other institutions interested in regional collaboration to enhance graduate education for health sciences librarianship. PMID:8913560

  19. Institutional Goals Analyses of a Health Science Subsystem in a Statewide Higher Education System.

    ERIC Educational Resources Information Center

    Ezell, Annette Schram

    An Institutional Goals Inventory (IGI) is used to assess a health science subsystem within a Western statewide higher educational system. Institutional goals are defined as ideal conditions the institution can continuously seek to maximize or perfect. Data were collected from each college and campus responsible for health science education and for…

  20. Institute for Science Education. Institut fur die Padagogik der Naturwissenschaften an der Universitat Kiel. IPN Report-in-Brief 11. 3rd Edition.

    ERIC Educational Resources Information Center

    Blansdorf, Klaus, Ed.

    The Institut fur die Padagogik der Naturwissenschaften (IPN) is the research institute for science education, with a national function in the Federal Republic of Germany. The IPN consists of biology education, chemistry education, physics education, educational science, research methodology/statistics, and administration/general services…

  1. Double photoionization of Be-like (Be-F5+) ions

    NASA Astrophysics Data System (ADS)

    Abdel Naby, Shahin; Pindzola, Michael; Colgan, James

    2015-04-01

    The time-dependent close-coupling method is used to study the single photon double ionization of Be-like (Be - F5+) ions. Energy and angle differential cross sections are calculated to fully investigate the correlated motion of the two photoelectrons. Symmetric and antisymmetric amplitudes are presented along the isoelectronic sequence for different energy sharing of the emitted electrons. Our total double photoionization cross sections are in good agreement with available theoretical results and experimental measurements along the Be-like ions. This work was supported in part by grants from NSF and US DoE. Computational work was carried out at NERSC in Oakland, California and the National Institute for Computational Sciences in Knoxville, Tennessee.

  2. METLIN-PC: An applications-program package for problems of mathematical programming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pshenichnyi, B.N.; Sobolenko, L.A.; Sosnovskii, A.A.

    1994-05-01

    The METLIN-PC applications-program package (APP) was developed at the V.M. Glushkov Institute of Cybernetics of the Academy of Sciences of Ukraine on IBM PC XT and AT computers. The present version of the package was written in Turbo Pascal and Fortran-77. The METLIN-PC is chiefly designed for the solution of smooth problems of mathematical programming and is a further development of the METLIN prototype, which was created earlier on a BESM-6 computer. The principal property of the previous package is retained - the applications modules employ a single approach based on the linearization method of B.N. Pschenichnyi. Hence the namemore » {open_quotes}METLIN.{close_quotes}« less

  3. Infrastructure for Training and Partnershipes: California Water and Coastal Ocean Resources

    NASA Technical Reports Server (NTRS)

    Siegel, David A.; Dozier, Jeffrey; Gautier, Catherine; Davis, Frank; Dickey, Tommy; Dunne, Thomas; Frew, James; Keller, Arturo; MacIntyre, Sally; Melack, John

    2000-01-01

    The purpose of this project was to advance the existing ICESS/Bren School computing infrastructure to allow scientists, students, and research trainees the opportunity to interact with environmental data and simulations in near-real time. Improvements made with the funding from this project have helped to strengthen the research efforts within both units, fostered graduate research training, and helped fortify partnerships with government and industry. With this funding, we were able to expand our computational environment in which computer resources, software, and data sets are shared by ICESS/Bren School faculty researchers in all areas of Earth system science. All of the graduate and undergraduate students associated with the Donald Bren School of Environmental Science and Management and the Institute for Computational Earth System Science have benefited from the infrastructure upgrades accomplished by this project. Additionally, the upgrades fostered a significant number of research projects (attached is a list of the projects that benefited from the upgrades). As originally proposed, funding for this project provided the following infrastructure upgrades: 1) a modem file management system capable of interoperating UNIX and NT file systems that can scale to 6.7 TB, 2) a Qualstar 40-slot tape library with two AIT tape drives and Legato Networker backup/archive software, 3) previously unavailable import/export capability for data sets on Zip, Jaz, DAT, 8mm, CD, and DLT media in addition to a 622Mb/s Internet 2 connection, 4) network switches capable of 100 Mbps to 128 desktop workstations, 5) Portable Batch System (PBS) computational task scheduler, and vi) two Compaq/Digital Alpha XP1000 compute servers each with 1.5 GB of RAM along with an SGI Origin 2000 (purchased partially using funds from this project along with funding from various other sources) to be used for very large computations, as required for simulation of mesoscale meteorology or climate.

  4. 78 FR 38997 - National Institute of General Medical Sciences; Notice of Closed Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-28

    ... General Medical Sciences; Notice of Closed Meetings Pursuant to section 10(d) of the Federal Advisory... Sciences Special Emphasis Panel; Clinical Trials of Pain Treatment. Date: July 19, 2013. Time: 1:00 p.m. to... Institute of General Medical Sciences, National Institutes of Health, 45 Center Drive, Room 3An.18K...

  5. 77 FR 33471 - National Institute of General Medical Sciences; Notice of Closed Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-06

    ... General Medical Sciences; Notice of Closed Meeting Pursuant to section 10(d) of the Federal Advisory... Sciences Special Emphasis Panel MBRS SCORE Grant Applications. Date: June 27, 2012. Time: 8:00 a.m. to 5:00..., National Institute of General Medical Sciences, National Institutes of Health, 45 Center Drive, Room 3An.18...

  6. Van Allen Probes Science Gateway: Single-Point Access to Long-Term Radiation Belt Measurements and Space Weather Nowcasting

    NASA Astrophysics Data System (ADS)

    Romeo, G.; Barnes, R. J.; Ukhorskiy, A. Y.; Sotirelis, T.; Stephens, G.

    2017-12-01

    The Science Gateway gives single-point access to over 4.5 years of comprehensive wave and particle measurements from the Van Allen Probes NASA twin-spacecraft mission. The Gateway provides a set of visualization and data analysis tools including: HTML5-based interactive visualization of high-level data products from all instrument teams in the form of: line plots, orbital content plots, dynamical energy spectra, L-shell context plots (including two-spacecraft plotting), FFT spectra of wave data, solar wind and geomagnetic indices data, etc.; download custom multi-instrument CDF data files of selected data products; publication quality plots of digital data; combined orbit predicts for mission planning and coordination including: Van Allen Probes, MMS, THEMIS, Arase (ERG), Cluster, GOES, Geotail, FIREBIRD; magnetic footpoint calculator for coordination with LEO and ground-based assets; real-time computation and processing of empirical magnetic field models - computation of magnetic ephemeris, computation of adiabatic invariants. Van Allen Probes is the first spacecraft mission to provide a nowcast of the radiation environment in the heart of the radiation belts, where the radiation levels are the highest and most dangerous for spacecraft operations. For this purpose, all instruments continuously broadcast a subset of their science data in real time. Van Allen Probes partners with four foreign institutions who operate ground stations that receive the broadcast: Korea (KASI), the Czech republic (CAS), Argentina (CONAE), and Brazil (INPE). The SpWx broadcast is then collected at APL and delivered to the community via the Science Gateway.

  7. Jaasc Cooperation League for Education and Public Outreach

    NASA Astrophysics Data System (ADS)

    Watanabe, Jun-Ichi; JAASC Committee

    The JAASC Japanese Astronomy Aeronautical Science Space Science cooperation league has been established in 2000 among the related institutes for education and public outreach. The participating institutes are National Astronomical Observatory of Japan Institute of Space and Astronautical Science National Space Development Agency of Japan National Aerospace Laboratory of Japan Young Astronomers Club Japan Science and Technology Corporation and Japan Space Forum. These institutes started several joint efforts such as making web site for beginners in general public or educational materials for junior high school. This is a challenging trial for Japanese institutes to cooperate beyond the barrier of the

  8. Integrated School of Ocean Sciences: Doctoral Education in Marine Sciences in Kiel

    NASA Astrophysics Data System (ADS)

    Bergmann, Nina; Basse, Wiebke; Prigge, Enno; Schelten, Christiane; Antia, Avan

    2016-04-01

    Marine research is a dynamic thematic focus in Kiel, Germany, uniting natural scientists, economists, lawyers, philosophers, artists and computing and medical scientists in frontier research on the scientific, economic and legal aspects of the seas. The contributing institutions are Kiel University, GEOMAR Helmholtz Centre for Ocean Research Kiel, Kiel Institute for the World Economy and Muthesius University in Kiel. Marine science education in Kiel trains young scientists to investigate the role of the oceans in global change, risks arising from ocean usage and sustainable management of living and non-living marine resources. Basic fundamental research is supplemented with applied science in an international framework including partners from industry and public life. The Integrated School of Ocean Sciences (ISOS) established through the Cluster of Excellence "The Future Ocean", funded within the German Excellence Initiative, provides PhD candidates in marine sciences with interdisciplinary education outside of curricular courses. It supports the doctoral candidates through supplementary training, a framework of supervision, mentoring and mobility, the advisors through transparency and support of doctoral training in their research proposals and the contributing institutions by ensuring quality, innovation and excellence in marine doctoral education. All PhD candidates financed by the Helmholtz Research School for Ocean System Science and Technology (HOSST) and the Collaborative Research Centre 754 "Climate-biogeochemical interactions in the tropical ocean" (SFB 754) are enrolled at the ISOS and are integrated into the larger peer community. Over 150 PhD candidate members from 6 faculties form a large interdisciplinary network. At the ISOS, they sharpen their scientific profile, are challenged to think beyond their discipline and equip themselves for life after a PhD through early exposure to topics beyond research (e.g. social responsibility, public communication, global sustainability etc.). The primary advisor and at least one co-advisor form an advisory committee, committing to support the candidate in two mandatory meetings per year. Contrasting to other PhD programmes, ISOS emphasises on an open policy with voluntary participation for all other aspects of the programme, creating a unique environment that lives upon personal involvement and maximises tangible benefits for individual PhD candidates.

  9. The impact of institutional ethics on academic health sciences library leadership: a survey of academic health sciences library directors.

    PubMed

    Tooey, Mary Joan M J; Arnold, Gretchen N

    2014-10-01

    Ethical behavior in libraries goes beyond service to users. Academic health sciences library directors may need to adhere to the ethical guidelines and rules of their institutions. Does the unique environment of an academic health center imply different ethical considerations? Do the ethical policies of institutions affect these library leaders? Do their personal ethical considerations have an impact as well? In December 2013, a survey regarding the impact of institutional ethics was sent to the director members of the Association of Academic Health Sciences Libraries. The objective was to determine the impact of institutional ethics on these leaders, whether through personal conviction or institutional imperative.

  10. The impact of institutional ethics on academic health sciences library leadership: a survey of academic health sciences library directors

    PubMed Central

    Tooey, Mary Joan (M.J.); Arnold, Gretchen N.

    2014-01-01

    Ethical behavior in libraries goes beyond service to users. Academic health sciences library directors may need to adhere to the ethical guidelines and rules of their institutions. Does the unique environment of an academic health center imply different ethical considerations? Do the ethical policies of institutions affect these library leaders? Do their personal ethical considerations have an impact as well? In December 2013, a survey regarding the impact of institutional ethics was sent to the director members of the Association of Academic Health Sciences Libraries. The objective was to determine the impact of institutional ethics on these leaders, whether through personal conviction or institutional imperative. PMID:25349542

  11. Community Capacity Building as a vital mechanism for enhancing the growth and efficacy of a sustainable scientific software ecosystem: experiences running a real-time bi-coastal "Open Science for Synthesis" Training Institute for young Earth and Environmental scientists

    NASA Astrophysics Data System (ADS)

    Schildhauer, M.; Jones, M. B.; Bolker, B.; Lenhardt, W. C.; Hampton, S. E.; Idaszak, R.; Rebich Hespanha, S.; Ahalt, S.; Christopherson, L.

    2014-12-01

    Continuing advances in computational capabilities, access to Big Data, and virtual collaboration technologies are creating exciting new opportunities for accomplishing Earth science research at finer resolutions, with much broader scope, using powerful modeling and analytical approaches that were unachievable just a few years ago. Yet, there is a perceptible lag in the abilities of the research community to capitalize on these new possibilities, due to lacking the relevant skill-sets, especially with regards to multi-disciplinary and integrative investigations that involve active collaboration. UC Santa Barbara's National Center for Ecological Analysis and Synthesis (NCEAS), and the University of North Carolina's Renaissance Computing Institute (RENCI), were recipients of NSF OCI S2I2 "Conceptualization awards", charged with helping define the needs of the research community relative to enabling science and education through "sustained software infrastructure". Over the course of our activities, a consistent request from Earth scientists was for "better training in software that enables more effective, reproducible research." This community-based feedback led to creation of an "Open Science for Synthesis" Institute— a innovative, three-week, bi-coastal training program for early career researchers. We provided a mix of lectures, hands-on exercises, and working group experience on topics including: data discovery and preservation; code creation, management, sharing, and versioning; scientific workflow documentation and reproducibility; statistical and machine modeling techniques; virtual collaboration mechanisms; and methods for communicating scientific results. All technologies and quantitative tools presented were suitable for advancing open, collaborative, and reproducible synthesis research. In this talk, we will report on the lessons learned from running this ambitious training program, that involved coordinating classrooms among two remote sites, and included developing original synthesis research activities as part of the course. We also report on the feedback provided by participants as to the learning approaches and topical issues they found most engaging, and why.

  12. Factors influencing exemplary science teachers' levels of computer use

    NASA Astrophysics Data System (ADS)

    Hakverdi, Meral

    This study examines exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their students' use of computer applications/tools in or for their science class. After a relevant review of the literature certain variables were selected for analysis. These variables included personal self-efficacy in teaching with computers, outcome expectancy, pupil-control ideology, level of computer use, age, gender, teaching experience, personal computer use, professional computer use and science teachers' level of knowledge/skills in using specific computer applications for science instruction. The sample for this study includes middle and high school science teachers who received the Presidential Award for Excellence in Science Teaching Award (sponsored by the White House and the National Science Foundation) between the years 1997 and 2003 from all 50 states and U.S. territories. Award-winning science teachers were contacted about the survey via e-mail or letter with an enclosed return envelope. Of the 334 award-winning science teachers, usable responses were received from 92 science teachers, which made a response rate of 27.5%. Analysis of the survey responses indicated that exemplary science teachers have a variety of knowledge/skills in using computer related applications/tools. The most commonly used computer applications/tools are information retrieval via the Internet, presentation tools, online communication, digital cameras, and data collection probes. Results of the study revealed that students' use of technology in their science classroom is highly correlated with the frequency of their science teachers' use of computer applications/tools. The results of the multiple regression analysis revealed that personal self-efficacy related to the exemplary science teachers' level of computer use suggesting that computer use is dependent on perceived abilities at using computers. The teachers' use of computer-related applications/tools during class, and their personal self-efficacy, age, and gender are highly related with their level of knowledge/skills in using specific computer applications for science instruction. The teachers' level of knowledge/skills in using specific computer applications for science instruction and gender related to their use of computer-related applications/tools during class and the students' use of computer-related applications/tools in or for their science class. In conclusion, exemplary science teachers need assistance in learning and using computer-related applications/tool in their science class.

  13. Institute for Materials Science

    Science.gov Websites

    Search Site submit National Security Education Center Los Alamos National LaboratoryInstitute for Materials Science Incubate - Innovate - Integrate Los Alamos National Laboratory Institute for Materials educational center in NSEC focused on fostering the advancement of materials science at Los Alamos National

  14. The challenges of developing computational physics: the case of South Africa

    NASA Astrophysics Data System (ADS)

    Salagaram, T.; Chetty, N.

    2013-08-01

    Most modern scientific research problems are complex and interdisciplinary in nature. It is impossible to study such problems in detail without the use of computation in addition to theory and experiment. Although it is widely agreed that students should be introduced to computational methods at the undergraduate level, it remains a challenge to do this in a full traditional undergraduate curriculum. In this paper, we report on a survey that we conducted of undergraduate physics curricula in South Africa to determine the content and the approach taken in the teaching of computational physics. We also considered the pedagogy of computational physics at the postgraduate and research levels at various South African universities, research facilities and institutions. We conclude that the state of computational physics training in South Africa, especially at the undergraduate teaching level, is generally weak and needs to be given more attention at all universities. Failure to do so will impact negatively on the countrys capacity to grow its endeavours generally in the field of computational sciences, with negative impacts on research, and in commerce and industry.

  15. Radar Model of Asteroid 216 Kleopatra

    NASA Technical Reports Server (NTRS)

    2000-01-01

    These images show several views from a radar-based computer model of asteroid 216 Kleopatra. The object, located in the main asteroid belt between Mars and Jupiter, is about 217 kilometers (135 miles) long and about 94 kilometers (58 miles) wide, or about the size of New Jersey.

    This dog bone-shaped asteroid is an apparent leftover from an ancient, violent cosmic collision. Kleopatra is one of several dozen asteroids whose coloring suggests they contain metal.

    A team of astronomers observing Kleopatra used the 305-meter (1,000-foot) telescope of the Arecibo Observatory in Puerto Rico to bounce encoded radio signals off Kleopatra. Using sophisticated computer analysis techniques, they decoded the echoes, transformed them into images, and assembled a computer model of the asteroid's shape.

    The images were obtained when Kleopatra was about 171 million kilometers (106 million miles) from Earth. This model is accurate to within about 15 kilometers (9 miles).

    The Arecibo Observatory is part of the National Astronomy and Ionosphere Center, operated by Cornell University, Ithaca, N.Y., for the National Science Foundation. The Kleopatra radar observations were supported by NASA's Office of Space Science, Washington, DC. JPL is managed for NASA by the California Institute of Technology in Pasadena.

  16. Increasing Scientific Literacy at Minority Serving Institutions Nationwide through AMS Professional Development Diversity Workshops

    NASA Astrophysics Data System (ADS)

    Brey, J. A.; Geer, I. W.; Mills, E. W.; Nugnes, K. A.; Moses, M. N.

    2011-12-01

    Increasing students' earth science literacy, especially those at Minority Serving Institutions (MSIs), is a primary goal of the American Meteorological Society (AMS). Through the NSF-supported AMS Weather Studies and AMS Ocean Studies Diversity workshops for Historically Black College and Universities, Hispanic Serving Institutions, Tribal Colleges and Universities, Alaska Native, and Native Hawaiian Serving Institutions, AMS has brought meteorology and oceanography courses to more students. These workshops trained and mentored faculty implementing AMS Weather Studies and AMS Ocean Studies. Of the 145 institutions that have participated in the AMS Weather Studies Diversity Project, reaching over 13,000 students, it was the first meteorology course offered for more than two-thirds of the institutions. As a result of the AMS Ocean Studies Diversity Project, 75 institutions have offered the course to more than 3000 students. About 50 MSIs implemented both the Weather and Ocean courses, improving the Earth Science curriculum on their campuses. With the support of NSF and NASA, and a partnership with Second Nature, the organizing entity behind the American College and University President's Climate Commitment (ACUPCC), the newest professional development workshop, AMS Climate Studies Diversity Project will recruit MSI faculty members through the vast network of Second Nature's more than 670 signatories. These workshops will begin in early summer 2012. An innovative approach to studying climate science, AMS Climate Studies explores the fundamental science of Earth's climate system and addresses the societal impacts relevant to today's students and teachers. The course utilizes resources from respected organizations, such as the IPCC, the US Global Change Research Program, NASA, and NOAA. In addition, faculty and students learn about basic climate modeling through the AMS Conceptual Energy Model. Following the flow of energy in a clear, simplified model from space to Earth and back sets the stage for differentiating between climate, climate variability, and climate change. The AMS Climate Studies Diversity Project will follow the successful models of the Weather and Ocean Diversity Projects. Hands on examples, computer based experiments, round table discussions, lectures, and conversations with scientists in the field and other experienced professors are all important parts of previous workshops, and will be complimented by previous participants' feedback. This presentation will also focus on insight gained from the results of a self-study of the long term, successful AMS DataStreme Project, precollege teacher professional development courses. AMS is excited for this new opportunity of reaching even more MSI faculty and students. The ultimate goal of the AMS is to have a geoscience concentration at MSIs throughout the nation and to greatly increase the number of minority students entering geoscience careers, including science teaching.

  17. Local knowledge, science, and institutional change: the case of desertification control in Northern China.

    PubMed

    Yang, Lihua

    2015-03-01

    This article studies the influence of local knowledge on the impact of science on institutional change in ecological and environmental management. Based on an empirical study on desertification control in 12 counties in north China, the study found the following major results: (1) although there was a cubic relationship between the extent and effect of local knowledge, local knowledge significantly influenced the impact of science on institutional change; (2) local knowledge took effect mainly through affecting formal laws and regulations, major actors, and methods of desertification control in institutional change but had no significant impact on the types of property rights; and (3) local knowledge enhanced the impact of science on the results of desertification control through affecting the impact of science on institutional change. These findings provide a reference for researchers, policy makers, and practitioners, both in China and in other regions of the world, to further explore the influence of local knowledge on the impact of science on institutional change and the roles of local knowledge or knowledge in institutional change and governance.

  18. National Board for Education Sciences 2009 Annual Report, August 2008 through June 2009. NBES 2009-6020

    ERIC Educational Resources Information Center

    National Board for Education Sciences, 2009

    2009-01-01

    On November 5, 2002, Congress passed the Education Sciences Reform Act of 2002 (ESRA), establishing the Institute of Education Sciences (IES, or the Institute) and its board of directors, the National Board for Education Sciences (NBES, or the Board). The Institute reports to Congress yearly on the condition of education in the United States. The…

  19. Semiannual Report, October 1, 1989 through March 31, 1990 (Institute for Computer Applications in Science and Engineering)

    DTIC Science & Technology

    1990-06-01

    synchronization . We consider the performance of various synchronization protocols by deriving upper and lower bounds on optimal perfor- mance, upper bounds on Time ...from universities and from industry, who have resident appointments for limited periods of time , and by consultants. Members of NASA’s research staff...convergence to steady state is also being studied together with D. Gottlieb. The idea is to generalize the concept of local- time stepping by minimizing the

  20. CERT TST November 2016 Visit Summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Little, Robert Currier; Bailey, Teresa S.; Kahler, III, Albert Comstock

    2017-04-27

    The dozen plus presentations covered the span of the Center’s activities, including experimental progress, simulations of the experiments (both for calibration and validation), UQ analysis, nuclear data impacts, status of simulation codes, methods development, computational science progress, and plans for upcoming priorities. All three institutions comprising the Center (Texas A&M, University of Colorado Boulder, and Simon Fraser University) were represented. Center-supported students not only gave two of the oral presentations, but also highlighted their research in a number of excellent posters.

  1. A Primal DPG Method Without a First Order Reformulation

    DTIC Science & Technology

    2013-05-01

    PERFORMING ORGANIZATION NAME(S) AND ADDRESS( ES ) University of Texas at Austin,Institute for Computational Engineering and Sciences,Austin,TX,78712 8...PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS( ES ) 10. SPONSOR/MONITOR’S ACRONYM(S) 11. SPONSOR...J. GOPALAKRISHNAN 100 102 104 106 10−8 10−6 10−4 10−2 100 102 Square domain: h and p convergence # Degrees of Freedom R el at iv e er ro r i n H1

  2. Strain engineering of electronic and magnetic properties of Ga2S2 nanoribbons

    NASA Astrophysics Data System (ADS)

    Wang, Bao-Ji; Li, Xiao-Hua; Zhang, Li-Wei; Wang, Guo-Dong; Ke, San-Huang

    2017-05-01

    Not Available Project supported by the National Natural Science Foundation of China (Grant Nos. 11174220 and 11374226), the Key Scientific Research Project of the Henan Institutions of Higher Learning, China (Grant No. 16A140009), the Program for Innovative Research Team of Henan Polytechnic University, China (Grant Nos. T2015-3 and T2016-2), the Doctoral Foundation of Henan Polytechnic University, China (Grant No. B2015-46), and the High-performance Grid Computing Platform of Henan Polytechnic University, China.

  3. The Design and Implementation of a Relational to Network Query Translator for a Distributed Database Management System.

    DTIC Science & Technology

    1985-12-01

    RELATIONAL TO NETWORK QUERY TRANSLATOR FOR A DISTRIBUTED DATABASE MANAGEMENT SYSTEM TH ESI S .L Kevin H. Mahoney -- Captain, USAF AFIT/GCS/ENG/85D-7...NETWORK QUERY TRANSLATOR FOR A DISTRIBUTED DATABASE MANAGEMENT SYSTEM - THESIS Presented to the Faculty of the School of Engineering of the Air Force...Institute of Technology Air University In Partial Fulfillment of the Requirements for the Degree of Master of Science in Computer Systems - Kevin H. Mahoney

  4. The National Cancer Institute's Physical Sciences - Oncology Network

    NASA Astrophysics Data System (ADS)

    Espey, Michael Graham

    In 2009, the NCI launched the Physical Sciences - Oncology Centers (PS-OC) initiative with 12 Centers (U54) funded through 2014. The current phase of the Program includes U54 funded Centers with the added feature of soliciting new Physical Science - Oncology Projects (PS-OP) U01 grant applications through 2017; see NCI PAR-15-021. The PS-OPs, individually and along with other PS-OPs and the Physical Sciences-Oncology Centers (PS-OCs), comprise the Physical Sciences-Oncology Network (PS-ON). The foundation of the Physical Sciences-Oncology initiative is a high-risk, high-reward program that promotes a `physical sciences perspective' of cancer and fosters the convergence of physical science and cancer research by forming transdisciplinary teams of physical scientists (e.g., physicists, mathematicians, chemists, engineers, computer scientists) and cancer researchers (e.g., cancer biologists, oncologists, pathologists) who work closely together to advance our understanding of cancer. The collaborative PS-ON structure catalyzes transformative science through increased exchange of people, ideas, and approaches. PS-ON resources are leveraged to fund Trans-Network pilot projects to enable synergy and cross-testing of experimental and/or theoretical concepts. This session will include a brief PS-ON overview followed by a strategic discussion with the APS community to exchange perspectives on the progression of trans-disciplinary physical sciences in cancer research.

  5. Democratizing data science through data science training.

    PubMed

    Van Horn, John Darrell; Fierro, Lily; Kamdar, Jeana; Gordon, Jonathan; Stewart, Crystal; Bhattrai, Avnish; Abe, Sumiko; Lei, Xiaoxiao; O'Driscoll, Caroline; Sinha, Aakanchha; Jain, Priyambada; Burns, Gully; Lerman, Kristina; Ambite, José Luis

    2018-01-01

    The biomedical sciences have experienced an explosion of data which promises to overwhelm many current practitioners. Without easy access to data science training resources, biomedical researchers may find themselves unable to wrangle their own datasets. In 2014, to address the challenges posed such a data onslaught, the National Institutes of Health (NIH) launched the Big Data to Knowledge (BD2K) initiative. To this end, the BD2K Training Coordinating Center (TCC; bigdatau.org) was funded to facilitate both in-person and online learning, and open up the concepts of data science to the widest possible audience. Here, we describe the activities of the BD2K TCC and its focus on the construction of the Educational Resource Discovery Index (ERuDIte), which identifies, collects, describes, and organizes online data science materials from BD2K awardees, open online courses, and videos from scientific lectures and tutorials. ERuDIte now indexes over 9,500 resources. Given the richness of online training materials and the constant evolution of biomedical data science, computational methods applying information retrieval, natural language processing, and machine learning techniques are required - in effect, using data science to inform training in data science. In so doing, the TCC seeks to democratize novel insights and discoveries brought forth via large-scale data science training.

  6. Democratizing data science through data science training

    PubMed Central

    Van Horn, John Darrell; Fierro, Lily; Kamdar, Jeana; Gordon, Jonathan; Stewart, Crystal; Bhattrai, Avnish; Abe, Sumiko; Lei, Xiaoxiao; O’Driscoll, Caroline; Sinha, Aakanchha; Jain, Priyambada; Burns, Gully; Lerman, Kristina; Ambite, José Luis

    2017-01-01

    The biomedical sciences have experienced an explosion of data which promises to overwhelm many current practitioners. Without easy access to data science training resources, biomedical researchers may find themselves unable to wrangle their own datasets. In 2014, to address the challenges posed such a data onslaught, the National Institutes of Health (NIH) launched the Big Data to Knowledge (BD2K) initiative. To this end, the BD2K Training Coordinating Center (TCC; bigdatau.org) was funded to facilitate both in-person and online learning, and open up the concepts of data science to the widest possible audience. Here, we describe the activities of the BD2K TCC and its focus on the construction of the Educational Resource Discovery Index (ERuDIte), which identifies, collects, describes, and organizes online data science materials from BD2K awardees, open online courses, and videos from scientific lectures and tutorials. ERuDIte now indexes over 9,500 resources. Given the richness of online training materials and the constant evolution of biomedical data science, computational methods applying information retrieval, natural language processing, and machine learning techniques are required - in effect, using data science to inform training in data science. In so doing, the TCC seeks to democratize novel insights and discoveries brought forth via large-scale data science training. PMID:29218890

  7. U.S. Institutional Research Productivity in Major Science Education Research Journals: Top 30 for 2000's

    ERIC Educational Resources Information Center

    Barrow, Lloyd H.; Tang, Nai-en

    2013-01-01

    VonAalst (2010) used Google Scholar to identify the top four science education research journals: "Journal of Research in Science Teaching," "Science Education," "International Journal of Science Education," and "Journal of Science Teacher Education." U.S. institutional productivity for 2000-2009 for the…

  8. High-Performance Compute Infrastructure in Astronomy: 2020 Is Only Months Away

    NASA Astrophysics Data System (ADS)

    Berriman, B.; Deelman, E.; Juve, G.; Rynge, M.; Vöckler, J. S.

    2012-09-01

    By 2020, astronomy will be awash with as much as 60 PB of public data. Full scientific exploitation of such massive volumes of data will require high-performance computing on server farms co-located with the data. Development of this computing model will be a community-wide enterprise that has profound cultural and technical implications. Astronomers must be prepared to develop environment-agnostic applications that support parallel processing. The community must investigate the applicability and cost-benefit of emerging technologies such as cloud computing to astronomy, and must engage the Computer Science community to develop science-driven cyberinfrastructure such as workflow schedulers and optimizers. We report here the results of collaborations between a science center, IPAC, and a Computer Science research institute, ISI. These collaborations may be considered pathfinders in developing a high-performance compute infrastructure in astronomy. These collaborations investigated two exemplar large-scale science-driver workflow applications: 1) Calculation of an infrared atlas of the Galactic Plane at 18 different wavelengths by placing data from multiple surveys on a common plate scale and co-registering all the pixels; 2) Calculation of an atlas of periodicities present in the public Kepler data sets, which currently contain 380,000 light curves. These products have been generated with two workflow applications, written in C for performance and designed to support parallel processing on multiple environments and platforms, but with different compute resource needs: the Montage image mosaic engine is I/O-bound, and the NASA Star and Exoplanet Database periodogram code is CPU-bound. Our presentation will report cost and performance metrics and lessons-learned for continuing development. Applicability of Cloud Computing: Commercial Cloud providers generally charge for all operations, including processing, transfer of input and output data, and for storage of data, and so the costs of running applications vary widely according to how they use resources. The cloud is well suited to processing CPU-bound (and memory bound) workflows such as the periodogram code, given the relatively low cost of processing in comparison with I/O operations. I/O-bound applications such as Montage perform best on high-performance clusters with fast networks and parallel file-systems. Science-driven Cyberinfrastructure: Montage has been widely used as a driver application to develop workflow management services, such as task scheduling in distributed environments, designing fault tolerance techniques for job schedulers, and developing workflow orchestration techniques. Running Parallel Applications Across Distributed Cloud Environments: Data processing will eventually take place in parallel distributed across cyber infrastructure environments having different architectures. We have used the Pegasus Work Management System (WMS) to successfully run applications across three very different environments: TeraGrid, OSG (Open Science Grid), and FutureGrid. Provisioning resources across different grids and clouds (also referred to as Sky Computing), involves establishing a distributed environment, where issues of, e.g, remote job submission, data management, and security need to be addressed. This environment also requires building virtual machine images that can run in different environments. Usually, each cloud provides basic images that can be customized with additional software and services. In most of our work, we provisioned compute resources using a custom application, called Wrangler. Pegasus WMS abstracts the architectures of the compute environments away from the end-user, and can be considered a first-generation tool suitable for scientists to run their applications on disparate environments.

  9. 75 FR 78719 - National Institute of Environmental Health Sciences; Notice of Closed Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-16

    ... Committee: National Institute of Environmental Health Sciences Special Emphasis Panel. Method Development... Environmental Health Sciences; Notice of Closed Meetings Pursuant to section 10(d) of the Federal Advisory... clearly unwarranted invasion of personal privacy. Name of Committee: National Institute of Environmental...

  10. JPRS Report, Science & Technology, USSR: Science & Technology Policy

    DTIC Science & Technology

    1989-12-07

    technologies. —The restoration of the biosphere and its return to an ecologically clean, healthy state; the preservation and reproduction of soils and the...and Geochemistry of Combustible Materials Institute, Casting Problems Institute, Technical Thermal Physics Institute, Gas Insti- tute, Social and...academician, honorary director of the Institute of Geochemistry imeni A.P. Vinogradov of the Siberian Department of the USSR Academy of Sciences

  11. Big Science, Team Science, and Open Science for Neuroscience.

    PubMed

    Koch, Christof; Jones, Allan

    2016-11-02

    The Allen Institute for Brain Science is a non-profit private institution dedicated to basic brain science with an internal organization more commonly found in large physics projects-large teams generating complete, accurate and permanent resources for the mouse and human brain. It can also be viewed as an experiment in the sociology of neuroscience. We here describe some of the singular differences to more academic, PI-focused institutions. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Statistics, Computation, and Modeling in Cosmology

    NASA Astrophysics Data System (ADS)

    Jewell, Jeff; Guiness, Joe; SAMSI 2016 Working Group in Cosmology

    2017-01-01

    Current and future ground and space based missions are designed to not only detect, but map out with increasing precision, details of the universe in its infancy to the present-day. As a result we are faced with the challenge of analyzing and interpreting observations from a wide variety of instruments to form a coherent view of the universe. Finding solutions to a broad range of challenging inference problems in cosmology is one of the goals of the “Statistics, Computation, and Modeling in Cosmology” workings groups, formed as part of the year long program on ‘Statistical, Mathematical, and Computational Methods for Astronomy’, hosted by the Statistical and Applied Mathematical Sciences Institute (SAMSI), a National Science Foundation funded institute. Two application areas have emerged for focused development in the cosmology working group involving advanced algorithmic implementations of exact Bayesian inference for the Cosmic Microwave Background, and statistical modeling of galaxy formation. The former includes study and development of advanced Markov Chain Monte Carlo algorithms designed to confront challenging inference problems including inference for spatial Gaussian random fields in the presence of sources of galactic emission (an example of a source separation problem). Extending these methods to future redshift survey data probing the nonlinear regime of large scale structure formation is also included in the working group activities. In addition, the working group is also focused on the study of ‘Galacticus’, a galaxy formation model applied to dark matter-only cosmological N-body simulations operating on time-dependent halo merger trees. The working group is interested in calibrating the Galacticus model to match statistics of galaxy survey observations; specifically stellar mass functions, luminosity functions, and color-color diagrams. The group will use subsampling approaches and fractional factorial designs to statistically and computationally efficiently explore the Galacticus parameter space. The group will also use the Galacticus simulations to study the relationship between the topological and physical structure of the halo merger trees and the properties of the resulting galaxies.

  13. Real-Time Mapping Spectroscopy on the Ground, in the Air, and in Space

    NASA Astrophysics Data System (ADS)

    Thompson, D. R.; Allwood, A.; Chien, S.; Green, R. O.; Wettergreen, D. S.

    2016-12-01

    Real-time data interpretation can benefit both remote in situ exploration and remote sensing. Basic analyses at the sensor can monitor instrument performance and reveal invisible science phenomena in real time. This promotes situational awareness for remote robotic explorers or campaign decision makers, enabling adaptive data collection, reduced downlink requirements, and coordinated multi-instrument observations. Fast analysis is ideal for mapping spectrometers providing unambiguous, quantitative geophysical measurements. This presentation surveys recent computational advances in real-time spectroscopic analysis for Earth science and planetary exploration. Spectral analysis at the sensor enables new operations concepts that significantly improve science yield. Applications include real-time detection of fugitive greenhouse emissions by airborne monitoring, real-time cloud screening and mineralogical mapping by orbital spectrometers, and adaptive measurement by the PIXL instrument on the Mars 2020 rover. Copyright 2016 California Institute of Technology. All Rights Reserved. We acknowledge support of the US Government, NASA, the Earth Science Division and Terrestrial Ecology program.

  14. Mission leverage education: NSU/NASA innovative undergraduate model

    NASA Technical Reports Server (NTRS)

    Chaudhury, S. Raj; Shaw, Paula R. D.

    2005-01-01

    The BEST Lab (Center for Excellence in Science Education), the Center for Materials Research (CMR), and the Chemistry, Mathematics, Physics, and Computer Science (CS) Departments at Norfolk State University (NSU) joined forces to implement MiLEN(2) IUM - an innovative approach tu integrate current and emerging research into the undergraduate curricula and train students on NASA-related fields. An Earth Observing System (EOS) mission was simulated where students are educated and trained in many aspects of Remote Sensing: detector physics and spectroscopy; signal processing; data conditioning, analysis, visualization; and atmospheric science. This model and its continued impact is expected to significantly enhance the quality of the Mathematics, Science, Engineering and Technology (MSET or SMET) educational experience and to inspire students from historically underrepresented groups to pursue careers in NASA-related fields. MiLEN(2) IUM will be applicable to other higher education institutions that are willing to make the commitment to this endeavor in terms of faculty interest and space.

  15. Building the biomedical data science workforce.

    PubMed

    Dunn, Michelle C; Bourne, Philip E

    2017-07-01

    This article describes efforts at the National Institutes of Health (NIH) from 2013 to 2016 to train a national workforce in biomedical data science. We provide an analysis of the Big Data to Knowledge (BD2K) training program strengths and weaknesses with an eye toward future directions aimed at any funder and potential funding recipient worldwide. The focus is on extramurally funded programs that have a national or international impact rather than the training of NIH staff, which was addressed by the NIH's internal Data Science Workforce Development Center. From its inception, the major goal of BD2K was to narrow the gap between needed and existing biomedical data science skills. As biomedical research increasingly relies on computational, mathematical, and statistical thinking, supporting the training and education of the workforce of tomorrow requires new emphases on analytical skills. From 2013 to 2016, BD2K jump-started training in this area for all levels, from graduate students to senior researchers.

  16. Building the biomedical data science workforce

    PubMed Central

    Dunn, Michelle C.; Bourne, Philip E.

    2017-01-01

    This article describes efforts at the National Institutes of Health (NIH) from 2013 to 2016 to train a national workforce in biomedical data science. We provide an analysis of the Big Data to Knowledge (BD2K) training program strengths and weaknesses with an eye toward future directions aimed at any funder and potential funding recipient worldwide. The focus is on extramurally funded programs that have a national or international impact rather than the training of NIH staff, which was addressed by the NIH’s internal Data Science Workforce Development Center. From its inception, the major goal of BD2K was to narrow the gap between needed and existing biomedical data science skills. As biomedical research increasingly relies on computational, mathematical, and statistical thinking, supporting the training and education of the workforce of tomorrow requires new emphases on analytical skills. From 2013 to 2016, BD2K jump-started training in this area for all levels, from graduate students to senior researchers. PMID:28715407

  17. Science Production in Germany, France, Belgium, and Luxembourg: Comparing the Contributions of Research Universities and Institutes to Science, Technology, Engineering, Mathematics, and Health

    ERIC Educational Resources Information Center

    Powell, Justin J. W.; Dusdal, Jennifer

    2017-01-01

    Charting significant growth in science production over the 20th century in four European Union member states, this neo-institutional analysis describes the development and current state of universities and research institutes that bolster Europe's position as a key region in global science. On-going internationalization and Europeanization of…

  18. Learning the Hidden Structure of Speech.

    DTIC Science & Technology

    1987-02-01

    STRUCTURE OF SPEECH J. L. Elman and D. Zipser February 1987 ICS Report 8701 COGNITIVE SCIENCE ,a - ~QIt b’eez INSTITUTE FOR COGNITIVE SCIENCE...Zipser February 1987 ICS Report 8701 *0:-.:-! ,%. ., Jeffrey L. Elman David Zipser Department of Linguistics Institute for Cognitive Science...any purpose of the United States Government. Requests for reprints should be sent to the Institute for Cognitive Science, C-015; University of

  19. Federal role in science will grow, NSF Director predicts

    NASA Astrophysics Data System (ADS)

    Simarski, Lynn Teo

    1992-01-01

    Walter Massey, director of the National Science Foundation, recently called for a fundamental reassessment of the relationship between the federal government and research institutions. On January 15, Massey, now in his ninth month at NSF, described great changes in the government-university “partnership” since the “golden age” of the 1960s. Speaking in Washington, D.C. at a seminar of George Washington University's Center for International Science and Technology Policy, he predicted that his own term at the foundation would not be “business as usual.”Science and technology have shifted from being a peripheral concern of the government to a central policy issue, Massey said. The United States now sees science as too important to leave its agenda for scientists to set themselves. In response, the federal government is launching the initiatives of the Federal Coordinating Council for Science, Engineering, and Technology. Some of last year's FCCSET budget initiatives, spanning a number of federal agencies, dealt with math and science education, global change, and high-performance computing. Such programs “are research agenda put forth from the federal side—they are not things put forth from the [research] community,” Massey pointed out.

  20. Speaking Up For Science

    NASA Astrophysics Data System (ADS)

    Spilhaus, Fred

    2005-06-01

    The Smithsonian Institution's National Museum of Natural History in Washington D.C. is planning to show a film, "A Privileged Planet" that promotes creationism in the form of "intelligent design." The film is based on the book by Guillermo Gonzalez and Jay Wesley Richards, both affiliated with the Discovery Institute, which advocates teaching "intelligent design" as science in U.S. public schools. By associating with the Discovery Institute, the Smithsonian Institution will associate science with creationism and damage their credibility. The film is slated for airing on 23 June, unless the Smithsonian comes to its senses.Why is this important? Because the film promotes a long term strategy of the Discovery Institute (//www.discovery.org/csc/) to replace "materialistic science" with "intelligent design." The film fosters the idea that science should include the supernatural. This is unacceptable. AGU's position is clear, creationism is not science and AGU opposes all efforts to promote creationism as science, (The full text of the AGU position statement can be found at: //www.agu.org/sci_soc/policy/positions/evolution.shtml).

Top