Empirical Determination of Competence Areas to Computer Science Education
ERIC Educational Resources Information Center
Zendler, Andreas; Klaudt, Dieter; Seitz, Cornelia
2014-01-01
The authors discuss empirically determined competence areas to K-12 computer science education, emphasizing the cognitive level of competence. The results of a questionnaire with 120 professors of computer science serve as a database. By using multi-dimensional scaling and cluster analysis, four competence areas to computer science education…
ERIC Educational Resources Information Center
Batey, Anne
Computers are integrated into science education when they are used as the most appropriate tool or delivery system to support the goals of science education. The goals of science education can be condensed into two general areas. One area concerns the preparation of a science-literate citizenry; the second area concerns understanding the…
Computer Science Research at Langley
NASA Technical Reports Server (NTRS)
Voigt, S. J. (Editor)
1982-01-01
A workshop was held at Langley Research Center, November 2-5, 1981, to highlight ongoing computer science research at Langley and to identify additional areas of research based upon the computer user requirements. A panel discussion was held in each of nine application areas, and these are summarized in the proceedings. Slides presented by the invited speakers are also included. A survey of scientific, business, data reduction, and microprocessor computer users helped identify areas of focus for the workshop. Several areas of computer science which are of most concern to the Langley computer users were identified during the workshop discussions. These include graphics, distributed processing, programmer support systems and tools, database management, and numerical methods.
A Financial Technology Entrepreneurship Program for Computer Science Students
ERIC Educational Resources Information Center
Lawler, James P.; Joseph, Anthony
2011-01-01
Education in entrepreneurship is becoming a critical area of curricula for computer science students. Few schools of computer science have a concentration in entrepreneurship in the computing curricula. The paper presents Technology Entrepreneurship in the curricula at a leading school of computer science and information systems, in which students…
On Real-Time Systems Using Local Area Networks.
1987-07-01
87-35 July, 1987 CS-TR-1892 On Real - Time Systems Using Local Area Networks*I VShem-Tov Levi Department of Computer Science Satish K. Tripathit...1892 On Real - Time Systems Using Local Area Networks* Shem-Tov Levi Department of Computer Science Satish K. Tripathit Department of Computer Science...constraints and the clock systems that feed the time to real - time systems . A model for real-time system based on LAN communication is presented in
Applied Information Systems Research Program (AISRP). Workshop 2: Meeting Proceedings
NASA Technical Reports Server (NTRS)
1992-01-01
The Earth and space science participants were able to see where the current research can be applied in their disciplines and computer science participants could see potential areas for future application of computer and information systems research. The Earth and Space Science research proposals for the High Performance Computing and Communications (HPCC) program were under evaluation. Therefore, this effort was not discussed at the AISRP Workshop. OSSA's other high priority area in computer science is scientific visualization, with the entire second day of the workshop devoted to it.
The NASA computer science research program plan
NASA Technical Reports Server (NTRS)
1983-01-01
A taxonomy of computer science is included, one state of the art of each of the major computer science categories is summarized. A functional breakdown of NASA programs under Aeronautics R and D, space R and T, and institutional support is also included. These areas were assessed against the computer science categories. Concurrent processing, highly reliable computing, and information management are identified.
Report on Computing and Networking in the Space Science Laboratory by the SSL Computer Committee
NASA Technical Reports Server (NTRS)
Gallagher, D. L. (Editor)
1993-01-01
The Space Science Laboratory (SSL) at Marshall Space Flight Center is a multiprogram facility. Scientific research is conducted in four discipline areas: earth science and applications, solar-terrestrial physics, astrophysics, and microgravity science and applications. Representatives from each of these discipline areas participate in a Laboratory computer requirements committee, which developed this document. The purpose is to establish and discuss Laboratory objectives for computing and networking in support of science. The purpose is also to lay the foundation for a collective, multiprogram approach to providing these services. Special recognition is given to the importance of the national and international efforts of our research communities toward the development of interoperable, network-based computer applications.
Eyetracking Methodology in SCMC: A Tool for Empowering Learning and Teaching
ERIC Educational Resources Information Center
Stickler, Ursula; Shi, Lijing
2017-01-01
Computer-assisted language learning, or CALL, is an interdisciplinary area of research, positioned between science and social science, computing and education, linguistics and applied linguistics. This paper argues that by appropriating methods originating in some areas of CALL-related research, for example human-computer interaction (HCI) or…
Non-Determinism: An Abstract Concept in Computer Science Studies
ERIC Educational Resources Information Center
Armoni, Michal; Gal-Ezer, Judith
2007-01-01
Non-determinism is one of the most important, yet abstract, recurring concepts of Computer Science. It plays an important role in Computer Science areas such as formal language theory, computability theory, distributed computing, and operating systems. We conducted a series of studies on the perception of non-determinism. In the current research,…
NASA Technical Reports Server (NTRS)
1994-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in the areas of (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest, including acoustics and combustion; (3) experimental research in transition and turbulence and aerodynamics involving Langley facilities and scientists; and (4) computer science.
Process-Based Development of Competence Models to Computer Science Education
ERIC Educational Resources Information Center
Zendler, Andreas; Seitz, Cornelia; Klaudt, Dieter
2016-01-01
A process model ("cpm.4.CSE") is introduced that allows the development of competence models in computer science education related to curricular requirements. It includes eight subprocesses: (a) determine competence concept, (b) determine competence areas, (c) identify computer science concepts, (d) assign competence dimensions to…
Studies in Mathematics, Volume 22. Studies in Computer Science.
ERIC Educational Resources Information Center
Pollack, Seymour V., Ed.
The nine articles in this collection were selected because they represent concerns central to computer science, emphasize topics of particular interest to mathematicians, and underscore the wide range of areas deeply and continually affected by computer science. The contents consist of: "Introduction" (S. V. Pollack), "The…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saffer, Shelley
2014-12-01
This is a final report of the DOE award DE-SC0001132, Advanced Artificial Science. The development of an artificial science and engineering research infrastructure to facilitate innovative computational modeling, analysis, and application to interdisciplinary areas of scientific investigation. This document describes the achievements of the goals, and resulting research made possible by this award.
Strategic research in the social sciences
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bainbridge, W.S.
1995-12-31
The federal government has identified a number of multi-agency funding initiatives for science in strategic areas, such as the initiatives on global environmental change and high performance computing, that give some role to the social sciences. Seven strategic areas for social science research are given with potential for federal funding: (1) Democratization. (2) Human Capital. (3) Administrative Science. (4) Cognitive Science. (5) High Performance Computing and Digital Libraries. (6) Human Dimensions of Environmental Change. and (7) Human Genetic Diversity. The first two are addressed in detail and the remainder as a group. 10 refs.
A Novel Coupling Pattern in Computational Science and Engineering Software
Computational science and engineering (CSE) software is written by experts of certain area(s). Due to the specialization, existing CSE software may need to integrate other CSE software systems developed by different groups of experts. The coupling problem is one of the challenges...
A Novel Coupling Pattern in Computational Science and Engineering Software
Computational science and engineering (CSE) software is written by experts of certain area(s). Due to the specialization,existing CSE software may need to integrate other CSE software systems developed by different groups of experts. Thecoupling problem is one of the challenges f...
Entrepreneurial Health Informatics for Computer Science and Information Systems Students
ERIC Educational Resources Information Center
Lawler, James; Joseph, Anthony; Narula, Stuti
2014-01-01
Corporate entrepreneurship is a critical area of curricula for computer science and information systems students. Few institutions of computer science and information systems have entrepreneurship in the curricula however. This paper presents entrepreneurial health informatics as a course in a concentration of Technology Entrepreneurship at a…
Representing, Running, and Revising Mental Models: A Computational Model
ERIC Educational Resources Information Center
Friedman, Scott; Forbus, Kenneth; Sherin, Bruce
2018-01-01
People use commonsense science knowledge to flexibly explain, predict, and manipulate the world around them, yet we lack computational models of how this commonsense science knowledge is represented, acquired, utilized, and revised. This is an important challenge for cognitive science: Building higher order computational models in this area will…
Computers in Undergraduate Science Education. Conference Proceedings.
ERIC Educational Resources Information Center
Blum, Ronald, Ed.
Six areas of computer use in undergraduate education, particularly in the fields of mathematics and physics, are discussed in these proceedings. The areas included are: the computational mode; computer graphics; the simulation mode; analog computing; computer-assisted instruction; and the current politics and management of college level computer…
Educational NASA Computational and Scientific Studies (enCOMPASS)
NASA Technical Reports Server (NTRS)
Memarsadeghi, Nargess
2013-01-01
Educational NASA Computational and Scientific Studies (enCOMPASS) is an educational project of NASA Goddard Space Flight Center aimed at bridging the gap between computational objectives and needs of NASA's scientific research, missions, and projects, and academia's latest advances in applied mathematics and computer science. enCOMPASS achieves this goal via bidirectional collaboration and communication between NASA and academia. Using developed NASA Computational Case Studies in university computer science/engineering and applied mathematics classes is a way of addressing NASA's goals of contributing to the Science, Technology, Education, and Math (STEM) National Objective. The enCOMPASS Web site at http://encompass.gsfc.nasa.gov provides additional information. There are currently nine enCOMPASS case studies developed in areas of earth sciences, planetary sciences, and astrophysics. Some of these case studies have been published in AIP and IEEE's Computing in Science and Engineering magazines. A few university professors have used enCOMPASS case studies in their computational classes and contributed their findings to NASA scientists. In these case studies, after introducing the science area, the specific problem, and related NASA missions, students are first asked to solve a known problem using NASA data and past approaches used and often published in a scientific/research paper. Then, after learning about the NASA application and related computational tools and approaches for solving the proposed problem, students are given a harder problem as a challenge for them to research and develop solutions for. This project provides a model for NASA scientists and engineers on one side, and university students, faculty, and researchers in computer science and applied mathematics on the other side, to learn from each other's areas of work, computational needs and solutions, and the latest advances in research and development. This innovation takes NASA science and engineering applications to computer science and applied mathematics university classes, and makes NASA objectives part of the university curricula. There is great potential for growth and return on investment of this program to the point where every major university in the U.S. would use at least one of these case studies in one of their computational courses, and where every NASA scientist and engineer facing a computational challenge (without having resources or expertise to solve it) would use enCOMPASS to formulate the problem as a case study, provide it to a university, and get back their solutions and ideas.
Computer Science and Technology Publications. NBS Publications List 84.
ERIC Educational Resources Information Center
National Bureau of Standards (DOC), Washington, DC. Inst. for Computer Sciences and Technology.
This bibliography lists publications of the Institute for Computer Sciences and Technology of the National Bureau of Standards. Publications are listed by subject in the areas of computer security, computer networking, and automation technology. Sections list publications of: (1) current Federal Information Processing Standards; (2) computer…
Correlation Educational Model in Primary Education Curriculum of Mathematics and Computer Science
ERIC Educational Resources Information Center
Macinko Kovac, Maja; Eret, Lidija
2012-01-01
This article gives insight into methodical correlation model of teaching mathematics and computer science. The model shows the way in which the related areas of computer science and mathematics can be supplemented, if it transforms the way of teaching and creates a "joint" lessons. Various didactic materials are designed, in which all…
NASA Astrophysics Data System (ADS)
Thackeray, Lynn Roy
The purpose of this study is to understand the meaning that women make of the social and cultural factors that influence their reasons for entering and remaining in study of computer science. The twenty-first century presents many new challenges in career development and workforce choices for both men and women. Information technology has become the driving force behind many areas of the economy. As this trend continues, it has become essential that U.S. citizens need to pursue a career in technologies, including the computing sciences. Although computer science is a very lucrative profession, many Americans, especially women, are not choosing it as a profession. Recent studies have shown no significant differences in math, technical and science competency between men and women. Therefore, other factors, such as social, cultural, and environmental influences seem to affect women's decisions in choosing an area of study and career choices. A phenomenological method of qualitative research was used in this study, based on interviews of seven female students who are currently enrolled in a post-secondary computer science program. Their narratives provided meaning into the social and cultural environments that contribute to their persistence in their technical studies, as well as identifying barriers and challenges that are faced by female students who choose to study computer science. It is hoped that the data collected from this study may provide recommendations for the recruiting, retention and support for women in computer science departments of U.S. colleges and universities, and thereby increase the numbers of women computer scientists in industry. Keywords: gender access, self-efficacy, culture, stereotypes, computer education, diversity.
Designing Educational Games for Computer Programming: A Holistic Framework
ERIC Educational Resources Information Center
Malliarakis, Christos; Satratzemi, Maya; Xinogalos, Stelios
2014-01-01
Computer science is continuously evolving during the past decades. This has also brought forth new knowledge that should be incorporated and new learning strategies must be adopted for the successful teaching of all sub-domains. For example, computer programming is a vital knowledge area within computer science with constantly changing curriculum…
NASA Technical Reports Server (NTRS)
1993-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics and computer science during the period April 1, 1993 through September 30, 1993. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustic and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.
NASA Technical Reports Server (NTRS)
1994-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1993 through March 31, 1994. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustics and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.
Computational Intelligence and Its Impact on Future High-Performance Engineering Systems
NASA Technical Reports Server (NTRS)
Noor, Ahmed K. (Compiler)
1996-01-01
This document contains presentations from the joint UVA/NASA Workshop on Computational Intelligence held at the Virginia Consortium of Engineering and Science Universities, Hampton, Virginia, June 27-28, 1995. The presentations addressed activities in the areas of fuzzy logic, neural networks, and evolutionary computations. Workshop attendees represented NASA, the National Science Foundation, the Department of Energy, National Institute of Standards and Technology (NIST), the Jet Propulsion Laboratory, industry, and academia. The workshop objectives were to assess the state of technology in the Computational intelligence area and to provide guidelines for future research.
Montague, P. Read; Dolan, Raymond J.; Friston, Karl J.; Dayan, Peter
2013-01-01
Computational ideas pervade many areas of science and have an integrative explanatory role in neuroscience and cognitive science. However, computational depictions of cognitive function have had surprisingly little impact on the way we assess mental illness because diseases of the mind have not been systematically conceptualized in computational terms. Here, we outline goals and nascent efforts in the new field of computational psychiatry, which seeks to characterize mental dysfunction in terms of aberrant computations over multiple scales. We highlight early efforts in this area that employ reinforcement learning and game theoretic frameworks to elucidate decision-making in health and disease. Looking forwards, we emphasize a need for theory development and large-scale computational phenotyping in human subjects. PMID:22177032
ERIC Educational Resources Information Center
Soh, Leen-Kiat; Samal, Ashok; Nugent, Gwen
2007-01-01
This paper describes the Reinventing Computer Science Curriculum Project at the University of Nebraska-Lincoln. Motivated by rapid and significant changes in the information technology and computing areas, high diversity in student aptitudes, and high dropout rates, the project designed and implemented an integrated instructional/research…
ERIC Educational Resources Information Center
Science News, 1988
1988-01-01
Reviews major science news stories of 1988 as reported in the pages of Science News. Covers the areas of anthropology, astronomy, behavior, biology, biomedicine, chemistry, earth sciences, environment, food science, mathematics and computers, paleobiology, physics, science and society, space sciences, and technology. (YP)
An Annotated Partial List of Science-Related Computer Bulletin Board Systems.
ERIC Educational Resources Information Center
Journal of Student Research, 1990
1990-01-01
A list of science-related computer bulletin board systems is presented. Entries include geographic area, phone number, and a short explanation of services. Also included are the addresses and phone numbers of selected commercial services. (KR)
Institute for Computer Sciences and Technology. Annual Report FY 1986.
ERIC Educational Resources Information Center
National Bureau of Standards (DOC), Washington, DC. Inst. for Computer Sciences and Technology.
Activities of the Institute for Computer Sciences and Technology (ICST) within the U.S. Department of Commerce during fiscal year 1986 are described in this annual report, which summarizes research and publications by ICST in the following areas: (1) standards and guidelines for computer security, including encryption and message authentication…
Computers in Life Science Education. Volume 5, 1988.
ERIC Educational Resources Information Center
Computers in Life Science Education, 1988
1988-01-01
Designed to serve as a means of communication among life science educators who anticipate or are currently using microcomputers as an educational tool, this volume of newsletters provides background information and practical suggestions on computer use. Over 80 articles are included. Topic areas include: (1) using a personal computer in a plant…
Cloud Computing in the Curricula of Schools of Computer Science and Information Systems
ERIC Educational Resources Information Center
Lawler, James P.
2011-01-01
The cloud continues to be a developing area of information systems. Evangelistic literature in the practitioner field indicates benefit for business firms but disruption for technology departments of the firms. Though the cloud currently is immature in methodology, this study defines a model program by which computer science and information…
1983-10-28
Computing. By seizing an opportunity to leverage recent advances in artificial intelligence, computer science, and microelectronics, the Agency plans...occurred in many separated areas of artificial intelligence, computer science, and microelectronics. Advances in "expert system" technology now...and expert knowledge o Advances in Artificial Intelligence: Mechanization of speech recognition, vision, and natural language understanding. o
Topics in Computational Learning Theory and Graph Algorithms.
ERIC Educational Resources Information Center
Board, Raymond Acton
This thesis addresses problems from two areas of theoretical computer science. The first area is that of computational learning theory, which is the study of the phenomenon of concept learning using formal mathematical models. The goal of computational learning theory is to investigate learning in a rigorous manner through the use of techniques…
Computation, Mathematics and Logistics Department Report for Fiscal Year 1978.
1980-03-01
storage technology. A reference library on these and related areas is now composed of two thousand documents. The most comprehensive tool available...at DTNSRDC on the CDC 6000 Computer System for a variety of applications including Navy Logistics, Library Science, Ocean Science, Contract Manage... Library Science) Track technical documents on advanced ship design Univ. of Virginia at Charlottesville - (Ocean Science) Monitor research projects for
Increasing the Interest of Elementary Age Students in Computer Science though Day Camps
ERIC Educational Resources Information Center
Cliburn, Dan; Weisheit, Tracey; Griffith, Jason; Jones, Matt; Rackley, Hunter; Richey, Eric; Stormer, Kevin
2004-01-01
Computer Science and related majors have seen a decrease in enrollment across the country in recent years. While there are several theories behind why this may be the case, as educators in many areas of computing and information technology, this is a trend we should attempt to reverse. While it is true that many children are "computer literate",…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collins, W. E.
2004-08-16
Computational Science plays a big role in research and development in mathematics, science, engineering and biomedical disciplines. The Alliance for Computational Science Collaboration (ACSC) has the goal of training African-American and other minority scientists in the computational science field for eventual employment with the Department of Energy (DOE). The involvements of Historically Black Colleges and Universities (HBCU) in the Alliance provide avenues for producing future DOE African-American scientists. Fisk University has been participating in this program through grants from the DOE. The DOE grant supported computational science activities at Fisk University. The research areas included energy related projects, distributed computing,more » visualization of scientific systems and biomedical computing. Students' involvement in computational science research included undergraduate summer research at Oak Ridge National Lab, on-campus research involving the participation of undergraduates, participation of undergraduate and faculty members in workshops, and mentoring of students. These activities enhanced research and education in computational science, thereby adding to Fisk University's spectrum of research and educational capabilities. Among the successes of the computational science activities are the acceptance of three undergraduate students to graduate schools with full scholarships beginning fall 2002 (one for master degree program and two for Doctoral degree program).« less
ERIC Educational Resources Information Center
Science News, 1990
1990-01-01
This is a review of important science news stories of 1990 as reported in the pages of this journal. Areas covered include anthropology, astronomy, behavior, biology, biomedicine, chemistry, computers and math, earth sciences, environment, food science, materials science, paleobiology, physics, science and society, and space sciences. (CW)
States Move toward Computer Science Standards. Policy Update. Vol. 23, No. 17
ERIC Educational Resources Information Center
Tilley-Coulson, Eve
2016-01-01
While educators and parents recognize computer science as a key skill for career readiness, only five states have adopted learning standards in this area. Tides are changing, however, as the Every Student Succeeds Act (ESSA) recognizes with its call on states to provide a "well-rounded education" for students, to include computer science…
ERIC Educational Resources Information Center
Guo, Chorng-Jee, Ed.
1998-01-01
This proceedings covers the domain and content areas of learning and learners; curriculum and materials; instruction (including computer-assisted instruction); assessment and evaluation; history and philosophy of science; teacher preparation and professional development; and related areas of interest including environmental, special, health,…
Bioinformatics in high school biology curricula: a study of state science standards.
Wefer, Stephen H; Sheppard, Keith
2008-01-01
The proliferation of bioinformatics in modern biology marks a modern revolution in science that promises to influence science education at all levels. This study analyzed secondary school science standards of 49 U.S. states (Iowa has no science framework) and the District of Columbia for content related to bioinformatics. The bioinformatics content of each state's biology standards was analyzed and categorized into nine areas: Human Genome Project/genomics, forensics, evolution, classification, nucleotide variations, medicine, computer use, agriculture/food technology, and science technology and society/socioscientific issues. Findings indicated a generally low representation of bioinformatics-related content, which varied substantially across the different areas, with Human Genome Project/genomics and computer use being the lowest (8%), and evolution being the highest (64%) among states' science frameworks. This essay concludes with recommendations for reworking/rewording existing standards to facilitate the goal of promoting science literacy among secondary school students.
Bioinformatics in High School Biology Curricula: A Study of State Science Standards
Sheppard, Keith
2008-01-01
The proliferation of bioinformatics in modern biology marks a modern revolution in science that promises to influence science education at all levels. This study analyzed secondary school science standards of 49 U.S. states (Iowa has no science framework) and the District of Columbia for content related to bioinformatics. The bioinformatics content of each state's biology standards was analyzed and categorized into nine areas: Human Genome Project/genomics, forensics, evolution, classification, nucleotide variations, medicine, computer use, agriculture/food technology, and science technology and society/socioscientific issues. Findings indicated a generally low representation of bioinformatics-related content, which varied substantially across the different areas, with Human Genome Project/genomics and computer use being the lowest (8%), and evolution being the highest (64%) among states' science frameworks. This essay concludes with recommendations for reworking/rewording existing standards to facilitate the goal of promoting science literacy among secondary school students. PMID:18316818
ERIC Educational Resources Information Center
Mikulecky, Larry
Interactive computer programs, developed at Indiana University's Learning Skills Center, were designed to model effective strategies for reading biology and psychology textbooks. For each subject area, computer programs and textbook passages were used to instruct and model for students how to identify key concepts, compare and contrast concepts,…
von Arnim, Albrecht G.; Missra, Anamika
2017-01-01
Leading voices in the biological sciences have called for a transformation in graduate education leading to the PhD degree. One area commonly singled out for growth and innovation is cross-training in computational science. In 1998, the University of Tennessee (UT) founded an intercollegiate graduate program called the UT-ORNL Graduate School of Genome Science and Technology in partnership with the nearby Oak Ridge National Laboratory. Here, we report outcome data that attest to the program’s effectiveness in graduating computationally enabled biologists for diverse careers. Among 77 PhD graduates since 2003, the majority came with traditional degrees in the biological sciences, yet two-thirds moved into computational or hybrid (computational–experimental) positions. We describe the curriculum of the program and how it has changed. We also summarize how the program seeks to establish cohesion between computational and experimental biologists. This type of program can respond flexibly and dynamically to unmet training needs. In conclusion, this study from a flagship, state-supported university may serve as a reference point for creating a stable, degree-granting, interdepartmental graduate program in computational biology and allied areas. PMID:29167223
Semiannual report, 1 April - 30 September 1991
NASA Technical Reports Server (NTRS)
1991-01-01
The major categories of the current Institute for Computer Applications in Science and Engineering (ICASE) research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification problems, with emphasis on effective numerical methods; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software for parallel computers. Research in these areas is discussed.
ERIC Educational Resources Information Center
Congress of the U.S., Washington, DC. Senate Committee on Commerce, Science, and Transportation.
This hearing before the Senate Subcommittee on Science, Technology, and Space focuses on S. 272, the High-Performance Computing and Communications Act of 1991, a bill that provides for a coordinated federal research and development program to ensure continued U.S. leadership in this area. Performance computing is defined as representing the…
NASA Astrophysics Data System (ADS)
Wang, Jianxiong
2014-06-01
This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF
Research Area 3: Mathematical Sciences: 3.4, Discrete Mathematics and Computer Science
2015-06-10
013-0043-1 Charles Chui, Hrushikesh Mhaskar. MRA contextual-recovery extension of smooth functions on manifolds, Applied and Computational Harmonic...753507. International Society for Optics and Photonics, 2010. [5] C. K. Chui and H. N. Mhaskar. MRA contextual-recovery extension of smooth functions on
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quirk, W.J.; Canada, J.; de Vore, L.
1994-04-01
This issue highlights the Lawrence Livermore National Laboratory`s 1993 accomplishments in our mission areas and core programs: economic competitiveness, national security, energy, the environment, lasers, biology and biotechnology, engineering, physics, chemistry, materials science, computers and computing, and science and math education. Secondary topics include: nonproliferation, arms control, international security, environmental remediation, and waste management.
Visualising "Junk" DNA through Bioinformatics
ERIC Educational Resources Information Center
Elwess, Nancy L.; Latourelle, Sandra M.; Cauthorn, Olivia
2005-01-01
One of the hottest areas of science today is the field in which biology, information technology,and computer science are merged into a single discipline called bioinformatics. This field enables the discovery and analysis of biological data, including nucleotide and amino acid sequences that are easily accessed through the use of computers. As…
MaRIE theory, modeling and computation roadmap executive summary
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lookman, Turab
The confluence of MaRIE (Matter-Radiation Interactions in Extreme) and extreme (exascale) computing timelines offers a unique opportunity in co-designing the elements of materials discovery, with theory and high performance computing, itself co-designed by constrained optimization of hardware and software, and experiments. MaRIE's theory, modeling, and computation (TMC) roadmap efforts have paralleled 'MaRIE First Experiments' science activities in the areas of materials dynamics, irradiated materials and complex functional materials in extreme conditions. The documents that follow this executive summary describe in detail for each of these areas the current state of the art, the gaps that exist and the road mapmore » to MaRIE and beyond. Here we integrate the various elements to articulate an overarching theme related to the role and consequences of heterogeneities which manifest as competing states in a complex energy landscape. MaRIE experiments will locate, measure and follow the dynamical evolution of these heterogeneities. Our TMC vision spans the various pillar science and highlights the key theoretical and experimental challenges. We also present a theory, modeling and computation roadmap of the path to and beyond MaRIE in each of the science areas.« less
Generic Software for Emulating Multiprocessor Architectures.
1985-05-01
RD-A157 662 GENERIC SOFTWARE FOR EMULATING MULTIPROCESSOR 1/2 AlRCHITECTURES(J) MASSACHUSETTS INST OF TECH CAMBRIDGE U LRS LAB FOR COMPUTER SCIENCE R...AREA & WORK UNIT NUMBERS MIT Laboratory for Computer Science 545 Technology Square Cambridge, MA 02139 ____________ I I. CONTROLLING OFFICE NAME AND...aide If neceeasy end Identify by block number) Computer architecture, emulation, simulation, dataf low 20. ABSTRACT (Continue an reverse slde It
Math and science technology access and use in South Dakota public schools grades three through five
NASA Astrophysics Data System (ADS)
Schwietert, Debra L.
The development of K-12 technology standards, soon to be added to state testing of technology proficiency, and the increasing presence of computers in homes and classrooms reflects the growing importance of technology in current society. This study examined math and science teachers' responses on a survey of technology use in grades three through five in South Dakota. A researcher-developed survey instrument was used to collect data from a random sample of 100 public schools throughout the South Dakota. Forced choice and open-ended responses were recorded. Most teachers have access to computers, but they lack resources to purchase software for their content areas, especially in science areas. Three-fourths of teachers in this study reported multiple computers in their classrooms and 67% reported access to labs in other areas of the school building. These numbers are lower than the national average of 84% of teachers with computers in their classrooms and 95% with access to computers elsewhere in the building (USDOE, 2000). Almost eight out of 10 teachers noted time as a barrier to learning more about educational software. Additional barriers included lack of school funds (38%), access to relevant training (32%), personal funds (30%), and poor quality of training (7%). Teachers most often use math and science software as supplemental, with practice tutorials cited as another common use. The most common interest for software was math for both boys and girls. The second most common choice for boys was science and for girls, language arts. Teachers reported that there was no preference for either individual or group work on computers for girls or boys. Most teachers do not systematically evaluate software for gender preferences, but review software over subjectively.
Ideas for Integrating the Microcomputer with High School Science.
ERIC Educational Resources Information Center
Podany, Zita
This report discusses how computers are being used in high school science classrooms. For this report, four high school science teachers were interviewed. The approach to science instruction described in these four interviews deals with the areas of scientific and technological literacy, making science learning fun and attractive, and stimulating…
Remote Science Operation Center research
NASA Technical Reports Server (NTRS)
Banks, P. M.
1986-01-01
Progress in the following areas is discussed: the design, planning and operation of a remote science payload operations control center; design and planning of a data link via satellite; and the design and prototyping of an advanced workstation environment for multi-media (3-D computer aided design/computer aided engineering, voice, video, text) communications and operations.
Local and Long Distance Computer Networking for Science Classrooms. Technical Report No. 43.
ERIC Educational Resources Information Center
Newman, Denis
This report describes Earth Lab, a project which is demonstrating new ways of using computers for upper-elementary and middle-school science instruction, and finding ways to integrate local-area and telecommunications networks. The discussion covers software, classroom activities, formative research on communications networks, and integration of…
Computers in Life Science Education. Volumes 1 through 4, 1984-1987.
ERIC Educational Resources Information Center
Modell, Harold, Ed.
1987-01-01
Designed to serve as a means of communication among life science educators who anticipate or are currently using microcomputers as an educational tool, these four volumes of newsletters provide background information and practical suggestions on computer use in over 80 articles. Topic areas include: (1) teaching physiology and other life sciences…
Multiscale Computation. Needs and Opportunities for BER Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scheibe, Timothy D.; Smith, Jeremy C.
2015-01-01
The Environmental Molecular Sciences Laboratory (EMSL), a scientific user facility managed by Pacific Northwest National Laboratory for the U.S. Department of Energy, Office of Biological and Environmental Research (BER), conducted a one-day workshop on August 26, 2014 on the topic of “Multiscale Computation: Needs and Opportunities for BER Science.” Twenty invited participants, from various computational disciplines within the BER program research areas, were charged with the following objectives; Identify BER-relevant models and their potential cross-scale linkages that could be exploited to better connect molecular-scale research to BER research at larger scales and; Identify critical science directions that will motivate EMSLmore » decisions regarding future computational (hardware and software) architectures.« less
Demystifying computer science for molecular ecologists.
Belcaid, Mahdi; Toonen, Robert J
2015-06-01
In this age of data-driven science and high-throughput biology, computational thinking is becoming an increasingly important skill for tackling both new and long-standing biological questions. However, despite its obvious importance and conspicuous integration into many areas of biology, computer science is still viewed as an obscure field that has, thus far, permeated into only a few of the biology curricula across the nation. A national survey has shown that lack of computational literacy in environmental sciences is the norm rather than the exception [Valle & Berdanier (2012) Bulletin of the Ecological Society of America, 93, 373-389]. In this article, we seek to introduce a few important concepts in computer science with the aim of providing a context-specific introduction aimed at research biologists. Our goal was to help biologists understand some of the most important mainstream computational concepts to better appreciate bioinformatics methods and trade-offs that are not obvious to the uninitiated. © 2015 John Wiley & Sons Ltd.
The Computational and Neural Basis of Cognitive Control: Charted Territory and New Frontiers
ERIC Educational Resources Information Center
Botvinick, Matthew M.; Cohen, Jonathan D.
2014-01-01
Cognitive control has long been one of the most active areas of computational modeling work in cognitive science. The focus on computational models as a medium for specifying and developing theory predates the PDP books, and cognitive control was not one of the areas on which they focused. However, the framework they provided has injected work on…
NASA Technical Reports Server (NTRS)
Schulbach, Catherine H. (Editor)
2000-01-01
The purpose of the CAS workshop is to bring together NASA's scientists and engineers and their counterparts in industry, other government agencies, and academia working in the Computational Aerosciences and related fields. This workshop is part of the technology transfer plan of the NASA High Performance Computing and Communications (HPCC) Program. Specific objectives of the CAS workshop are to: (1) communicate the goals and objectives of HPCC and CAS, (2) promote and disseminate CAS technology within the appropriate technical communities, including NASA, industry, academia, and other government labs, (3) help promote synergy among CAS and other HPCC scientists, and (4) permit feedback from peer researchers on issues facing High Performance Computing in general and the CAS project in particular. This year we had a number of exciting presentations in the traditional aeronautics, aerospace sciences, and high-end computing areas and in the less familiar (to many of us affiliated with CAS) earth science, space science, and revolutionary computing areas. Presentations of more than 40 high quality papers were organized into ten sessions and presented over the three-day workshop. The proceedings are organized here for easy access: by author, title and topic.
Computational Social Creativity.
Saunders, Rob; Bown, Oliver
2015-01-01
This article reviews the development of computational models of creativity where social interactions are central. We refer to this area as computational social creativity. Its context is described, including the broader study of creativity, the computational modeling of other social phenomena, and computational models of individual creativity. Computational modeling has been applied to a number of areas of social creativity and has the potential to contribute to our understanding of creativity. A number of requirements for computational models of social creativity are common in artificial life and computational social science simulations. Three key themes are identified: (1) computational social creativity research has a critical role to play in understanding creativity as a social phenomenon and advancing computational creativity by making clear epistemological contributions in ways that would be challenging for other approaches; (2) the methodologies developed in artificial life and computational social science carry over directly to computational social creativity; and (3) the combination of computational social creativity with individual models of creativity presents significant opportunities and poses interesting challenges for the development of integrated models of creativity that have yet to be realized.
Frances: A Tool for Understanding Computer Architecture and Assembly Language
ERIC Educational Resources Information Center
Sondag, Tyler; Pokorny, Kian L.; Rajan, Hridesh
2012-01-01
Students in all areas of computing require knowledge of the computing device including software implementation at the machine level. Several courses in computer science curricula address these low-level details such as computer architecture and assembly languages. For such courses, there are advantages to studying real architectures instead of…
NASA Information Sciences and Human Factors Program
NASA Technical Reports Server (NTRS)
Holcomb, Lee (Editor); Hood, Ray (Editor); Montemerlo, Melvin (Editor); Sokoloski, Martin M. (Editor); Jenkins, James P. (Editor); Smith, Paul H. (Editor); Dibattista, John D. (Editor)
1988-01-01
The FY 1987 descriptions of technical accomplishments are contained for seven areas: automation and robotics, communications systems, computer sciences, controls and guidance, data systems, human factors, and sensor technology.
ERIC Educational Resources Information Center
Giannakos, Michail N.
2014-01-01
Computer Science (CS) courses comprise both Programming and Information and Communication Technology (ICT) issues; however these two areas have substantial differences, inter alia the attitudes and beliefs of the students regarding the intended learning content. In this research, factors from the Social Cognitive Theory and Unified Theory of…
Analyzing the Use of Concept Maps in Computer Science: A Systematic Mapping Study
ERIC Educational Resources Information Center
dos Santos, Vinicius; de Souza, Érica F.; Felizardo, Katia R; Vijaykumar, Nandamudi L.
2017-01-01
Context: concept Maps (CMs) enable the creation of a schematic representation of a domain knowledge. For this reason, CMs have been applied in different research areas, including Computer Science. Objective: the objective of this paper is to present the results of a systematic mapping study conducted to collect and evaluate existing research on…
Expanding Computer Science Education in Schools: Understanding Teacher Experiences and Challenges
ERIC Educational Resources Information Center
Yadav, Aman; Gretter, Sarah; Hambrusch, Susanne; Sands, Phil
2017-01-01
The increased push for teaching computer science (CS) in schools in the United States requires training a large number of new K-12 teachers. The current efforts to increase the number of CS teachers have predominantly focused on training teachers from other content areas. In order to support these beginning CS teachers, we need to better…
Scientific Research in British Universities and Colleges 1969-70, Volume I, Physical Sciences.
ERIC Educational Resources Information Center
Department of Education and Science, London (England).
This annual publication (1969-1970) contains brief statements about current research in the physical sciences being conducted at British universities and colleges. Areas included are chemistry, physics, engineering, biochemistry, biometry, biophysics, physical geography, mathematics, computing science, and history and philosophy of science. (CP)
Quantum Sensors at the Intersections of Fundamental Science, Quantum Information Science & Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chattopadhyay, Swapan; Falcone, Roger; Walsworth, Ronald
Over the last twenty years, there has been a boom in quantum science - i.e., the development and exploitation of quantum systems to enable qualitatively and quantitatively new capabilities, with high-impact applications and fundamental insights that can range across all areas of science and technology.
NASA Technical Reports Server (NTRS)
Freitas, R. A., Jr. (Editor); Carlson, P. A. (Editor)
1983-01-01
Adoption of an aggressive computer science research and technology program within NASA will: (1) enable new mission capabilities such as autonomous spacecraft, reliability and self-repair, and low-bandwidth intelligent Earth sensing; (2) lower manpower requirements, especially in the areas of Space Shuttle operations, by making fuller use of control center automation, technical support, and internal utilization of state-of-the-art computer techniques; (3) reduce project costs via improved software verification, software engineering, enhanced scientist/engineer productivity, and increased managerial effectiveness; and (4) significantly improve internal operations within NASA with electronic mail, managerial computer aids, an automated bureaucracy and uniform program operating plans.
von Arnim, Albrecht G; Missra, Anamika
2017-01-01
Leading voices in the biological sciences have called for a transformation in graduate education leading to the PhD degree. One area commonly singled out for growth and innovation is cross-training in computational science. In 1998, the University of Tennessee (UT) founded an intercollegiate graduate program called the UT-ORNL Graduate School of Genome Science and Technology in partnership with the nearby Oak Ridge National Laboratory. Here, we report outcome data that attest to the program's effectiveness in graduating computationally enabled biologists for diverse careers. Among 77 PhD graduates since 2003, the majority came with traditional degrees in the biological sciences, yet two-thirds moved into computational or hybrid (computational-experimental) positions. We describe the curriculum of the program and how it has changed. We also summarize how the program seeks to establish cohesion between computational and experimental biologists. This type of program can respond flexibly and dynamically to unmet training needs. In conclusion, this study from a flagship, state-supported university may serve as a reference point for creating a stable, degree-granting, interdepartmental graduate program in computational biology and allied areas. © 2017 A. G. von Arnim and A. Missra. CBE—Life Sciences Education © 2017 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).
Software Reuse Methods to Improve Technological Infrastructure for e-Science
NASA Technical Reports Server (NTRS)
Marshall, James J.; Downs, Robert R.; Mattmann, Chris A.
2011-01-01
Social computing has the potential to contribute to scientific research. Ongoing developments in information and communications technology improve capabilities for enabling scientific research, including research fostered by social computing capabilities. The recent emergence of e-Science practices has demonstrated the benefits from improvements in the technological infrastructure, or cyber-infrastructure, that has been developed to support science. Cloud computing is one example of this e-Science trend. Our own work in the area of software reuse offers methods that can be used to improve new technological development, including cloud computing capabilities, to support scientific research practices. In this paper, we focus on software reuse and its potential to contribute to the development and evaluation of information systems and related services designed to support new capabilities for conducting scientific research.
High End Computing Technologies for Earth Science Applications: Trends, Challenges, and Innovations
NASA Technical Reports Server (NTRS)
Parks, John (Technical Monitor); Biswas, Rupak; Yan, Jerry C.; Brooks, Walter F.; Sterling, Thomas L.
2003-01-01
Earth science applications of the future will stress the capabilities of even the highest performance supercomputers in the areas of raw compute power, mass storage management, and software environments. These NASA mission critical problems demand usable multi-petaflops and exabyte-scale systems to fully realize their science goals. With an exciting vision of the technologies needed, NASA has established a comprehensive program of advanced research in computer architecture, software tools, and device technology to ensure that, in partnership with US industry, it can meet these demanding requirements with reliable, cost effective, and usable ultra-scale systems. NASA will exploit, explore, and influence emerging high end computing architectures and technologies to accelerate the next generation of engineering, operations, and discovery processes for NASA Enterprises. This article captures this vision and describes the concepts, accomplishments, and the potential payoff of the key thrusts that will help meet the computational challenges in Earth science applications.
NASA Technical Reports Server (NTRS)
Ortega, J. M.
1985-01-01
Synopses are given for NASA supported work in computer science at the University of Virginia. Some areas of research include: error seeding as a testing method; knowledge representation for engineering design; analysis of faults in a multi-version software experiment; implementation of a parallel programming environment; two computer graphics systems for visualization of pressure distribution and convective density particles; task decomposition for multiple robot arms; vectorized incomplete conjugate gradient; and iterative methods for solving linear equations on the Flex/32.
Snowmass Computing Frontier: Computing for the Cosmic Frontier, Astrophysics, and Cosmology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Connolly, A.; Habib, S.; Szalay, A.
2013-11-12
This document presents (off-line) computing requrements and challenges for Cosmic Frontier science, covering the areas of data management, analysis, and simulations. We invite contributions to extend the range of covered topics and to enhance the current descriptions.
ERIC Educational Resources Information Center
Tuttle, Francis
Twenty-three instructors participated in an 8-week summer institute to develop their technical competency to teach the second year of a 2-year Technical Education Computer Science Program. Instructional material covered the following areas: (1) compiler languages and systems design, (2) cost studies, (3) business organization, (4) advanced…
ERIC Educational Resources Information Center
Florida State Community Coll. Coordinating Board, Tallahassee.
In 1987-88, the Florida State Board of Community Colleges and the Division of Vocational, Adult, and Community Education jointly conducted a review of instructional programs in computer science and data processing in order to determine needs for state policy changes and funding priorities. The process involved a review of printed resources on…
A parallel-processing approach to computing for the geographic sciences
Crane, Michael; Steinwand, Dan; Beckmann, Tim; Krpan, Greg; Haga, Jim; Maddox, Brian; Feller, Mark
2001-01-01
The overarching goal of this project is to build a spatially distributed infrastructure for information science research by forming a team of information science researchers and providing them with similar hardware and software tools to perform collaborative research. Four geographically distributed Centers of the U.S. Geological Survey (USGS) are developing their own clusters of low-cost personal computers into parallel computing environments that provide a costeffective way for the USGS to increase participation in the high-performance computing community. Referred to as Beowulf clusters, these hybrid systems provide the robust computing power required for conducting research into various areas, such as advanced computer architecture, algorithms to meet the processing needs for real-time image and data processing, the creation of custom datasets from seamless source data, rapid turn-around of products for emergency response, and support for computationally intense spatial and temporal modeling.
Activities of the Research Institute for Advanced Computer Science
NASA Technical Reports Server (NTRS)
Oliger, Joseph
1994-01-01
The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. Research at RIACS is currently being done in the following areas: (1) parallel computing; (2) advanced methods for scientific computing; (3) high performance networks; and (4) learning systems. RIACS technical reports are usually preprints of manuscripts that have been submitted to research journals or conference proceedings. A list of these reports for the period January 1, 1994 through December 31, 1994 is in the Reports and Abstracts section of this report.
Computational Exposure Science: An Emerging Discipline to ...
Background: Computational exposure science represents a frontier of environmental science that is emerging and quickly evolving.Objectives: In this commentary, we define this burgeoning discipline, describe a framework for implementation, and review some key ongoing research elements that are advancing the science with respect to exposure to chemicals in consumer products.Discussion: The fundamental elements of computational exposure science include the development of reliable, computationally efficient predictive exposure models; the identification, acquisition, and application of data to support and evaluate these models; and generation of improved methods for extrapolating across chemicals. We describe our efforts in each of these areas and provide examples that demonstrate both progress and potential.Conclusions: Computational exposure science, linked with comparable efforts in toxicology, is ushering in a new era of risk assessment that greatly expands our ability to evaluate chemical safety and sustainability and to protect public health. The National Exposure Research Laboratory’s (NERL’s) Human Exposure and Atmospheric Sciences Division (HEASD) conducts research in support of EPA’s mission to protect human health and the environment. HEASD’s research program supports Goal 1 (Clean Air) and Goal 4 (Healthy People) of EPA’s strategic plan. More specifically, our division conducts research to characterize the movement of pollutants from the source
NASA Technical Reports Server (NTRS)
1993-01-01
This bibliography contains 1237 annotated references to reports and journal articles of Commonwealth of Independent States (CIS) intellectual origin entered into the NASA Scientific and Technical Information System during 1992. Representative subject areas include the following: aeronautics, astronautics, chemistry and materials, engineering, geosciences, life sciences, mathematical and computer sciences, physics, social sciences, and space sciences.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
This document comprises Pacific Northwest National Laboratory`s report for Fiscal Year 1996 on research and development programs. The document contains 161 project summaries in 16 areas of research and development. The 16 areas of research and development reported on are: atmospheric sciences, biotechnology, chemical instrumentation and analysis, computer and information science, ecological science, electronics and sensors, health protection and dosimetry, hydrological and geologic sciences, marine sciences, materials science and engineering, molecular science, process science and engineering, risk and safety analysis, socio-technical systems analysis, statistics and applied mathematics, and thermal and energy systems. In addition, this report provides an overview ofmore » the research and development program, program management, program funding, and Fiscal Year 1997 projects.« less
FY 1999 Laboratory Directed Research and Development annual report
DOE Office of Scientific and Technical Information (OSTI.GOV)
PJ Hughes
2000-06-13
A short synopsis of each project is given covering the following main areas of research and development: Atmospheric sciences; Biotechnology; Chemical and instrumentation analysis; Computer and information science; Design and manufacture engineering; Ecological science; Electronics and sensors; Experimental technology; Health protection and dosimetry; Hydrologic and geologic science; Marine sciences; Materials science; Nuclear science and engineering; Process science and engineering; Sociotechnical systems analysis; Statistics and applied mathematics; and Thermal and energy systems.
10 CFR 605.5 - The Office of Energy Research Financial Assistance Program.
Code of Federal Regulations, 2012 CFR
2012-01-01
...) Scientific Computing Staff (7) Superconducting Super Collider (8) University and Science Education Programs... appendix A of this part. (b) The Program areas are: (1) Basic Energy Sciences (2) Field Operations...
10 CFR 605.5 - The Office of Energy Research Financial Assistance Program.
Code of Federal Regulations, 2013 CFR
2013-01-01
...) Scientific Computing Staff (7) Superconducting Super Collider (8) University and Science Education Programs... appendix A of this part. (b) The Program areas are: (1) Basic Energy Sciences (2) Field Operations...
10 CFR 605.5 - The Office of Energy Research Financial Assistance Program.
Code of Federal Regulations, 2011 CFR
2011-01-01
...) Scientific Computing Staff (7) Superconducting Super Collider (8) University and Science Education Programs... appendix A of this part. (b) The Program areas are: (1) Basic Energy Sciences (2) Field Operations...
10 CFR 605.5 - The Office of Energy Research Financial Assistance Program.
Code of Federal Regulations, 2014 CFR
2014-01-01
...) Scientific Computing Staff (7) Superconducting Super Collider (8) University and Science Education Programs... appendix A of this part. (b) The Program areas are: (1) Basic Energy Sciences (2) Field Operations...
Computer Science in K-12 School Curricula of the 2lst Century: Why, What and When?
ERIC Educational Resources Information Center
Webb, Mary; Davis, Niki; Bell, Tim; Katz, Yaacov J.; Reynolds, Nicholas; Chambers, Dianne P.; Syslo, Maciej M.
2017-01-01
In this paper we have examined the position and roles of Computer Science in curricula in the light of recent calls for curriculum change and we have proposed principles and issues to consider in curriculum design as well as identifying priority areas for further research. The paper is based on discussions within and beyond the International…
ERIC Educational Resources Information Center
Kankaanpää, Irja; Isomäki, Hannakaisa
2013-01-01
This paper reviews research literature on the production and commercialization of IT-enabled higher education in computer science. Systematic literature review (SLR) was carried out in order to find out to what extent this area has been studied, more specifically how much it has been studied and to what detail. The results of this paper make a…
A Research Program in Computer Technology. 1982 Annual Technical Report
1983-03-01
for the Defense Advanced Research Projects Agency. The research applies computer science and technology to areas of high DoD/ military impact. The ISI...implement the plan; New Computing Environment - investigation and adaptation of developing computer technologies to serve the research and military ...Computing Environment - ,.*_i;.;"’.)n and adaptation of developing computer technologies to serve the research and military tser communities; and Computer
Jumpstarting Jill: Strategies to Nurture Talented Girls in Your Science Classroom
ERIC Educational Resources Information Center
Heilbronner, Nancy N.
2008-01-01
Women are making progress in many areas of science, but a gender gap still remains, especially in physics, computer science, and engineering, and at advanced levels of academic and career achievement. Today's teachers can help narrow this gap by instilling a love for science in their female students and by helping them to understand and develop…
Tracking the PhD Students' Daily Computer Use
ERIC Educational Resources Information Center
Sim, Kwong Nui; van der Meer, Jacques
2015-01-01
This study investigated PhD students' computer activities in their daily research practice. Software that tracks computer usage (Manic Time) was installed on the computers of nine PhD students, who were at their early, mid and final stage in doing their doctoral research in four different discipline areas (Commerce, Humanities, Health Sciences and…
Toward Using Games to Teach Fundamental Computer Science Concepts
ERIC Educational Resources Information Center
Edgington, Jeffrey Michael
2010-01-01
Video and computer games have become an important area of study in the field of education. Games have been designed to teach mathematics, physics, raise social awareness, teach history and geography, and train soldiers in the military. Recent work has created computer games for teaching computer programming and understanding basic algorithms. …
Division of Computer Research Summary of Awards. Fiscal Year 1984.
ERIC Educational Resources Information Center
National Science Foundation, Washington, DC. Directorate for Mathematical and Physical Sciences.
Provided in this report are summaries of grants awarded by the National Science Foundation Division of Computer Research in fiscal year 1984. Similar areas of research are grouped (for the purposes of this report only) into these major categories: (1) computational mathematics; (2) computer systems design; (3) intelligent systems; (4) software…
ERIC Educational Resources Information Center
Snyder, Robin M.
2017-01-01
The author has attended and presented at most ASCUE meetings since 1994, and has worked professionally in research and development, industry, military, government, business, and private and public academia--moving between computer science, software engineering, and business fields at both the undergraduate and graduate level, and even running…
Texas Agricultural Science Teachers' Attitudes toward Information Technology
ERIC Educational Resources Information Center
Anderson, Ryan; Williams, Robert
2012-01-01
The researchers sought to find the Agricultural Science teachers' attitude toward five innovations (Computer-Aided Design, Record Books, E-Mail Career Development Event Registration, and World Wide Web) of information technology. The population for this study consisted of all 333 secondary Agricultural science teachers from Texas FFA Areas V and…
Sandia National Laboratories: Locations: Kauai Test Facility
Defense Systems & Assessments About Defense Systems & Assessments Program Areas Accomplishments Foundations Bioscience Computing & Information Science Electromagnetics Engineering Science Geoscience Suppliers iSupplier Account Accounts Payable Contract Information Construction & Facilities Contract
ERIC Educational Resources Information Center
School Science Review, 1986
1986-01-01
Describes 26 different activities, experiments, demonstrations, and computer simulations in various topics in science. Includes instructional activities dealing with mural ecology, surface area/volume ratios, energy transfer in ecosystems, electrochemical simulations, alternating and direct current, terminal velocity, measuring the size of the…
Approaching gender parity: Women in computer science at Afghanistan's Kabul University
NASA Astrophysics Data System (ADS)
Plane, Jandelyn
This study explores the representation of women in computer science at the tertiary level through data collected about undergraduate computer science education at Kabul University in Afghanistan. Previous studies have theorized reasons for underrepresentation of women in computer science, and while many of these reasons are indeed present in Afghanistan, they appear to hinder advancement to degree to a lesser extent. Women comprise at least 36% of each graduating class from KU's Computer Science Department; however, in 2007 women were 25% of the university population. In the US, women comprise over 50% of university populations while only graduating on average 25% women in undergraduate computer science programs. Representation of women in computer science in the US is 50% below the university rate, but at KU, it is 50% above the university rate. This mixed methods study of KU was conducted in the following three stages: setting up focus groups with women computer science students, distributing surveys to all students in the CS department, and conducting a series of 22 individual interviews with fourth year CS students. The analysis of the data collected and its comparison to literature on university/department retention in Science, Technology, Engineering and Mathematics gender representation and on women's education in underdeveloped Islamic countries illuminates KU's uncharacteristic representation of women in its Computer Science Department. The retention of women in STEM through the education pipeline has several characteristics in Afghanistan that differ from countries often studied in available literature. Few Afghan students have computers in their home and few have training beyond secretarial applications before considering studying CS at university. University students in Afghanistan are selected based on placement exams and are then assigned to an area of study, and financially supported throughout their academic career, resulting in a low attrition rate from the program. Gender and STEM literature identifies parental encouragement, stereotypes and employment perceptions as influential characteristics. Afghan women in computer science received significant parental encouragement even from parents with no computer background. They do not seem to be influenced by any negative "geek" stereotypes, but they do perceive limitations when considering employment after graduation.
ERIC Educational Resources Information Center
Bourret, Annie, Ed.; L'Homme, Marie-Claude, Ed.
A collection of essays addresses aspects of the "Language Utilities," the general term for the area of the conjunction of computer science and linguistics. The following are English translations of the titles of the articles in the collections: "Industrialization of the French Language and Its Maintenance as an Important Language of…
ERIC Educational Resources Information Center
von Arnim, Albrecht G.; Missra, Anamika
2017-01-01
Leading voices in the biological sciences have called for a transformation in graduate education leading to the PhD degree. One area commonly singled out for growth and innovation is cross-training in computational science. In 1998, the University of Tennessee (UT) founded an intercollegiate graduate program called the UT-ORNL Graduate School of…
Implications of Windowing Techniques for CAI.
ERIC Educational Resources Information Center
Heines, Jesse M.; Grinstein, Georges G.
This paper discusses the use of a technique called windowing in computer assisted instruction to allow independent control of functional areas in complex CAI displays and simultaneous display of output from a running computer program and coordinated instructional material. Two obstacles to widespread use of CAI in computer science courses are…
Community Information Centers and the Computer.
ERIC Educational Resources Information Center
Carroll, John M.; Tague, Jean M.
Two computer data bases have been developed by the Computer Science Department at the University of Western Ontario for "Information London," the local community information center. One system, called LONDON, permits Boolean searches of a file of 5,000 records describing human service agencies in the London area. The second system,…
Implementations of the CC'01 Human-Computer Interaction Guidelines Using Bloom's Taxonomy
ERIC Educational Resources Information Center
Manaris, Bill; Wainer, Michael; Kirkpatrick, Arthur E.; Stalvey, RoxAnn H.; Shannon, Christine; Leventhal, Laura; Barnes, Julie; Wright, John; Schafer, J. Ben; Sanders, Dean
2007-01-01
In today's technology-laden society human-computer interaction (HCI) is an important knowledge area for computer scientists and software engineers. This paper surveys existing approaches to incorporate HCI into computer science (CS) and such related issues as the perceived gap between the interests of the HCI community and the needs of CS…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crabtree, George; Glotzer, Sharon; McCurdy, Bill
This report is based on a SC Workshop on Computational Materials Science and Chemistry for Innovation on July 26-27, 2010, to assess the potential of state-of-the-art computer simulations to accelerate understanding and discovery in materials science and chemistry, with a focus on potential impacts in energy technologies and innovation. The urgent demand for new energy technologies has greatly exceeded the capabilities of today's materials and chemical processes. To convert sunlight to fuel, efficiently store energy, or enable a new generation of energy production and utilization technologies requires the development of new materials and processes of unprecedented functionality and performance. Newmore » materials and processes are critical pacing elements for progress in advanced energy systems and virtually all industrial technologies. Over the past two decades, the United States has developed and deployed the world's most powerful collection of tools for the synthesis, processing, characterization, and simulation and modeling of materials and chemical systems at the nanoscale, dimensions of a few atoms to a few hundred atoms across. These tools, which include world-leading x-ray and neutron sources, nanoscale science facilities, and high-performance computers, provide an unprecedented view of the atomic-scale structure and dynamics of materials and the molecular-scale basis of chemical processes. For the first time in history, we are able to synthesize, characterize, and model materials and chemical behavior at the length scale where this behavior is controlled. This ability is transformational for the discovery process and, as a result, confers a significant competitive advantage. Perhaps the most spectacular increase in capability has been demonstrated in high performance computing. Over the past decade, computational power has increased by a factor of a million due to advances in hardware and software. This rate of improvement, which shows no sign of abating, has enabled the development of computer simulations and models of unprecedented fidelity. We are at the threshold of a new era where the integrated synthesis, characterization, and modeling of complex materials and chemical processes will transform our ability to understand and design new materials and chemistries with predictive power. In turn, this predictive capability will transform technological innovation by accelerating the development and deployment of new materials and processes in products and manufacturing. Harnessing the potential of computational science and engineering for the discovery and development of materials and chemical processes is essential to maintaining leadership in these foundational fields that underpin energy technologies and industrial competitiveness. Capitalizing on the opportunities presented by simulation-based engineering and science in materials and chemistry will require an integration of experimental capabilities with theoretical and computational modeling; the development of a robust and sustainable infrastructure to support the development and deployment of advanced computational models; and the assembly of a community of scientists and engineers to implement this integration and infrastructure. This community must extend to industry, where incorporating predictive materials science and chemistry into design tools can accelerate the product development cycle and drive economic competitiveness. The confluence of new theories, new materials synthesis capabilities, and new computer platforms has created an unprecedented opportunity to implement a "materials-by-design" paradigm with wide-ranging benefits in technological innovation and scientific discovery. The Workshop on Computational Materials Science and Chemistry for Innovation was convened in Bethesda, Maryland, on July 26-27, 2010. Sponsored by the Department of Energy (DOE) Offices of Advanced Scientific Computing Research and Basic Energy Sciences, the workshop brought together 160 experts in materials science, chemistry, and computational science representing more than 65 universities, laboratories, and industries, and four agencies. The workshop examined seven foundational challenge areas in materials science and chemistry: materials for extreme conditions, self-assembly, light harvesting, chemical reactions, designer fluids, thin films and interfaces, and electronic structure. Each of these challenge areas is critical to the development of advanced energy systems, and each can be accelerated by the integrated application of predictive capability with theory and experiment. The workshop concluded that emerging capabilities in predictive modeling and simulation have the potential to revolutionize the development of new materials and chemical processes. Coupled with world-leading materials characterization and nanoscale science facilities, this predictive capability provides the foundation for an innovation ecosystem that can accelerate the discovery, development, and deployment of new technologies, including advanced energy systems. Delivering on the promise of this innovation ecosystem requires the following: Integration of synthesis, processing, characterization, theory, and simulation and modeling. Many of the newly established Energy Frontier Research Centers and Energy Hubs are exploiting this integration. Achieving/strengthening predictive capability in foundational challenge areas. Predictive capability in the seven foundational challenge areas described in this report is critical to the development of advanced energy technologies. Developing validated computational approaches that span vast differences in time and length scales. This fundamental computational challenge crosscuts all of the foundational challenge areas. Similarly challenging is coupling of analytical data from multiple instruments and techniques that are required to link these length and time scales. Experimental validation and quantification of uncertainty in simulation and modeling. Uncertainty quantification becomes increasingly challenging as simulations become more complex. Robust and sustainable computational infrastructure, including software and applications. For modeling and simulation, software equals infrastructure. To validate the computational tools, software is critical infrastructure that effectively translates huge arrays of experimental data into useful scientific understanding. An integrated approach for managing this infrastructure is essential. Efficient transfer and incorporation of simulation-based engineering and science in industry. Strategies for bridging the gap between research and industrial applications and for widespread industry adoption of integrated computational materials engineering are needed.« less
ERIC Educational Resources Information Center
Prange, W. Werner; Bellinghausen, Carol R.
A directory of college television courseware lists offerings in curriculum areas such as: social sciences, biology, black studies, business, mathematics, sciences, computer science, consumer protection, creative arts, drug education, ecology, engineering, humanities, physics, nursing, nutrition, religion, and vocational education, etc. Each course…
Cumulative reports and publications
NASA Technical Reports Server (NTRS)
1993-01-01
A complete list of Institute for Computer Applications in Science and Engineering (ICASE) reports are listed. Since ICASE reports are intended to be preprints of articles that will appear in journals or conference proceedings, the published reference is included when it is available. The major categories of the current ICASE research program are: applied and numerical mathematics, including numerical analysis and algorithm development; theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustics and combustion; experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and computer science.
Operation of the Institute for Computer Applications in Science and Engineering
NASA Technical Reports Server (NTRS)
1975-01-01
The ICASE research program is described in detail; it consists of four major categories: (1) efficient use of vector and parallel computers, with particular emphasis on the CDC STAR-100; (2) numerical analysis, with particular emphasis on the development and analysis of basic numerical algorithms; (3) analysis and planning of large-scale software systems; and (4) computational research in engineering and the natural sciences, with particular emphasis on fluid dynamics. The work in each of these areas is described in detail; other activities are discussed, a prognosis of future activities are included.
Biotechnology Computing: Information Science for the Era of Molecular Medicine.
ERIC Educational Resources Information Center
Masys, Daniel R.
1989-01-01
The evolution from classical genetics to biotechnology, an area of research involving key macromolecules in living cells, is chronicled and the current state of biotechnology is described, noting related advances in computing and clinical medicine. (MSE)
NASA Technical Reports Server (NTRS)
Rummel, J. D.
1986-01-01
Questions and areas of study that need to be persued in order to develope a Controlled Ecological Life Support System are posed. Research topics needing attention are grouped under various leadings: ecology, genetics, plant pathology, cybernetics, chemistry, computer science, fluid dynamics, optics, and solid-state physics.
ERIC Educational Resources Information Center
School Science Review, 1986
1986-01-01
Describes activities, games, experiments, demonstrations, and computer-oriented exercises in all science areas. Topics include energy flow through a marine ecosystem, using 2,4-dichlorophenoxyethanoic acid to demonstrate translocation in plants, use of the dichotomous key, use of leaf yeasts to monitor atmospheric pollution, and others. (JN)
ERIC Educational Resources Information Center
Erickson, Judith B.; And Others
1980-01-01
Discusses patterns resulting from the monitor of science education proposals which may reflect problems or differing perceptions of NSF. Discusses these areas: proposal submissions from two-year institutions and social and behavioral scientists, trends in project content at the academic-industrial interface and in computer technology, and…
Toward using games to teach fundamental computer science concepts
NASA Astrophysics Data System (ADS)
Edgington, Jeffrey Michael
Video and computer games have become an important area of study in the field of education. Games have been designed to teach mathematics, physics, raise social awareness, teach history and geography, and train soldiers in the military. Recent work has created computer games for teaching computer programming and understanding basic algorithms. We present an investigation where computer games are used to teach two fundamental computer science concepts: boolean expressions and recursion. The games are intended to teach the concepts and not how to implement them in a programming language. For this investigation, two computer games were created. One is designed to teach basic boolean expressions and operators and the other to teach fundamental concepts of recursion. We describe the design and implementation of both games. We evaluate the effectiveness of these games using before and after surveys. The surveys were designed to ascertain basic understanding, attitudes and beliefs regarding the concepts. The boolean game was evaluated with local high school students and students in a college level introductory computer science course. The recursion game was evaluated with students in a college level introductory computer science course. We present the analysis of the collected survey information for both games. This analysis shows a significant positive change in student attitude towards recursion and modest gains in student learning outcomes for both topics.
Enhancing Tele-robotics with Immersive Virtual Reality
2017-11-03
graduate and undergraduate students within the Digital Gaming and Simulation, Computer Science, and psychology programs have actively collaborated...investigates the use of artificial intelligence and visual computing. Numerous fields across the human-computer interaction and gaming research areas...invested in digital gaming and simulation to cognitively stimulate humans by computers, forming a $10.5B industry [1]. On the other hand, cognitive
The Effects of Gender on the Attitudes towards the Computer Assisted Instruction: A Meta-Analysis
ERIC Educational Resources Information Center
Cam, Sefika Sumeyye; Yarar, Gokhan; Toraman, Cetin; Erdamar, Gurcu Koc
2016-01-01
The idea that gender factor creates a difference on computer usage and computer-assisted instruction is based upon previous years. At that time, it was thought that some areas like engineering, science and mathematics were for males so it created a difference on the computer usage. Nevertheless, developing technology and females becoming more…
Cloud Computing as a Core Discipline in a Technology Entrepreneurship Program
ERIC Educational Resources Information Center
Lawler, James; Joseph, Anthony
2012-01-01
Education in entrepreneurship continues to be a developing area of curricula for computer science and information systems students. Entrepreneurship is enabled frequently by cloud computing methods that furnish benefits to especially medium and small-sized firms. Expanding upon an earlier foundation paper, the authors of this paper present an…
JPL basic research review. [research and advanced development
NASA Technical Reports Server (NTRS)
1977-01-01
Current status, projected goals, and results of 49 research and advanced development programs at the Jet Propulsion Laboratory are reported in abstract form. Areas of investigation include: aerodynamics and fluid mechanics, applied mathematics and computer sciences, environment protection, materials science, propulsion, electric and solar power, guidance and navigation, communication and information sciences, general physics, and chemistry.
European aerospace science and technology, 1992: A bibliography with indexes
NASA Technical Reports Server (NTRS)
1993-01-01
This bibliography contains 1916 annotated references to reports and journal articles of European intellectual origin entered into the NASA Scientific and Technical Information System during 1992. Representative subject areas include: spacecraft and aircraft design, propulsion technology, chemistry and materials, engineering and mechanics, earth and life sciences, communications, computers and mathematics, and the natural space sciences.
Biomedical wellness challenges and opportunities
NASA Astrophysics Data System (ADS)
Tangney, John F.
2012-06-01
The mission of ONR's Human and Bioengineered Systems Division is to direct, plan, foster, and encourage Science and Technology in cognitive science, computational neuroscience, bioscience and bio-mimetic technology, social/organizational science, training, human factors, and decision making as related to future Naval needs. This paper highlights current programs that contribute to future biomedical wellness needs in context of humanitarian assistance and disaster relief. ONR supports fundamental research and related technology demonstrations in several related areas, including biometrics and human activity recognition; cognitive sciences; computational neurosciences and bio-robotics; human factors, organizational design and decision research; social, cultural and behavioral modeling; and training, education and human performance. In context of a possible future with automated casualty evacuation, elements of current science and technology programs are illustrated.
Big Data: An Opportunity for Collaboration with Computer Scientists on Data-Driven Science
NASA Astrophysics Data System (ADS)
Baru, C.
2014-12-01
Big data technologies are evolving rapidly, driven by the need to manage ever increasing amounts of historical data; process relentless streams of human and machine-generated data; and integrate data of heterogeneous structure from extremely heterogeneous sources of information. Big data is inherently an application-driven problem. Developing the right technologies requires an understanding of the applications domain. Though, an intriguing aspect of this phenomenon is that the availability of the data itself enables new applications not previously conceived of! In this talk, we will discuss how the big data phenomenon creates an imperative for collaboration among domain scientists (in this case, geoscientists) and computer scientists. Domain scientists provide the application requirements as well as insights about the data involved, while computer scientists help assess whether problems can be solved with currently available technologies or require adaptaion of existing technologies and/or development of new technologies. The synergy can create vibrant collaborations potentially leading to new science insights as well as development of new data technologies and systems. The area of interface between geosciences and computer science, also referred to as geoinformatics is, we believe, a fertile area for interdisciplinary research.
African-American males in computer science---Examining the pipeline for clogs
NASA Astrophysics Data System (ADS)
Stone, Daryl Bryant
The literature on African-American males (AAM) begins with a statement to the effect that "Today young Black men are more likely to be killed or sent to prison than to graduate from college." Why are the numbers of African-American male college graduates decreasing? Why are those enrolled in college not majoring in the science, technology, engineering, and mathematics (STEM) disciplines? This research explored why African-American males are not filling the well-recognized industry need for Computer Scientist/Technologists by choosing college tracks to these careers. The literature on STEM disciplines focuses largely on women in STEM, as opposed to minorities, and within minorities, there is a noticeable research gap in addressing the needs and opportunities available to African-American males. The primary goal of this study was therefore to examine the computer science "pipeline" from the African-American male perspective. The method included a "Computer Science Degree Self-Efficacy Scale" be distributed to five groups of African-American male students, to include: (1) fourth graders, (2) eighth graders, (3) eleventh graders, (4) underclass undergraduate computer science majors, and (5) upperclass undergraduate computer science majors. In addition to a 30-question self-efficacy test, subjects from each group were asked to participate in a group discussion about "African-American males in computer science." The audio record of each group meeting provides qualitative data for the study. The hypotheses include the following: (1) There is no significant difference in "Computer Science Degree" self-efficacy between fourth and eighth graders. (2) There is no significant difference in "Computer Science Degree" self-efficacy between eighth and eleventh graders. (3) There is no significant difference in "Computer Science Degree" self-efficacy between eleventh graders and lower-level computer science majors. (4) There is no significant difference in "Computer Science Degree" self-efficacy between lower-level computer science majors and upper-level computer science majors. (5) There is no significant difference in "Computer Science Degree" self-efficacy between each of the five groups of students. Finally, the researcher selected African-American male students attending six primary schools, including the predominately African-American elementary, middle and high school that the researcher attended during his own academic career. Additionally, a racially mixed elementary, middle and high school was selected from the same county in Maryland. Bowie State University provided both the underclass and upperclass computer science majors surveyed in this study. Of the five hypotheses, the sample provided enough evidence to support the claim that there are significant differences in the "Computer Science Degree" self-efficacy between each of the five groups of students. ANOVA analysis by question and total self-efficacy scores provided more results of statistical significance. Additionally, factor analysis and review of the qualitative data provide more insightful results. Overall, the data suggest 'a clog' may exist in the middle school level and students attending racially mixed schools were more confident in their computer, math and science skills. African-American males admit to spending lots of time on social networking websites and emailing, but are 'dis-aware' of the skills and knowledge needed to study in the computing disciplines. The majority of the subjects knew little, if any, AAMs in the 'computing discipline pipeline'. The collegian African-American males, in this study, agree that computer programming is a difficult area and serves as a 'major clog in the pipeline'.
BioSIGHT: Interactive Visualization Modules for Science Education
NASA Technical Reports Server (NTRS)
Wong, Wee Ling
1998-01-01
Redefining science education to harness emerging integrated media technologies with innovative pedagogical goals represents a unique challenge. The Integrated Media Systems Center (IMSC) is the only engineering research center in the area of multimedia and creative technologies sponsored by the National Science Foundation. The research program at IMSC is focused on developing advanced technologies that address human-computer interfaces, database management, and high-speed network capabilities. The BioSIGHT project at is a demonstration technology project in the area of education that seeks to address how such emerging multimedia technologies can make an impact on science education. The scope of this project will help solidify NASA's commitment for the development of innovative educational resources that promotes science literacy for our students and the general population as well. These issues must be addressed as NASA marches toward the goal of enabling human space exploration that requires an understanding of life sciences in space. The IMSC BioSIGHT lab was established with the purpose of developing a novel methodology that will map a high school biology curriculum into a series of interactive visualization modules that can be easily incorporated into a space biology curriculum. Fundamental concepts in general biology must be mastered in order to allow a better understanding and application for space biology. Interactive visualization is a powerful component that can capture the students' imagination, facilitate their assimilation of complex ideas, and help them develop integrated views of biology. These modules will augment the role of the teacher and will establish the value of student-centered interactivity, both in an individual setting as well as in a collaborative learning environment. Students will be able to interact with the content material, explore new challenges, and perform virtual laboratory simulations. The BioSIGHT effort is truly cross-disciplinary in nature and requires expertise from many areas including Biology, Computer Science Electrical Engineering, Education, and the Cognitive Sciences. The BioSIGHT team includes a scientific illustrator, educational software designer, computer programmers as well as IMSC graduate and undergraduate students.
Computational provenance in hydrologic science: a snow mapping example.
Dozier, Jeff; Frew, James
2009-03-13
Computational provenance--a record of the antecedents and processing history of digital information--is key to properly documenting computer-based scientific research. To support investigations in hydrologic science, we produce the daily fractional snow-covered area from NASA's moderate-resolution imaging spectroradiometer (MODIS). From the MODIS reflectance data in seven wavelengths, we estimate the fraction of each 500 m pixel that snow covers. The daily products have data gaps and errors because of cloud cover and sensor viewing geometry, so we interpolate and smooth to produce our best estimate of the daily snow cover. To manage the data, we have developed the Earth System Science Server (ES3), a software environment for data-intensive Earth science, with unique capabilities for automatically and transparently capturing and managing the provenance of arbitrary computations. Transparent acquisition avoids the scientists having to express their computations in specific languages or schemas in order for provenance to be acquired and maintained. ES3 models provenance as relationships between processes and their input and output files. It is particularly suited to capturing the provenance of an evolving algorithm whose components span multiple languages and execution environments.
Trends in Social Science: The Impact of Computational and Simulative Models
NASA Astrophysics Data System (ADS)
Conte, Rosaria; Paolucci, Mario; Cecconi, Federico
This paper discusses current progress in the computational social sciences. Specifically, it examines the following questions: Are the computational social sciences exhibiting positive or negative developments? What are the roles of agent-based models and simulation (ABM), network analysis, and other "computational" methods within this dynamic? (Conte, The necessity of intelligent agents in social simulation, Advances in Complex Systems, 3(01n04), 19-38, 2000; Conte 2010; Macy, Annual Review of Sociology, 143-166, 2002). Are there objective indicators of scientific growth that can be applied to different scientific areas, allowing for comparison among them? In this paper, some answers to these questions are presented and discussed. In particular, comparisons among different disciplines in the social and computational sciences are shown, taking into account their respective growth trends in the number of publication citations over the last few decades (culled from Google Scholar). After a short discussion of the methodology adopted, results of keyword-based queries are presented, unveiling some unexpected local impacts of simulation on the takeoff of traditionally poorly productive disciplines.
Abstracts of Research, July 1975-June 1976.
ERIC Educational Resources Information Center
Ohio State Univ., Columbus. Computer and Information Science Research Center.
Abstracts of research papers in computer and information science are given for 62 papers in the areas of information storage and retrieval; computer facilities; information analysis; linguistics analysis; artificial intelligence; information processes in physical, biological, and social systems; mathematical technigues; systems programming;…
Teacher's Guide to Secondary Mathematics.
ERIC Educational Resources Information Center
Duval County Schools, Jacksonville, FL.
This is a teacher's guide to secondary school mathematics. Developed for use in the Duval County Public Schools, Jacksonville, Florida. Areas of mathematics covered are algebra, analysis, calculus, computer literacy, computer science, geometry, analytic geometry, general mathematics, consumer mathematics, pre-algebra, probability and statistics,…
Computational communities: African-American cultural capital in computer science education
NASA Astrophysics Data System (ADS)
Lachney, Michael
2017-10-01
Enrolling the cultural capital of underrepresented communities in PK-12 technology and curriculum design has been a primary strategy for broadening the participation of students of color in U.S. computer science (CS) fields. This article examines two ways that African-American cultural capital and computing can be bridged in CS education. The first is community representation, using cultural capital to highlight students' social identities and networks through computational thinking. The second, computational integration, locates computation in cultural capital itself. I survey two risks - the appearance of shallow computing and the reproduction of assimilationist logics - that may arise when constructing one bridge without the other. To avoid these risks, I introduce the concept of computational communities by exploring areas in CS education that employ both strategies. This concept is then grounded in qualitative data from an after school program that connected CS to African-American cosmetology.
Lefor, Alan T
2011-08-01
Oncology research has traditionally been conducted using techniques from the biological sciences. The new field of computational oncology has forged a new relationship between the physical sciences and oncology to further advance research. By applying physics and mathematics to oncologic problems, new insights will emerge into the pathogenesis and treatment of malignancies. One major area of investigation in computational oncology centers around the acquisition and analysis of data, using improved computing hardware and software. Large databases of cellular pathways are being analyzed to understand the interrelationship among complex biological processes. Computer-aided detection is being applied to the analysis of routine imaging data including mammography and chest imaging to improve the accuracy and detection rate for population screening. The second major area of investigation uses computers to construct sophisticated mathematical models of individual cancer cells as well as larger systems using partial differential equations. These models are further refined with clinically available information to more accurately reflect living systems. One of the major obstacles in the partnership between physical scientists and the oncology community is communications. Standard ways to convey information must be developed. Future progress in computational oncology will depend on close collaboration between clinicians and investigators to further the understanding of cancer using these new approaches.
Science Application of Area and Ratio Concepts
ERIC Educational Resources Information Center
Horak, Virginia M.
2006-01-01
This article describes using area and ratio concepts to examine why some animals, or people wearing different types of shoes, sink into the surface on which they are standing. Students compute "sinking values" to explain these differences. (Contains 2 figures.)
Emerging Science And Technologies: Securing The Nation Through Dicovery and Innovation
2013-04-01
potential material for use in quantum computing and spintronics. R&D in the area of advanced carbon-based materials has the potential to revolutionize...seem to involve a dual-approach strategy. First, the vast majority of our sensory input information does not reach the level of consciousness ...WHITE PAPER | 17 Relevant technology areas that support Protection of the Intelligence Enterprise include: Quantum Computing and Associated
An Ethernet Java Applet for a Course for Non-Majors.
ERIC Educational Resources Information Center
Holliday, Mark A.
1997-01-01
Details the topics of a new course that introduces computing and communication technology to students not majoring in computer science. Discusses the process of developing a Java applet (a program that can be invoked through a World Wide Web browser) that illustrates the protocol used by ethernet local area networks to determine which computer can…
An Undergraduate Computer Engineering Option for Electrical Engineering.
ERIC Educational Resources Information Center
National Academy of Engineering, Washington, DC. Commission on Education.
This report is the result of a study, funded by the National Science Foundation, of a group constituted as the COSINE Task Force on Undergraduate Education in Computer Engineering in 1969. The group was formed in response to the growing demand for education in computer engineering and the limited opportunities for study in this area. Computer…
A Visual Tool for Computer Supported Learning: The Robot Motion Planning Example
ERIC Educational Resources Information Center
Elnagar, Ashraf; Lulu, Leena
2007-01-01
We introduce an effective computer aided learning visual tool (CALVT) to teach graph-based applications. We present the robot motion planning problem as an example of such applications. The proposed tool can be used to simulate and/or further to implement practical systems in different areas of computer science such as graphics, computational…
ERIC Educational Resources Information Center
Velez-Rubio, Miguel
2013-01-01
Teaching computer programming to freshmen students in Computer Sciences and other Information Technology areas has been identified as a complex activity. Different approaches have been studied looking for the best one that could help to improve this teaching process. A proposed approach was implemented which is based in the language immersion…
Computational thinking in life science education.
Rubinstein, Amir; Chor, Benny
2014-11-01
We join the increasing call to take computational education of life science students a step further, beyond teaching mere programming and employing existing software tools. We describe a new course, focusing on enriching the curriculum of life science students with abstract, algorithmic, and logical thinking, and exposing them to the computational "culture." The design, structure, and content of our course are influenced by recent efforts in this area, collaborations with life scientists, and our own instructional experience. Specifically, we suggest that an effective course of this nature should: (1) devote time to explicitly reflect upon computational thinking processes, resisting the temptation to drift to purely practical instruction, (2) focus on discrete notions, rather than on continuous ones, and (3) have basic programming as a prerequisite, so students need not be preoccupied with elementary programming issues. We strongly recommend that the mere use of existing bioinformatics tools and packages should not replace hands-on programming. Yet, we suggest that programming will mostly serve as a means to practice computational thinking processes. This paper deals with the challenges and considerations of such computational education for life science students. It also describes a concrete implementation of the course and encourages its use by others.
The impact of supercomputers on experimentation: A view from a national laboratory
NASA Technical Reports Server (NTRS)
Peterson, V. L.; Arnold, J. O.
1985-01-01
The relative roles of large scale scientific computers and physical experiments in several science and engineering disciplines are discussed. Increasing dependence on computers is shown to be motivated both by the rapid growth in computer speed and memory, which permits accurate numerical simulation of complex physical phenomena, and by the rapid reduction in the cost of performing a calculation, which makes computation an increasingly attractive complement to experimentation. Computer speed and memory requirements are presented for selected areas of such disciplines as fluid dynamics, aerodynamics, aerothermodynamics, chemistry, atmospheric sciences, astronomy, and astrophysics, together with some examples of the complementary nature of computation and experiment. Finally, the impact of the emerging role of computers in the technical disciplines is discussed in terms of both the requirements for experimentation and the attainment of previously inaccessible information on physical processes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Windus, Theresa; Banda, Michael; Devereaux, Thomas
Computers have revolutionized every aspect of our lives. Yet in science, the most tantalizing applications of computing lie just beyond our reach. The current quest to build an exascale computer with one thousand times the capability of today’s fastest machines (and more than a million times that of a laptop) will take researchers over the next horizon. The field of materials, chemical reactions, and compounds is inherently complex. Imagine millions of new materials with new functionalities waiting to be discovered — while researchers also seek to extend those materials that are known to a dizzying number of new forms. Wemore » could translate massive amounts of data from high precision experiments into new understanding through data mining and analysis. We could have at our disposal the ability to predict the properties of these materials, to follow their transformations during reactions on an atom-by-atom basis, and to discover completely new chemical pathways or physical states of matter. Extending these predictions from the nanoscale to the mesoscale, from the ultrafast world of reactions to long-time simulations to predict the lifetime performance of materials, and to the discovery of new materials and processes will have a profound impact on energy technology. In addition, discovery of new materials is vital to move computing beyond Moore’s law. To realize this vision, more than hardware is needed. New algorithms to take advantage of the increase in computing power, new programming paradigms, and new ways of mining massive data sets are needed as well. This report summarizes the opportunities and the requisite computing ecosystem needed to realize the potential before us. In addition to pursuing new and more complete physical models and theoretical frameworks, this review found that the following broadly grouped areas relevant to the U.S. Department of Energy (DOE) Office of Advanced Scientific Computing Research (ASCR) would directly affect the Basic Energy Sciences (BES) mission need. Simulation, visualization, and data analysis are crucial for advances in energy science and technology. Revolutionary mathematical, software, and algorithm developments are required in all areas of BES science to take advantage of exascale computing architectures and to meet data analysis, management, and workflow needs. In partnership with ASCR, BES has an emerging and pressing need to develop new and disruptive capabilities in data science. More capable and larger high-performance computing (HPC) and data ecosystems are required to support priority research in BES. Continued success in BES research requires developing the next-generation workforce through education and training and by providing sustained career opportunities.« less
Business aspects and sustainability for healthgrids - an expert survey.
Scholz, Stefan; Semler, Sebastian C; Breitner, Michael H
2009-01-01
Grid computing initiatives in medicine and life sciences are under pressure to prove their sustainability. While some first business model frameworks were outlined, few practical experiences were considered. This gap has been narrowed by an international survey of 33 grid computing experts with biomedical and non-biomedical background on business aspects. The experts surveyed were cautiously optimistic about a sustainable implementation of grid computing within a mid term timeline. They identified marketable application areas, stated the underlying value proposition, outlined trends and specify critical success factors. From a general perspective of their answers, they provided a stable basis for a road map of sustainable grid computing solutions for medicine and life sciences.
Abstracts of Research, July 1973 through June 1974.
ERIC Educational Resources Information Center
Ohio State Univ., Columbus. Computer and Information Science Research Center.
Abstracts of research papers in the fields of computer and information science are given; 72 papers are abstracted in the areas of information storage and retrieval, information processing, linguistic analysis, artificial intelligence, mathematical techniques, systems programing, and computer networks. In addition, the Ohio State University…
NASA Technical Reports Server (NTRS)
Davis, M. H. (Editor); Singy, A. (Editor)
1994-01-01
The Universities Space Research Association (USRA) was incorporated 25 years ago in the District of Columbia as a private nonprofit corporation under the auspices of the National Academy of Sciences. Institutional membership in the association has grown from 49 colleges and universities, when it was founded, to 76 in 1993. USRA provides a mechanism through which universities can cooperate effectively with one another, with the government, and with other organizations to further space science and technology and to promote education in these areas. Its mission is carried out through the institutes, centers, divisions, and programs that are described in detail in this booklet. These include the Lunar and Planetary Institute, the Institute for Computer Applications in Science and Engineering (ICASE), the Research Institute for Advanced Computer Science (RIACS), and the Center of Excellence in Space Data and Information Sciences (CESDIS).
Let's Use Cognitive Science to Create Collaborative Workstations.
Reicher, Murray A; Wolfe, Jeremy M
2016-05-01
When informed by an understanding of cognitive science, radiologists' workstations could become collaborative to improve radiologists' performance and job satisfaction. The authors review relevant literature and present several promising areas of research, including image toggling, eye tracking, cognitive computing, intelligently restricted messaging, work habit tracking, and innovative input devices. The authors call for more research in "perceptual design," a promising field that can complement advances in computer-aided detection. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Congress of the U.S., Washington, DC. House Committee on Science, Space and Technology.
This document contains the transcript of three hearings on the High Speed Performance Computing and High Speed Networking Applications Act of 1993 (H.R. 1757). The hearings were designed to obtain specific suggestions for improvements to the legislation and alternative or additional application areas that should be pursued. Testimony and prepared…
BioSIGHT: Interactive Visualization Modules for Science Education
NASA Technical Reports Server (NTRS)
Wong, Wee Ling
1998-01-01
Redefining science education to harness emerging integrated media technologies with innovative pedagogical goals represents a unique challenge. The Integrated Media Systems Center (IMSC) is the only engineering research center in the area of multimedia and creative technologies sponsored by the National Science Foundation. The research program at IMSC is focused on developing advanced technologies that address human-computer interfaces, database management, and high- speed network capabilities. The BioSIGHT project at IMSC is a demonstration technology project in the area of education that seeks to address how such emerging multimedia technologies can make an impact on science education. The scope of this project will help solidify NASA's commitment for the development of innovative educational resources that promotes science literacy for our students and the general population as well. These issues must be addressed as NASA marches towards the goal of enabling human space exploration that requires an understanding of life sciences in space. The IMSC BioSIGHT lab was established with the purpose of developing a novel methodology that will map a high school biology curriculum into a series of interactive visualization modules that can be easily incorporated into a space biology curriculum. Fundamental concepts in general biology must be mastered in order to allow a better understanding and application for space biology. Interactive visualization is a powerful component that can capture the students' imagination, facilitate their assimilation of complex ideas, and help them develop integrated views of biology. These modules will augment the role of the teacher and will establish the value of student-centered interactivity, both in an individual setting as well as in a collaborative learning environment. Students will be able to interact with the content material, explore new challenges, and perform virtual laboratory simulations. The BioSIGHT effort is truly cross-disciplinary in nature and requires expertise from many areas including Biology, Computer Science, Electrical Engineering, Education, and the Cognitive Sciences. The BioSIGHT team includes a scientific illustrator, educational software designer, computer programmers as well as IMSC graduate and undergraduate students. Our collaborators include TERC, a research and education organization with extensive k-12 math and science curricula development from Cambridge, MA.; SRI International of Menlo Park, CA.; teachers and students from local area high schools (Newbury Park High School, USC's Family of Five schools, Chadwick School, and Pasadena Polytechnic High School).
XXV IUPAP Conference on Computational Physics (CCP2013): Preface
NASA Astrophysics Data System (ADS)
2014-05-01
XXV IUPAP Conference on Computational Physics (CCP2013) was held from 20-24 August 2013 at the Russian Academy of Sciences in Moscow, Russia. The annual Conferences on Computational Physics (CCP) present an overview of the most recent developments and opportunities in computational physics across a broad range of topical areas. The CCP series aims to draw computational scientists from around the world and to stimulate interdisciplinary discussion and collaboration by putting together researchers interested in various fields of computational science. It is organized under the auspices of the International Union of Pure and Applied Physics and has been in existence since 1989. The CCP series alternates between Europe, America and Asia-Pacific. The conferences are traditionally supported by European Physical Society and American Physical Society. This year the Conference host was Landau Institute for Theoretical Physics. The Conference contained 142 presentations, and, in particular, 11 plenary talks with comprehensive reviews from airbursts to many-electron systems. We would like to take this opportunity to thank our sponsors: International Union of Pure and Applied Physics (IUPAP), European Physical Society (EPS), Division of Computational Physics of American Physical Society (DCOMP/APS), Russian Foundation for Basic Research, Department of Physical Sciences of Russian Academy of Sciences, RSC Group company. Further conference information and images from the conference are available in the pdf.
NASA Advanced Supercomputing Facility Expansion
NASA Technical Reports Server (NTRS)
Thigpen, William W.
2017-01-01
The NASA Advanced Supercomputing (NAS) Division enables advances in high-end computing technologies and in modeling and simulation methods to tackle some of the toughest science and engineering challenges facing NASA today. The name "NAS" has long been associated with leadership and innovation throughout the high-end computing (HEC) community. We play a significant role in shaping HEC standards and paradigms, and provide leadership in the areas of large-scale InfiniBand fabrics, Lustre open-source filesystems, and hyperwall technologies. We provide an integrated high-end computing environment to accelerate NASA missions and make revolutionary advances in science. Pleiades, a petaflop-scale supercomputer, is used by scientists throughout the U.S. to support NASA missions, and is ranked among the most powerful systems in the world. One of our key focus areas is in modeling and simulation to support NASA's real-world engineering applications and make fundamental advances in modeling and simulation methods.
NASA Astrophysics Data System (ADS)
Joseph, Dolly Rebecca Doran
The playing of computer games is one of the most popular non-school activities of children, particularly boys, and is often the entry point to greater facility with and use of other computer applications. Children are learning skills as they play, but what they learn often does not generalize beyond application to that and other similar games. Nevertheless, games have the potential to develop in students the knowledge and skills described by national and state educational standards. This study focuses upon middle-school aged children, and how they react to and respond to computer games designed for entertainment and educational purposes, within the context of science learning. Through qualitative, case study methodology, the game play, evaluation, and modification experiences of four diverse middle-school-aged students in summer camps are analyzed. The inquiry focused on determining the attributes of computer games that appeal to middle school students, the aspects of science that appeal to middle school children, and ultimately, how science games might be designed to appeal to middle school children. Qualitative data analysis led to the development of a method for describing players' activity modes during game play, rather than the conventional methods that describe game characteristics. These activity modes are used to describe the game design preferences of the participants. Recommendations are also made in the areas of functional, aesthetic, and character design and for the design of educational games. Middle school students may find the topical areas of forensics, medicine, and the environment to be of most interest; designing games in and across these topic areas has the potential for encouraging voluntary science-related play. Finally, when including children in game evaluation and game design activities, results suggest the value of providing multiple types of activities in order to encourage the full participation of all children.
"Have Them Read a Good Book": Enriching the Scientific and Technical Writing Curriculum.
ERIC Educational Resources Information Center
Miles, Thomas H.
1989-01-01
Lists approximately 200 recent science and technology book titles (some with annotations). Notes that this literature acquaints students with the history of science and technology and helps them understand debated philosophical issues. Includes the following subject areas: anthropology; chemistry; computers and artificial intelligence; ecology;…
ERIC Educational Resources Information Center
Twaddell, Freeman
A brief analysis and definition of general linguistics focuses on distinct areas of study within the science, including descriptive, historical, comparative, and computational linguistics. Other branches of the science discussed are psycholinguistics, sociolinguistics, and dialectology. Technical concepts encountered in the literature are also…
Research briefing on contemporary problems in plasma science
NASA Technical Reports Server (NTRS)
1991-01-01
An overview is presented of the broad perspective of all plasma science. Detailed discussions are given of scientific opportunities in various subdisciplines of plasma science. The first subdiscipline to be discussed is the area where the contemporary applications of plasma science are the most widespread, low temperature plasma science. Opportunities for new research and technology development that have emerged as byproducts of research in magnetic and inertial fusion are then highlighted. Then follows a discussion of new opportunities in ultrafast plasma science opened up by recent developments in laser and particle beam technology. Next, research that uses smaller scale facilities is discussed, first discussing non-neutral plasmas, and then the area of basic plasma experiments. Discussions of analytic theory and computational plasma physics and of space and astrophysical plasma physics are then presented.
ERIC Educational Resources Information Center
Tataw, Oben Moses
2013-01-01
Interdisciplinary research in computer science requires the development of computational techniques for practical application in different domains. This usually requires careful integration of different areas of technical expertise. This dissertation presents image and time series analysis algorithms, with practical interdisciplinary applications…
Commentary: Ubiquitous Computing Revisited--A New Perspective
ERIC Educational Resources Information Center
Bull, Glen; Garofalo, Joe
2006-01-01
In 2002, representatives from the teacher educator associations representing the core content areas (science, mathematics, language arts, and social studies) and educational technology met at the National Technology Leadership Retreat (NTLR) to discuss potential implications of ubiquitous computing for K-12 schools. This paper re-examines some of…
Computational Understanding: Analysis of Sentences and Context
1974-05-01
Computer Science Department Stanford, California 9430b 10- PROGRAM ELEMENT. PROJECT. TASK AREA « WORK UNIT NUMBERS II. CONTROLLING OFFICE NAME...these is the need tor programs that can respond in useful ways to information expressed in a natural language. However a computational understanding...buying structure because "Mary" appears where it does. But the time for analysis was rarely over five seconds of computer time, when the Lisp program
Enlist micros: Training science teachers to use microcomputers
NASA Astrophysics Data System (ADS)
Baird, William E.; Ellis, James D.; Kuerbis, Paul J.
A National Science Foundation grant to the Biological Sciences Curriculum Study (BSCS) at The Colorado College supported the design and production of training materials to encourage literacy of science teachers in the use of microcomputers. ENLIST Micros is based on results of a national needs assessment that identified 22 compentencies needed by K-12 science teachers to use microcomputers for instruction. A writing team developed the 16-hour training program in the summer of 1985, and field-test coordinators tested it with 18 preservice or in-service groups during the 1985-86 academic year at 15 sites within the United States. The training materials consist of video programs, interactive computer disks for the Apple II series microcomputer, a training manual for participants, and a guide for the group leader. The experimental materials address major areas of educational computing: awareness, applications, implementation, evaluation, and resources. Each chapter contains activities developed for this program, such as viewing video segments of science teachers who are using computers effectively and running commercial science and training courseware. Role playing and small-group interaction help the teachers overcome their reluctance to use computers and plan for effective implementation of microcomputers in the school. This study examines the implementation of educational computing among 47 science teachers who completed the ENLIST Micros training at a southern university. We present results of formative evaluation for that site. Results indicate that both elementary and secondary teachers benefit from the training program and demonstrate gains in attitudes toward computer use. Participating teachers said that the program met its stated objectives and helped them obtain needed skills. Only 33 percent of these teachers, however, reported using computers one year after the training. In June 1986, the BSCS initiated a follow up to the ENLIST Micros curriculum to develop, evaluate, and disseminate a complete model of teacher enhancement for educational computing in the sciences. In that project, we use the ENLIST Micros curriculum as the first step in a training process. The project includes seminars that introduce additional skills: It contains provisions for sharing among participants, monitors use of computers in participants' classrooms, provides structured coaching of participants' use of computers in their classrooms, and offers planned observations of peers using computers in their science teaching.
NASA Astrophysics Data System (ADS)
Doerr, Martin; Freitas, Fred; Guizzardi, Giancarlo; Han, Hyoil
Ontology is a cross-disciplinary field concerned with the study of concepts and theories that can be used for representing shared conceptualizations of specific domains. Ontological Engineering is a discipline in computer and information science concerned with the development of techniques, methods, languages and tools for the systematic construction of concrete artifacts capturing these representations, i.e., models (e.g., domain ontologies) and metamodels (e.g., upper-level ontologies). In recent years, there has been a growing interest in the application of formal ontology and ontological engineering to solve modeling problems in diverse areas in computer science such as software and data engineering, knowledge representation, natural language processing, information science, among many others.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Qian, Xiaoqing; Deng, Z. T.
2009-11-10
This is the final report for the Department of Energy (DOE) project DE-FG02-06ER25746, entitled, "Continuing High Performance Computing Research and Education at AAMU". This three-year project was started in August 15, 2006, and it was ended in August 14, 2009. The objective of this project was to enhance high performance computing research and education capabilities at Alabama A&M University (AAMU), and to train African-American and other minority students and scientists in the computational science field for eventual employment with DOE. AAMU has successfully completed all the proposed research and educational tasks. Through the support of DOE, AAMU was able tomore » provide opportunities to minority students through summer interns and DOE computational science scholarship program. In the past three years, AAMU (1). Supported three graduate research assistants in image processing for hypersonic shockwave control experiment and in computational science related area; (2). Recruited and provided full financial support for six AAMU undergraduate summer research interns to participate Research Alliance in Math and Science (RAMS) program at Oak Ridge National Lab (ORNL); (3). Awarded highly competitive 30 DOE High Performance Computing Scholarships ($1500 each) to qualified top AAMU undergraduate students in science and engineering majors; (4). Improved high performance computing laboratory at AAMU with the addition of three high performance Linux workstations; (5). Conducted image analysis for electromagnetic shockwave control experiment and computation of shockwave interactions to verify the design and operation of AAMU-Supersonic wind tunnel. The high performance computing research and education activities at AAMU created great impact to minority students. As praised by Accreditation Board for Engineering and Technology (ABET) in 2009, ?The work on high performance computing that is funded by the Department of Energy provides scholarships to undergraduate students as computational science scholars. This is a wonderful opportunity to recruit under-represented students.? Three ASEE papers were published in 2007, 2008 and 2009 proceedings of ASEE Annual Conferences, respectively. Presentations of these papers were also made at the ASEE Annual Conferences. It is very critical to continue the research and education activities.« less
Structural biology computing: Lessons for the biomedical research sciences.
Morin, Andrew; Sliz, Piotr
2013-11-01
The field of structural biology, whose aim is to elucidate the molecular and atomic structures of biological macromolecules, has long been at the forefront of biomedical sciences in adopting and developing computational research methods. Operating at the intersection between biophysics, biochemistry, and molecular biology, structural biology's growth into a foundational framework on which many concepts and findings of molecular biology are interpreted1 has depended largely on parallel advancements in computational tools and techniques. Without these computing advances, modern structural biology would likely have remained an exclusive pursuit practiced by few, and not become the widely practiced, foundational field it is today. As other areas of biomedical research increasingly embrace research computing techniques, the successes, failures and lessons of structural biology computing can serve as a useful guide to progress in other biomedically related research fields. Copyright © 2013 Wiley Periodicals, Inc.
A visiting scientist program in atmospheric sciences for the Goddard Space Flight Center
NASA Technical Reports Server (NTRS)
Davis, M. H.
1989-01-01
A visiting scientist program was conducted in the atmospheric sciences and related areas at the Goddard Laboratory for Atmospheres. Research was performed in mathematical analysis as applied to computer modeling of the atmospheres; development of atmospheric modeling programs; analysis of remotely sensed atmospheric, surface, and oceanic data and its incorporation into atmospheric models; development of advanced remote sensing instrumentation; and related research areas. The specific research efforts are detailed by tasks.
Bioinformatics for Exploration
NASA Technical Reports Server (NTRS)
Johnson, Kathy A.
2006-01-01
For the purpose of this paper, bioinformatics is defined as the application of computer technology to the management of biological information. It can be thought of as the science of developing computer databases and algorithms to facilitate and expedite biological research. This is a crosscutting capability that supports nearly all human health areas ranging from computational modeling, to pharmacodynamics research projects, to decision support systems within autonomous medical care. Bioinformatics serves to increase the efficiency and effectiveness of the life sciences research program. It provides data, information, and knowledge capture which further supports management of the bioastronautics research roadmap - identifying gaps that still remain and enabling the determination of which risks have been addressed.
Life sciences and environmental sciences
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1992-02-01
The DOE laboratories play a unique role in bringing multidisciplinary talents -- in biology, physics, chemistry, computer sciences, and engineering -- to bear on major problems in the life and environmental sciences. Specifically, the laboratories utilize these talents to fulfill OHER's mission of exploring and mitigating the health and environmental effects of energy use, and of developing health and medical applications of nuclear energy-related phenomena. At Lawrence Berkeley Laboratory (LBL) support of this mission is evident across the spectrum of OHER-sponsored research, especially in the broad areas of genomics, structural biology, basic cell and molecular biology, carcinogenesis, energy and environment,more » applications to biotechnology, and molecular, nuclear and radiation medicine. These research areas are briefly described.« less
NASA Technical Reports Server (NTRS)
1994-01-01
CESDIS, the Center of Excellence in Space Data and Information Sciences was developed jointly by NASA, Universities Space Research Association (USRA), and the University of Maryland in 1988 to focus on the design of advanced computing techniques and data systems to support NASA Earth and space science research programs. CESDIS is operated by USRA under contract to NASA. The Director, Associate Director, Staff Scientists, and administrative staff are located on-site at NASA's Goddard Space Flight Center in Greenbelt, Maryland. The primary CESDIS mission is to increase the connection between computer science and engineering research programs at colleges and universities and NASA groups working with computer applications in Earth and space science. Research areas of primary interest at CESDIS include: 1) High performance computing, especially software design and performance evaluation for massively parallel machines; 2) Parallel input/output and data storage systems for high performance parallel computers; 3) Data base and intelligent data management systems for parallel computers; 4) Image processing; 5) Digital libraries; and 6) Data compression. CESDIS funds multiyear projects at U. S. universities and colleges. Proposals are accepted in response to calls for proposals and are selected on the basis of peer reviews. Funds are provided to support faculty and graduate students working at their home institutions. Project personnel visit Goddard during academic recess periods to attend workshops, present seminars, and collaborate with NASA scientists on research projects. Additionally, CESDIS takes on specific research tasks of shorter duration for computer science research requested by NASA Goddard scientists.
The CAN Microcluster: Parallel Processing over the Controller Area Network
ERIC Educational Resources Information Center
Kuban, Paul A.; Ragade, Rammohan K.
2005-01-01
Most electrical engineering and computer science undergraduate programs include at least one course on microcontrollers and assembly language programming. Some departments offer legacy courses in C programming, but few include C programming from an embedded systems perspective, where it is still regularly used. Distributed computing and parallel…
The ACLS Survey of Scholars: Views on Publications, Computers, Libraries.
ERIC Educational Resources Information Center
Morton, Herbert C.; Price, Anne Jamieson
1986-01-01
Reviews results of a survey by the American Council of Learned Societies (ACLS) of 3,835 scholars in the humanities and social sciences who are working both in colleges and universities and outside the academic community. Areas highlighted include professional reading, authorship patterns, computer use, and library use. (LRW)
On the design of computer-based models for integrated environmental science.
McIntosh, Brian S; Jeffrey, Paul; Lemon, Mark; Winder, Nick
2005-06-01
The current research agenda in environmental science is dominated by calls to integrate science and policy to better understand and manage links between social (human) and natural (nonhuman) processes. Freshwater resource management is one area where such calls can be heard. Designing computer-based models for integrated environmental science poses special challenges to the research community. At present it is not clear whether such tools, or their outputs, receive much practical policy or planning application. It is argued that this is a result of (1) a lack of appreciation within the research modeling community of the characteristics of different decision-making processes including policy, planning, and (2) participation, (3) a lack of appreciation of the characteristics of different decision-making contexts, (4) the technical difficulties in implementing the necessary support tool functionality, and (5) the socio-technical demands of designing tools to be of practical use. This article presents a critical synthesis of ideas from each of these areas and interprets them in terms of design requirements for computer-based models being developed to provide scientific information support for policy and planning. Illustrative examples are given from the field of freshwater resources management. Although computer-based diagramming and modeling tools can facilitate processes of dialogue, they lack adequate simulation capabilities. Component-based models and modeling frameworks provide such functionality and may be suited to supporting problematic or messy decision contexts. However, significant technical (implementation) and socio-technical (use) challenges need to be addressed before such ambition can be realized.
An urban area minority outreach program for K-6 children in space science
NASA Astrophysics Data System (ADS)
Morris, P.; Garza, O.; Lindstrom, M.; Allen, J.; Wooten, J.; Sumners, C.; Obot, V.
The Houston area has minority populations with significant school dropout rates. This is similar to other major cities in the United States and elsewhere in the world where there are significant minority populations from rural areas. The student dropout rates are associated in many instances with the absence of educational support opportuni- ties either from the school and/or from the family. This is exacerbated if the student has poor English language skills. To address this issue, a NASA minority university initiative enabled us to develop a broad-based outreach program that includes younger children and their parents at a primarily Hispanic inner city charter school. The pro- gram at the charter school was initiated by teaching computer skills to the older chil- dren, who in turn taught parents. The older children were subsequently asked to help teach a computer literacy class for mothers with 4-5 year old children. The computers initially intimidated the mothers as most had limited educational backgrounds and En- glish language skills. To practice their newly acquired computer skills and learn about space science, the mothers and their children were asked to pick a space project and investigate it using their computer skills. The mothers and their children decided to learn about black holes. The project included designing space suits for their children so that they could travel through space and observe black holes from a closer proxim- ity. The children and their mothers learned about computers and how to use them for educational purposes. In addition, they learned about black holes and the importance of space suits in protecting astronauts as they investigated space. The parents are proud of their children and their achievements. By including the parents in the program, they have a greater understanding of the importance of their children staying in school and the opportunities for careers in space science and technology. For more information on our overall program, the charter school and their other space science related activities, visit their web site, http://www.tccc-ryss.org/solarsys/solarmingrant.htm
Expanding the Scope of High-Performance Computing Facilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Uram, Thomas D.; Papka, Michael E.
The high-performance computing centers of the future will expand their roles as service providers, and as the machines scale up, so should the sizes of the communities they serve. National facilities must cultivate their users as much as they focus on operating machines reliably. The authors present five interrelated topic areas that are essential to expanding the value provided to those performing computational science.
ERIC Educational Resources Information Center
Spence, Michelle; Mawhinney, Tara; Barsky, Eugene
2012-01-01
Science and engineering libraries have an important role to play in preserving the intellectual content in research areas of the departments they serve. This study employs bibliographic data from the Web of Science database to examine how much research material is required to cover 90% of faculty citations in civil engineering and computer…
NASA Technical Reports Server (NTRS)
Hussaini, M. Y. (Editor); Kumar, A. (Editor); Salas, M. D. (Editor)
1993-01-01
The purpose here is to assess the state of the art in the areas of numerical analysis that are particularly relevant to computational fluid dynamics (CFD), to identify promising new developments in various areas of numerical analysis that will impact CFD, and to establish a long-term perspective focusing on opportunities and needs. Overviews are given of discretization schemes, computational fluid dynamics, algorithmic trends in CFD for aerospace flow field calculations, simulation of compressible viscous flow, and massively parallel computation. Also discussed are accerelation methods, spectral and high-order methods, multi-resolution and subcell resolution schemes, and inherently multidimensional schemes.
Large-scale visualization projects for teaching software engineering.
Müller, Christoph; Reina, Guido; Burch, Michael; Weiskopf, Daniel
2012-01-01
The University of Stuttgart's software engineering major complements the traditional computer science major with more practice-oriented education. Two-semester software projects in various application areas offered by the university's different computer science institutes are a successful building block in the curriculum. With this realistic, complex project setting, students experience the practice of software engineering, including software development processes, technologies, and soft skills. In particular, visualization-based projects are popular with students. Such projects offer them the opportunity to gain profound knowledge that would hardly be possible with only regular lectures and homework assignments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mann, Reinhold C.
This is the first formal progress report issued by the ORNL Life Sciences Division. It covers the period from February 1997 through December 1998, which has been critical in the formation of our new division. The legacy of 50 years of excellence in biological research at ORNL has been an important driver for everyone in the division to do their part so that this new research division can realize the potential it has to make seminal contributions to the life sciences for years to come. This reporting period is characterized by intense assessment and planning efforts. They included thorough scrutinymore » of our strengths and weaknesses, analyses of our situation with respect to comparative research organizations, and identification of major thrust areas leading to core research efforts that take advantage of our special facilities and expertise. Our goal is to develop significant research and development (R&D) programs in selected important areas to which we can make significant contributions by combining our distinctive expertise and resources in the biological sciences with those in the physical, engineering, and computational sciences. Significant facilities in mouse genomics, mass spectrometry, neutron science, bioanalytical technologies, and high performance computing are critical to the success of our programs. Research and development efforts in the division are organized in six sections. These cluster into two broad areas of R&D: systems biology and technology applications. The systems biology part of the division encompasses our core biological research programs. It includes the Mammalian Genetics and Development Section, the Biochemistry and Biophysics Section, and the Computational Biosciences Section. The technology applications part of the division encompasses the Assessment Technology Section, the Environmental Technology Section, and the Toxicology and Risk Analysis Section. These sections are the stewards of the division's core competencies. The common mission of the division is to advance science and technology to understand complex biological systems and their relationship with human health and the environment.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1978-03-01
Abstracts of papers published during the previous calendar year, arranged in accordance with the project titles used in the USDOE Schedule 189 Budget Proposals, are presented. The collection of abstracts supplements the listing of papers published in the Schedule 189. The following subject areas are represented: high-energy physics; nuclear physics; basic energy sciences (nuclear science, materials sciences, solid state physics, materials chemistry); molecular, mathematical, and earth sciences (fundamental interactions, processes and techniques, mathematical and computer sciences); environmental research and development; physical and technological studies (characterization, measurement and monitoring); and nuclear research and applications.
CSBB: synthetic biology research at Newcastle University.
Goñi-Moreno, Angel; Wipat, Anil; Krasnogor, Natalio
2017-06-15
The Centre for Synthetic Biology and the Bioeconomy (CSBB) brings together a far-reaching multidisciplinary community across all Newcastle University's faculties - Medical Sciences, Science, Agriculture and Engineering, and Humanities, Arts and Social Sciences. The CSBB focuses on many different areas of Synthetic Biology, including bioprocessing, computational design and in vivo computation, as well as improving understanding of basic molecular machinery. Such breadth is supported by major national and international research funding, a range of industrial partners in the North East of England and beyond, as well as a large number of doctoral and post-doctoral researchers. The CSBB trains the next generation of scientists through a 1-year MSc in Synthetic Biology. © 2017 The Author(s).
Role of High-End Computing in Meeting NASA's Science and Engineering Challenges
NASA Technical Reports Server (NTRS)
Biswas, Rupak
2006-01-01
High-End Computing (HEC) has always played a major role in meeting the modeling and simulation needs of various NASA missions. With NASA's newest 62 teraflops Columbia supercomputer, HEC is having an even greater impact within the Agency and beyond. Significant cutting-edge science and engineering simulations in the areas of space exploration, Shuttle operations, Earth sciences, and aeronautics research, are already occurring on Columbia, demonstrating its ability to accelerate NASA s exploration vision. The talk will describe how the integrated supercomputing production environment is being used to reduce design cycle time, accelerate scientific discovery, conduct parametric analysis of multiple scenarios, and enhance safety during the life cycle of NASA missions.
ERIC Educational Resources Information Center
Govindasamy, Malliga K.
2014-01-01
Agent technology has become one of the dynamic and most interesting areas of computer science in recent years. The dynamism of this technology has resulted in computer generated characters, known as pedagogical agent, entering the digital learning environments in increasing numbers. Commonly deployed in implementing tutoring strategies, these…
NASA Astrophysics Data System (ADS)
Kuzenov, V. V.
2017-12-01
The paper is devoted to the theoretical and computational study of compression and energy release for magneto-inertial plasma confinement. This approach makes it possible to create new high-density plasma sources, apply them in materials science experiments, and use them in promising areas of power engineering.
Computers in Bilingual Education: Project CIBE. Evaluation Section Report. OREA Reports.
ERIC Educational Resources Information Center
Berney, Tomi D.; Alvarez, Rosalyn
This project provided 360 students at South Bronx High School (New York) with instruction in English as a Second Language (ESL); Native Language Arts (NLA); the bilingual content area subjects of mathematics, science, and social studies; and computer literacy. The goal of the project was to provide instructional and support services to…
Development of Web-Based Examination System Using Open Source Programming Model
ERIC Educational Resources Information Center
Abass, Olalere A.; Olajide, Samuel A.; Samuel, Babafemi O.
2017-01-01
The traditional method of assessment (examination) is often characterized by examination questions leakages, human errors during marking of scripts and recording of scores. The technological advancement in the field of computer science has necessitated the need for computer usage in majorly all areas of human life and endeavors, education sector…
ERIC Educational Resources Information Center
Tsompanoudi, Despina; Satratzemi, Maya; Xinogalos, Stelios
2016-01-01
The results presented in this paper contribute to research on two different areas of teaching methods: distributed pair programming (DPP) and computer-supported collaborative learning (CSCL). An evaluation study of a DPP system that supports collaboration scripts was conducted over one semester of a computer science course. Seventy-four students…
ERIC Educational Resources Information Center
Patterson, Brian F.; Packman, Sheryl; Kobrin, Jennifer L.
2011-01-01
The purpose of this study was to examine the effects of Advanced Placement[R] (AP[R]) exam participation and performance on college grades for courses taken in the same subject area as students' AP Exam(s). Students' first-year college subject area grade point averages (SGPAs) were examined in nine subject areas: mathematics, computer science,…
ERIC Educational Resources Information Center
Patterson, Brian F.; Packman, Sheryl; Kobrin, Jennifer L.
2011-01-01
The purpose of this study was to examine the effects of Advanced Placement (AP) exam participation and performance on college grades for courses taken in the same subject area as students' AP Exam(s). Students' first-year college subject area grade point averages (SGPAs) were examined in nine subject areas: mathematics, computer science,…
NASA Technical Reports Server (NTRS)
1982-01-01
The state-of-the-art of multispectral sensing is reviewed and recommendations for future research and development are proposed. specifically, two generic sensor concepts were discussed. One is the multispectral pushbroom sensor utilizing linear array technology which operates in six spectral bands including two in the SWIR region and incorporates capabilities for stereo and crosstrack pointing. The second concept is the imaging spectrometer (IS) which incorporates a dispersive element and area arrays to provide both spectral and spatial information simultaneously. Other key technology areas included very large scale integration and the computer aided design of these devices.
Center for Building Science: Annual report, FY 1986
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cairns, E.J.; Rosenfeld, A.H.
1987-05-01
The Center for Building Science consists of four programs in the Applied Science Division: energy analysis, buildings energy systems, windows and lighting, and indoor environment. It was established to provide an umbrella so that goups in different programs but with similar interests could combine to perform joint research, develop new research areas, share resources, and produce joint publications. As detailed below, potential savings for the U.S. society from energy efficient buildings are enormous. But these savings can only be realized through an expanding federal RandD program that develops expertise in this new area. The Center for Building Science develops efficientmore » new building componenets, computer models, data and information systems, and trains needed builidng scientists. 135 refs., 72 figs., 18 tabs.« less
Command and data handling of science signals on Spacelab
NASA Technical Reports Server (NTRS)
Mccain, H. G.
1975-01-01
The Orbiter Avionics and the Spacelab Command and Data Management System (CDMS) combine to provide a relatively complete command, control, and data handling service to the instrument complement during a Shuttle Sortie Mission. The Spacelab CDMS services the instruments and the Orbiter in turn services the Spacelab. The CDMS computer system includes three computers, two I/O units, a mass memory, and a variable number of remote acquisition units. Attention is given to the CDMS high rate multiplexer, CDMS tape recorders, closed circuit television for the visual monitoring of payload bay and cabin area activities, methods of science data acquisition, questions of transmission and recording, CDMS experiment computer usage, and experiment electronics.
Duarte, Afonso M. S.; Psomopoulos, Fotis E.; Blanchet, Christophe; Bonvin, Alexandre M. J. J.; Corpas, Manuel; Franc, Alain; Jimenez, Rafael C.; de Lucas, Jesus M.; Nyrönen, Tommi; Sipos, Gergely; Suhr, Stephanie B.
2015-01-01
With the increasingly rapid growth of data in life sciences we are witnessing a major transition in the way research is conducted, from hypothesis-driven studies to data-driven simulations of whole systems. Such approaches necessitate the use of large-scale computational resources and e-infrastructures, such as the European Grid Infrastructure (EGI). EGI, one of key the enablers of the digital European Research Area, is a federation of resource providers set up to deliver sustainable, integrated and secure computing services to European researchers and their international partners. Here we aim to provide the state of the art of Grid/Cloud computing in EU research as viewed from within the field of life sciences, focusing on key infrastructures and projects within the life sciences community. Rather than focusing purely on the technical aspects underlying the currently provided solutions, we outline the design aspects and key characteristics that can be identified across major research approaches. Overall, we aim to provide significant insights into the road ahead by establishing ever-strengthening connections between EGI as a whole and the life sciences community. PMID:26157454
Duarte, Afonso M S; Psomopoulos, Fotis E; Blanchet, Christophe; Bonvin, Alexandre M J J; Corpas, Manuel; Franc, Alain; Jimenez, Rafael C; de Lucas, Jesus M; Nyrönen, Tommi; Sipos, Gergely; Suhr, Stephanie B
2015-01-01
With the increasingly rapid growth of data in life sciences we are witnessing a major transition in the way research is conducted, from hypothesis-driven studies to data-driven simulations of whole systems. Such approaches necessitate the use of large-scale computational resources and e-infrastructures, such as the European Grid Infrastructure (EGI). EGI, one of key the enablers of the digital European Research Area, is a federation of resource providers set up to deliver sustainable, integrated and secure computing services to European researchers and their international partners. Here we aim to provide the state of the art of Grid/Cloud computing in EU research as viewed from within the field of life sciences, focusing on key infrastructures and projects within the life sciences community. Rather than focusing purely on the technical aspects underlying the currently provided solutions, we outline the design aspects and key characteristics that can be identified across major research approaches. Overall, we aim to provide significant insights into the road ahead by establishing ever-strengthening connections between EGI as a whole and the life sciences community.
Modeling biological problems in computer science: a case study in genome assembly.
Medvedev, Paul
2018-01-30
As computer scientists working in bioinformatics/computational biology, we often face the challenge of coming up with an algorithm to answer a biological question. This occurs in many areas, such as variant calling, alignment and assembly. In this tutorial, we use the example of the genome assembly problem to demonstrate how to go from a question in the biological realm to a solution in the computer science realm. We show the modeling process step-by-step, including all the intermediate failed attempts. Please note this is not an introduction to how genome assembly algorithms work and, if treated as such, would be incomplete and unnecessarily long-winded. © The Author(s) 2018. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
PREFACE: International Conference on Applied Sciences 2015 (ICAS2015)
NASA Astrophysics Data System (ADS)
Lemle, Ludovic Dan; Jiang, Yiwen
2016-02-01
The International Conference on Applied Sciences ICAS2015 took place in Wuhan, China on June 3-5, 2015 at the Military Economics Academy of Wuhan. The conference is regularly organized, alternatively in Romania and in P.R. China, by Politehnica University of Timişoara, Romania, and Military Economics Academy of Wuhan, P.R. China, with the joint aims to serve as a platform for exchange of information between various areas of applied sciences, and to promote the communication between the scientists of different nations, countries and continents. The topics of the conference cover a comprehensive spectrum of issues from: >Economical Sciences and Defense: Management Sciences, Business Management, Financial Management, Logistics, Human Resources, Crisis Management, Risk Management, Quality Control, Analysis and Prediction, Government Expenditure, Computational Methods in Economics, Military Sciences, National Security, and others... >Fundamental Sciences and Engineering: Interdisciplinary applications of physics, Numerical approximation and analysis, Computational Methods in Engineering, Metallic Materials, Composite Materials, Metal Alloys, Metallurgy, Heat Transfer, Mechanical Engineering, Mechatronics, Reliability, Electrical Engineering, Circuits and Systems, Signal Processing, Software Engineering, Data Bases, Modeling and Simulation, and others... The conference gathered qualified researchers whose expertise can be used to develop new engineering knowledge that has applicability potential in Engineering, Economics, Defense, etc. The number of participants was 120 from 11 countries (China, Romania, Taiwan, Korea, Denmark, France, Italy, Spain, USA, Jamaica, and Bosnia and Herzegovina). During the three days of the conference four invited and 67 oral talks were delivered. Based on the work presented at the conference, 38 selected papers have been included in this volume of IOP Conference Series: Materials Science and Engineering. These papers present new research in the various fields of Materials Engineering, Mechanical Engineering, Computers Engineering, and Electrical Engineering. It's our great pleasure to present this volume of IOP Conference Series: Materials Science and Engineering to the scientific community to promote further research in these areas. We sincerely hope that the papers published in this volume will contribute to the advancement of knowledge in the respective fields.
ERIC Educational Resources Information Center
Mobray, Deborah, Ed.
Papers on local area networks (LANs), modelling techniques, software improvement, capacity planning, software engineering, microcomputers and end user computing, cost accounting and chargeback, configuration and performance management, and benchmarking presented at this conference include: (1) "Theoretical Performance Analysis of Virtual…
NASA Technical Reports Server (NTRS)
Atkinson, R. C.
1974-01-01
The results are presented of a project of research and development on strategies for optimizing the instructional process, and dissemination of information about the applications of such research to the instructional medium of computer-assisted instruction. Accomplishments reported include construction of the author language INSTRUCT, construction of a practical CAI course in the area of computer science, and a number of investigations into the individualization of instruction, using the course as a vehicle.
Problem Solving with General Semantics.
ERIC Educational Resources Information Center
Hewson, David
1996-01-01
Discusses how to use general semantics formulations to improve problem solving at home or at work--methods come from the areas of artificial intelligence/computer science, engineering, operations research, and psychology. (PA)
ERIC Educational Resources Information Center
Nogami, Glenda Y., Ed.; And Others
The 21 summaries of research programs, funded by the United States Army Research Institute (ARI) for the Behavioral and Social Sciences which are presented are grouped in five broad topic areas: computer-based systems; information processing; learning, memory and transfer; human relations; and related issues and trends. Papers presented include:…
The Technical Information Library: TIB
NASA Technical Reports Server (NTRS)
Rosemann, Uwe
1994-01-01
The Technische Informationsbibliothek Hannover (TIB) is the German national central library for all areas of technology and related sciences, especially chemistry, computer science, mathematics, and physics. The TIB acquires and makes available a comprehensive collection of conventional and non-conventional literature, especially foreign material, with particular emphasis on specialized new publications which are difficult to obtain or in difficult languages.
, Statistical Analysis and Data Mining: The ASA Data Science Journal (2017) Using GIS-Based Methods and Lidar techniques to the problem of large area coverage mapping for wireless networks. He has also done work in -4297 Dr. Caleb Phillips is a data scientist with the Computational Science Center at NREL. Caleb comes
Code of Federal Regulations, 2010 CFR
2010-04-01
... course of study in which the major area of concentration was actuarial science, or (2) Received a... mathematics, statistics, or computer science, and shall have successfully completed at least 6 semester hours or 9 quarter hours of courses in life contingencies at an accredited college or university. (e...
Code of Federal Regulations, 2011 CFR
2011-04-01
... course of study in which the major area of concentration was actuarial science, or (2) Received a... mathematics, statistics, or computer science, and shall have successfully completed at least 6 semester hours or 9 quarter hours of courses in life contingencies at an accredited college or university. (e...
NASA Astrophysics Data System (ADS)
Stolyarov, I. V.
2017-01-01
The author of this article manages a project and research activity of students in the areas of computer science, physics, engineering and biology, basing on the acquired experience in these fields. Pupils constantly become winners of competitions and conferences of different levels, for example, three of the finalists of Intel ISEF in 2013 in Phoenix (Arizona, USA) and in 2014 in Los Angeles (California, USA). In 2013 A. Makarychev received the "Small Nobel prize" in Computer Science section and special award sponsors - the company's CAST. Scientific themes and methods suggested by the author and developed in joint publications of students from Russia, Germany and Austria are the patents for invention and certificates for registration in the ROSPATENT. The article presents the results of the implementation of specific software and hardware systems in physics, engineering and medicine.
New & Special Grad School Programs.
ERIC Educational Resources Information Center
Ross, Steven S.
1988-01-01
Discusses some special Master of Science in engineering (MS) programs including manufacturing and quality control, safety engineering, transportation engineering, and computer related areas. Gives a table showing MS degrees, institutions, and faculty. (YP)
ALCF Data Science Program: Productive Data-centric Supercomputing
NASA Astrophysics Data System (ADS)
Romero, Nichols; Vishwanath, Venkatram
The ALCF Data Science Program (ADSP) is targeted at big data science problems that require leadership computing resources. The goal of the program is to explore and improve a variety of computational methods that will enable data-driven discoveries across all scientific disciplines. The projects will focus on data science techniques covering a wide area of discovery including but not limited to uncertainty quantification, statistics, machine learning, deep learning, databases, pattern recognition, image processing, graph analytics, data mining, real-time data analysis, and complex and interactive workflows. Project teams will be among the first to access Theta, ALCFs forthcoming 8.5 petaflops Intel/Cray system. The program will transition to the 200 petaflop/s Aurora supercomputing system when it becomes available. In 2016, four projects have been selected to kick off the ADSP. The selected projects span experimental and computational sciences and range from modeling the brain to discovering new materials for solar-powered windows to simulating collision events at the Large Hadron Collider (LHC). The program will have a regular call for proposals with the next call expected in Spring 2017.http://www.alcf.anl.gov/alcf-data-science-program This research used resources of the ALCF, which is a DOE Office of Science User Facility supported under Contract DE-AC02-06CH11357.
An Overview of High Performance Computing and Challenges for the Future
Google Tech Talks
2017-12-09
In this talk we examine how high performance computing has changed over the last 10-year and look toward the future in terms of trends. These changes have had and will continue to have a major impact on our software. A new generation of software libraries and lgorithms are needed for the effective and reliable use of (wide area) dynamic, distributed and parallel environments. Some of the software and algorithm challenges have already been encountered, such as management of communication and memory hierarchies through a combination of compile--time and run--time techniques, but the increased scale of computation, depth of memory hierarchies, range of latencies, and increased run--time environment variability will make these problems much harder. We will focus on the redesign of software to fit multicore architectures. Speaker: Jack Dongarra University of Tennessee Oak Ridge National Laboratory University of Manchester Jack Dongarra received a Bachelor of Science in Mathematics from Chicago State University in 1972 and a Master of Science in Computer Science from the Illinois Institute of Technology in 1973. He received his Ph.D. in Applied Mathematics from the University of New Mexico in 1980. He worked at the Argonne National Laboratory until 1989, becoming a senior scientist. He now holds an appointment as University Distinguished Professor of Computer Science in the Electrical Engineering and Computer Science Department at the University of Tennessee, has the position of a Distinguished Research Staff member in the Computer Science and Mathematics Division at Oak Ridge National Laboratory (ORNL), Turing Fellow in the Computer Science and Mathematics Schools at the University of Manchester, and an Adjunct Professor in the Computer Science Department at Rice University. He specializes in numerical algorithms in linear algebra, parallel computing, the use of advanced-computer architectures, programming methodology, and tools for parallel computers. His research includes the development, testing and documentation of high quality mathematical software. He has contributed to the design and implementation of the following open source software packages and systems: EISPACK, LINPACK, the BLAS, LAPACK, ScaLAPACK, Netlib, PVM, MPI, NetSolve, Top500, ATLAS, and PAPI. He has published approximately 200 articles, papers, reports and technical memoranda and he is coauthor of several books. He was awarded the IEEE Sid Fernbach Award in 2004 for his contributions in the application of high performance computers using innovative approaches. He is a Fellow of the AAAS, ACM, and the IEEE and a member of the National Academy of Engineering.
An Overview of High Performance Computing and Challenges for the Future
DOE Office of Scientific and Technical Information (OSTI.GOV)
Google Tech Talks
In this talk we examine how high performance computing has changed over the last 10-year and look toward the future in terms of trends. These changes have had and will continue to have a major impact on our software. A new generation of software libraries and lgorithms are needed for the effective and reliable use of (wide area) dynamic, distributed and parallel environments. Some of the software and algorithm challenges have already been encountered, such as management of communication and memory hierarchies through a combination of compile--time and run--time techniques, but the increased scale of computation, depth of memory hierarchies,more » range of latencies, and increased run--time environment variability will make these problems much harder. We will focus on the redesign of software to fit multicore architectures. Speaker: Jack Dongarra University of Tennessee Oak Ridge National Laboratory University of Manchester Jack Dongarra received a Bachelor of Science in Mathematics from Chicago State University in 1972 and a Master of Science in Computer Science from the Illinois Institute of Technology in 1973. He received his Ph.D. in Applied Mathematics from the University of New Mexico in 1980. He worked at the Argonne National Laboratory until 1989, becoming a senior scientist. He now holds an appointment as University Distinguished Professor of Computer Science in the Electrical Engineering and Computer Science Department at the University of Tennessee, has the position of a Distinguished Research Staff member in the Computer Science and Mathematics Division at Oak Ridge National Laboratory (ORNL), Turing Fellow in the Computer Science and Mathematics Schools at the University of Manchester, and an Adjunct Professor in the Computer Science Department at Rice University. He specializes in numerical algorithms in linear algebra, parallel computing, the use of advanced-computer architectures, programming methodology, and tools for parallel computers. His research includes the development, testing and documentation of high quality mathematical software. He has contributed to the design and implementation of the following open source software packages and systems: EISPACK, LINPACK, the BLAS, LAPACK, ScaLAPACK, Netlib, PVM, MPI, NetSolve, Top500, ATLAS, and PAPI. He has published approximately 200 articles, papers, reports and technical memoranda and he is coauthor of several books. He was awarded the IEEE Sid Fernbach Award in 2004 for his contributions in the application of high performance computers using innovative approaches. He is a Fellow of the AAAS, ACM, and the IEEE and a member of the National Academy of Engineering.« less
EMSL Geochemistry, Biogeochemistry and Subsurface Science-Science Theme Advisory Panel Meeting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Gordon E.; Chaka, Anne; Shuh, David K.
2011-08-01
This report covers the topics of discussion and the recommendations of the panel members. On December 8 and 9, 2010, the Geochemistry, Biogeochemistry, and Subsurface Science (GBSS) Science Theme Advisory Panel (STAP) convened for a more in-depth exploration of the five Science Theme focus areas developed at a similar meeting held in 2009. The goal for the fiscal year (FY) 2011 meeting was to identify potential topical areas for science campaigns, necessary experimental development needs, and scientific members for potential research teams. After a review of the current science in each of the five focus areas, the 2010 STAP discussionsmore » successfully led to the identification of one well focused campaign idea in pore-scale modeling and five longer-term potential research campaign ideas that would likely require additional workshops to identify specific research thrusts. These five campaign areas can be grouped into two categories: (1) the application of advanced high-resolution, high mass accuracy experimental techniques to elucidate the interplay between geochemistry and microbial communities in terrestrial ecosystems and (2) coupled computation/experimental investigations of the electron transfer reactions either between mineral surfaces and outer membranes of microbial cells or between the outer and inner membranes of microbial cells.« less
Introduction to Library Public Services. Sixth Edition. Library and Information Science Text Series.
ERIC Educational Resources Information Center
Evans, G. Edward; Amodeo, Anthony J.; Carter, Thomas L.
This book covers the role, purpose, and philosophy related to each of the major functional areas of library public service. This sixth edition, on the presumption that most people know the basic facts about computer hardware, does not include the chapter (in the previous edition) on computer basics, and instead integrated specific technological…
A Case Study of Educational Computer Game Design by Middle School Students
ERIC Educational Resources Information Center
An, Yun-Jo
2016-01-01
Only a limited number of research studies have investigated how students design educational computer games and its impact on student learning. In addition, most studies on educational game design by students were conducted in the areas of mathematics and science. Using the qualitative case study approach, this study explored how seventh graders…
Applications of Computer Graphics in Engineering
NASA Technical Reports Server (NTRS)
1975-01-01
Various applications of interactive computer graphics to the following areas of science and engineering were described: design and analysis of structures, configuration geometry, animation, flutter analysis, design and manufacturing, aircraft design and integration, wind tunnel data analysis, architecture and construction, flight simulation, hydrodynamics, curve and surface fitting, gas turbine engine design, analysis, and manufacturing, packaging of printed circuit boards, spacecraft design.
Nature-Computer Camp. Final Evaluation Report 1984-1985. E.C.I.A. Chapter 2.
ERIC Educational Resources Information Center
District of Columbia Public Schools, Washington, DC. Div. of Quality Assurance.
This report presents a description and evaluation of the Nature-Computer Camp (NCC), a science- and technology-oriented program for sixth-grade students from the District of Columbia Public Schools. The NCC experience is designed to offer students opportunities in such environmentally-related areas as woodland ecology, stream ecology, geology, as…
A Digital Simulation Program for Health Science Students to Follow Drug Levels in the Body
ERIC Educational Resources Information Center
Stavchansky, Salomon; And Others
1977-01-01
The Rayetheon Scientific Simulation Language (RSSL) program, an easily-used simulation on the CDC/6600 computer at the University of Texas at Austin, offers a simple method of solving differential equations on a digital computer. It is used by undergraduate biopharmaceutics-pharmacokinetics students and graduate students in all areas. (Author/LBH)
PREFACE: High Performance Computing Symposium 2011
NASA Astrophysics Data System (ADS)
Talon, Suzanne; Mousseau, Normand; Peslherbe, Gilles; Bertrand, François; Gauthier, Pierre; Kadem, Lyes; Moitessier, Nicolas; Rouleau, Guy; Wittig, Rod
2012-02-01
HPCS (High Performance Computing Symposium) is a multidisciplinary conference that focuses on research involving High Performance Computing and its application. Attended by Canadian and international experts and renowned researchers in the sciences, all areas of engineering, the applied sciences, medicine and life sciences, mathematics, the humanities and social sciences, it is Canada's pre-eminent forum for HPC. The 25th edition was held in Montréal, at the Université du Québec à Montréal, from 15-17 June and focused on HPC in Medical Science. The conference was preceded by tutorials held at Concordia University, where 56 participants learned about HPC best practices, GPU computing, parallel computing, debugging and a number of high-level languages. 274 participants from six countries attended the main conference, which involved 11 invited and 37 contributed oral presentations, 33 posters, and an exhibit hall with 16 booths from our sponsors. The work that follows is a collection of papers presented at the conference covering HPC topics ranging from computer science to bioinformatics. They are divided here into four sections: HPC in Engineering, Physics and Materials Science, HPC in Medical Science, HPC Enabling to Explore our World and New Algorithms for HPC. We would once more like to thank the participants and invited speakers, the members of the Scientific Committee, the referees who spent time reviewing the papers and our invaluable sponsors. To hear the invited talks and learn about 25 years of HPC development in Canada visit the Symposium website: http://2011.hpcs.ca/lang/en/conference/keynote-speakers/ Enjoy the excellent papers that follow, and we look forward to seeing you in Vancouver for HPCS 2012! Gilles Peslherbe Chair of the Scientific Committee Normand Mousseau Co-Chair of HPCS 2011 Suzanne Talon Chair of the Organizing Committee UQAM Sponsors The PDF also contains photographs from the conference banquet.
NASA Astrophysics Data System (ADS)
Bell, Peter M.
Artificial intelligence techniques are being used for the first time to evaluate geophysical, geochemical, and geologic data and theory in order to locate ore deposits. After several years of development, an intelligent computer code has been formulated and applied to the Mount Tolman area in Washington state. In a project funded by the United States Geological Survey and the National Science Foundation a set of computer programs, under the general title Prospector, was used successfully to locate a previously unknown ore-grade porphyry molybdenum deposit in the vicinity of Mount Tolman (Science, Sept. 3, 1982).The general area of the deposit had been known to contain exposures of porphyry mineralization. Between 1964 and 1978, exploration surveys had been run by the Bear Creek Mining Company, and later exploration was done in the area by the Amax Corporation. Some of the geophysical data and geochemical and other prospecting surveys were incorporated into the programs, and mine exploration specialists contributed to a set of rules for Prospector. The rules were encoded as ‘inference networks’ to form the ‘expert system’ on which the artificial intelligence codes were based. The molybdenum ore deposit discovered by the test is large, located subsurface, and has an areal extent of more than 18 km2.
Integrating Health Information Systems into a Database Course: A Case Study
ERIC Educational Resources Information Center
Anderson, Nicole; Zhang, Mingrui; McMaster, Kirby
2011-01-01
Computer Science is a rich field with many growing application areas, such as Health Information Systems. What we suggest here is that multi-disciplinary threads can be introduced to supplement, enhance, and strengthen the primary area of study in a course. We call these supplementary materials "threads," because they are executed…
1996 Laboratory directed research and development annual report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meyers, C.E.; Harvey, C.L.; Lopez-Andreas, L.M.
This report summarizes progress from the Laboratory Directed Research and Development (LDRD) program during fiscal year 1996. In addition to a programmatic and financial overview, the report includes progress reports from 259 individual R&D projects in seventeen categories. The general areas of research include: engineered processes and materials; computational and information sciences; microelectronics and photonics; engineering sciences; pulsed power; advanced manufacturing technologies; biomedical engineering; energy and environmental science and technology; advanced information technologies; counterproliferation; advanced transportation; national security technology; electronics technologies; idea exploration and exploitation; production; and science at the interfaces - engineering with atoms.
Multimission image processing and science data visualization
NASA Technical Reports Server (NTRS)
Green, William B.
1993-01-01
The Operational Science Analysis (OSA) Functional area supports science instrument data display, analysis, visualization and photo processing in support of flight operations of planetary spacecraft managed by the Jet Propulsion Laboratory (JPL). This paper describes the data products generated by the OSA functional area, and the current computer system used to generate these data products. The objectives on a system upgrade now in process are described. The design approach to development of the new system are reviewed, including use of the Unix operating system and X-Window display standards to provide platform independence, portability, and modularity within the new system, is reviewed. The new system should provide a modular and scaleable capability supporting a variety of future missions at JPL.
Exploratory Research and Development Fund, FY 1990
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1992-05-01
The Lawrence Berkeley Laboratory Exploratory R D Fund FY 1990 report is compiled from annual reports submitted by principal investigators following the close of the fiscal year. This report describes the projects supported and summarizes their accomplishments. It constitutes a part of an Exploratory R D Fund (ERF) planning and documentation process that includes an annual planning cycle, projection selection, implementation, and review. The research areas covered in this report are: Accelerator and fusion research; applied science; cell and molecular biology; chemical biodynamics; chemical sciences; earth sciences; engineering; information and computing sciences; materials sciences; nuclear science; physics and research medicinemore » and radiation biophysics.« less
Knowledge Discovery from Climate Data using Graph-Based Methods
NASA Astrophysics Data System (ADS)
Steinhaeuser, K.
2012-04-01
Climate and Earth sciences have recently experienced a rapid transformation from a historically data-poor to a data-rich environment, thus bringing them into the realm of the Fourth Paradigm of scientific discovery - a term coined by the late Jim Gray (Hey et al. 2009), the other three being theory, experimentation and computer simulation. In particular, climate-related observations from remote sensors on satellites and weather radars, in situ sensors and sensor networks, as well as outputs of climate or Earth system models from large-scale simulations, provide terabytes of spatio-temporal data. These massive and information-rich datasets offer a significant opportunity for advancing climate science and our understanding of the global climate system, yet current analysis techniques are not able to fully realize their potential benefits. We describe a class of computational approaches, specifically from the data mining and machine learning domains, which may be novel to the climate science domain and can assist in the analysis process. Computer scientists have developed spatial and spatio-temporal analysis techniques for a number of years now, and many of them may be applicable and/or adaptable to problems in climate science. We describe a large-scale, NSF-funded project aimed at addressing climate science question using computational analysis methods; team members include computer scientists, statisticians, and climate scientists from various backgrounds. One of the major thrusts is in the development of graph-based methods, and several illustrative examples of recent work in this area will be presented.
NASA Technical Reports Server (NTRS)
Moore, Robert C.
1998-01-01
The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities that serves as a bridge between NASA and the academic community. Under a five-year co-operative agreement with NASA, research at RIACS is focused on areas that are strategically enabling to the Ames Research Center's role as NASA's Center of Excellence for Information Technology. The primary mission of RIACS is charted to carry out research and development in computer science. This work is devoted in the main to tasks that are strategically enabling with respect to NASA's bold mission in space exploration and aeronautics. There are three foci for this work: (1) Automated Reasoning. (2) Human-Centered Computing. and (3) High Performance Computing and Networking. RIACS has the additional goal of broadening the base of researcher in these areas of importance to the nation's space and aeronautics enterprises. Through its visiting scientist program, RIACS facilitates the participation of university-based researchers, including both faculty and students, in the research activities of NASA and RIACS. RIACS researchers work in close collaboration with NASA computer scientists on projects such as the Remote Agent Experiment on Deep Space One mission, and Super-Resolution Surface Modeling.
A crisis in the NASA space and earth sciences programme
NASA Technical Reports Server (NTRS)
Lanzerotti, Louis, J.; Rosendhal, Jeffrey D.; Black, David C.; Baker, D. James; Banks, Peter M.; Bretherton, Francis; Brown, Robert A.; Burke, Kevin C.; Burns, Joseph A.; Canizares, Claude R.
1987-01-01
Problems in the space and earth science programs are examined. Changes in the research environment and requirements for the space and earth sciences, for example from small Explorer missions to multispacecraft missions, have been observed. The need to expand the computational capabilities for space and earth sciences is discussed. The effects of fluctuations in funding, program delays, the limited number of space flights, and the development of the Space Station on research in the areas of astronomy and astrophysics, planetary exploration, solar and space physics, and earth science are analyzed. The recommendations of the Space and Earth Science Advisory Committee on the development and maintenance of effective space and earth sciences programs are described.
Information Sciences Assessment for Asia and Australasia
2009-10-16
entertainment and home services - Machine Translation for international cooperation - NLU + Affective Computing for education - Intelligent Optimization for...into an emotion. ETTS, embedded Mandarin, music retrieval. Also, research in areas of computer graphics, digital media processing Intelligent...many from outside China, 40% in phase 2 Sales volume in 2007 130 * 100 million RMB SAP (1st), CITI, AIG, EDS, Capgemini, ILOG, Infosys, HCL, Sony
ERIC Educational Resources Information Center
Association for Educational Data Systems, Washington, DC.
The 98 papers in this collection examine a wide variety of topics related to the latest technological developments as they apply to the educational process. Papers are grouped to reflect common, broad areas of interest, representing the instructional, administrative, and computer science divisions of the Association for Educational Data Systems…
A 21st Century Science, Technology, and Innovation Strategy for Americas National Security
2016-05-01
areas. Advanced Computing and Communications The exponential growth of the digital economy, driven by ubiquitous computing and communication...weapons- focused R&D, many of the capabilities being developed have significant dual-use potential. Digital connectivity, for instance, brings...scale than traditional recombinant DNA techniques, and to share these designs digitally . Nanotechnology promises the ability to engineer entirely
González-Nilo, Fernando; Pérez-Acle, Tomás; Guínez-Molinos, Sergio; Geraldo, Daniela A; Sandoval, Claudia; Yévenes, Alejandro; Santos, Leonardo S; Laurie, V Felipe; Mendoza, Hegaly; Cachau, Raúl E
2011-01-01
After the progress made during the genomics era, bioinformatics was tasked with supporting the flow of information generated by nanobiotechnology efforts. This challenge requires adapting classical bioinformatic and computational chemistry tools to store, standardize, analyze, and visualize nanobiotechnological information. Thus, old and new bioinformatic and computational chemistry tools have been merged into a new sub-discipline: nanoinformatics. This review takes a second look at the development of this new and exciting area as seen from the perspective of the evolution of nanobiotechnology applied to the life sciences. The knowledge obtained at the nano-scale level implies answers to new questions and the development of new concepts in different fields. The rapid convergence of technologies around nanobiotechnologies has spun off collaborative networks and web platforms created for sharing and discussing the knowledge generated in nanobiotechnology. The implementation of new database schemes suitable for storage, processing and integrating physical, chemical, and biological properties of nanoparticles will be a key element in achieving the promises in this convergent field. In this work, we will review some applications of nanobiotechnology to life sciences in generating new requirements for diverse scientific fields, such as bioinformatics and computational chemistry.
Map-Based Querying for Multimedia Database
2014-09-01
existing assets in a custom multimedia database based on an area of interest. It also describes the augmentation of an Android Tactical Assault Kit (ATAK......for Multimedia Database Somiya Metu Computational and Information Sciences Directorate, ARL
Atmospheric Science Data Center
2013-04-18
... on the right. This quantity is retrieved using an automated computer algorithm that takes advantage of MISR's multi-angle capability. Areas ... NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Science Mission Directorate, Washington, D.C. The Terra spacecraft is managed ...
Evolution, learning, and cognition
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Y.C.
1988-01-01
The book comprises more than fifteen articles in the areas of neural networks and connectionist systems, classifier systems, adaptive network systems, genetic algorithm, cellular automata, artificial immune systems, evolutionary genetics, cognitive science, optical computing, combinatorial optimization, and cybernetics.
1988-03-01
Mechanism; Computer Security. 16. PRICE CODE 17. SECURITY CLASSIFICATION IS. SECURITY CLASSIFICATION 19. SECURITY CLASSIFICATION 20. UMrrATION OF ABSTRACT...denial of service. This paper assumes that the reader is a computer science or engineering professional working in the area of formal specification and...recovery from such events as deadlocks and crashes can be accounted for in the computation of the waiting time for each service in the service hierarchy
NASA Technical Reports Server (NTRS)
Oliger, Joseph
1992-01-01
The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on 6 June 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under a cooperative agreement with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. A flexible scientific staff is provided through a university faculty visitor program, a post doctoral program, and a student visitor program. Not only does this provide appropriate expertise but it also introduces scientists outside of NASA to NASA problems. A small group of core RIACS staff provides continuity and interacts with an ARC technical monitor and scientific advisory group to determine the RIACS mission. RIACS activities are reviewed and monitored by a USRA advisory council and ARC technical monitor. Research at RIACS is currently being done in the following areas: Parallel Computing; Advanced Methods for Scientific Computing; Learning Systems; High Performance Networks and Technology; Graphics, Visualization, and Virtual Environments.
A Research Agenda and Vision for Data Science
NASA Astrophysics Data System (ADS)
Mattmann, C. A.
2014-12-01
Big Data has emerged as a first-class citizen in the research community spanning disciplines in the domain sciences - Astronomy is pushing velocity with new ground-based instruments such as the Square Kilometre Array (SKA) and its unprecedented data rates (700 TB/sec!); Earth-science is pushing the boundaries of volume with increasing experiments in the international Intergovernmental Panel on Climate Change (IPCC) and climate modeling and remote sensing communities increasing the size of the total archives into the Exabytes scale; airborne missions from NASA such as the JPL Airborne Snow Observatory (ASO) is increasing both its velocity and decreasing the overall turnaround time required to receive products and to make them available to water managers and decision makers. Proteomics and the computational biology community are sequencing genomes and providing near real time answers to clinicians, researchers, and ultimately to patients, helping to process and understand and create diagnoses. Data complexity is on the rise, and the norm is no longer 100s of metadata attributes, but thousands to hundreds of thousands, including complex interrelationships between data and metadata and knowledge. I published a vision for data science in Nature 2013 that encapsulates four thrust areas and foci that I believe the computer science, Big Data, and data science communities need to attack over the next decade to make fundamental progress in the data volume, velocity and complexity challenges arising from the domain sciences such as those described above. These areas include: (1) rapid and unobtrusive algorithm integration; (2) intelligent and automatic data movement; (3) automated and rapid extraction text, metadata and language from heterogeneous file formats; and (4) participation and people power via open source communities. In this talk I will revisit these four areas and describe current progress; future work and challenges ahead as we move forward in this exciting age of Data Science.
Applications of multigrid software in the atmospheric sciences
NASA Technical Reports Server (NTRS)
Adams, J.; Garcia, R.; Gross, B.; Hack, J.; Haidvogel, D.; Pizzo, V.
1992-01-01
Elliptic partial differential equations from different areas in the atmospheric sciences are efficiently and easily solved utilizing the multigrid software package named MUDPACK. It is demonstrated that the multigrid method is more efficient than other commonly employed techniques, such as Gaussian elimination and fixed-grid relaxation. The efficiency relative to other techniques, both in terms of storage requirement and computational time, increases quickly with grid size.
ERIC Educational Resources Information Center
Riskin, Steve R.
This paper discusses the results of an experimental, non-traditional university class in sociology in which students produced an interactive multimedia module in a social science subject area using a computer system that allowed instant access to film, sound, television, images, and text. There were no constraints on the selection of media, or the…
Training the Future - Interns Harvesting & Testing Plant Experim
2017-07-19
In the Space Life Sciences Laboratory at NASA's Kennedy Space Center in Florida, student interns such as Ayla Grandpre are joining agency scientists, contributing in the area of plant growth research for food production in space. Grandpre is majoring in computer science and chemistry at Rocky Mountain College in Billings, Montana. The agency attracts its future workforce through the NASA Internship, Fellowships and Scholarships, or NIFS, Program.
Scientific Discovery through Advanced Computing in Plasma Science
NASA Astrophysics Data System (ADS)
Tang, William
2005-03-01
Advanced computing is generally recognized to be an increasingly vital tool for accelerating progress in scientific research during the 21st Century. For example, the Department of Energy's ``Scientific Discovery through Advanced Computing'' (SciDAC) Program was motivated in large measure by the fact that formidable scientific challenges in its research portfolio could best be addressed by utilizing the combination of the rapid advances in super-computing technology together with the emergence of effective new algorithms and computational methodologies. The imperative is to translate such progress into corresponding increases in the performance of the scientific codes used to model complex physical systems such as those encountered in high temperature plasma research. If properly validated against experimental measurements and analytic benchmarks, these codes can provide reliable predictive capability for the behavior of a broad range of complex natural and engineered systems. This talk reviews recent progress and future directions for advanced simulations with some illustrative examples taken from the plasma science applications area. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by the combination of access to powerful new computational resources together with innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning a huge range in time and space scales. In particular, the plasma science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPP's to produce three-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of plasma turbulence in magnetically-confined high temperature plasmas. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to stimulate improved cross-cutting collaborations with other fields and also to help attract bright young talent to the computational science area.
Computational complexity of ecological and evolutionary spatial dynamics
Ibsen-Jensen, Rasmus; Chatterjee, Krishnendu; Nowak, Martin A.
2015-01-01
There are deep, yet largely unexplored, connections between computer science and biology. Both disciplines examine how information proliferates in time and space. Central results in computer science describe the complexity of algorithms that solve certain classes of problems. An algorithm is deemed efficient if it can solve a problem in polynomial time, which means the running time of the algorithm is a polynomial function of the length of the input. There are classes of harder problems for which the fastest possible algorithm requires exponential time. Another criterion is the space requirement of the algorithm. There is a crucial distinction between algorithms that can find a solution, verify a solution, or list several distinct solutions in given time and space. The complexity hierarchy that is generated in this way is the foundation of theoretical computer science. Precise complexity results can be notoriously difficult. The famous question whether polynomial time equals nondeterministic polynomial time (i.e., P = NP) is one of the hardest open problems in computer science and all of mathematics. Here, we consider simple processes of ecological and evolutionary spatial dynamics. The basic question is: What is the probability that a new invader (or a new mutant) will take over a resident population? We derive precise complexity results for a variety of scenarios. We therefore show that some fundamental questions in this area cannot be answered by simple equations (assuming that P is not equal to NP). PMID:26644569
Cloudbursting - Solving the 3-body problem
NASA Astrophysics Data System (ADS)
Chang, G.; Heistand, S.; Vakhnin, A.; Huang, T.; Zimdars, P.; Hua, H.; Hood, R.; Koenig, J.; Mehrotra, P.; Little, M. M.; Law, E.
2014-12-01
Many science projects in the future will be accomplished through collaboration among 2 or more NASA centers along with, potentially, external scientists. Science teams will be composed of more geographically dispersed individuals and groups. However, the current computing environment does not make this easy and seamless. By being able to share computing resources among members of a multi-center team working on a science/ engineering project, limited pre-competition funds could be more efficiently applied and technical work could be conducted more effectively with less time spent moving data or waiting for computing resources to free up. Based on the work from an NASA CIO IT Labs task, this presentation will highlight our prototype work in identifying the feasibility and identify the obstacles, both technical and management, to perform "Cloudbursting" among private clouds located at three different centers. We will demonstrate the use of private cloud computing infrastructure at the Jet Propulsion Laboratory, Langley Research Center, and Ames Research Center to provide elastic computation to each other to perform parallel Earth Science data imaging. We leverage elastic load balancing and auto-scaling features at each data center so that each location can independently define how many resources to allocate to a particular job that was "bursted" from another data center and demonstrate that compute capacity scales up and down with the job. We will also discuss future work in the area, which could include the use of cloud infrastructure from different cloud framework providers as well as other cloud service providers.
Terminal-oriented computer-communication networks.
NASA Technical Reports Server (NTRS)
Schwartz, M.; Boorstyn, R. R.; Pickholtz, R. L.
1972-01-01
Four examples of currently operating computer-communication networks are described in this tutorial paper. They include the TYMNET network, the GE Information Services network, the NASDAQ over-the-counter stock-quotation system, and the Computer Sciences Infonet. These networks all use programmable concentrators for combining a multiplicity of terminals. Included in the discussion for each network is a description of the overall network structure, the handling and transmission of messages, communication requirements, routing and reliability consideration where applicable, operating data and design specifications where available, and unique design features in the area of computer communications.
Support Expressed in Congress for U.S. High-Performance Computing
NASA Astrophysics Data System (ADS)
Showstack, Randy
2004-06-01
Advocates for a stronger U.S. position in high-performance computing-which could help with a number of grand challenges in the Earth sciences and other disciplines-hope that legislation recently introduced in the House of Representatives, and, will help to revitalize U.S. efforts. The High-Performance Computing Revitalization Act of 2004 would amend the earlier High-Performance Computing Act of 1991 (Public Law 102-194), which is partially credited with helping to strengthen U.S. capabilities in this area. The bill has the support of the Bush administration.
Dan Goldin Presentation: Pathway to the Future
NASA Technical Reports Server (NTRS)
1999-01-01
In the "Path to the Future" presentation held at NASA's Langley Center on March 31, 1999, NASA's Administrator Daniel S. Goldin outlined the future direction and strategies of NASA in relation to the general space exploration enterprise. NASA's Vision, Future System Characteristics, Evolutions of Engineering, and Revolutionary Changes are the four main topics of the presentation. In part one, the Administrator talks in detail about NASA's vision in relation to the NASA Strategic Activities that are Space Science, Earth Science, Human Exploration, and Aeronautics & Space Transportation. Topics discussed in this section include: space science for the 21st century, flying in mars atmosphere (mars plane), exploring new worlds, interplanetary internets, earth observation and measurements, distributed information-system-in-the-sky, science enabling understanding and application, space station, microgravity, science and exploration strategies, human mars mission, advance space transportation program, general aviation revitalization, and reusable launch vehicles. In part two, he briefly talks about the future system characteristics. He discusses major system characteristics like resiliencey, self-sufficiency, high distribution, ultra-efficiency, and autonomy and the necessity to overcome any distance, time, and extreme environment barriers. Part three of Mr. Goldin's talk deals with engineering evolution, mainly evolution in the Computer Aided Design (CAD)/Computer Aided Engineering (CAE) systems. These systems include computer aided drafting, computerized solid models, virtual product development (VPD) systems, networked VPD systems, and knowledge enriched networked VPD systems. In part four, the last part, the Administrator talks about the need for revolutionary changes in communication and networking areas of a system. According to the administrator, the four major areas that need cultural changes in the creativity process are human-centered computing, an infrastructure for distributed collaboration, rapid synthesis and simulation tools, and life-cycle integration and validation. Mr. Goldin concludes his presentation with the following maxim "Collaborate, Integrate, Innovate or Stagnate and Evaporate." He also answers some questions after the presentation.
2011-07-22
This computer-generated view depicts part of Mars at the boundary between darkness and daylight, with an area including Gale Crater beginning to catch morning light. NASA has selected Gale as the landing site for the Mars Science Laboratory mission.
Research and technology, 1984 report
NASA Technical Reports Server (NTRS)
1984-01-01
Research and technology projects in the following areas are described: cryogenic engineering, hypergolic engineering, hazardous warning instrumentation, structures and mechanics, sensors and controls, computer sciences, communications, material analysis, biomedicine, meteorology, engineering management, logistics, training and maintenance aids, and technology applications.
NASA Astrophysics Data System (ADS)
Thapa, Ranjit; Kawazoe, Yoshiyuki
2017-10-01
The main objective of this meeting was to provide a platform for theoreticians and experimentalists working in the area of materials to come together and carry out cutting edge research in the field of energy by showcasing their ideas and innovations. The theme meeting was successful in attracting young researchers from both fields, sharing common research interests. Participation of more than 250 researchers in ACCMS-TM 2016 has successfully paved the way towards exchange of mutual research insights and establishment of promising research collaborations. To encourage the young participants' research efforts, three best posters, each named as ;KAWAZOE PRIZE; in theoretical category and two best posters named ;ACCMS-TM 2016 POSTER AWARD; for experimental contributions was selected. A new award named ;ACCMS MID-CAREER AWARD; for outstanding scientific contribution in the area of Computational Materials Science was constituted.
Life sciences Spacelab Mission Development test 3 (SMD 3) data management report
NASA Technical Reports Server (NTRS)
Moseley, E. C.
1977-01-01
Development of a permanent data system for SMD tests was studied that would simulate all elements of the shuttle onboard, telemetry, and ground data systems that are involved with spacelab operations. The onboard data system (ODS) and the ground data system (GDS) were utilized. The air-to-ground link was simulated by a hardwired computer-to-computer interface. A patch board system was used on board to select experiment inputs, and the downlink configuration from the ODS was changed by a crew keyboard entry to support each experiment. The ODS provided a CRT display of experiment parameters to enable the crew to monitor experiment performance. An onboard analog system, with recording capability, was installed to handle high rate data and to provide a backup to the digital system. The GDS accomplished engineering unit conversion and limit sensing, and provided realtime parameter display on CRT's in the science monitoring area and the test control area.
Know Your Discipline: Teaching the Philosophy of Computer Science
ERIC Educational Resources Information Center
Tedre, Matti
2007-01-01
The diversity and interdisciplinarity of computer science and the multiplicity of its uses in other sciences make it hard to define computer science and to prescribe how computer science should be carried out. The diversity of computer science also causes friction between computer scientists from different branches. Computer science curricula, as…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bland, Arthur S Buddy; Hack, James J; Baker, Ann E
Oak Ridge National Laboratory's (ORNL's) Cray XT5 supercomputer, Jaguar, kicked off the era of petascale scientific computing in 2008 with applications that sustained more than a thousand trillion floating point calculations per second - or 1 petaflop. Jaguar continues to grow even more powerful as it helps researchers broaden the boundaries of knowledge in virtually every domain of computational science, including weather and climate, nuclear energy, geosciences, combustion, bioenergy, fusion, and materials science. Their insights promise to broaden our knowledge in areas that are vitally important to the Department of Energy (DOE) and the nation as a whole, particularly energymore » assurance and climate change. The science of the 21st century, however, will demand further revolutions in computing, supercomputers capable of a million trillion calculations a second - 1 exaflop - and beyond. These systems will allow investigators to continue attacking global challenges through modeling and simulation and to unravel longstanding scientific questions. Creating such systems will also require new approaches to daunting challenges. High-performance systems of the future will need to be codesigned for scientific and engineering applications with best-in-class communications networks and data-management infrastructures and teams of skilled researchers able to take full advantage of these new resources. The Oak Ridge Leadership Computing Facility (OLCF) provides the nation's most powerful open resource for capability computing, with a sustainable path that will maintain and extend national leadership for DOE's Office of Science (SC). The OLCF has engaged a world-class team to support petascale science and to take a dramatic step forward, fielding new capabilities for high-end science. This report highlights the successful delivery and operation of a petascale system and shows how the OLCF fosters application development teams, developing cutting-edge tools and resources for next-generation systems.« less
NASA Astrophysics Data System (ADS)
Gosha, Kinnis
This dissertation presents the design, development and short-term evaluation of an embodied conversational agent designed to mentor human users. An embodied conversational agent (ECA) was created and programmed to mentor African American computer science majors on their decision to pursue graduate study in computing. Before constructing the ECA, previous research in the fields of embodied conversational agents, relational agents, mentorship, telementorship and successful mentoring programs and practices for African American graduate students were reviewed. A survey used to find areas of interest of the sample population. Experts were then interviewed to collect information on those areas of interest and a dialogue for the ECA was constructed based on the interview's transcripts. A between-group, mixed method experiment was conducted with 37 African American male undergraduate computer science majors where one group used the ECA mentor while the other group pursued mentoring advice from a human mentor. Results showed no significant difference between the ECA and human mentor when dealing with career mentoring functions. However, the human mentor was significantly better than the ECA mentor when addressing psychosocial mentoring functions.
Towards Reproducibility in Computational Hydrology
NASA Astrophysics Data System (ADS)
Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei; Duffy, Chris; Arheimer, Berit
2017-04-01
Reproducibility is a foundational principle in scientific research. The ability to independently re-run an experiment helps to verify the legitimacy of individual findings, and evolve (or reject) hypotheses and models of how environmental systems function, and move them from specific circumstances to more general theory. Yet in computational hydrology (and in environmental science more widely) the code and data that produces published results are not regularly made available, and even if they are made available, there remains a multitude of generally unreported choices that an individual scientist may have made that impact the study result. This situation strongly inhibits the ability of our community to reproduce and verify previous findings, as all the information and boundary conditions required to set up a computational experiment simply cannot be reported in an article's text alone. In Hutton et al 2016 [1], we argue that a cultural change is required in the computational hydrological community, in order to advance and make more robust the process of knowledge creation and hypothesis testing. We need to adopt common standards and infrastructures to: (1) make code readable and re-useable; (2) create well-documented workflows that combine re-useable code together with data to enable published scientific findings to be reproduced; (3) make code and workflows available, easy to find, and easy to interpret, using code and code metadata repositories. To create change we argue for improved graduate training in these areas. In this talk we reflect on our progress in achieving reproducible, open science in computational hydrology, which are relevant to the broader computational geoscience community. In particular, we draw on our experience in the Switch-On (EU funded) virtual water science laboratory (http://www.switch-on-vwsl.eu/participate/), which is an open platform for collaboration in hydrological experiments (e.g. [2]). While we use computational hydrology as the example application area, we believe that our conclusions are of value to the wider environmental and geoscience community as far as the use of code and models for scientific advancement is concerned. References: [1] Hutton, C., T. Wagener, J. Freer, D. Han, C. Duffy, and B. Arheimer (2016), Most computational hydrology is not reproducible, so is it really science?, Water Resour. Res., 52, 7548-7555, doi:10.1002/2016WR019285. [2] Ceola, S., et al. (2015), Virtual laboratories: New opportunities for collaborative water science, Hydrol. Earth Syst. Sci. Discuss., 11(12), 13443-13478, doi:10.5194/hessd-11-13443-2014.
Exploratory Research and Development Fund, FY 1990. Report on Lawrence Berkeley Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1992-05-01
The Lawrence Berkeley Laboratory Exploratory R&D Fund FY 1990 report is compiled from annual reports submitted by principal investigators following the close of the fiscal year. This report describes the projects supported and summarizes their accomplishments. It constitutes a part of an Exploratory R&D Fund (ERF) planning and documentation process that includes an annual planning cycle, projection selection, implementation, and review. The research areas covered in this report are: Accelerator and fusion research; applied science; cell and molecular biology; chemical biodynamics; chemical sciences; earth sciences; engineering; information and computing sciences; materials sciences; nuclear science; physics and research medicine and radiationmore » biophysics.« less
NASA aerospace database subject scope: An overview
NASA Technical Reports Server (NTRS)
1993-01-01
Outlined here is the subject scope of the NASA Aerospace Database, a publicly available subset of the NASA Scientific and Technical (STI) Database. Topics of interest to NASA are outlined and placed within the framework of the following broad aerospace subject categories: aeronautics, astronautics, chemistry and materials, engineering, geosciences, life sciences, mathematical and computer sciences, physics, social sciences, space sciences, and general. A brief discussion of the subject scope is given for each broad area, followed by a similar explanation of each of the narrower subject fields that follow. The subject category code is listed for each entry.
Discriminative Learning with Markov Logic Networks
2009-10-01
Discriminative Learning with Markov Logic Networks Tuyen N. Huynh Department of Computer Sciences University of Texas at Austin Austin, TX 78712...emerging area of research that addresses the problem of learning from noisy structured/relational data. Markov logic networks (MLNs), sets of weighted...TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) University of Texas at Austin,Department of Computer
Safdari, Reza; Shahmoradi, Leila; Hosseini-Beheshti, Molouk-Sadat; Nejad, Ahmadreza Farzaneh; Hosseiniravandi, Mohammad
2015-10-01
Encyclopedias and their compilation have become so prevalent as a valid cultural medium in the world. The daily development of computer industry and the expansion of various sciences have made indispensable the compilation of electronic, specialized encyclopedias, especially the web-based ones. This is an applied-developmental study conducted in 2014. First, the main terms in the field of medical informatics were gathered using MeSH Online 2014 and the supplementary terms of each were determined, and then the tree diagram of the terms was drawn based on their relationship in MeSH. Based on the studies done by the researchers, the tree diagram of the encyclopedia was drawn with respect to the existing areas in this field, and the terms gathered were put in related domains. In MeSH, 75 preferred terms together with 249 supplementary ones were indexed. One of the informatics' sub-branches is biomedical informatics and health which itself consists of three sub-divisions of bioinformatics, clinical informatics, and health informatics. Medical informatics which is a subdivision of clinical informatics has developed from the three fields of medical sciences, management and social sciences, and computational sciences and mathematics. Medical Informatics is created of confluence and fusion and applications of the three major scientific branches include health and biological sciences, social sciences and management sciences, computing and mathematical sciences, and according to that the structure of MeSH is weak for future development of Encyclopedia of Medical Informatics.
1981-03-01
Research Instructor of Computer Scienr-. Reviewed by: Released by: WILLIAM M. TOLLES Department puter Science Dean of Research 4c t SECURITY...Lyle A. Cox, Roger R. Schell, and Sonja L. Perdue 9. PERFORMING ORGANIZATION NAME ANO ADDRESS 10. PROGRAM ELEMENT. PROJECT. TASK AREA A WORK UNIT... Computer Networks, Operating Systems, Computer Security 20. AftUrCT (Cnthm, w v re eae old* It n..*p and idm 0 F W blk ..m.m.o’) ",A_;he security
Graduate student theses supported by DOE`s Environmental Sciences Division
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cushman, Robert M.; Parra, Bobbi M.
1995-07-01
This report provides complete bibliographic citations, abstracts, and keywords for 212 doctoral and master`s theses supported fully or partly by the U.S. Department of Energy`s Environmental Sciences Division (and its predecessors) in the following areas: Atmospheric Sciences; Marine Transport; Terrestrial Transport; Ecosystems Function and Response; Carbon, Climate, and Vegetation; Information; Computer Hardware, Advanced Mathematics, and Model Physics (CHAMMP); Atmospheric Radiation Measurement (ARM); Oceans; National Institute for Global Environmental Change (NIGEC); Unmanned Aerial Vehicles (UAV); Integrated Assessment; Graduate Fellowships for Global Change; and Quantitative Links. Information on the major professor, department, principal investigator, and program area is given for each abstract.more » Indexes are provided for major professor, university, principal investigator, program area, and keywords. This bibliography is also available in various machine-readable formats (ASCII text file, WordPerfect{reg_sign} files, and PAPYRUS{trademark} files).« less
NASA Astrophysics Data System (ADS)
Cirac, J. Ignacio; Kimble, H. Jeff
2017-01-01
Quantum optics is a well-established field that spans from fundamental physics to quantum information science. In the coming decade, areas including computation, communication and metrology are all likely to experience scientific and technological advances supported by this far-reaching research field.
European Science Notes. Volume 41, Number 10,
1987-10-01
the following topics: laminar/turbulent transition in boundary layers; coherent structures in the modeling of turbulent boundary layers, wakes, and jets...of the labeling of a model protein, human immu- indicator. The amount of oxygen produced noglobulin (hIgG), with acridinium ester, can easily be...has concerned cations, and Computer Science. Research model reduction of large-scale systems in the controls area is conducted in the and state and
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spangler, Lee; Cunningham, Alfred; Lageson, David
2011-03-31
ZERT has made major contributions to five main areas of sequestration science: improvement of computational tools; measurement and monitoring techniques to verify storage and track migration of CO{sub 2}; development of a comprehensive performance and risk assessment framework; fundamental geophysical, geochemical and hydrological investigations of CO{sub 2} storage; and investigate innovative, bio-based mitigation strategies.
The Argonne Leadership Computing Facility 2010 annual report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Drugan, C.
Researchers found more ways than ever to conduct transformative science at the Argonne Leadership Computing Facility (ALCF) in 2010. Both familiar initiatives and innovative new programs at the ALCF are now serving a growing, global user community with a wide range of computing needs. The Department of Energy's (DOE) INCITE Program remained vital in providing scientists with major allocations of leadership-class computing resources at the ALCF. For calendar year 2011, 35 projects were awarded 732 million supercomputer processor-hours for computationally intensive, large-scale research projects with the potential to significantly advance key areas in science and engineering. Argonne also continued tomore » provide Director's Discretionary allocations - 'start up' awards - for potential future INCITE projects. And DOE's new ASCR Leadership Computing (ALCC) Program allocated resources to 10 ALCF projects, with an emphasis on high-risk, high-payoff simulations directly related to the Department's energy mission, national emergencies, or for broadening the research community capable of using leadership computing resources. While delivering more science today, we've also been laying a solid foundation for high performance computing in the future. After a successful DOE Lehman review, a contract was signed to deliver Mira, the next-generation Blue Gene/Q system, to the ALCF in 2012. The ALCF is working with the 16 projects that were selected for the Early Science Program (ESP) to enable them to be productive as soon as Mira is operational. Preproduction access to Mira will enable ESP projects to adapt their codes to its architecture and collaborate with ALCF staff in shaking down the new system. We expect the 10-petaflops system to stoke economic growth and improve U.S. competitiveness in key areas such as advancing clean energy and addressing global climate change. Ultimately, we envision Mira as a stepping-stone to exascale-class computers that will be faster than petascale-class computers by a factor of a thousand. Pete Beckman, who served as the ALCF's Director for the past few years, has been named director of the newly created Exascale Technology and Computing Institute (ETCi). The institute will focus on developing exascale computing to extend scientific discovery and solve critical science and engineering problems. Just as Pete's leadership propelled the ALCF to great success, we know that that ETCi will benefit immensely from his expertise and experience. Without question, the future of supercomputing is certainly in good hands. I would like to thank Pete for all his effort over the past two years, during which he oversaw the establishing of ALCF2, the deployment of the Magellan project, increases in utilization, availability, and number of projects using ALCF1. He managed the rapid growth of ALCF staff and made the facility what it is today. All the staff and users are better for Pete's efforts.« less
Virtual Observatories, Data Mining, and Astroinformatics
NASA Astrophysics Data System (ADS)
Borne, Kirk
The historical, current, and future trends in knowledge discovery from data in astronomy are presented here. The story begins with a brief history of data gathering and data organization. A description of the development ofnew information science technologies for astronomical discovery is then presented. Among these are e-Science and the virtual observatory, with its data discovery, access, display, and integration protocols; astroinformatics and data mining for exploratory data analysis, information extraction, and knowledge discovery from distributed data collections; new sky surveys' databases, including rich multivariate observational parameter sets for large numbers of objects; and the emerging discipline of data-oriented astronomical research, called astroinformatics. Astroinformatics is described as the fourth paradigm of astronomical research, following the three traditional research methodologies: observation, theory, and computation/modeling. Astroinformatics research areas include machine learning, data mining, visualization, statistics, semantic science, and scientific data management.Each of these areas is now an active research discipline, with significantscience-enabling applications in astronomy. Research challenges and sample research scenarios are presented in these areas, in addition to sample algorithms for data-oriented research. These information science technologies enable scientific knowledge discovery from the increasingly large and complex data collections in astronomy. The education and training of the modern astronomy student must consequently include skill development in these areas, whose practitioners have traditionally been limited to applied mathematicians, computer scientists, and statisticians. Modern astronomical researchers must cross these traditional discipline boundaries, thereby borrowing the best of breed methodologies from multiple disciplines. In the era of large sky surveys and numerous large telescopes, the potential for astronomical discovery is equally large, and so the data-oriented research methods, algorithms, and techniques that are presented here will enable the greatest discovery potential from the ever-growing data and information resources in astronomy.
Center of Excellence for Geospatial Information Science research plan 2013-18
Usery, E. Lynn
2013-01-01
The U.S. Geological Survey Center of Excellence for Geospatial Information Science (CEGIS) was created in 2006 and since that time has provided research primarily in support of The National Map. The presentations and publications of the CEGIS researchers document the research accomplishments that include advances in electronic topographic map design, generalization, data integration, map projections, sea level rise modeling, geospatial semantics, ontology, user-centered design, volunteer geographic information, and parallel and grid computing for geospatial data from The National Map. A research plan spanning 2013–18 has been developed extending the accomplishments of the CEGIS researchers and documenting new research areas that are anticipated to support The National Map of the future. In addition to extending the 2006–12 research areas, the CEGIS research plan for 2013–18 includes new research areas in data models, geospatial semantics, high-performance computing, volunteered geographic information, crowdsourcing, social media, data integration, and multiscale representations to support the Three-Dimensional Elevation Program (3DEP) and The National Map of the future of the U.S. Geological Survey.
NASA Technical Reports Server (NTRS)
Moore, Robert C.
1998-01-01
The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities that serves as a bridge between NASA and the academic community. Under a five-year co-operative agreement with NASA, research at RIACS is focused on areas that are strategically enabling to the Ames Research Center's role as NASA's Center of Excellence for Information Technology. Research is carried out by a staff of full-time scientist,augmented by visitors, students, post doctoral candidates and visiting university faculty. The primary mission of RIACS is charted to carry out research and development in computer science. This work is devoted in the main to tasks that are strategically enabling with respect to NASA's bold mission in space exploration and aeronautics. There are three foci for this work: Automated Reasoning. Human-Centered Computing. and High Performance Computing and Networking. RIACS has the additional goal of broadening the base of researcher in these areas of importance to the nation's space and aeronautics enterprises. Through its visiting scientist program, RIACS facilitates the participation of university-based researchers, including both faculty and students, in the research activities of NASA and RIACS. RIACS researchers work in close collaboration with NASA computer scientists on projects such as the Remote Agent Experiment on Deep Space One mission, and Super-Resolution Surface Modeling.
Turmezei, Tom D; Poole, Ken E S
2011-01-01
Bone is a fundamental component of the disordered joint homeostasis seen in osteoarthritis, a disease that has been primarily characterized by the breakdown of articular cartilage accompanied by local bone changes and a limited degree of joint inflammation. In this review we consider the role of computed tomography imaging and computational analysis in osteoarthritis research, focusing on subchondral bone and osteophytes in the hip. We relate what is already known in this area to what could be explored through this approach in the future in relation to both clinical research trials and the underlying cellular and molecular science of osteoarthritis. We also consider how this area of research could impact on our understanding of the genetics of osteoarthritis.
NASA Astrophysics Data System (ADS)
Fu, Qiang; Schaaf, Peter
2018-07-01
This special issue of the high impact international peer reviewed journal Applied Surface Science represents the proceedings of the 2nd International Conference on Applied Surface Science ICASS held 12-16 June 2017 in Dalian China. The conference provided a forum for researchers in all areas of applied surface science to present their work. The main topics of the conference are in line with the most popular areas of research reported in Applied Surface Science. Thus, this issue includes current research on the role and use of surfaces in chemical and physical processes, related to catalysis, electrochemistry, surface engineering and functionalization, biointerfaces, semiconductors, 2D-layered materials, surface nanotechnology, energy, new/functional materials and nanotechnology. Also the various techniques and characterization methods will be discussed. Hence, scientific research on the atomic and molecular level of material properties investigated with specific surface analytical techniques and/or computational methods is essential for any further progress in these fields.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-08
... analysis, survey methodology, geospatial analysis, econometrics, cognitive psychology, and computer science... following disciplines: demography, economics, geography, psychology, statistics, survey methodology, social... expertise in such areas as demography, economics, geography, psychology, statistics, survey methodology...
ERIC Educational Resources Information Center
Online-Offline, 1998
1998-01-01
Focuses on technology, on advances in such areas as aeronautics, electronics, physics, the space sciences, as well as computers and the attendant progress in medicine, robotics, and artificial intelligence. Describes educational resources for elementary and middle school students, including Web sites, CD-ROMs and software, videotapes, books,…
Abstracts of Research. July 1974-June 1975.
ERIC Educational Resources Information Center
Ohio State Univ., Columbus. Computer and Information Science Research Center.
Abstracts of research papers in computer and information science are given for 68 papers in the areas of information storage and retrieval; human information processing; information analysis; linguistic analysis; artificial intelligence; information processes in physical, biological, and social systems; mathematical techniques; systems…
GET21: Geoinformatics Training and Education for the 21st Century Geoscience Workforce
NASA Astrophysics Data System (ADS)
Baru, C.; Allison, L.; Fox, P.; Keane, C.; Keller, R.; Richard, S.
2012-04-01
The integration of advanced information technologies (referred to as cyberinfrastructure) into scientific research and education creates a synergistic situation. On the one hand, science begins to move at the speed of information technology, with science applications having to move rapidly to keep apace with the latest innovations in hardware and software. On the other hand, information technology moves at the pace of science, requiring rapid prototyping and rapid development of software and systems to serve the immediate needs of the application. The 21st century geoscience workforce must be adept at both sides of this equation to be able to make the best use of the available cyber-tools for their science and education endeavors. To reach different segments of the broad geosciences community, an education program in geoinformatics must be multi-faceted, ranging from areas dealing with modeling, computational science, and high performance computing, to those dealing with data collection, data science, and data-intensive computing. Based on our experience in geoinformatics and data science education, we propose a multi-pronged approach with a number of different components, including summer institutes typically aimed at graduate students, postdocs and researchers; graduate and undergraduate curriculum development in geoinformatics; development of online course materials to facilitate asynchronous learning, especially for geoscience professionals in the field; provision of internship at geoinformatics-related facilities for graduate students, so that they can observe and participate in geoinformatics "in action"; creation of online communities and networks to facilitate planned as well as serendipitous collaborations and for linking users with experts in the different areas of geoscience and geoinformatics. We will describe some of our experiences and the lessons learned over the years from the Cyberinfrastructure Summer Institute for Geoscientists (CSIG), which is a 1-week institute that has been held each summer (August) at the San Diego Supercomputer Center, University of California, San Diego, since 2005. We will also discuss these opportunities for GET21 and geoinformatics education in the context of the newly launched EarthCube initiative at the US National Science Foundation.
ISMB 2016 offers outstanding science, networking, and celebration
Fogg, Christiana
2016-01-01
The annual international conference on Intelligent Systems for Molecular Biology (ISMB) is the major meeting of the International Society for Computational Biology (ISCB). Over the past 23 years the ISMB conference has grown to become the world's largest bioinformatics/computational biology conference. ISMB 2016 will be the year's most important computational biology event globally. The conferences provide a multidisciplinary forum for disseminating the latest developments in bioinformatics/computational biology. ISMB brings together scientists from computer science, molecular biology, mathematics, statistics and related fields. Its principal focus is on the development and application of advanced computational methods for biological problems. ISMB 2016 offers the strongest scientific program and the broadest scope of any international bioinformatics/computational biology conference. Building on past successes, the conference is designed to cater to variety of disciplines within the bioinformatics/computational biology community. ISMB 2016 takes place July 8 - 12 at the Swan and Dolphin Hotel in Orlando, Florida, United States. For two days preceding the conference, additional opportunities including Satellite Meetings, Student Council Symposium, and a selection of Special Interest Group Meetings and Applied Knowledge Exchange Sessions (AKES) are all offered to enable registered participants to learn more on the latest methods and tools within specialty research areas. PMID:27347392
ISMB 2016 offers outstanding science, networking, and celebration.
Fogg, Christiana
2016-01-01
The annual international conference on Intelligent Systems for Molecular Biology (ISMB) is the major meeting of the International Society for Computational Biology (ISCB). Over the past 23 years the ISMB conference has grown to become the world's largest bioinformatics/computational biology conference. ISMB 2016 will be the year's most important computational biology event globally. The conferences provide a multidisciplinary forum for disseminating the latest developments in bioinformatics/computational biology. ISMB brings together scientists from computer science, molecular biology, mathematics, statistics and related fields. Its principal focus is on the development and application of advanced computational methods for biological problems. ISMB 2016 offers the strongest scientific program and the broadest scope of any international bioinformatics/computational biology conference. Building on past successes, the conference is designed to cater to variety of disciplines within the bioinformatics/computational biology community. ISMB 2016 takes place July 8 - 12 at the Swan and Dolphin Hotel in Orlando, Florida, United States. For two days preceding the conference, additional opportunities including Satellite Meetings, Student Council Symposium, and a selection of Special Interest Group Meetings and Applied Knowledge Exchange Sessions (AKES) are all offered to enable registered participants to learn more on the latest methods and tools within specialty research areas.
ERIC Educational Resources Information Center
Congress of the U.S., Washington, DC. House Committee on Science, Space and Technology.
Based upon the premise that manufacturing, communications, and computers are the key to productivity, this hearing before the Technology Policy Task Force was held to examine how the federal government interacts with universities, engineering research centers, professional associations, and private businesses in these areas. This document contains…
Advances in Machine Learning and Data Mining for Astronomy
NASA Astrophysics Data System (ADS)
Way, Michael J.; Scargle, Jeffrey D.; Ali, Kamal M.; Srivastava, Ashok N.
2012-03-01
Advances in Machine Learning and Data Mining for Astronomy documents numerous successful collaborations among computer scientists, statisticians, and astronomers who illustrate the application of state-of-the-art machine learning and data mining techniques in astronomy. Due to the massive amount and complexity of data in most scientific disciplines, the material discussed in this text transcends traditional boundaries between various areas in the sciences and computer science. The book's introductory part provides context to issues in the astronomical sciences that are also important to health, social, and physical sciences, particularly probabilistic and statistical aspects of classification and cluster analysis. The next part describes a number of astrophysics case studies that leverage a range of machine learning and data mining technologies. In the last part, developers of algorithms and practitioners of machine learning and data mining show how these tools and techniques are used in astronomical applications. With contributions from leading astronomers and computer scientists, this book is a practical guide to many of the most important developments in machine learning, data mining, and statistics. It explores how these advances can solve current and future problems in astronomy and looks at how they could lead to the creation of entirely new algorithms within the data mining community.
Palm, Günther
2016-01-01
Research in neural information processing has been successful in the past, providing useful approaches both to practical problems in computer science and to computational models in neuroscience. Recent developments in the area of cognitive neuroscience present new challenges for a computational or theoretical understanding asking for neural information processing models that fulfill criteria or constraints from cognitive psychology, neuroscience and computational efficiency. The most important of these criteria for the evaluation of present and future contributions to this new emerging field are listed at the end of this article. PMID:26858632
Some Thoughts Regarding Practical Quantum Computing
NASA Astrophysics Data System (ADS)
Ghoshal, Debabrata; Gomez, Richard; Lanzagorta, Marco; Uhlmann, Jeffrey
2006-03-01
Quantum computing has become an important area of research in computer science because of its potential to provide more efficient algorithmic solutions to certain problems than are possible with classical computing. The ability of performing parallel operations over an exponentially large computational space has proved to be the main advantage of the quantum computing model. In this regard, we are particularly interested in the potential applications of quantum computers to enhance real software systems of interest to the defense, industrial, scientific and financial communities. However, while much has been written in popular and scientific literature about the benefits of the quantum computational model, several of the problems associated to the practical implementation of real-life complex software systems in quantum computers are often ignored. In this presentation we will argue that practical quantum computation is not as straightforward as commonly advertised, even if the technological problems associated to the manufacturing and engineering of large-scale quantum registers were solved overnight. We will discuss some of the frequently overlooked difficulties that plague quantum computing in the areas of memories, I/O, addressing schemes, compilers, oracles, approximate information copying, logical debugging, error correction and fault-tolerant computing protocols.
Publications in biomedical and environmental sciences programs, 1981
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moody, J.B.
1982-07-01
This bibliography contains 698 references to articles in journals, books, and reports published in the subject area of biomedical and environmental sciences during 1981. There are 520 references to articles published in journals and books and 178 references to reports. Staff members in the Biomedical and Environmental Sciences divisions have other publications not included in this bibliography; for example, theses, book reviews, abstracts published in journals or symposia proceedings, pending journal publications and reports such as monthly, bimonthly, and quarterly progress reports, contractor reports, and reports for internal distribution. This document is sorted by the division, and then alphabetically bymore » author. The sorting by divisions separates the references by subject area in a simple way. The divisions represented in the order that they appear in the bibliography are Analytical Chemistry, Biology, Chemical Technology, Information R and D, Health and Safety Research, Instrumentation and Controls, Computer Sciences, Energy, Engineering Technology, Solid State, Central Management, Operations, and Environmental Sciences. Indexes are provided by author, title, and journal reference.« less
Regime, phase and paradigm shifts: making community ecology the basic science for fisheries
Mangel, Marc; Levin, Phillip S.
2005-01-01
Modern fishery science, which began in 1957 with Beverton and Holt, is ca. 50 years old. At its inception, fishery science was limited by a nineteenth century mechanistic worldview and by computational technology; thus, the relatively simple equations of population ecology became the fundamental ecological science underlying fisheries. The time has come for this to change and for community ecology to become the fundamental ecological science underlying fisheries. This point will be illustrated with two examples. First, when viewed from a community perspective, excess production must be considered in the context of biomass left for predators. We argue that this is a better measure of the effects of fisheries than spawning biomass per recruit. Second, we shall analyse a simple, but still multi-species, model for fishery management that considers the alternatives of harvest regulations, inshore marine protected areas and offshore marine protected areas. Population or community perspectives lead to very different predictions about the efficacy of reserves. PMID:15713590
NASA Astrophysics Data System (ADS)
Baytak, Ahmet
Among educational researchers and practitioners, there is a growing interest in employing computer games for pedagogical purposes. The present research integrated a technology education class and a science class where 5 th graders learned about environmental issues by designing games that involved environmental concepts. The purposes of this study were to investigate how designing computer games affected the development of students' environmental knowledge, programming knowledge, environmental awareness and interest in computers. It also explored the nature of the artifacts developed and the types of knowledge represented therein. A case study (Yin, 2003) was employed within the context of a 5 th grade elementary science classroom. Fifth graders designed computer games about environmental issues to present to 2nd graders by using Scratch software. The analysis of this study was based on multiple data sources: students' pre- and post-test scores on environmental awareness, their environmental knowledge, their interest in computer science, and their game design. Included in the analyses were also data from students' computer games, participant observations, and structured interviews. The results of the study showed that students were able to successfully design functional games that represented their understanding of environment, even though the gain between pre- and post-environmental knowledge test and environmental awareness survey were minimal. The findings indicate that all students were able to use various game characteristics and programming concepts, but their prior experience with the design software affected their representations. The analyses of the interview transcriptions and games show that students improved their programming skills and that they wanted to do similar projects for other subject areas in the future. Observations showed that game design appeared to lead to knowledge-building, interaction and collaboration among students. This, in turn, encouraged students to test and improve their designs. Sharing the games, it was found, has both positive and negative effects on the students' game design process and the representation of students' understandings of the domain subject.
NASA Technical Reports Server (NTRS)
Oliger, Joseph
1993-01-01
The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on 6 June 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. A flexible scientific staff is provided through a university faculty visitor program, a post doctoral program, and a student visitor program. Not only does this provide appropriate expertise but it also introduces scientists outside of NASA to NASA problems. A small group of core RIACS staff provides continuity and interacts with an ARC technical monitor and scientific advisory group to determine the RIACS mission. RIACS activities are reviewed and monitored by a USRA advisory council and ARC technical monitor. Research at RIACS is currently being done in the following areas: Parallel Computing, Advanced Methods for Scientific Computing, High Performance Networks and Technology, and Learning Systems. Parallel compiler techniques, adaptive numerical methods for flows in complicated geometries, and optimization were identified as important problems to investigate for ARC's involvement in the Computational Grand Challenges of the next decade.
Factors influencing exemplary science teachers' levels of computer use
NASA Astrophysics Data System (ADS)
Hakverdi, Meral
This study examines exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their students' use of computer applications/tools in or for their science class. After a relevant review of the literature certain variables were selected for analysis. These variables included personal self-efficacy in teaching with computers, outcome expectancy, pupil-control ideology, level of computer use, age, gender, teaching experience, personal computer use, professional computer use and science teachers' level of knowledge/skills in using specific computer applications for science instruction. The sample for this study includes middle and high school science teachers who received the Presidential Award for Excellence in Science Teaching Award (sponsored by the White House and the National Science Foundation) between the years 1997 and 2003 from all 50 states and U.S. territories. Award-winning science teachers were contacted about the survey via e-mail or letter with an enclosed return envelope. Of the 334 award-winning science teachers, usable responses were received from 92 science teachers, which made a response rate of 27.5%. Analysis of the survey responses indicated that exemplary science teachers have a variety of knowledge/skills in using computer related applications/tools. The most commonly used computer applications/tools are information retrieval via the Internet, presentation tools, online communication, digital cameras, and data collection probes. Results of the study revealed that students' use of technology in their science classroom is highly correlated with the frequency of their science teachers' use of computer applications/tools. The results of the multiple regression analysis revealed that personal self-efficacy related to the exemplary science teachers' level of computer use suggesting that computer use is dependent on perceived abilities at using computers. The teachers' use of computer-related applications/tools during class, and their personal self-efficacy, age, and gender are highly related with their level of knowledge/skills in using specific computer applications for science instruction. The teachers' level of knowledge/skills in using specific computer applications for science instruction and gender related to their use of computer-related applications/tools during class and the students' use of computer-related applications/tools in or for their science class. In conclusion, exemplary science teachers need assistance in learning and using computer-related applications/tool in their science class.
Cappella, Joseph N
2017-10-01
Simultaneous developments in big data, social media, and computational social science have set the stage for how we think about and understand interpersonal and mass communication. This article explores some of the ways that these developments generate 4 hypothetical "vectors" - directions - into the next generation of communication research. These vectors include developments in network analysis, modeling interpersonal and social influence, recommendation systems, and the blurring of distinctions between interpersonal and mass audiences through narrowcasting and broadcasting. The methods and research in these arenas are occurring in areas outside the typical boundaries of the communication discipline but engage classic, substantive questions in mass and interpersonal communication.
NASA Astrophysics Data System (ADS)
Levit, Creon; Gazis, P.
2006-06-01
The graphics processing units (GPUs) built in to all professional desktop and laptop computers currently on the market are capable of transforming, filtering, and rendering hundreds of millions of points per second. We present a prototype open-source cross-platform (windows, linux, Apple OSX) application which leverages some of the power latent in the GPU to enable smooth interactive exploration and analysis of large high-dimensional data using a variety of classical and recent techniques. The targeted application area is the interactive analysis of complex, multivariate space science and astrophysics data sets, with dimensionalities that may surpass 100 and sample sizes that may exceed 10^6-10^8.
FY 1992 Budget committed to R&D
NASA Astrophysics Data System (ADS)
Bush, Susan
President's Bush's Fiscal Year 1992 budget for research and development is clear proof of his commitment to R&D as a long-term investment for the next American century, according to D. Allan Bromley, Assistant to the President for Science and Technology and Director, Office of Science and Technology Policy. The FY 92 budget proposes to allocate $75.6 billion for research and development, an increase of $8.4billion, or 13% over the amount appropriated for FY 91. Calling it a “good budget,” Bromley revealed the specifics of research and development in the President's budget on February 4.Bromley believes that as a nation we are underinvesting in research and development,but sees the 1992 budget increases as concrete steps to address this problem. The newly organized and revitalized Federal Coordinating Council for Science, Engineering, and Technology (FCCSET)—an interagency forum of Cabinet secretaries, deputy secretaries, and the heads of independent agencies that reviews, coordinates, and helps implement federal science and technology policy-named three high-priority cross—cutting areas of R&D and organized special interagency programs in these areas. The areas are high-performance computing and communications, global change, and mathematics and science education.
Transparency in Distributed File Systems
1989-01-01
ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT. PROJECT, TASK Computer Science Department AREA & WORK UNIT NUMBERS 734 Comouter Studies Bldc . University of...sistency control , file and director) placement, and file and directory migration in a way that pro- 3 vides full network transparency. This transparency...areas of naming, replication, con- sistency control , file and directory placement, and file and directory migration in a way that pro- 3 vides full
Safdari, Reza; Shahmoradi, Leila; Hosseini-beheshti, Molouk-sadat; Nejad, Ahmadreza Farzaneh; Hosseiniravandi, Mohammad
2015-01-01
Introduction: Encyclopedias and their compilation have become so prevalent as a valid cultural medium in the world. The daily development of computer industry and the expansion of various sciences have made indispensable the compilation of electronic, specialized encyclopedias, especially the web-based ones. Materials and Methods: This is an applied-developmental study conducted in 2014. First, the main terms in the field of medical informatics were gathered using MeSH Online 2014 and the supplementary terms of each were determined, and then the tree diagram of the terms was drawn based on their relationship in MeSH. Based on the studies done by the researchers, the tree diagram of the encyclopedia was drawn with respect to the existing areas in this field, and the terms gathered were put in related domains. Findings: In MeSH, 75 preferred terms together with 249 supplementary ones were indexed. One of the informatics’ sub-branches is biomedical informatics and health which itself consists of three sub-divisions of bioinformatics, clinical informatics, and health informatics. Medical informatics which is a subdivision of clinical informatics has developed from the three fields of medical sciences, management and social sciences, and computational sciences and mathematics. Results and Discussion: Medical Informatics is created of confluence and fusion and applications of the three major scientific branches include health and biological sciences, social sciences and management sciences, computing and mathematical sciences, and according to that the structure of MeSH is weak for future development of Encyclopedia of Medical Informatics. PMID:26635440
Information Architecture: Notes toward a New Curriculum.
ERIC Educational Resources Information Center
Latham, Don
2002-01-01
Considers the evolution of information architectures as a field of professional education. Topics include the need for an interdisciplinary approach; balancing practical skills with theoretical concepts; and key content areas, including information organization, graphic design, computer science, user and usability studies, and communication.…
New Editions for the Apple II of the Chelsea Science Simulations.
ERIC Educational Resources Information Center
Pipeline, 1983
1983-01-01
Ten computer simulations for the Apple II are described. Subject areas of programs include: population dynamics, plant competition, enzyme kinetics, evolution and natural selection, genetic mapping, ammonia synthesis, reaction kinetics, wave interference/diffraction, satellite orbits, and particle scattering. (JN)
Economic agents and markets as emergent phenomena
Tesfatsion, Leigh
2002-01-01
An overview of recent work in agent-based computational economics is provided, with a stress on the research areas highlighted in the National Academy of Sciences Sackler Colloquium session “Economic Agents and Markets as Emergent Phenomena” held in October 2001. PMID:12011395
Summer Institute for Physical Science Teachers
NASA Astrophysics Data System (ADS)
Maheswaranathan, Ponn; Calloway, Cliff
2007-04-01
A summer institute for physical science teachers was conducted at Winthrop University, June 19-29, 2006. Ninth grade physical science teachers at schools within a 50-mile radius from Winthrop were targeted. We developed a graduate level physics professional development course covering selected topics from both the physics and chemistry content areas of the South Carolina Science Standards. Delivery of the material included traditional lectures and the following new approaches in science teaching: hands-on experiments, group activities, computer based data collection, computer modeling, with group discussions & presentations. Two experienced master teachers assisted us during the delivery of the course. The institute was funded by the South Carolina Department of Education. The requested funds were used for the following: faculty salaries, the University contract course fee, some of the participants' room and board, startup equipment for each teacher, and indirect costs to Winthrop University. Startup equipment included a Pasco stand-alone, portable Xplorer GLX interface with sensors (temperature, voltage, pH, pressure, motion, and sound), and modeling software (Wavefunction's Spartan Student and Odyssey). What we learned and ideas for future K-12 teacher preparation initiatives will be presented.
ASCR Cybersecurity for Scientific Computing Integrity - Research Pathways and Ideas Workshop
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peisert, Sean; Potok, Thomas E.; Jones, Todd
At the request of the U.S. Department of Energy's (DOE) Office of Science (SC) Advanced Scientific Computing Research (ASCR) program office, a workshop was held June 2-3, 2015, in Gaithersburg, MD, to identify potential long term (10 to +20 year) cybersecurity fundamental basic research and development challenges, strategies and roadmap facing future high performance computing (HPC), networks, data centers, and extreme-scale scientific user facilities. This workshop was a follow-on to the workshop held January 7-9, 2015, in Rockville, MD, that examined higher level ideas about scientific computing integrity specific to the mission of the DOE Office of Science. Issues includedmore » research computation and simulation that takes place on ASCR computing facilities and networks, as well as network-connected scientific instruments, such as those run by various DOE Office of Science programs. Workshop participants included researchers and operational staff from DOE national laboratories, as well as academic researchers and industry experts. Participants were selected based on the submission of abstracts relating to the topics discussed in the previous workshop report [1] and also from other ASCR reports, including "Abstract Machine Models and Proxy Architectures for Exascale Computing" [27], the DOE "Preliminary Conceptual Design for an Exascale Computing Initiative" [28], and the January 2015 machine learning workshop [29]. The workshop was also attended by several observers from DOE and other government agencies. The workshop was divided into three topic areas: (1) Trustworthy Supercomputing, (2) Extreme-Scale Data, Knowledge, and Analytics for Understanding and Improving Cybersecurity, and (3) Trust within High-end Networking and Data Centers. Participants were divided into three corresponding teams based on the category of their abstracts. The workshop began with a series of talks from the program manager and workshop chair, followed by the leaders for each of the three topics and a representative of each of the four major DOE Office of Science Advanced Scientific Computing Research Facilities: the Argonne Leadership Computing Facility (ALCF), the Energy Sciences Network (ESnet), the National Energy Research Scientific Computing Center (NERSC), and the Oak Ridge Leadership Computing Facility (OLCF). The rest of the workshop consisted of topical breakout discussions and focused writing periods that produced much of this report.« less
Analytical Cost Metrics : Days of Future Past
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prajapati, Nirmal; Rajopadhye, Sanjay; Djidjev, Hristo Nikolov
As we move towards the exascale era, the new architectures must be capable of running the massive computational problems efficiently. Scientists and researchers are continuously investing in tuning the performance of extreme-scale computational problems. These problems arise in almost all areas of computing, ranging from big data analytics, artificial intelligence, search, machine learning, virtual/augmented reality, computer vision, image/signal processing to computational science and bioinformatics. With Moore’s law driving the evolution of hardware platforms towards exascale, the dominant performance metric (time efficiency) has now expanded to also incorporate power/energy efficiency. Therefore the major challenge that we face in computing systems researchmore » is: “how to solve massive-scale computational problems in the most time/power/energy efficient manner?”« less
Generating finite cyclic and dihedral groups using sequential insertion systems with interactions
NASA Astrophysics Data System (ADS)
Fong, Wan Heng; Sarmin, Nor Haniza; Turaev, Sherzod; Yosman, Ahmad Firdaus
2017-04-01
The operation of insertion has been studied extensively throughout the years for its impact in many areas of theoretical computer science such as DNA computing. First introduced as a generalization of the concatenation operation, many variants of insertion have been introduced, each with their own computational properties. In this paper, we introduce a new variant that enables the generation of some special types of groups called sequential insertion systems with interactions. We show that these new systems are able to generate all finite cyclic and dihedral groups.
Short-term Temperature Prediction Using Adaptive Computing on Dynamic Scales
NASA Astrophysics Data System (ADS)
Hu, W.; Cervone, G.; Jha, S.; Balasubramanian, V.; Turilli, M.
2017-12-01
When predicting temperature, there are specific places and times when high accuracy predictions are harder. For example, not all the sub-regions in the domain require the same amount of computing resources to generate an accurate prediction. Plateau areas might require less computing resources than mountainous areas because of the steeper gradient of temperature change in the latter. However, it is difficult to estimate beforehand the optimal allocation of computational resources because several parameters play a role in determining the accuracy of the forecasts, in addition to orography. The allocation of resources to perform simulations can become a bottleneck because it requires human intervention to stop jobs or start new ones. The goal of this project is to design and develop a dynamic approach to generate short-term temperature predictions that can automatically determines the required computing resources and the geographic scales of the predictions based on the spatial and temporal uncertainties. The predictions and the prediction quality metrics are computed using a numeric weather prediction model, Analog Ensemble (AnEn), and the parallelization on high performance computing systems is accomplished using Ensemble Toolkit, one component of the RADICAL-Cybertools family of tools. RADICAL-Cybertools decouple the science needs from the computational capabilities by building an intermediate layer to run general ensemble patterns, regardless of the science. In this research, we show how the ensemble toolkit allows generating high resolution temperature forecasts at different spatial and temporal resolution. The AnEn algorithm is run using NAM analysis and forecasts data for the continental United States for a period of 2 years. AnEn results show that temperature forecasts perform well according to different probabilistic and deterministic statistical tests.
Mathematical modeling based on ordinary differential equations: A promising approach to vaccinology
Bonin, Carla Rezende Barbosa; Fernandes, Guilherme Cortes; dos Santos, Rodrigo Weber; Lobosco, Marcelo
2017-01-01
ABSTRACT New contributions that aim to accelerate the development or to improve the efficacy and safety of vaccines arise from many different areas of research and technology. One of these areas is computational science, which traditionally participates in the initial steps, such as the pre-screening of active substances that have the potential to become a vaccine antigen. In this work, we present another promising way to use computational science in vaccinology: mathematical and computational models of important cell and protein dynamics of the immune system. A system of Ordinary Differential Equations represents different immune system populations, such as B cells and T cells, antigen presenting cells and antibodies. In this way, it is possible to simulate, in silico, the immune response to vaccines under development or under study. Distinct scenarios can be simulated by varying parameters of the mathematical model. As a proof of concept, we developed a model of the immune response to vaccination against the yellow fever. Our simulations have shown consistent results when compared with experimental data available in the literature. The model is generic enough to represent the action of other diseases or vaccines in the human immune system, such as dengue and Zika virus. PMID:28027002
Mathematical modeling based on ordinary differential equations: A promising approach to vaccinology.
Bonin, Carla Rezende Barbosa; Fernandes, Guilherme Cortes; Dos Santos, Rodrigo Weber; Lobosco, Marcelo
2017-02-01
New contributions that aim to accelerate the development or to improve the efficacy and safety of vaccines arise from many different areas of research and technology. One of these areas is computational science, which traditionally participates in the initial steps, such as the pre-screening of active substances that have the potential to become a vaccine antigen. In this work, we present another promising way to use computational science in vaccinology: mathematical and computational models of important cell and protein dynamics of the immune system. A system of Ordinary Differential Equations represents different immune system populations, such as B cells and T cells, antigen presenting cells and antibodies. In this way, it is possible to simulate, in silico, the immune response to vaccines under development or under study. Distinct scenarios can be simulated by varying parameters of the mathematical model. As a proof of concept, we developed a model of the immune response to vaccination against the yellow fever. Our simulations have shown consistent results when compared with experimental data available in the literature. The model is generic enough to represent the action of other diseases or vaccines in the human immune system, such as dengue and Zika virus.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Box, D.; Boyd, J.; Di Benedetto, V.
2016-01-01
The FabrIc for Frontier Experiments (FIFE) project is an initiative within the Fermilab Scientific Computing Division designed to steer the computing model for non-LHC Fermilab experiments across multiple physics areas. FIFE is a collaborative effort between experimenters and computing professionals to design and develop integrated computing models for experiments of varying size, needs, and infrastructure. The major focus of the FIFE project is the development, deployment, and integration of solutions for high throughput computing, data management, database access and collaboration management within an experiment. To accomplish this goal, FIFE has developed workflows that utilize Open Science Grid compute sites alongmore » with dedicated and commercial cloud resources. The FIFE project has made significant progress integrating into experiment computing operations several services including a common job submission service, software and reference data distribution through CVMFS repositories, flexible and robust data transfer clients, and access to opportunistic resources on the Open Science Grid. The progress with current experiments and plans for expansion with additional projects will be discussed. FIFE has taken the leading role in defining the computing model for Fermilab experiments, aided in the design of experiments beyond those hosted at Fermilab, and will continue to define the future direction of high throughput computing for future physics experiments worldwide.« less
Building a Semantic Framework for eScience
NASA Astrophysics Data System (ADS)
Movva, S.; Ramachandran, R.; Maskey, M.; Li, X.
2009-12-01
The e-Science vision focuses on the use of advanced computing technologies to support scientists. Recent research efforts in this area have focused primarily on “enabling” use of infrastructure resources for both data and computational access especially in Geosciences. One of the existing gaps in the existing e-Science efforts has been the failure to incorporate stable semantic technologies within the design process itself. In this presentation, we describe our effort in designing a framework for e-Science built using Service Oriented Architecture. Our framework provides users capabilities to create science workflows and mine distributed data. Our e-Science framework is being designed around a mass market tool to promote reusability across many projects. Semantics is an integral part of this framework and our design goal is to leverage the latest stable semantic technologies. The use of these stable semantic technologies will provide the users of our framework the useful features such as: allow search engines to find their content with RDFa tags; create RDF triple data store for their content; create RDF end points to share with others; and semantically mash their content with other online content available as RDF end point.
[Forensic evidence-based medicine in computer communication networks].
Qiu, Yun-Liang; Peng, Ming-Qi
2013-12-01
As an important component of judicial expertise, forensic science is broad and highly specialized. With development of network technology, increasement of information resources, and improvement of people's legal consciousness, forensic scientists encounter many new problems, and have been required to meet higher evidentiary standards in litigation. In view of this, evidence-based concept should be established in forensic medicine. We should find the most suitable method in forensic science field and other related area to solve specific problems in the evidence-based mode. Evidence-based practice can solve the problems in legal medical field, and it will play a great role in promoting the progress and development of forensic science. This article reviews the basic theory of evidence-based medicine and its effect, way, method, and evaluation in the forensic medicine in order to discuss the application value of forensic evidence-based medicine in computer communication networks.
Visual analytics as a translational cognitive science.
Fisher, Brian; Green, Tera Marie; Arias-Hernández, Richard
2011-07-01
Visual analytics is a new interdisciplinary field of study that calls for a more structured scientific approach to understanding the effects of interaction with complex graphical displays on human cognitive processes. Its primary goal is to support the design and evaluation of graphical information systems that better support cognitive processes in areas as diverse as scientific research and emergency management. The methodologies that make up this new field are as yet ill defined. This paper proposes a pathway for development of visual analytics as a translational cognitive science that bridges fundamental research in human/computer cognitive systems and design and evaluation of information systems in situ. Achieving this goal will require the development of enhanced field methods for conceptual decomposition of human/computer cognitive systems that maps onto laboratory studies, and improved methods for conducting laboratory investigations that might better map onto real-world cognitive processes in technology-rich environments. Copyright © 2011 Cognitive Science Society, Inc.
Large-Scale NASA Science Applications on the Columbia Supercluster
NASA Technical Reports Server (NTRS)
Brooks, Walter
2005-01-01
Columbia, NASA's newest 61 teraflops supercomputer that became operational late last year, is a highly integrated Altix cluster of 10,240 processors, and was named to honor the crew of the Space Shuttle lost in early 2003. Constructed in just four months, Columbia increased NASA's computing capability ten-fold, and revitalized the Agency's high-end computing efforts. Significant cutting-edge science and engineering simulations in the areas of space and Earth sciences, as well as aeronautics and space operations, are already occurring on this largest operational Linux supercomputer, demonstrating its capacity and capability to accelerate NASA's space exploration vision. The presentation will describe how an integrated environment consisting not only of next-generation systems, but also modeling and simulation, high-speed networking, parallel performance optimization, and advanced data analysis and visualization, is being used to reduce design cycle time, accelerate scientific discovery, conduct parametric analysis of multiple scenarios, and enhance safety during the life cycle of NASA missions. The talk will conclude by discussing how NAS partnered with various NASA centers, other government agencies, computer industry, and academia, to create a national resource in large-scale modeling and simulation.
Interdisciplinary Relationships in Technical Education: The CORD Perspective.
ERIC Educational Resources Information Center
Hull, Daniel M.
1990-01-01
The director of the Center for Occupational Research and Development (CORD) suggests areas in a technical curriculum that could be improved using an interdisciplinary approach: (1) systems; (2) the electromechanical core; (3) the mathematics/science base; (4) computers; and (5) interpersonal/communication skills. (Author)
So You're Going to Get an Intern!
ERIC Educational Resources Information Center
Boardman, Edna M.
1990-01-01
Presents a checklist of skills for school librarians to use with library science student interns. Areas of skills addressed include selection of materials, cataloging, vertical files, preparation of materials for shelving, circulation, audiovisual and computer equipment and procedures, administration, physical arrangement, relations with staff and…
Technology in Science and Mathematics Education.
ERIC Educational Resources Information Center
Buccino, Alphonse
Provided are several perspectives on technology, addressing changes in learners related to technology, changes in contemporary life related to technology, and changes in subject areas related to technology (indicating that technology has created such new tools for inquiry as computer programming, word processing, online database searches, and…
A Multidimensional Software Engineering Course
ERIC Educational Resources Information Center
Barzilay, O.; Hazzan, O.; Yehudai, A.
2009-01-01
Software engineering (SE) is a multidimensional field that involves activities in various areas and disciplines, such as computer science, project management, and system engineering. Though modern SE curricula include designated courses that address these various subjects, an advanced summary course that synthesizes them is still missing. Such a…
Laboratory Experiences in an Introduction to Natural Science Course.
ERIC Educational Resources Information Center
Barnard, Sister Marquita
1984-01-01
Describes a two-semester course designed to meet the needs of future elementary teachers, home economists, and occupational therapists. Laboratory work includes homemade calorimeters, inclined planes, and computing. Content areas of the course include measurement, physics, chemistry, astronomy, biology, geology, and meteorology. (JN)
Computer-aided drug discovery.
Bajorath, Jürgen
2015-01-01
Computational approaches are an integral part of interdisciplinary drug discovery research. Understanding the science behind computational tools, their opportunities, and limitations is essential to make a true impact on drug discovery at different levels. If applied in a scientifically meaningful way, computational methods improve the ability to identify and evaluate potential drug molecules, but there remain weaknesses in the methods that preclude naïve applications. Herein, current trends in computer-aided drug discovery are reviewed, and selected computational areas are discussed. Approaches are highlighted that aid in the identification and optimization of new drug candidates. Emphasis is put on the presentation and discussion of computational concepts and methods, rather than case studies or application examples. As such, this contribution aims to provide an overview of the current methodological spectrum of computational drug discovery for a broad audience.
2016-08-10
thermal decomposition and mechanical damage of energetics. The program for the meeting included nine oral presentation sessions. Discussion leaders...USA) 7:30 pm - 7:35 pm Introduction by Discussion Leader 7:35 pm - 7:50 pm Vincent Baijot (Laboratory for Analysis and Architecture of Systems , CNRS...were synthesis of new materials, performance, advanced diagnostics, experimental techniques, theoretical approaches, and computational models for
NASA Astrophysics Data System (ADS)
Strayer, Michael
2009-07-01
Welcome to San Diego and the 2009 SciDAC conference. Over the next four days, I would like to present an assessment of the SciDAC program. We will look at where we've been, how we got to where we are and where we are going in the future. Our vision is to be first in computational science, to be best in class in modeling and simulation. When Ray Orbach asked me what I would do, in my job interview for the SciDAC Director position, I said we would achieve that vision. And with our collective dedicated efforts, we have managed to achieve this vision. In the last year, we have now the most powerful supercomputer for open science, Jaguar, the Cray XT system at the Oak Ridge Leadership Computing Facility (OLCF). We also have NERSC, probably the best-in-the-world program for productivity in science that the Office of Science so depends on. And the Argonne Leadership Computing Facility offers architectural diversity with its IBM Blue Gene/P system as a counterbalance to Oak Ridge. There is also ESnet, which is often understated—the 40 gigabit per second dual backbone ring that connects all the labs and many DOE sites. In the President's Recovery Act funding, there is exciting news that ESnet is going to build out to a 100 gigabit per second network using new optical technologies. This is very exciting news for simulations and large-scale scientific facilities. But as one noted SciDAC luminary said, it's not all about the computers—it's also about the science—and we are also achieving our vision in this area. Together with having the fastest supercomputer for science, at the SC08 conference, SciDAC researchers won two ACM Gordon Bell Prizes for the outstanding performance of their applications. The DCA++ code, which solves some very interesting problems in materials, achieved a sustained performance of 1.3 petaflops, an astounding result and a mark I suspect will last for some time. The LS3DF application for studying nanomaterials also required the development of a new and novel algorithm to produce results up to 400 times faster than a similar application, and was recognized with a prize for algorithm innovation—a remarkable achievement. Day one of our conference will include examples of petascale science enabled at the OLCF. Although Jaguar has not been officially commissioned, it has gone through its acceptance tests, and during its shakedown phase there have been pioneer applications used for the acceptance tests, and they are running at scale. These include applications in the areas of astrophysics, biology, chemistry, combustion, fusion, geosciences, materials science, nuclear energy and nuclear physics. We also have a whole compendium of science we do at our facilities; these have been documented and reviewed at our last SciDAC conference. Many of these were highlighted in our Breakthroughs Report. One session at this week's conference will feature a cross-section of these breakthroughs. In the area of scalable electromagnetic simulations, the Auxiliary-space Maxwell Solver (AMS) uses specialized finite element discretizations and multigrid-based techniques, which decompose the original problem into easier-to-solve subproblems. Congratulations to the mathematicians on this. Another application on the list of breakthroughs was the authentication of PETSc, which provides scalable solvers used in many DOE applications and has solved problems with over 3 billion unknowns and scaled to over 16,000 processors on DOE leadership-class computers. This is becoming a very versatile and useful toolkit to achieve performance at scale. With the announcement of SIAM's first class of Fellows, we are remarkably well represented. Of the group of 191, more than 40 of these Fellows are in the 'DOE space.' We are so delighted that SIAM has recognized them for their many achievements. In the coming months, we will illustrate our leadership in applied math and computer science by looking at our contributions in the areas of programming models, development and performance tools, math libraries, system software, collaboration, and visualization and data analytics. This is a large and diverse list of libraries. We have asked for two panels, one chaired by David Keyes and composed of many of the nation's leading mathematicians, to produce a report on the most significant accomplishments in applied mathematics over the last eight years, taking us back to the start of the SciDAC program. In addition, we have a similar panel in computer science to be chaired by Kathy Yelick. They are going to identify the computer science accomplishments of the past eight years. These accomplishments are difficult to get a handle on, and I'm looking forward to this report. We will also have a follow-on to our report on breakthroughs in computational science and this will also go back eight years, looking at the many accomplishments under the SciDAC and INCITE programs. This will be chaired by Tony Mezzacappa. So, where are we going in the SciDAC program? It might help to take a look at computational science and how it got started. I go back to Ken Wilson, who made the model and has written on computational science and computational science education. His model was thus: The computational scientist plays the role of the experimentalist, and the math and CS researchers play the role of theorists, and the computers themselves are the experimental apparatus. And that in simulation science, we are carrying out numerical experiments as to the nature of physical and biological sciences. Peter Lax, in the same time frame, developed a report on large-scale computing in science and engineering. Peter remarked, 'Perhaps the most important applications of scientific computing come not in the solution of old problems, but in the discovery of new phenomena through numerical experimentation.' And in the early years, I think the person who provided the most guidance, the most innovation and the most vision for where the future might lie was Ed Oliver. Ed Oliver died last year. Ed did a number of things in science. He had this personality where he knew exactly what to do, but he preferred to stay out of the limelight so that others could enjoy the fruits of his vision. We in the SciDAC program and ASCR Facilities are still enjoying the benefits of his vision. We will miss him. Twenty years after Ken Wilson, Ray Orbach laid out the fundamental premise for SciDAC in an interview that appeared in SciDAC Review: 'SciDAC is unique in the world. There isn't any other program like it anywhere else, and it has the remarkable ability to do science by bringing together physical scientists, mathematicians, applied mathematicians, and computer scientists who recognize that computation is not something you do at the end, but rather it needs to be built into the solution of the very problem that one is addressing. ' As you look at the Lax report from 1982, it talks about how 'Future significant improvements may have to come from architectures embodying parallel processing elements—perhaps several thousands of processors.' And it continues, 'esearch in languages, algorithms and numerical analysis will be crucial in learning to exploit these new architectures fully.' In the early '90s, Sterling, Messina and Smith developed a workshop report on petascale computing and concluded, 'A petaflops computer system will be feasible in two decades, or less, and rely in part on the continual advancement of the semiconductor industry both in speed enhancement and cost reduction through improved fabrication processes.' So they were not wrong, and today we are embarking on a forward look that is at a different scale, the exascale, going to 1018 flops. In 2007, Stevens, Simon and Zacharia chaired a series of town hall meetings looking at exascale computing, and in their report wrote, 'Exascale computer systems are expected to be technologically feasible within the next 15 years, or perhaps sooner. These systems will push the envelope in a number of important technologies: processor architecture, scale of multicore integration, power management and packaging.' The concept of computing on the Jaguar computer involves hundreds of thousands of cores, as do the IBM systems that are currently out there. So the scale of computing with systems with billions of processors is staggering to me, and I don't know how the software and math folks feel about it. We have now embarked on a road toward extreme scale computing. We have created a series of town hall meetings and we are now in the process of holding workshops that address what I call within the DOE speak 'the mission need,' or what is the scientific justification for computing at that scale. We are going to have a total of 13 workshops. The workshops on climate, high energy physics, nuclear physics, fusion, and nuclear energy have been held. The report from the workshop on climate is actually out and available, and the other reports are being completed. The upcoming workshops are on biology, materials, and chemistry; and workshops that engage science for nuclear security are a partnership between NNSA and ASCR. There are additional workshops on applied math, computer science, and architecture that are needed for computing at the exascale. These extreme scale workshops will provide the foundation in our office, the Office of Science, the NNSA and DOE, and we will engage the National Science Foundation and the Department of Defense as partners. We envision a 10-year program for an exascale initiative. It will be an integrated R&D program initially—you can think about five years for research and development—that would be in hardware, operating systems, file systems, networking and so on, as well as software for applications. Application software and the operating system and the hardware all need to be bundled in this period so that at the end the system will execute the science applications at scale. We also believe that this process will have to have considerable investment from the manufacturers and vendors to be successful. We have formed laboratory, university and industry working groups to start this process and formed a panel to look at where SciDAC needs to go to compute at the extreme scale, and we have formed an executive committee within the Office of Science and the NNSA to focus on these activities. We will have outreach to DoD in the next few months. We are anticipating a solicitation within the next two years in which we will compete this bundled R&D process. We don't know how we will incorporate SciDAC into extreme scale computing, but we do know there will be many challenges. And as we have shown over the years, we have the expertise and determination to surmount these challenges.
NASA Astrophysics Data System (ADS)
Stoilescu, Dorian; Egodawatte, Gunawardena
2010-12-01
Research shows that female and male students in undergraduate computer science programs view computer culture differently. Female students are interested more in the use of computers than in doing programming, whereas male students see computer science mainly as a programming activity. The overall purpose of our research was not to find new definitions for computer science culture but to see how male and female students see themselves involved in computer science practices, how they see computer science as a successful career, and what they like and dislike about current computer science practices. The study took place in a mid-sized university in Ontario. Sixteen students and two instructors were interviewed to get their views. We found that male and female views are different on computer use, programming, and the pattern of student interactions. Female and male students did not have any major issues in using computers. In computing programming, female students were not so involved in computing activities whereas male students were heavily involved. As for the opinions about successful computer science professionals, both female and male students emphasized hard working, detailed oriented approaches, and enjoying playing with computers. The myth of the geek as a typical profile of successful computer science students was not found to be true.
Experimental quantum computing to solve systems of linear equations.
Cai, X-D; Weedbrook, C; Su, Z-E; Chen, M-C; Gu, Mile; Zhu, M-J; Li, Li; Liu, Nai-Le; Lu, Chao-Yang; Pan, Jian-Wei
2013-06-07
Solving linear systems of equations is ubiquitous in all areas of science and engineering. With rapidly growing data sets, such a task can be intractable for classical computers, as the best known classical algorithms require a time proportional to the number of variables N. A recently proposed quantum algorithm shows that quantum computers could solve linear systems in a time scale of order log(N), giving an exponential speedup over classical computers. Here we realize the simplest instance of this algorithm, solving 2×2 linear equations for various input vectors on a quantum computer. We use four quantum bits and four controlled logic gates to implement every subroutine required, demonstrating the working principle of this algorithm.
NASA Technical Reports Server (NTRS)
Ross, Elizabeth G.
1997-01-01
This document presents findings based on a third-year evaluation of Trenholm State (AL) Technical College's National Aeronautics and Space Administration (NASA) - supported High School Science Enrichment Program (HSSEP). HSSEP is an external (to school) program for area students from groups that are underrepresented in the mathematics, science, engineering and technology (MSET) professions. In addition to gaining insight into scientific careers, HSSEP participants learn about and deliver presentations that focus on mathematics applications, scientific problem-solving and computer programming during a seven-week summer or 10-week Academic-Year Saturday session.
Computational methods to extract meaning from text and advance theories of human cognition.
McNamara, Danielle S
2011-01-01
Over the past two decades, researchers have made great advances in the area of computational methods for extracting meaning from text. This research has to a large extent been spurred by the development of latent semantic analysis (LSA), a method for extracting and representing the meaning of words using statistical computations applied to large corpora of text. Since the advent of LSA, researchers have developed and tested alternative statistical methods designed to detect and analyze meaning in text corpora. This research exemplifies how statistical models of semantics play an important role in our understanding of cognition and contribute to the field of cognitive science. Importantly, these models afford large-scale representations of human knowledge and allow researchers to explore various questions regarding knowledge, discourse processing, text comprehension, and language. This topic includes the latest progress by the leading researchers in the endeavor to go beyond LSA. Copyright © 2010 Cognitive Science Society, Inc.
NASA Astrophysics Data System (ADS)
Elishakoff, I.; Sarlin, N.
2016-06-01
In this paper we provide a general methodology of analysis and design of systems involving uncertainties. Available experimental data is enclosed by some geometric figures (triangle, rectangle, ellipse, parallelogram, super ellipse) of minimum area. Then these areas are inflated resorting to the Chebyshev inequality in order to take into account the forecasted data. Next step consists in evaluating response of system when uncertainties are confined to one of the above five suitably inflated geometric figures. This step involves a combined theoretical and computational analysis. We evaluate the maximum response of the system subjected to variation of uncertain parameters in each hypothesized region. The results of triangular, interval, ellipsoidal, parallelogram, and super ellipsoidal calculi are compared with the view of identifying the region that leads to minimum of maximum response. That response is identified as a result of the suggested predictive inference. The methodology thus synthesizes probabilistic notion with each of the five calculi. Using the term "pillar" in the title was inspired by the News Release (2013) on according Honda Prize to J. Tinsley Oden, stating, among others, that "Dr. Oden refers to computational science as the "third pillar" of scientific inquiry, standing beside theoretical and experimental science. Computational science serves as a new paradigm for acquiring knowledge and informing decisions important to humankind". Analysis of systems with uncertainties necessitates employment of all three pillars. The analysis is based on the assumption that that the five shapes are each different conservative estimates of the true bounding region. The smallest of the maximal displacements in x and y directions (for a 2D system) therefore provides the closest estimate of the true displacements based on the above assumption.
Dahlö, Martin; Scofield, Douglas G; Schaal, Wesley; Spjuth, Ola
2018-05-01
Next-generation sequencing (NGS) has transformed the life sciences, and many research groups are newly dependent upon computer clusters to store and analyze large datasets. This creates challenges for e-infrastructures accustomed to hosting computationally mature research in other sciences. Using data gathered from our own clusters at UPPMAX computing center at Uppsala University, Sweden, where core hour usage of ∼800 NGS and ∼200 non-NGS projects is now similar, we compare and contrast the growth, administrative burden, and cluster usage of NGS projects with projects from other sciences. The number of NGS projects has grown rapidly since 2010, with growth driven by entry of new research groups. Storage used by NGS projects has grown more rapidly since 2013 and is now limited by disk capacity. NGS users submit nearly twice as many support tickets per user, and 11 more tools are installed each month for NGS projects than for non-NGS projects. We developed usage and efficiency metrics and show that computing jobs for NGS projects use more RAM than non-NGS projects, are more variable in core usage, and rarely span multiple nodes. NGS jobs use booked resources less efficiently for a variety of reasons. Active monitoring can improve this somewhat. Hosting NGS projects imposes a large administrative burden at UPPMAX due to large numbers of inexperienced users and diverse and rapidly evolving research areas. We provide a set of recommendations for e-infrastructures that host NGS research projects. We provide anonymized versions of our storage, job, and efficiency databases.
2018-01-01
Abstract Background Next-generation sequencing (NGS) has transformed the life sciences, and many research groups are newly dependent upon computer clusters to store and analyze large datasets. This creates challenges for e-infrastructures accustomed to hosting computationally mature research in other sciences. Using data gathered from our own clusters at UPPMAX computing center at Uppsala University, Sweden, where core hour usage of ∼800 NGS and ∼200 non-NGS projects is now similar, we compare and contrast the growth, administrative burden, and cluster usage of NGS projects with projects from other sciences. Results The number of NGS projects has grown rapidly since 2010, with growth driven by entry of new research groups. Storage used by NGS projects has grown more rapidly since 2013 and is now limited by disk capacity. NGS users submit nearly twice as many support tickets per user, and 11 more tools are installed each month for NGS projects than for non-NGS projects. We developed usage and efficiency metrics and show that computing jobs for NGS projects use more RAM than non-NGS projects, are more variable in core usage, and rarely span multiple nodes. NGS jobs use booked resources less efficiently for a variety of reasons. Active monitoring can improve this somewhat. Conclusions Hosting NGS projects imposes a large administrative burden at UPPMAX due to large numbers of inexperienced users and diverse and rapidly evolving research areas. We provide a set of recommendations for e-infrastructures that host NGS research projects. We provide anonymized versions of our storage, job, and efficiency databases. PMID:29659792
Parallel Architectures for Planetary Exploration Requirements (PAPER)
NASA Astrophysics Data System (ADS)
Cezzar, Ruknet
1993-08-01
The project's main contributions have been in the area of student support. Throughout the project, at least one, in some cases two, undergraduate students have been supported. By working with the project, these students gained valuable knowledge involving the scientific research project, including the not-so-pleasant reporting requirements to the funding agencies. The other important contribution was towards the establishment of a graduate program in computer science at Hampton University. Primarily, the PAPER project has served as the main research basis in seeking funds from other agencies, such as the National Science Foundation, for establishing a research infrastructure in the department. In technical areas, especially in the first phase, we believe the trip to Jet Propulsion Laboratory, and gathering together all the pertinent information involving experimental computer architectures aimed for planetary explorations was very helpful. Indeed, if this effort is to be revived in the future due to congressional funding for planetary explorations, say an unmanned mission to Mars, our interim report will be an important starting point. In other technical areas, our simulator has pinpointed and highlighted several important performance issues related to the design of operating system kernels for MIMD machines. In particular, the critical issue of how the kernel itself will run in parallel on a multiple-processor system has been addressed through the various ready list organization and access policies. In the area of neural computing, our main contribution was an introductory tutorial package to familiarize the researchers at NASA with this new and promising field zone axes (20). Finally, we have introduced the notion of reversibility in programming systems which may find applications in various areas of space research.
Parallel Architectures for Planetary Exploration Requirements (PAPER)
NASA Technical Reports Server (NTRS)
Cezzar, Ruknet
1993-01-01
The project's main contributions have been in the area of student support. Throughout the project, at least one, in some cases two, undergraduate students have been supported. By working with the project, these students gained valuable knowledge involving the scientific research project, including the not-so-pleasant reporting requirements to the funding agencies. The other important contribution was towards the establishment of a graduate program in computer science at Hampton University. Primarily, the PAPER project has served as the main research basis in seeking funds from other agencies, such as the National Science Foundation, for establishing a research infrastructure in the department. In technical areas, especially in the first phase, we believe the trip to Jet Propulsion Laboratory, and gathering together all the pertinent information involving experimental computer architectures aimed for planetary explorations was very helpful. Indeed, if this effort is to be revived in the future due to congressional funding for planetary explorations, say an unmanned mission to Mars, our interim report will be an important starting point. In other technical areas, our simulator has pinpointed and highlighted several important performance issues related to the design of operating system kernels for MIMD machines. In particular, the critical issue of how the kernel itself will run in parallel on a multiple-processor system has been addressed through the various ready list organization and access policies. In the area of neural computing, our main contribution was an introductory tutorial package to familiarize the researchers at NASA with this new and promising field zone axes (20). Finally, we have introduced the notion of reversibility in programming systems which may find applications in various areas of space research.
ERIC Educational Resources Information Center
Lin, Che-Li; Liang, Jyh-Chong; Su, Yi-Ching; Tsai, Chin-Chung
2013-01-01
Teacher-centered instruction has been widely adopted in college computer science classrooms and has some benefits in training computer science undergraduates. Meanwhile, student-centered contexts have been advocated to promote computer science education. How computer science learners respond to or prefer the two types of teacher authority,…
Cloud computing approaches to accelerate drug discovery value chain.
Garg, Vibhav; Arora, Suchir; Gupta, Chitra
2011-12-01
Continued advancements in the area of technology have helped high throughput screening (HTS) evolve from a linear to parallel approach by performing system level screening. Advanced experimental methods used for HTS at various steps of drug discovery (i.e. target identification, target validation, lead identification and lead validation) can generate data of the order of terabytes. As a consequence, there is pressing need to store, manage, mine and analyze this data to identify informational tags. This need is again posing challenges to computer scientists to offer the matching hardware and software infrastructure, while managing the varying degree of desired computational power. Therefore, the potential of "On-Demand Hardware" and "Software as a Service (SAAS)" delivery mechanisms cannot be denied. This on-demand computing, largely referred to as Cloud Computing, is now transforming the drug discovery research. Also, integration of Cloud computing with parallel computing is certainly expanding its footprint in the life sciences community. The speed, efficiency and cost effectiveness have made cloud computing a 'good to have tool' for researchers, providing them significant flexibility, allowing them to focus on the 'what' of science and not the 'how'. Once reached to its maturity, Discovery-Cloud would fit best to manage drug discovery and clinical development data, generated using advanced HTS techniques, hence supporting the vision of personalized medicine.
Management Information Systems, Planning, and Public Community Colleges.
ERIC Educational Resources Information Center
Ritch, Stephen W.; Munro, Robert J.
Management Information Systems (MIS), originally developed in the areas of accounting, management science, and computer processing, are now being applied to decision-making in educational settings. Definitions of MIS are numerous and often vague, but management systems (as distinguished from other information systems) should promote real-time…
In Their Own Words: Dealing with Dyslexia | NIH MedlinePlus the Magazine
... occurs in people of all backgrounds and intellectual levels. People with dyslexia can be very bright. They are often capable or even gifted in areas such as art, computer science, design, drama, electronics, math, mechanics, music, physics, sales, and sports. Some of ...
Interactive Technologies and the Social Studies. Emerging Issues and Applications.
ERIC Educational Resources Information Center
Martorella, Peter H., Ed.
This book includes contributions from seven authors with diverse backgrounds, whose specializations include the area of social studies education, software development, computer science, and visual design. The chapters are: (1) "Online Learning Communities: Implications for the Social Studies" (Lynn A. Fontana); (2) "Bringing Preservice Teachers…
Improving Family Forest Knowledge Transfer through Social Network Analysis
ERIC Educational Resources Information Center
Gorczyca, Erika L.; Lyons, Patrick W.; Leahy, Jessica E.; Johnson, Teresa R.; Straub, Crista L.
2012-01-01
To better engage Maine's family forest landowners our study used social network analysis: a computational social science method for identifying stakeholders, evaluating models of engagement, and targeting areas for enhanced partnerships. Interviews with researchers associated with a research center were conducted to identify how social network…
Exploring TeleRobotics: A Radio-Controlled Robot
ERIC Educational Resources Information Center
Deal, Walter F., III; Hsiung, Steve C.
2007-01-01
Robotics is a rich and exciting multidisciplinary area to study and learn about electronics and control technology. The interest in robotic devices and systems provides the technology teacher with an excellent opportunity to make many concrete connections between electronics, control technology, and computers and science, engineering, and…
NESTOR: A Computer-Based Medical Diagnostic Aid That Integrates Causal and Probabilistic Knowledge.
1984-11-01
indiidual conditional probabilities between one cause node and its effect node, but less common to know a joint conditional probability between a...PERFOAMING ORG. REPORT NUMBER * 7. AUTI4ORs) O Gregory F. Cooper 1 CONTRACT OR GRANT NUMBERIa) ONR N00014-81-K-0004 g PERFORMING ORGANIZATION NAME AND...ADDRESS 10. PROGRAM ELEMENT, PROJECT. TASK Department of Computer Science AREA & WORK UNIT NUMBERS Stanford University Stanford, CA 94305 USA 12. REPORT
2016-09-01
Sciences Group 6% 1550s Computer Scientists Group 5% Other 1500s ORSAa, Mathematics, & Statistics Group 3% 1600s Equipment & Facilities Group 4...Employee removal based on misconduct, delinquency , suitability, unsatisfactory performance, or failure to qualify for conversion to a career appointment...average of 10.4% in many areas, but over double the average for the 1550s (Computer Scientists) and other 1500s (ORSA, Mathematics, and Statistics ). Also
Visions of the Future - the Changing Role of Actors in Data-Intensive Science
NASA Astrophysics Data System (ADS)
Schäfer, L.; Klump, J. F.
2013-12-01
Around the world scientific disciplines are increasingly facing the challenge of a burgeoning volume of research data. This data avalanche consists of a stream of information generated from sensors and scientific instruments, digital recordings, social-science surveys or drawn from the World Wide Web. All areas of the scientific economy are affected by this rapid growth in data, from the logging of digs in Archaeology, telescope data with observations of distant galaxies in Astrophysics or data from polls and surveys in the Social Sciences. The challenge for science is not only to process the data through analysis, reduction and visualization, but also to set up infrastructures for provisioning and storing the data. The rise of new technologies and developments also poses new challenges for the actors in the area of research data infrastructures. Libraries, as one of the actors, enable access to digital media and support the publication of research data and its long-term archiving. Digital media and research data, however, introduce new aspects into the libraries' range of activities. How are we to imagine the library of the future? The library as an interface to the computer centers? Will library and computer center fuse into a new service unit? What role will scientific publishers play in future? Currently the traditional form of publication still carry greater weight - articles for conferences and journals. But will this still be the case in future? New forms of publication are already making their presence felt. The tasks of the computer centers may also change. Yesterday their remit was provisioning of rapid hardware, whereas now everything revolves around the topic of data and services. Finally, how about the researchers themselves? Not such a long time ago, Geoscience was not necessarily seen as linked to Computer Science. Nowadays, modern Geoscience relies heavily on IT and its techniques. Thus, in how far will the profile of the modern geoscientist change? This gives rise to the question of what tools are required to locate and pursue the correct course in a networked world. One tool from the area of innovation management is the scenario technique. This poster will outline visions of the future as possible developments of the scientific world in 2020 (or later). The scenarios presented will show possible developments - both positive and negative. It is up then to the actors themselves to define their own position in this context, to rethink it and consider steps that can achieve a positive development for the future.
Academic computer science and gender: A naturalistic study investigating the causes of attrition
NASA Astrophysics Data System (ADS)
Declue, Timothy Hall
Far fewer women than men take computer science classes in high school, enroll in computer science programs in college, or complete advanced degrees in computer science. The computer science pipeline begins to shrink for women even before entering college, but it is at the college level that the "brain drain" is the most evident numerically, especially in the first class taken by most computer science majors called "Computer Science 1" or CS-I. The result, for both academia and industry, is a pronounced technological gender disparity in academic and industrial computer science. The study revealed the existence of several factors influencing success in CS-I. First, and most clearly, the effect of attribution processes seemed to be quite strong. These processes tend to work against success for females and in favor of success for males. Likewise, evidence was discovered which strengthens theories related to prior experience and the perception that computer science has a culture which is hostile to females. Two unanticipated themes related to the motivation and persistence of successful computer science majors. The findings did not support the belief that females have greater logistical problems in computer science than males, or that females tend to have a different programming style than males which adversely affects the females' ability to succeed in CS-I.
NASA Astrophysics Data System (ADS)
Teodorescu, Liliana; Britton, David; Glover, Nigel; Heinrich, Gudrun; Lauret, Jérôme; Naumann, Axel; Speer, Thomas; Teixeira-Dias, Pedro
2012-06-01
ACAT2011 This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011) which took place on 5-7 September 2011 at Brunel University, UK. The workshop series, which began in 1990 in Lyon, France, brings together computer science researchers and practitioners, and researchers from particle physics and related fields in order to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. It is a forum for the exchange of ideas among the fields, exploring and promoting cutting-edge computing, data analysis and theoretical calculation techniques in fundamental physics research. This year's edition of the workshop brought together over 100 participants from all over the world. 14 invited speakers presented key topics on computing ecosystems, cloud computing, multivariate data analysis, symbolic and automatic theoretical calculations as well as computing and data analysis challenges in astrophysics, bioinformatics and musicology. Over 80 other talks and posters presented state-of-the art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. Panel and round table discussions on data management and multivariate data analysis uncovered new ideas and collaboration opportunities in the respective areas. This edition of ACAT was generously sponsored by the Science and Technology Facility Council (STFC), the Institute for Particle Physics Phenomenology (IPPP) at Durham University, Brookhaven National Laboratory in the USA and Dell. We would like to thank all the participants of the workshop for the high level of their scientific contributions and for the enthusiastic participation in all its activities which were, ultimately, the key factors in the success of the workshop. Further information on ACAT 2011 can be found at http://acat2011.cern.ch Dr Liliana Teodorescu Brunel University ACATgroup The PDF also contains details of the workshop's committees and sponsors.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spentzouris, P.; /Fermilab; Cary, J.
The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessarymore » accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors. ComPASS is in the first year of executing its plan to develop the next-generation HPC accelerator modeling tools. ComPASS aims to develop an integrated simulation environment that will utilize existing and new accelerator physics modules with petascale capabilities, by employing modern computing and solver technologies. The ComPASS vision is to deliver to accelerator scientists a virtual accelerator and virtual prototyping modeling environment, with the necessary multiphysics, multiscale capabilities. The plan for this development includes delivering accelerator modeling applications appropriate for each stage of the ComPASS software evolution. Such applications are already being used to address challenging problems in accelerator design and optimization. The ComPASS organization for software development and applications accounts for the natural domain areas (beam dynamics, electromagnetics, and advanced acceleration), and all areas depend on the enabling technologies activities, such as solvers and component technology, to deliver the desired performance and integrated simulation environment. The ComPASS applications focus on computationally challenging problems important for design or performance optimization to all major HEP, NP, and BES accelerator facilities. With the cost and complexity of particle accelerators rising, the use of computation to optimize their designs and find improved operating regimes becomes essential, potentially leading to significant cost savings with modest investment.« less
Training the Future - Interns Harvesting & Testing Plant Experim
2017-07-19
In the Space Life Sciences Laboratory at NASA's Kennedy Space Center in Florida, student interns such as Ayla Grandpre, left, and Payton Barnwell are joining agency scientists, contributing in the area of plant growth research for food production in space. Grandpre is pursuing a degree in computer science and chemistry at Rocky Mountain College in Billings, Montana. Barnwell is a mechanical engineering and nanotechnology major at Florida Polytechnic University. The agency attracts its future workforce through the NASA Internship, Fellowships and Scholarships, or NIFS, Program.
Pacific Northwest National Laboratory Annual Site Environmental Report for Calendar Year 2013
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duncan, Joanne P.; Sackschewsky, Michael R.; Tilden, Harold T.
2014-09-30
Pacific Northwest National Laboratory (PNNL), one of the U.S. Department of Energy (DOE) Office of Science’s 10 national laboratories, provides innovative science and technology development in the areas of energy and the environment, fundamental and computational science, and national security. DOE’s Pacific Northwest Site Office (PNSO) is responsible for oversight of PNNL at its Campus in Richland, Washington, as well as its facilities in Sequim, Seattle, and North Bonneville, Washington, and Corvallis and Portland, Oregon.
Computer-Game Construction: A Gender-Neutral Attractor to Computing Science
ERIC Educational Resources Information Center
Carbonaro, Mike; Szafron, Duane; Cutumisu, Maria; Schaeffer, Jonathan
2010-01-01
Enrollment in Computing Science university programs is at a dangerously low level. A major reason for this is the general lack of interest in Computing Science by females. In this paper, we discuss our experience with using a computer game construction environment as a vehicle to encourage female participation in Computing Science. Experiments…
NASA Technical Reports Server (NTRS)
Leiner, Barry M.; Gross, Anthony R. (Technical Monitor)
2002-01-01
The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. Operated by the Universities Space Research Association (a non-profit university consortium), RIACS is located at the NASA Ames Research Center, Moffett Field, California. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in September 2003. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology (IT) Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1) Automated Reasoning for Autonomous Systems; 2) Human-Centered Computing; and 3) High Performance Computing and Networking. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains including aerospace technology, earth science, life sciences, and astrobiology. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.
Development and application of unified algorithms for problems in computational science
NASA Technical Reports Server (NTRS)
Shankar, Vijaya; Chakravarthy, Sukumar
1987-01-01
A framework is presented for developing computationally unified numerical algorithms for solving nonlinear equations that arise in modeling various problems in mathematical physics. The concept of computational unification is an attempt to encompass efficient solution procedures for computing various nonlinear phenomena that may occur in a given problem. For example, in Computational Fluid Dynamics (CFD), a unified algorithm will be one that allows for solutions to subsonic (elliptic), transonic (mixed elliptic-hyperbolic), and supersonic (hyperbolic) flows for both steady and unsteady problems. The objectives are: development of superior unified algorithms emphasizing accuracy and efficiency aspects; development of codes based on selected algorithms leading to validation; application of mature codes to realistic problems; and extension/application of CFD-based algorithms to problems in other areas of mathematical physics. The ultimate objective is to achieve integration of multidisciplinary technologies to enhance synergism in the design process through computational simulation. Specific unified algorithms for a hierarchy of gas dynamics equations and their applications to two other areas: electromagnetic scattering, and laser-materials interaction accounting for melting.
NASA Astrophysics Data System (ADS)
Cofino, A. S.; Fernández Quiruelas, V.; Blanco Real, J. C.; García Díez, M.; Fernández, J.
2013-12-01
Nowadays Grid Computing is powerful computational tool which is ready to be used for scientific community in different areas (such as biomedicine, astrophysics, climate, etc.). However, the use of this distributed computing infrastructures (DCI) is not yet common practice in climate research, and only a few teams and applications in this area take advantage of this infrastructure. Thus, the WRF4G project objective is to popularize the use of this technology in the atmospheric sciences area. In order to achieve this objective, one of the most used applications has been taken (WRF; a limited- area model, successor of the MM5 model), that has a user community formed by more than 8000 researchers worldwide. This community develop its research activity on different areas and could benefit from the advantages of Grid resources (case study simulations, regional hind-cast/forecast, sensitivity studies, etc.). The WRF model is used by many groups, in the climate research community, to carry on downscaling simulations. Therefore this community will also benefit. However, Grid infrastructures have some drawbacks for the execution of applications that make an intensive use of CPU and memory for a long period of time. This makes necessary to develop a specific framework (middleware). This middleware encapsulates the application and provides appropriate services for the monitoring and management of the simulations and the data. Thus,another objective of theWRF4G project consists on the development of a generic adaptation of WRF to DCIs. It should simplify the access to the DCIs for the researchers, and also to free them from the technical and computational aspects of the use of theses DCI. Finally, in order to demonstrate the ability of WRF4G solving actual scientific challenges with interest and relevance on the climate science (implying a high computational cost) we will shown results from different kind of downscaling experiments, like ERA-Interim re-analysis, CMIP5 models, or seasonal. WRF4G is been used to run WRF simulations which are contributing to the CORDEX initiative and others projects like SPECS and EUPORIAS. This work is been partially funded by the European Regional Development Fund (ERDF) and the Spanish National R&D Plan 2008-2011 (CGL2011-28864)
Determining Asset Criticality for Cyber Defense
2011-09-23
sciences area that may be applied to our situation. In particular, Analytic Hierarchy Process ( AHP ) [20] and Hierarchical TOPSIS [21] [22] are some examples...34 Mathematical and Computer Modeling, vol. 45, no. 7-8, pp. 801-813, 2007. 33 [22] Jia-Wen Wang, Ching-Hsue Cheng, and Kun-Cheng Huang, " Fuzzy
Electronic Structure Theory | Materials Science | NREL
design and discover materials for energy applications. This includes detailed studies of the physical computing. Key Research Areas Materials by Design NREL leads the U.S. Department of Energy's Center for Next Generation of Materials by Design, which incorporates metastability and synthesizability. Learn more about
An Interactive Learning Environment for Information and Communication Theory
ERIC Educational Resources Information Center
Hamada, Mohamed; Hassan, Mohammed
2017-01-01
Interactive learning tools are emerging as effective educational materials in the area of computer science and engineering. It is a research domain that is rapidly expanding because of its positive impacts on motivating and improving students' performance during the learning process. This paper introduces an interactive learning environment for…
Industry/Postsecondary Education Partnership for Faculty Development.
ERIC Educational Resources Information Center
Zanville, Holly
The project addressed the need for Oregon higher education faculty to receive state-of-the art information from Oregon businesses and industries in computer science, business, and engineering areas. Planning for a statewide interactive Eudcational Television Network (ED-NET) has been underway in Oregon for several years. The network will involve…
Usage of Computers and Calculators and Students' Achievement: Results from TIMSS 2003
ERIC Educational Resources Information Center
Antonijevic, Radovan
2007-01-01
The paper deals with the facts obtained from TIMSS 2003 (Trends in International Mathematics and Science Study). This international comparative study, which includes 47 participant countries worldwide, explores dependence between eighth grade students' achievement in the areas of mathematics, physics, chemistry, biology and geography, and basic…
Computer Techniques for Studying Coverage, Overlaps, and Gaps in Collections.
ERIC Educational Resources Information Center
White, Howard D.
1987-01-01
Describes techniques for using the Statistical Package for the Social Sciences (SSPS) to create tables for cooperative collection development across a number of libraries. Specific commands are given to generate holdings profiles focusing on collection coverage, overlaps, gaps, or other areas of interest, from a master bibliographic list. (CLB)
ERIC Educational Resources Information Center
Zendler, Andreas; Klaudt, Dieter
2012-01-01
The significance of computer science for economics and society is undisputed. In particular, computer science is acknowledged to play a key role in schools (e.g., by opening multiple career paths). The provision of effective computer science education in schools is dependent on teachers who are able to properly represent the discipline and whose…
NASA Astrophysics Data System (ADS)
Quigley, Mark Declan
The purpose of this researcher was to examine specific environmental, educational, and demographic factors and their influence on mathematics and science achievement. In particular, the researcher ascertained the interconnections of home computer access and social capital, with Asian American students and the effect on mathematics and science achievement. Coleman's theory on social capital and parental influence was used as a basis for the analysis of data. Subjects for this study were the base year students from the National Education Longitudinal Study of 1988 (NELS:88) and the subsequent follow-up survey data in 1990, 1992, and 1994. The approximate sample size for this study is 640 ethnic Asians from the NELS:88 database. The analysis was a longitudinal study based on the Student and Parent Base Year responses and the Second Follow-up survey of 1992, when the subjects were in 12th grade. Achievement test results from the NELS:88 data were used to measure achievement in mathematics and science. The NELS:88 test battery was developed to measure both individual status and a student's growth in a number of achievement areas. The subject's responses were analyzed by principal components factor analysis, weights, effect sizes, hierarchial regression analysis, and PLSPath Analysis. The results of this study were that prior ability in mathematics and science is a major influence in the student's educational achievement. Findings from the study support the view that home computer access has a negative direct effect on mathematics and science achievement for both Asian American males and females. None of the social capital factors in the study had either a negative or positive direct effect on mathematics and science achievement although some indirect effects were found. Suggestions were made toward increasing parental involvement in their children's academic endeavors. Computer access in the home should be considered related to television viewing and should be closely monitored by the parents to promote educational uses.
Arctic Boreal Vulnerability Experiment (ABoVE) Science Cloud
NASA Astrophysics Data System (ADS)
Duffy, D.; Schnase, J. L.; McInerney, M.; Webster, W. P.; Sinno, S.; Thompson, J. H.; Griffith, P. C.; Hoy, E.; Carroll, M.
2014-12-01
The effects of climate change are being revealed at alarming rates in the Arctic and Boreal regions of the planet. NASA's Terrestrial Ecology Program has launched a major field campaign to study these effects over the next 5 to 8 years. The Arctic Boreal Vulnerability Experiment (ABoVE) will challenge scientists to take measurements in the field, study remote observations, and even run models to better understand the impacts of a rapidly changing climate for areas of Alaska and western Canada. The NASA Center for Climate Simulation (NCCS) at the Goddard Space Flight Center (GSFC) has partnered with the Terrestrial Ecology Program to create a science cloud designed for this field campaign - the ABoVE Science Cloud. The cloud combines traditional high performance computing with emerging technologies to create an environment specifically designed for large-scale climate analytics. The ABoVE Science Cloud utilizes (1) virtualized high-speed InfiniBand networks, (2) a combination of high-performance file systems and object storage, and (3) virtual system environments tailored for data intensive, science applications. At the center of the architecture is a large object storage environment, much like a traditional high-performance file system, that supports data proximal processing using technologies like MapReduce on a Hadoop Distributed File System (HDFS). Surrounding the storage is a cloud of high performance compute resources with many processing cores and large memory coupled to the storage through an InfiniBand network. Virtual systems can be tailored to a specific scientist and provisioned on the compute resources with extremely high-speed network connectivity to the storage and to other virtual systems. In this talk, we will present the architectural components of the science cloud and examples of how it is being used to meet the needs of the ABoVE campaign. In our experience, the science cloud approach significantly lowers the barriers and risks to organizations that require high performance computing solutions and provides the NCCS with the agility required to meet our customers' rapidly increasing and evolving requirements.
Geometric modeling for computer aided design
NASA Technical Reports Server (NTRS)
Schwing, James L.
1993-01-01
Over the past several years, it has been the primary goal of this grant to design and implement software to be used in the conceptual design of aerospace vehicles. The work carried out under this grant was performed jointly with members of the Vehicle Analysis Branch (VAB) of NASA LaRC, Computer Sciences Corp., and Vigyan Corp. This has resulted in the development of several packages and design studies. Primary among these are the interactive geometric modeling tool, the Solid Modeling Aerospace Research Tool (smart), and the integration and execution tools provided by the Environment for Application Software Integration and Execution (EASIE). In addition, it is the purpose of the personnel of this grant to provide consultation in the areas of structural design, algorithm development, and software development and implementation, particularly in the areas of computer aided design, geometric surface representation, and parallel algorithms.
Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing
2011-01-01
Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century. PMID:21444779
Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing
2011-04-05
Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century.
Python in the NERSC Exascale Science Applications Program for Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ronaghi, Zahra; Thomas, Rollin; Deslippe, Jack
We describe a new effort at the National Energy Re- search Scientific Computing Center (NERSC) in performance analysis and optimization of scientific Python applications targeting the Intel Xeon Phi (Knights Landing, KNL) many- core architecture. The Python-centered work outlined here is part of a larger effort called the NERSC Exascale Science Applications Program (NESAP) for Data. NESAP for Data focuses on applications that process and analyze high-volume, high-velocity data sets from experimental/observational science (EOS) facilities supported by the US Department of Energy Office of Science. We present three case study applications from NESAP for Data that use Python. These codesmore » vary in terms of “Python purity” from applications developed in pure Python to ones that use Python mainly as a convenience layer for scientists without expertise in lower level programming lan- guages like C, C++ or Fortran. The science case, requirements, constraints, algorithms, and initial performance optimizations for each code are discussed. Our goal with this paper is to contribute to the larger conversation around the role of Python in high-performance computing today and tomorrow, highlighting areas for future work and emerging best practices« less
ERIC Educational Resources Information Center
Menekse, Muhsin
2015-01-01
While there has been a remarkable interest to make computer science a core K-12 academic subject in the United States, there is a shortage of K-12 computer science teachers to successfully implement computer sciences courses in schools. In order to enhance computer science teacher capacity, training programs have been offered through teacher…
Schneider, Maria Victoria; Griffin, Philippa C; Tyagi, Sonika; Flannery, Madison; Dayalan, Saravanan; Gladman, Simon; Watson-Haigh, Nathan; Bayer, Philipp E; Charleston, Michael; Cooke, Ira; Cook, Rob; Edwards, Richard J; Edwards, David; Gorse, Dominique; McConville, Malcolm; Powell, David; Wilkins, Marc R; Lonie, Andrew
2017-06-30
EMBL Australia Bioinformatics Resource (EMBL-ABR) is a developing national research infrastructure, providing bioinformatics resources and support to life science and biomedical researchers in Australia. EMBL-ABR comprises 10 geographically distributed national nodes with one coordinating hub, with current funding provided through Bioplatforms Australia and the University of Melbourne for its initial 2-year development phase. The EMBL-ABR mission is to: (1) increase Australia's capacity in bioinformatics and data sciences; (2) contribute to the development of training in bioinformatics skills; (3) showcase Australian data sets at an international level and (4) enable engagement in international programs. The activities of EMBL-ABR are focussed in six key areas, aligning with comparable international initiatives such as ELIXIR, CyVerse and NIH Commons. These key areas-Tools, Data, Standards, Platforms, Compute and Training-are described in this article. © The Author 2017. Published by Oxford University Press.
NASA Technical Reports Server (NTRS)
Farrell, C. E.; Krauze, L. D.
1983-01-01
The IDEAS computer of NASA is a tool for interactive preliminary design and analysis of LSS (Large Space System). Nine analysis modules were either modified or created. These modules include the capabilities of automatic model generation, model mass properties calculation, model area calculation, nonkinematic deployment modeling, rigid-body controls analysis, RF performance prediction, subsystem properties definition, and EOS science sensor selection. For each module, a section is provided that contains technical information, user instructions, and programmer documentation.
Computational Fluid Dynamics Program at NASA Ames Research Center
NASA Technical Reports Server (NTRS)
Holst, Terry L.
1989-01-01
The Computational Fluid Dynamics (CFD) Program at NASA Ames Research Center is reviewed and discussed. The technical elements of the CFD Program are listed and briefly discussed. These elements include algorithm research, research and pilot code development, scientific visualization, advanced surface representation, volume grid generation, and numerical optimization. Next, the discipline of CFD is briefly discussed and related to other areas of research at NASA Ames including experimental fluid dynamics, computer science research, computational chemistry, and numerical aerodynamic simulation. These areas combine with CFD to form a larger area of research, which might collectively be called computational technology. The ultimate goal of computational technology research at NASA Ames is to increase the physical understanding of the world in which we live, solve problems of national importance, and increase the technical capabilities of the aerospace community. Next, the major programs at NASA Ames that either use CFD technology or perform research in CFD are listed and discussed. Briefly, this list includes turbulent/transition physics and modeling, high-speed real gas flows, interdisciplinary research, turbomachinery demonstration computations, complete aircraft aerodynamics, rotorcraft applications, powered lift flows, high alpha flows, multiple body aerodynamics, and incompressible flow applications. Some of the individual problems actively being worked in each of these areas is listed to help define the breadth or extent of CFD involvement in each of these major programs. State-of-the-art examples of various CFD applications are presented to highlight most of these areas. The main emphasis of this portion of the presentation is on examples which will not otherwise be treated at this conference by the individual presentations. Finally, a list of principal current limitations and expected future directions is given.
Computer Science | Classification | College of Engineering & Applied
EMS 1011 profile photo Adrian Dumitrescu, Ph.D.ProfessorComputer Science(414) 229-4265Eng & Math @uwm.eduEng & Math Sciences 919 profile photo Hossein Hosseini, Ph.D.ProfessorComputer Science(414) 229 -5184hosseini@uwm.eduEng & Math Sciences 1091 profile photo Amol Mali, Ph.D.Associate ProfessorComputer
Computers in Science Education: Can They Go Far Enough? Have We Gone Too Far?
ERIC Educational Resources Information Center
Schrock, John Richard
1984-01-01
Indicates that although computers may churn out creative research, science is still dependent on science education, and that science education consists of increasing human experience. Also considers uses and misuses of computers in the science classroom, examining Edgar Dale's "cone of experience" related to laboratory computer and "extended…
Artificial neural networks in biology and chemistry: the evolution of a new analytical tool.
Cartwright, Hugh M
2008-01-01
Once regarded as an eccentric and unpromising algorithm for the analysis of scientific data, the neural network has been developed in the last decade into a powerful computational tool. Its use now spans all areas of science, from the physical sciences and engineering to the life sciences and allied subjects. Applications range from the assessment of epidemiological data or the deconvolution of spectra to highly practical applications, such as the electronic nose. This introductory chapter considers briefly the growth in the use of neural networks and provides some general background in preparation for the more detailed chapters that follow.
Innovation Research in E-Learning
NASA Astrophysics Data System (ADS)
Wu, Bing; Xu, WenXia; Ge, Jun
This study is a productivity review on the literature gleaned from SSCI, SCIE databases concerning innovation research in E-Learning. The result indicates that the number of literature productions on innovation research in ELearning is still growing from 2005. The main research development country is England, and from the analysis of the publication year, the number of papers is increasing peaking in 25% of the total in 2010. Meanwhile the main source title is British Journal of Educational Technology. In addition the subject area concentrated on Education & Educational Research, Computer Science, Interdisciplinary Applications and Computer Science, Software Engineering. Moreover the research focuses on are mainly conceptual research and empirical research, which were used to explore E-Learning in respective of innovation diffusion theory, also the limitations and future research of these research were discussed for further research.
Optimization of knowledge-based systems and expert system building tools
NASA Technical Reports Server (NTRS)
Yasuda, Phyllis; Mckellar, Donald
1993-01-01
The objectives of the NASA-AMES Cooperative Agreement were to investigate, develop, and evaluate, via test cases, the system parameters and processing algorithms that constrain the overall performance of the Information Sciences Division's Artificial Intelligence Research Facility. Written reports covering various aspects of the grant were submitted to the co-investigators for the grant. Research studies concentrated on the field of artificial intelligence knowledge-based systems technology. Activities included the following areas: (1) AI training classes; (2) merging optical and digital processing; (3) science experiment remote coaching; (4) SSF data management system tests; (5) computer integrated documentation project; (6) conservation of design knowledge project; (7) project management calendar and reporting system; (8) automation and robotics technology assessment; (9) advanced computer architectures and operating systems; and (10) honors program.
NASA Technical Reports Server (NTRS)
1987-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April, 1986 through September 30, 1986 is summarized.
78 FR 10180 - Annual Computational Science Symposium; Conference
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-13
...] Annual Computational Science Symposium; Conference AGENCY: Food and Drug Administration, HHS. ACTION... Computational Science Symposium.'' The purpose of the conference is to help the broader community align and share experiences to advance computational science. At the conference, which will bring together FDA...
Progress in fast, accurate multi-scale climate simulations
Collins, W. D.; Johansen, H.; Evans, K. J.; ...
2015-06-01
We present a survey of physical and computational techniques that have the potential to contribute to the next generation of high-fidelity, multi-scale climate simulations. Examples of the climate science problems that can be investigated with more depth with these computational improvements include the capture of remote forcings of localized hydrological extreme events, an accurate representation of cloud features over a range of spatial and temporal scales, and parallel, large ensembles of simulations to more effectively explore model sensitivities and uncertainties. Numerical techniques, such as adaptive mesh refinement, implicit time integration, and separate treatment of fast physical time scales are enablingmore » improved accuracy and fidelity in simulation of dynamics and allowing more complete representations of climate features at the global scale. At the same time, partnerships with computer science teams have focused on taking advantage of evolving computer architectures such as many-core processors and GPUs. As a result, approaches which were previously considered prohibitively costly have become both more efficient and scalable. In combination, progress in these three critical areas is poised to transform climate modeling in the coming decades.« less
NASA Technical Reports Server (NTRS)
Young, Gerald W.; Clemons, Curtis B.
2004-01-01
The focus of this Cooperative Agreement between the Computational Materials Laboratory (CML) of the Processing Science and Technology Branch of the NASA Glenn Research Center (GRC) and the Department of Theoretical and Applied Mathematics at The University of Akron was in the areas of system development of the CML workstation environment, modeling of microgravity and earth-based material processing systems, and joint activities in laboratory projects. These efforts complement each other as the majority of the modeling work involves numerical computations to support laboratory investigations. Coordination and interaction between the modelers, system analysts, and laboratory personnel are essential toward providing the most effective simulations and communication of the simulation results. Toward these means, The University of Akron personnel involved in the agreement worked at the Applied Mathematics Research Laboratory (AMRL) in the Department of Theoretical and Applied Mathematics while maintaining a close relationship with the personnel of the Computational Materials Laboratory at GRC. Network communication between both sites has been established. A summary of the projects we undertook during the time period 9/1/03 - 6/30/04 is included.
Computational biology and bioinformatics in Nigeria.
Fatumo, Segun A; Adoga, Moses P; Ojo, Opeolu O; Oluwagbemi, Olugbenga; Adeoye, Tolulope; Ewejobi, Itunuoluwa; Adebiyi, Marion; Adebiyi, Ezekiel; Bewaji, Clement; Nashiru, Oyekanmi
2014-04-01
Over the past few decades, major advances in the field of molecular biology, coupled with advances in genomic technologies, have led to an explosive growth in the biological data generated by the scientific community. The critical need to process and analyze such a deluge of data and turn it into useful knowledge has caused bioinformatics to gain prominence and importance. Bioinformatics is an interdisciplinary research area that applies techniques, methodologies, and tools in computer and information science to solve biological problems. In Nigeria, bioinformatics has recently played a vital role in the advancement of biological sciences. As a developing country, the importance of bioinformatics is rapidly gaining acceptance, and bioinformatics groups comprised of biologists, computer scientists, and computer engineers are being constituted at Nigerian universities and research institutes. In this article, we present an overview of bioinformatics education and research in Nigeria. We also discuss professional societies and academic and research institutions that play central roles in advancing the discipline in Nigeria. Finally, we propose strategies that can bolster bioinformatics education and support from policy makers in Nigeria, with potential positive implications for other developing countries.
Computational Biology and Bioinformatics in Nigeria
Fatumo, Segun A.; Adoga, Moses P.; Ojo, Opeolu O.; Oluwagbemi, Olugbenga; Adeoye, Tolulope; Ewejobi, Itunuoluwa; Adebiyi, Marion; Adebiyi, Ezekiel; Bewaji, Clement; Nashiru, Oyekanmi
2014-01-01
Over the past few decades, major advances in the field of molecular biology, coupled with advances in genomic technologies, have led to an explosive growth in the biological data generated by the scientific community. The critical need to process and analyze such a deluge of data and turn it into useful knowledge has caused bioinformatics to gain prominence and importance. Bioinformatics is an interdisciplinary research area that applies techniques, methodologies, and tools in computer and information science to solve biological problems. In Nigeria, bioinformatics has recently played a vital role in the advancement of biological sciences. As a developing country, the importance of bioinformatics is rapidly gaining acceptance, and bioinformatics groups comprised of biologists, computer scientists, and computer engineers are being constituted at Nigerian universities and research institutes. In this article, we present an overview of bioinformatics education and research in Nigeria. We also discuss professional societies and academic and research institutions that play central roles in advancing the discipline in Nigeria. Finally, we propose strategies that can bolster bioinformatics education and support from policy makers in Nigeria, with potential positive implications for other developing countries. PMID:24763310
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hules, John
This 1998 annual report from the National Scientific Energy Research Computing Center (NERSC) presents the year in review of the following categories: Computational Science; Computer Science and Applied Mathematics; and Systems and Services. Also presented are science highlights in the following categories: Basic Energy Sciences; Biological and Environmental Research; Fusion Energy Sciences; High Energy and Nuclear Physics; and Advanced Scientific Computing Research and Other Projects.
NASA Astrophysics Data System (ADS)
Binti Shamsuddin, Norsila
Technology advancement and development in a higher learning institution is a chance for students to be motivated to learn in depth in the information technology areas. Students should take hold of the opportunity to blend their skills towards these technologies as preparation for them when graduating. The curriculum itself can rise up the students' interest and persuade them to be directly involved in the evolvement of the technology. The aim of this study is to see how deep is the students' involvement as well as their acceptance towards the adoption of the technology used in Computer Graphics and Image Processing subjects. The study will be towards the Bachelor students in Faculty of Industrial Information Technology (FIIT), Universiti Industri Selangor (UNISEL); Bac. In Multimedia Industry, BSc. Computer Science and BSc. Computer Science (Software Engineering). This study utilizes the new Unified Theory of Acceptance and Use of Technology (UTAUT) to further validate the model and enhance our understanding of the adoption of Computer Graphics and Image Processing Technologies. Four (4) out of eight (8) independent factors in UTAUT will be studied towards the dependent factor.
Investigation and Implementation of a Tree Transformation System for User Friendly Programming.
1984-12-01
systems have become an important area of research because of theiL direct impact on all areas of computer science such as software engineering ...RD-i52 716 INVESTIGTIN AND IMPLEMENTATION OF A TREE I/2TRANSFORMATION SYSTEM FOR USER FRIENDLY PROGRAMMING (U) NAVAL POSTGRADUATE SCHOOL MONTEREY CA...Implementation of a Master’s Thesis Tree Transformation System for User December 1984 Friendly Programming 6. PERFORMING ORG. REPORT NUMBER 7. AU~THOR(s) S
Enduring Influence of Stereotypical Computer Science Role Models on Women's Academic Aspirations
ERIC Educational Resources Information Center
Cheryan, Sapna; Drury, Benjamin J.; Vichayapai, Marissa
2013-01-01
The current work examines whether a brief exposure to a computer science role model who fits stereotypes of computer scientists has a lasting influence on women's interest in the field. One-hundred undergraduate women who were not computer science majors met a female or male peer role model who embodied computer science stereotypes in appearance…
A Web of Resources for Introductory Computer Science.
ERIC Educational Resources Information Center
Rebelsky, Samuel A.
As the field of Computer Science has grown, the syllabus of the introductory Computer Science course has changed significantly. No longer is it a simple introduction to programming or a tutorial on computer concepts and applications. Rather, it has become a survey of the field of Computer Science, touching on a wide variety of topics from digital…
NASA Technical Reports Server (NTRS)
1988-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April l, 1988 through September 30, 1988.
NASA Technical Reports Server (NTRS)
1984-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science during the period October 1, 1983 through March 31, 1984 is summarized.
NASA Technical Reports Server (NTRS)
1987-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1986 through March 31, 1987 is summarized.
High school computer science education paves the way for higher education: the Israeli case
NASA Astrophysics Data System (ADS)
Armoni, Michal; Gal-Ezer, Judith
2014-07-01
The gap between enrollments in higher education computing programs and the high-tech industry's demands is widely reported, and is especially prominent for women. Increasing the availability of computer science education in high school is one of the strategies suggested in order to address this gap. We look at the connection between exposure to computer science in high school and pursuing computing in higher education. We also examine the gender gap, in the context of high school computer science education. We show that in Israel, students who took the high-level computer science matriculation exam were more likely to pursue computing in higher education. Regarding the issue of gender, we will show that, in general, in Israel the difference between males and females who take computer science in high school is relatively small, and a larger, though still not very large difference exists only for the highest exam level. In addition, exposing females to high-level computer science in high school has more relative impact on pursuing higher education in computing.
Studying Students' Attitudes on Using Examples of Game Source Code for Learning Programming
ERIC Educational Resources Information Center
Theodoraki, Aristea; Xinogalos, Stelios
2014-01-01
Games for learning are currently used in several disciplines for motivating students and enhancing their learning experience. This new approach of technology-enhanced learning has attracted researchers' and instructors' attention in the area of programming that is one of the most cognitively demanding fields in Computer Science. Several…
The Digital Agora: Interaction and Learning in Political Science.
ERIC Educational Resources Information Center
Watters, Carolyn; Conley, Marshall; Alexander, Cynthia
Acadia University is the first "laptop" university in Canada. The Acadia Advantage program has each incoming student and each faculty member equipped with a laptop computer. In addition, classrooms, library, residence rooms, and common areas are wired so that the network is accessible both in and out of classrooms. This initiative has…
ERIC Educational Resources Information Center
National Science Foundation, Arlington, VA. Directorate for Computer and Information Science and Engineering.
The purpose of this summary of awards is to provide the scientific and engineering communities with a summary of the grants awarded in 1994 by the National Science Foundation's Division of Microelectronic Information Processing Systems. Similar areas of research are grouped together. Grantee institutions and principal investigators are identified…
Interdisciplinary Project Experiences: Collaboration between Majors and Non-Majors
ERIC Educational Resources Information Center
Smarkusky, Debra L.; Toman, Sharon A.
2014-01-01
Students in computer science and information technology should be engaged in solving real-world problems received from government and industry as well as those that expose them to various areas of application. In this paper, we discuss interdisciplinary project experiences between majors and non-majors that offered a creative and innovative…
COMPUTER TECHNIQUES FOR WEEKLY MULTIPLE-CHOICE TESTING.
ERIC Educational Resources Information Center
BROYLES, DAVID
TO ENCOURAGE POLITICAL SCIENCE STUDENTS TO READ PROPERLY AND CONTINUOUSLY, THE AUTHOR GIVES FREQUENT SHORT QUIZZES BASED ON THE ASSIGNED READINGS. FOR EASE IN ADMINISTRATION AND SCORING, HE USES MARK-SENSE CARDS, ON WHICH THE STUDENT MARKS DESIGNATED AREAS TO INDICATE HIS NUMBER AND HIS CHOICE OF ANSWERS. TO EMPHASIZE THE VALUE OF CONTINUED HIGH…
Characteristics of Open Access Journals in Six Subject Areas
ERIC Educational Resources Information Center
Walters, William H.; Linvill, Anne C.
2011-01-01
We examine the characteristics of 663 Open Access (OA) journals in biology, computer science, economics, history, medicine, and psychology, then compare the OA journals with impact factors to comparable subscription journals. There is great variation in the size of OA journals; the largest publishes more than 2,700 articles per year, but half…
Knowledge Structures of Entering Computer Networking Students and Their Instructors
ERIC Educational Resources Information Center
DiCerbo, Kristen E.
2007-01-01
Students bring prior knowledge to their learning experiences. This prior knowledge is known to affect how students encode and later retrieve new information learned. Teachers and content developers can use information about students' prior knowledge to create more effective lessons and materials. In many content areas, particularly the sciences,…
Biomimetic robots using EAP as artificial muscles - progress and challenges
NASA Technical Reports Server (NTRS)
Bar-Cohen, Yoseph
2004-01-01
Biology offers a great model for emulation in areas ranging from tools, computational algorithms, materials science, mechanisms and information technology. In recent years, the field of biomimetics, namely mimicking biology, has blossomed with significant advances enabling the reverse engineering of many animals' functions and implementation of some of these capabilities.
A Best Practices Approach to the Use of Information Technology in Education.
ERIC Educational Resources Information Center
Wright, Peter W.
Based on the author's presentation at the International Conference on Computer Based Learning in Science, this paper discuses some high profile areas of interest and concern in the educational use of information and communication technology (ICT). The paper is influenced partly by a series of nine government funded "Best Practices"…
A Functional Programming Approach to AI Search Algorithms
ERIC Educational Resources Information Center
Panovics, Janos
2012-01-01
The theory and practice of search algorithms related to state-space represented problems form the major part of the introductory course of Artificial Intelligence at most of the universities and colleges offering a degree in the area of computer science. Students usually meet these algorithms only in some imperative or object-oriented language…
Toward Psychoinformatics: Computer Science Meets Psychology
Duke, Éilish; Markowetz, Alexander
2016-01-01
The present paper provides insight into an emerging research discipline called Psychoinformatics. In the context of Psychoinformatics, we emphasize the cooperation between the disciplines of psychology and computer science in handling large data sets derived from heavily used devices, such as smartphones or online social network sites, in order to shed light on a large number of psychological traits, including personality and mood. New challenges await psychologists in light of the resulting “Big Data” sets, because classic psychological methods will only in part be able to analyze this data derived from ubiquitous mobile devices, as well as other everyday technologies. As a consequence, psychologists must enrich their scientific methods through the inclusion of methods from informatics. The paper provides a brief review of one area of this research field, dealing mainly with social networks and smartphones. Moreover, we highlight how data derived from Psychoinformatics can be combined in a meaningful way with data from human neuroscience. We close the paper with some observations of areas for future research and problems that require consideration within this new discipline. PMID:27403204
Integration science and distributed networks
NASA Astrophysics Data System (ADS)
Landauer, Christopher; Bellman, Kirstie L.
2002-07-01
Our work on integration of data and knowledge sources is based in a common theoretical treatment of 'Integration Science', which leads to systematic processes for combining formal logical and mathematical systems, computational and physical systems, and human systems and organizations. The theory is based on the processing of explicit meta-knowledge about the roles played by the different knowledge sources and the methods of analysis and semantic implications of the different data values, together with information about the context in which and the purpose for which they are being combined. The research treatment is primarily mathematical, and though this kind of integration mathematics is still under development, there are some applicable common threads that have emerged already. Instead of describing the current state of the mathematical investigations, since they are not yet crystallized enough for formalisms, we describe our applications of the approach in several different areas, including our focus area of 'Constructed Complex Systems', which are complex heterogeneous systems managed or mediated by computing systems. In this context, it is important to remember that all systems are embedded, all systems are autonomous, and that all systems are distributed networks.
Defining Computational Thinking for Mathematics and Science Classrooms
NASA Astrophysics Data System (ADS)
Weintrop, David; Beheshti, Elham; Horn, Michael; Orton, Kai; Jona, Kemi; Trouille, Laura; Wilensky, Uri
2016-02-01
Science and mathematics are becoming computational endeavors. This fact is reflected in the recently released Next Generation Science Standards and the decision to include "computational thinking" as a core scientific practice. With this addition, and the increased presence of computation in mathematics and scientific contexts, a new urgency has come to the challenge of defining computational thinking and providing a theoretical grounding for what form it should take in school science and mathematics classrooms. This paper presents a response to this challenge by proposing a definition of computational thinking for mathematics and science in the form of a taxonomy consisting of four main categories: data practices, modeling and simulation practices, computational problem solving practices, and systems thinking practices. In formulating this taxonomy, we draw on the existing computational thinking literature, interviews with mathematicians and scientists, and exemplary computational thinking instructional materials. This work was undertaken as part of a larger effort to infuse computational thinking into high school science and mathematics curricular materials. In this paper, we argue for the approach of embedding computational thinking in mathematics and science contexts, present the taxonomy, and discuss how we envision the taxonomy being used to bring current educational efforts in line with the increasingly computational nature of modern science and mathematics.
. Education Ph.D., Computer Science, Colorado School of Mines M.S., Computer Science, University of Queensland B.S., Computer Science, Colorado School of Mines Brunhart-Lupo Nicholas Brunhart-Lupo Computational Science Nicholas.Brunhart-Lupo@nrel.gov
ERIC Educational Resources Information Center
Margolis, Jane; Goode, Joanna; Bernier, David
2011-01-01
Broadening computer science learning to include more students is a crucial item on the United States' education agenda, these authors say. Although policymakers advocate more computer science expertise, computer science offerings in high schools are few--and actually shrinking. In addition, poorly resourced schools with a high percentage of…
NASA Technical Reports Server (NTRS)
1989-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1988 through March 31, 1989 is summarized.
Computational Materials Program for Alloy Design
NASA Technical Reports Server (NTRS)
Bozzolo, Guillermo
2005-01-01
The research program sponsored by this grant, "Computational Materials Program for Alloy Design", covers a period of time of enormous change in the emerging field of computational materials science. The computational materials program started with the development of the BFS method for alloys, a quantum approximate method for atomistic analysis of alloys specifically tailored to effectively deal with the current challenges in the area of atomistic modeling and to support modern experimental programs. During the grant period, the program benefited from steady growth which, as detailed below, far exceeds its original set of goals and objectives. Not surprisingly, by the end of this grant, the methodology and the computational materials program became an established force in the materials communitiy, with substantial impact in several areas. Major achievements during the duration of the grant include the completion of a Level 1 Milestone for the HITEMP program at NASA Glenn, consisting of the planning, development and organization of an international conference held at the Ohio Aerospace Institute in August of 2002, finalizing a period of rapid insertion of the methodology in the research community worlwide. The conference, attended by citizens of 17 countries representing various fields of the research community, resulted in a special issue of the leading journal in the area of applied surface science. Another element of the Level 1 Milestone was the presentation of the first version of the Alloy Design Workbench software package, currently known as "adwTools". This software package constitutes the first PC-based piece of software for atomistic simulations for both solid alloys and surfaces in the market.Dissemination of results and insertion in the materials community worldwide was a primary focus during this period. As a result, the P.I. was responsible for presenting 37 contributed talks, 19 invited talks, and publishing 71 articles in peer-reviewed journals, as detailed later in this Report.
NASA Astrophysics Data System (ADS)
Koch, Melissa; Gorges, Torie
2016-10-01
Underrepresented populations such as women, African-Americans, and Latinos/as often come to STEM (science, technology, engineering, and mathematics) careers by less traditional paths than White and Asian males. To better understand how and why women might shift toward STEM, particularly computer science, careers, we investigated the education and career direction of afterschool facilitators, primarily women of color in their twenties and thirties, who taught Build IT, an afterschool computer science curriculum for middle school girls. Many of these women indicated that implementing Build IT had influenced their own interest in technology and computer science and in some cases had resulted in their intent to pursue technology and computer science education. We wanted to explore the role that teaching Build IT may have played in activating or reactivating interest in careers in computer science and to see whether in the years following implementation of Build IT, these women pursued STEM education and/or careers. We reached nine facilitators who implemented the program in 2011-12 or shortly after. Many indicated that while facilitating Build IT, they learned along with the participants, increasing their interest in and confidence with technology and computer science. Seven of the nine participants pursued further STEM or computer science learning or modified their career paths to include more of a STEM or computer science focus. Through interviews, we explored what aspects of Build IT influenced these facilitators' interest and confidence in STEM and when relevant their pursuit of technology and computer science education and careers.
NASA Astrophysics Data System (ADS)
Morikawa, Y.; Murata, K. T.; Watari, S.; Kato, H.; Yamamoto, K.; Inoue, S.; Tsubouchi, K.; Fukazawa, K.; Kimura, E.; Tatebe, O.; Shimojo, S.
2010-12-01
Main methodologies of Solar-Terrestrial Physics (STP) so far are theoretical, experimental and observational, and computer simulation approaches. Recently "informatics" is expected as a new (fourth) approach to the STP studies. Informatics is a methodology to analyze large-scale data (observation data and computer simulation data) to obtain new findings using a variety of data processing techniques. At NICT (National Institute of Information and Communications Technology, Japan) we are now developing a new research environment named "OneSpaceNet". The OneSpaceNet is a cloud-computing environment specialized for science works, which connects many researchers with high-speed network (JGN: Japan Gigabit Network). The JGN is a wide-area back-born network operated by NICT; it provides 10G network and many access points (AP) over Japan. The OneSpaceNet also provides with rich computer resources for research studies, such as super-computers, large-scale data storage area, licensed applications, visualization devices (like tiled display wall: TDW), database/DBMS, cluster computers (4-8 nodes) for data processing and communication devices. What is amazing in use of the science cloud is that a user simply prepares a terminal (low-cost PC). Once connecting the PC to JGN2plus, the user can make full use of the rich resources of the science cloud. Using communication devices, such as video-conference system, streaming and reflector servers, and media-players, the users on the OneSpaceNet can make research communications as if they belong to a same (one) laboratory: they are members of a virtual laboratory. The specification of the computer resources on the OneSpaceNet is as follows: The size of data storage we have developed so far is almost 1PB. The number of the data files managed on the cloud storage is getting larger and now more than 40,000,000. What is notable is that the disks forming the large-scale storage are distributed to 5 data centers over Japan (but the storage system performs as one disk). There are three supercomputers allocated on the cloud, one from Tokyo, one from Osaka and the other from Nagoya. One's simulation job data on any supercomputers are saved on the cloud data storage (same directory); it is a kind of virtual computing environment. The tiled display wall has 36 panels acting as one display; the pixel (resolution) size of it is as large as 18000x4300. This size is enough to preview or analyze the large-scale computer simulation data. It also allows us to take a look of multiple (e.g., 100 pictures) on one screen together with many researchers. In our talk we also present a brief report of the initial results using the OneSpaceNet for Global MHD simulations as an example of successful use of our science cloud; (i) Ultra-high time resolution visualization of Global MHD simulations on the large-scale storage and parallel processing system on the cloud, (ii) Database of real-time Global MHD simulation and statistic analyses of the data, and (iii) 3D Web service of Global MHD simulations.
NASA Astrophysics Data System (ADS)
Lynch, Amanda H.; Abramson, David; Görgen, Klaus; Beringer, Jason; Uotila, Petteri
2007-10-01
Fires in the Australian savanna have been hypothesized to affect monsoon evolution, but the hypothesis is controversial and the effects have not been quantified. A distributed computing approach allows the development of a challenging experimental design that permits simultaneous variation of all fire attributes. The climate model simulations are distributed around multiple independent computer clusters in six countries, an approach that has potential for a range of other large simulation applications in the earth sciences. The experiment clarifies that savanna burning can shape the monsoon through two mechanisms. Boundary-layer circulation and large-scale convergence is intensified monotonically through increasing fire intensity and area burned. However, thresholds of fire timing and area are evident in the consequent influence on monsoon rainfall. In the optimal band of late, high intensity fires with a somewhat limited extent, it is possible for the wet season to be significantly enhanced.
The Caltech Concurrent Computation Program - Project description
NASA Technical Reports Server (NTRS)
Fox, G.; Otto, S.; Lyzenga, G.; Rogstad, D.
1985-01-01
The Caltech Concurrent Computation Program wwhich studies basic issues in computational science is described. The research builds on initial work where novel concurrent hardware, the necessary systems software to use it and twenty significant scientific implementations running on the initial 32, 64, and 128 node hypercube machines have been constructed. A major goal of the program will be to extend this work into new disciplines and more complex algorithms including general packages that decompose arbitrary problems in major application areas. New high-performance concurrent processors with up to 1024-nodes, over a gigabyte of memory and multigigaflop performance are being constructed. The implementations cover a wide range of problems in areas such as high energy and astrophysics, condensed matter, chemical reactions, plasma physics, applied mathematics, geophysics, simulation, CAD for VLSI, graphics and image processing. The products of the research program include the concurrent algorithms, hardware, systems software, and complete program implementations.
On teaching computer ethics within a computer science department.
Quinn, Michael J
2006-04-01
The author has surveyed a quarter of the accredited undergraduate computer science programs in the United States. More than half of these programs offer a 'social and ethical implications of computing' course taught by a computer science faculty member, and there appears to be a trend toward teaching ethics classes within computer science departments. Although the decision to create an 'in house' computer ethics course may sometimes be a pragmatic response to pressure from the accreditation agency, this paper argues that teaching ethics within a computer science department can provide students and faculty members with numerous benefits. The paper lists topics that can be covered in a computer ethics course and offers some practical suggestions for making the course successful.
Computational Science News | Computational Science | NREL
-Cooled High-Performance Computing Technology at the ESIF February 28, 2018 NREL Launches New Website for High-Performance Computing System Users The National Renewable Energy Laboratory (NREL) Computational Science Center has launched a revamped website for users of the lab's high-performance computing (HPC
1988-07-08
Marcus and C. Baczynski), Computer Science Press, Rockville, Maryland, 1986. 3. An Introduction to Pascal and Precalculus , Computer Science Press...Science Press, Rockville, Maryland, 1986. 35. An Introduction to Pascal and Precalculus , Computer Science Press, Rockville, Maryland, 1986. 36
Factors Influencing Exemplary Science Teachers' Levels of Computer Use
ERIC Educational Resources Information Center
Hakverdi, Meral; Dana, Thomas M.; Swain, Colleen
2011-01-01
The purpose of this study was to examine exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their…
Preparing Future Secondary Computer Science Educators
ERIC Educational Resources Information Center
Ajwa, Iyad
2007-01-01
Although nearly every college offers a major in computer science, many computer science teachers at the secondary level have received little formal training. This paper presents details of a project that could make a significant contribution to national efforts to improve computer science education by combining teacher education and professional…
Conceptual Modeling in the Time of the Revolution: Part II
NASA Astrophysics Data System (ADS)
Mylopoulos, John
Conceptual Modeling was a marginal research topic at the very fringes of Computer Science in the 60s and 70s, when the discipline was dominated by topics focusing on programs, systems and hardware architectures. Over the years, however, the field has moved to centre stage and has come to claim a central role both in Computer Science research and practice in diverse areas, such as Software Engineering, Databases, Information Systems, the Semantic Web, Business Process Management, Service-Oriented Computing, Multi-Agent Systems, Knowledge Management, and more. The transformation was greatly aided by the adoption of standards in modeling languages (e.g., UML), and model-based methodologies (e.g., Model-Driven Architectures) by the Object Management Group (OMG) and other standards organizations. We briefly review the history of the field over the past 40 years, focusing on the evolution of key ideas. We then note some open challenges and report on-going research, covering topics such as the representation of variability in conceptual models, capturing model intentions, and models of laws.
Final Report. Institute for Ultralscale Visualization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ma, Kwan-Liu; Galli, Giulia; Gygi, Francois
The SciDAC Institute for Ultrascale Visualization brought together leading experts from visualization, high-performance computing, and science application areas to make advanced visualization solutions for SciDAC scientists and the broader community. Over the five-year project, the Institute introduced many new enabling visualization techniques, which have significantly enhanced scientists’ ability to validate their simulations, interpret their data, and communicate with others about their work and findings. This Institute project involved a large number of junior and student researchers, who received the opportunities to work on some of the most challenging science applications and gain access to the most powerful high-performance computing facilitiesmore » in the world. They were readily trained and prepared for facing the greater challenges presented by extreme-scale computing. The Institute’s outreach efforts, through publications, workshops and tutorials, successfully disseminated the new knowledge and technologies to the SciDAC and the broader scientific communities. The scientific findings and experience of the Institute team helped plan the SciDAC3 program.« less
Cantürk, İsmail; Özyılmaz, Lale
2018-07-01
This paper presents an approach to postmortem interval (PMI) estimation, which is a very debated and complicated area of forensic science. Most of the reported methods to determine PMI in the literature are not practical because of the need for skilled persons and significant amounts of time, and give unsatisfactory results. Additionally, the error margin of PMI estimation increases proportionally with elapsed time after death. It is crucial to develop practical PMI estimation methods for forensic science. In this study, a computational system is developed to determine the PMI of human subjects by investigating postmortem opacity development of the eye. Relevant features from the eye images were extracted using image processing techniques to reflect gradual opacity development. The features were then investigated to predict the time after death using machine learning methods. The experimental results prove that the development of opacity can be utilized as a practical computational tool to determine PMI for human subjects. Copyright © 2018 Elsevier Ltd. All rights reserved.
OPENING REMARKS: SciDAC: Scientific Discovery through Advanced Computing
NASA Astrophysics Data System (ADS)
Strayer, Michael
2005-01-01
Good morning. Welcome to SciDAC 2005 and San Francisco. SciDAC is all about computational science and scientific discovery. In a large sense, computational science characterizes SciDAC and its intent is change. It transforms both our approach and our understanding of science. It opens new doors and crosses traditional boundaries while seeking discovery. In terms of twentieth century methodologies, computational science may be said to be transformational. There are a number of examples to this point. First are the sciences that encompass climate modeling. The application of computational science has in essence created the field of climate modeling. This community is now international in scope and has provided precision results that are challenging our understanding of our environment. A second example is that of lattice quantum chromodynamics. Lattice QCD, while adding precision and insight to our fundamental understanding of strong interaction dynamics, has transformed our approach to particle and nuclear science. The individual investigator approach has evolved to teams of scientists from different disciplines working side-by-side towards a common goal. SciDAC is also undergoing a transformation. This meeting is a prime example. Last year it was a small programmatic meeting tracking progress in SciDAC. This year, we have a major computational science meeting with a variety of disciplines and enabling technologies represented. SciDAC 2005 should position itself as a new corner stone for Computational Science and its impact on science. As we look to the immediate future, FY2006 will bring a new cycle to SciDAC. Most of the program elements of SciDAC will be re-competed in FY2006. The re-competition will involve new instruments for computational science, new approaches for collaboration, as well as new disciplines. There will be new opportunities for virtual experiments in carbon sequestration, fusion, and nuclear power and nuclear waste, as well as collaborations with industry and virtual prototyping. New instruments of collaboration will include institutes and centers while summer schools, workshops and outreach will invite new talent and expertise. Computational science adds new dimensions to science and its practice. Disciplines of fusion, accelerator science, and combustion are poised to blur the boundaries between pure and applied science. As we open the door into FY2006 we shall see a landscape of new scientific challenges: in biology, chemistry, materials, and astrophysics to name a few. The enabling technologies of SciDAC have been transformational as drivers of change. Planning for major new software systems assumes a base line employing Common Component Architectures and this has become a household word for new software projects. While grid algorithms and mesh refinement software have transformed applications software, data management and visualization have transformed our understanding of science from data. The Gordon Bell prize now seems to be dominated by computational science and solvers developed by TOPS ISIC. The priorities of the Office of Science in the Department of Energy are clear. The 20 year facilities plan is driven by new science. High performance computing is placed amongst the two highest priorities. Moore's law says that by the end of the next cycle of SciDAC we shall have peta-flop computers. The challenges of petascale computing are enormous. These and the associated computational science are the highest priorities for computing within the Office of Science. Our effort in Leadership Class computing is just a first step towards this goal. Clearly, computational science at this scale will face enormous challenges and possibilities. Performance evaluation and prediction will be critical to unraveling the needed software technologies. We must not lose sight of our overarching goal—that of scientific discovery. Science does not stand still and the landscape of science discovery and computing holds immense promise. In this environment, I believe it is necessary to institute a system of science based performance metrics to help quantify our progress towards science goals and scientific computing. As a final comment I would like to reaffirm that the shifting landscapes of science will force changes to our computational sciences, and leave you with the quote from Richard Hamming, 'The purpose of computing is insight, not numbers'.
Enabling Discoveries in Earth Sciences Through the Geosciences Network (GEON)
NASA Astrophysics Data System (ADS)
Seber, D.; Baru, C.; Memon, A.; Lin, K.; Youn, C.
2005-12-01
Taking advantage of the state-of-the-art information technology resources GEON researchers are building a cyberinfrastructure designed to enable data sharing, semantic data integration, high-end computations and 4D visualization in easy-to-use web-based environments. The GEON Network currently allows users to search and register Earth science resources such as data sets (GIS layers, GMT files, geoTIFF images, ASCII files, relational databases etc), software applications or ontologies. Portal based access mechanisms enable developers to built dynamic user interfaces to conduct advanced processing and modeling efforts across distributed computers and supercomputers. Researchers and educators can access the networked resources through the GEON portal and its portlets that were developed to conduct better and more comprehensive science and educational studies. For example, the SYNSEIS portlet in GEON enables users to access in near-real time seismic waveforms from the IRIS Data Management Center, easily build a 3D geologic model within the area of the seismic station(s) and the epicenter and perform a 3D synthetic seismogram analysis to understand the lithospheric structure and earthquake source parameters for any given earthquake in the US. Similarly, GEON's workbench area enables users to create their own work environment and copy, visualize and analyze any data sets within the network, and create subsets of the data sets for their own purposes. Since all these resources are built as part of a Service-oriented Architecture (SOA), they are also used in other development platforms. One such platform is Kepler Workflow system which can access web service based resources and provides users with graphical programming interfaces to build a model to conduct computations and/or visualization efforts using the networked resources. Developments in the area of semantic integration of the networked datasets continue to advance and prototype studies can be accessed via the GEON portal at www.geongrid.org
NASA Astrophysics Data System (ADS)
Kumar, Manoj; Srivastava, Akanksha
2013-01-01
This paper presents a survey of innovative approaches of the most effective computational techniques for solving singular perturbed partial differential equations, which are useful because of their numerical and computer realizations. Many applied problems appearing in semiconductors theory, biochemistry, kinetics, theory of electrical chains, economics, solid mechanics, fluid dynamics, quantum mechanics, and many others can be modelled as singularly perturbed systems. Here, we summarize a wide range of research articles published by numerous researchers during the last ten years to get a better view of the present scenario in this area of research.
Mathematics and statistics research department. Progress report, period ending June 30, 1981
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lever, W.E.; Kane, V.E.; Scott, D.S.
1981-09-01
This report is the twenty-fourth in the series of progress reports of the Mathematics and Statistics Research Department of the Computer Sciences Division, Union Carbide Corporation - Nuclear Division (UCC-ND). Part A records research progress in biometrics research, materials science applications, model evaluation, moving boundary problems, multivariate analysis, numerical linear algebra, risk analysis, and complementary areas. Collaboration and consulting with others throughout the UCC-ND complex are recorded in Part B. Included are sections on biology and health sciences, chemistry, energy, engineering, environmental sciences, health and safety research, materials sciences, safeguards, surveys, and uranium resource evaluation. Part C summarizes the variousmore » educational activities in which the staff was engaged. Part D lists the presentations of research results, and Part E records the staff's other professional activities during the report period.« less
Enabling Wide-Scale Computer Science Education through Improved Automated Assessment Tools
NASA Astrophysics Data System (ADS)
Boe, Bryce A.
There is a proliferating demand for newly trained computer scientists as the number of computer science related jobs continues to increase. University programs will only be able to train enough new computer scientists to meet this demand when two things happen: when there are more primary and secondary school students interested in computer science, and when university departments have the resources to handle the resulting increase in enrollment. To meet these goals, significant effort is being made to both incorporate computational thinking into existing primary school education, and to support larger university computer science class sizes. We contribute to this effort through the creation and use of improved automated assessment tools. To enable wide-scale computer science education we do two things. First, we create a framework called Hairball to support the static analysis of Scratch programs targeted for fourth, fifth, and sixth grade students. Scratch is a popular building-block language utilized to pique interest in and teach the basics of computer science. We observe that Hairball allows for rapid curriculum alterations and thus contributes to wide-scale deployment of computer science curriculum. Second, we create a real-time feedback and assessment system utilized in university computer science classes to provide better feedback to students while reducing assessment time. Insights from our analysis of student submission data show that modifications to the system configuration support the way students learn and progress through course material, making it possible for instructors to tailor assignments to optimize learning in growing computer science classes.
Benefits of Exchange Between Computer Scientists and Perceptual Scientists: A Panel Discussion
NASA Technical Reports Server (NTRS)
Kaiser, Mary K.; Null, Cynthia H. (Technical Monitor)
1995-01-01
We have established several major goals for this panel: 1) Introduce the computer graphics community to some specific leaders in the use of perceptual psychology relating to computer graphics; 2) Enumerate the major results that are known, and provide a set of resources for finding others; 3) Identify research areas where knowledge of perceptual psychology can help computer system designers improve their systems; and 4) Provide advice to researchers on how they can establish collaborations in their own research programs. We believe this will be a very important panel. In addition to generating lively discussion, we hope to point out some of the fundamental issues that occur at the boundary between computer science and perception, and possibly help researchers avoid some of the common pitfalls.
Metacognition: computation, biology and function
Fleming, Stephen M.; Dolan, Raymond J.; Frith, Christopher D.
2012-01-01
Many complex systems maintain a self-referential check and balance. In animals, such reflective monitoring and control processes have been grouped under the rubric of metacognition. In this introductory article to a Theme Issue on metacognition, we review recent and rapidly progressing developments from neuroscience, cognitive psychology, computer science and philosophy of mind. While each of these areas is represented in detail by individual contributions to the volume, we take this opportunity to draw links between disciplines, and highlight areas where further integration is needed. Specifically, we cover the definition, measurement, neurobiology and possible functions of metacognition, and assess the relationship between metacognition and consciousness. We propose a framework in which level of representation, order of behaviour and access consciousness are orthogonal dimensions of the conceptual landscape. PMID:22492746
Programmers, professors, and parasites: credit and co-authorship in computer science.
Solomon, Justin
2009-12-01
This article presents an in-depth analysis of past and present publishing practices in academic computer science to suggest the establishment of a more consistent publishing standard. Historical precedent for academic publishing in computer science is established through the study of anecdotes as well as statistics collected from databases of published computer science papers. After examining these facts alongside information about analogous publishing situations and standards in other scientific fields, the article concludes with a list of basic principles that should be adopted in any computer science publishing standard. These principles would contribute to the reliability and scientific nature of academic publications in computer science and would allow for more straightforward discourse in future publications.
Increasing Diversity in Computer Science: Acknowledging, yet Moving Beyond, Gender
NASA Astrophysics Data System (ADS)
Larsen, Elizabeth A.; Stubbs, Margaret L.
Lack of diversity within the computer science field has, thus far, been examined most fully through the lens of gender. This article is based on a follow-on to Margolis and Fisher's (2002) study and includes interviews with 33 Carnegie Mellon University students from the undergraduate senior class of 2002 in the School of Computer Science. We found evidence of similarities among the perceptions of these women and men on definitions of computer science, explanations for the notoriously low proportion of women in the field, characterizations of a typical computer science student, impressions of recent curricular changes, a sense of the atmosphere/culture in the program, views of the Women@SCS campus organization, and suggestions for attracting and retaining well-rounded students in computer science. We conclude that efforts to increase diversity in the computer science field will benefit from a more broad-based approach that considers, but is not limited to, notions of gender difference.
Democratizing Computer Science
ERIC Educational Resources Information Center
Margolis, Jane; Goode, Joanna; Ryoo, Jean J.
2015-01-01
Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…
ERIC Educational Resources Information Center
Abuzaghleh, Omar; Goldschmidt, Kathleen; Elleithy, Yasser; Lee, Jeongkyu
2013-01-01
With the advances in computing power, high-performance computing (HPC) platforms have had an impact on not only scientific research in advanced organizations but also computer science curriculum in the educational community. For example, multicore programming and parallel systems are highly desired courses in the computer science major. However,…
Computer Science and the Liberal Arts
ERIC Educational Resources Information Center
Shannon, Christine
2010-01-01
Computer science and the liberal arts have much to offer each other. Yet liberal arts colleges, in particular, have been slow to recognize the opportunity that the study of computer science provides for achieving the goals of a liberal education. After the precipitous drop in computer science enrollments during the first decade of this century,…
Marrying Content and Process in Computer Science Education
ERIC Educational Resources Information Center
Zendler, A.; Spannagel, C.; Klaudt, D.
2011-01-01
Constructivist approaches to computer science education emphasize that as well as knowledge, thinking skills and processes are involved in active knowledge construction. K-12 computer science curricula must not be based on fashions and trends, but on contents and processes that are observable in various domains of computer science, that can be…
ERIC Educational Resources Information Center
Master, Allison; Cheryan, Sapna; Meltzoff, Andrew N.
2016-01-01
Computer science has one of the largest gender disparities in science, technology, engineering, and mathematics. An important reason for this disparity is that girls are less likely than boys to enroll in necessary "pipeline courses," such as introductory computer science. Two experiments investigated whether high-school girls' lower…
Approaching Gender Parity: Women in Computer Science at Afghanistan's Kabul University
ERIC Educational Resources Information Center
Plane, Jandelyn
2010-01-01
This study explores the representation of women in computer science at the tertiary level through data collected about undergraduate computer science education at Kabul University in Afghanistan. Previous studies have theorized reasons for underrepresentation of women in computer science, and while many of these reasons are indeed present in…
Some Hail 'Computational Science' as Biggest Advance Since Newton, Galileo.
ERIC Educational Resources Information Center
Turner, Judith Axler
1987-01-01
Computational science is defined as science done on a computer. A computer can serve as a laboratory for researchers who cannot experiment with their subjects, and as a calculator for those who otherwise might need centuries to solve some problems mathematically. The National Science Foundation's support of supercomputers is discussed. (MLW)
Yearning to Give Back: Searching for Social Purpose in Computer Science and Engineering.
Carrigan, Coleen M
2017-01-01
Computing is highly segregated and stratified by gender. While there is abundant scholarship investigating this problem, emerging evidence suggests that a hierarchy of value exists between the social and technical dimensions of Computer Science and Engineering (CSE) and this plays a role in the underrepresentation of women in the field. This ethnographic study of women's experiences in computing offers evidence of a systemic preference for the technical dimensions of computing over the social and a correlation between gender and social aspirations. Additionally, it suggests there is a gap between the exaltation of computing's social contributions and the realities of them. My participants expressed a yearning to contribute to the collective well-being of society using their computing skills. I trace moments of rupture in my participants' stories, moments when they felt these aspirations were in conflict with the cultural values in their organizations. I interpret these ruptures within a consideration of yearning, a need my participants had to contribute meaningfully to society that remained unfulfilled. The yearning to align one's altruistic values with one's careers aspirations in CSE illuminates an area for greater exploration on the path to realizing gender equity in computing. I argue that before a case can be made that careers in computing do indeed contribute to social and civil engagements, we must first address the meaning of the social within the values, ideologies and practices of CSE institutions and next, develop ways to measure and evaluate the field's contributions to society.
Yearning to Give Back: Searching for Social Purpose in Computer Science and Engineering
Carrigan, Coleen M.
2017-01-01
Computing is highly segregated and stratified by gender. While there is abundant scholarship investigating this problem, emerging evidence suggests that a hierarchy of value exists between the social and technical dimensions of Computer Science and Engineering (CSE) and this plays a role in the underrepresentation of women in the field. This ethnographic study of women's experiences in computing offers evidence of a systemic preference for the technical dimensions of computing over the social and a correlation between gender and social aspirations. Additionally, it suggests there is a gap between the exaltation of computing's social contributions and the realities of them. My participants expressed a yearning to contribute to the collective well-being of society using their computing skills. I trace moments of rupture in my participants' stories, moments when they felt these aspirations were in conflict with the cultural values in their organizations. I interpret these ruptures within a consideration of yearning, a need my participants had to contribute meaningfully to society that remained unfulfilled. The yearning to align one's altruistic values with one's careers aspirations in CSE illuminates an area for greater exploration on the path to realizing gender equity in computing. I argue that before a case can be made that careers in computing do indeed contribute to social and civil engagements, we must first address the meaning of the social within the values, ideologies and practices of CSE institutions and next, develop ways to measure and evaluate the field's contributions to society. PMID:28790936
A Parallel Processing Algorithm for Remote Sensing Classification
NASA Technical Reports Server (NTRS)
Gualtieri, J. Anthony
2005-01-01
A current thread in parallel computation is the use of cluster computers created by networking a few to thousands of commodity general-purpose workstation-level commuters using the Linux operating system. For example on the Medusa cluster at NASA/GSFC, this provides for super computing performance, 130 G(sub flops) (Linpack Benchmark) at moderate cost, $370K. However, to be useful for scientific computing in the area of Earth science, issues of ease of programming, access to existing scientific libraries, and portability of existing code need to be considered. In this paper, I address these issues in the context of tools for rendering earth science remote sensing data into useful products. In particular, I focus on a problem that can be decomposed into a set of independent tasks, which on a serial computer would be performed sequentially, but with a cluster computer can be performed in parallel, giving an obvious speedup. To make the ideas concrete, I consider the problem of classifying hyperspectral imagery where some ground truth is available to train the classifier. In particular I will use the Support Vector Machine (SVM) approach as applied to hyperspectral imagery. The approach will be to introduce notions about parallel computation and then to restrict the development to the SVM problem. Pseudocode (an outline of the computation) will be described and then details specific to the implementation will be given. Then timing results will be reported to show what speedups are possible using parallel computation. The paper will close with a discussion of the results.
Girls in computer science: A female only introduction class in high school
NASA Astrophysics Data System (ADS)
Drobnis, Ann W.
This study examined the impact of an all girls' classroom environment in a high school introductory computer science class on the student's attitudes towards computer science and their thoughts on future involvement with computer science. It was determined that an all girls' introductory class could impact the declining female enrollment and female students' efficacy towards computer science. This research was conducted in a summer school program through a regional magnet school for science and technology which these students attend during the school year. Three different groupings of students were examined for the research: female students in an all girls' class, female students in mixed-gender classes and male students in mixed-gender classes. A survey, Attitudes about Computers and Computer Science (ACCS), was designed to obtain an understanding of the students' thoughts, preconceptions, attitude, knowledge of computer science, and future intentions around computer science, both in education and career. Students in all three groups were administered the ACCS prior to taking the class and upon completion of the class. In addition, students in the all girls' class wrote in a journal throughout the course, and some of those students were also interviewed upon completion of the course. The data was analyzed using quantitative and qualitative techniques. While there were no major differences found in the quantitative data, it was determined that girls in the all girls' class were truly excited by what they had learned and were more open to the idea of computer science being a part of their future.
Engineering and physical sciences in oncology: challenges and opportunities.
Mitchell, Michael J; Jain, Rakesh K; Langer, Robert
2017-11-01
The principles of engineering and physics have been applied to oncology for nearly 50 years. Engineers and physical scientists have made contributions to all aspects of cancer biology, from quantitative understanding of tumour growth and progression to improved detection and treatment of cancer. Many early efforts focused on experimental and computational modelling of drug distribution, cell cycle kinetics and tumour growth dynamics. In the past decade, we have witnessed exponential growth at the interface of engineering, physics and oncology that has been fuelled by advances in fields including materials science, microfabrication, nanomedicine, microfluidics, imaging, and catalysed by new programmes at the National Institutes of Health (NIH), including the National Institute of Biomedical Imaging and Bioengineering (NIBIB), Physical Sciences in Oncology, and the National Cancer Institute (NCI) Alliance for Nanotechnology. Here, we review the advances made at the interface of engineering and physical sciences and oncology in four important areas: the physical microenvironment of the tumour and technological advances in drug delivery; cellular and molecular imaging; and microfluidics and microfabrication. We discussthe research advances, opportunities and challenges for integrating engineering and physical sciences with oncology to develop new methods to study, detect and treat cancer, and we also describe the future outlook for these emerging areas.
Bringing computational science to the public.
McDonagh, James L; Barker, Daniel; Alderson, Rosanna G
2016-01-01
The increasing use of computers in science allows for the scientific analyses of large datasets at an increasing pace. We provided examples and interactive demonstrations at Dundee Science Centre as part of the 2015 Women in Science festival, to present aspects of computational science to the general public. We used low-cost Raspberry Pi computers to provide hands on experience in computer programming and demonstrated the application of computers to biology. Computer games were used as a means to introduce computers to younger visitors. The success of the event was evaluated by voluntary feedback forms completed by visitors, in conjunction with our own self-evaluation. This work builds on the original work of the 4273π bioinformatics education program of Barker et al. (2013, BMC Bioinform. 14:243). 4273π provides open source education materials in bioinformatics. This work looks at the potential to adapt similar materials for public engagement events. It appears, at least in our small sample of visitors (n = 13), that basic computational science can be conveyed to people of all ages by means of interactive demonstrations. Children as young as five were able to successfully edit simple computer programs with supervision. This was, in many cases, their first experience of computer programming. The feedback is predominantly positive, showing strong support for improving computational science education, but also included suggestions for improvement. Our conclusions are necessarily preliminary. However, feedback forms suggest methods were generally well received among the participants; "Easy to follow. Clear explanation" and "Very easy. Demonstrators were very informative." Our event, held at a local Science Centre in Dundee, demonstrates that computer games and programming activities suitable for young children can be performed alongside a more specialised and applied introduction to computational science for older visitors.
Computer Science and Telecommunications Board summary of activities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blumenthal, M.S.
1992-03-27
The Computer Science and Telecommunications Board (CSTB) considers technical and policy issues pertaining to computer science, telecommunications, and associated technologies. CSTB actively disseminates the results of its completed projects to those in a position to help implement their recommendations or otherwise use their insights. It provides a forum for the exchange of information on computer science, computing technology, and telecommunications. This report discusses the major accomplishments of CSTB.
Hispanic women overcoming deterrents to computer science: A phenomenological study
NASA Astrophysics Data System (ADS)
Herling, Lourdes
The products of computer science are important to all aspects of society and are tools in the solution of the world's problems. It is, therefore, troubling that the United States faces a shortage in qualified graduates in computer science. The number of women and minorities in computer science is significantly lower than the percentage of the U.S. population which they represent. The overall enrollment in computer science programs has continued to decline with the enrollment of women declining at a higher rate than that of men. This study addressed three aspects of underrepresentation about which there has been little previous research: addressing computing disciplines specifically rather than embedding them within the STEM disciplines, what attracts women and minorities to computer science, and addressing the issues of race/ethnicity and gender in conjunction rather than in isolation. Since women of underrepresented ethnicities are more severely underrepresented than women in general, it is important to consider whether race and ethnicity play a role in addition to gender as has been suggested by previous research. Therefore, this study examined what attracted Hispanic women to computer science specifically. The study determines whether being subjected to multiple marginalizations---female and Hispanic---played a role in the experiences of Hispanic women currently in computer science. The study found five emergent themes within the experiences of Hispanic women in computer science. Encouragement and role models strongly influenced not only the participants' choice to major in the field, but to persist as well. Most of the participants experienced a negative atmosphere and feelings of not fitting in while in college and industry. The interdisciplinary nature of computer science was the most common aspect that attracted the participants to computer science. The aptitudes participants commonly believed are needed for success in computer science are the Twenty-First Century skills problem solving, creativity, and critical thinking. While not all the participants had experience with computers or programming prior to attending college, experience played a role in the self-confidence of those who did.
Cheyney University Curriculum and Infrastructure Enhamcement in STEM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eva, Sakkar Ara
Cheyney University is the oldest historically Black educational institution in America. Initially established as a “normal” school emphasizing the matriculation of educators, Cheyney has become a comprehensive university, one of 14 state universities comprising the Pennsylvania State System of Higher Education (PASSHE). Cheyney University graduates still become teachers, but they also enter such fields as journalism, medicine, science, mathematics, law, communication and government. Cheyney University is a small state owned HBCU with very limited resource. At present the university has about a thousand students with 15% in STEM. The CUCIES II grant made significant contribution in saving the computer sciencemore » program from being a discontinued program in the university. The grant enabled the university to hire a temporary faculty to teach in and update the computer science program. The program is enhanced with three tracks; cyber security, human computer interaction and general. The updated and enhanced computer science program will prepare professionals in the area of computer science with the knowledge, skills, and professional ethic needed for the current market. The new curriculum was developed for a professional profile that would focus on the technologies and techniques currently used in the industry. With faculty on board, the university worked with the department to bring back the computer science program from moratorium. Once in the path of being discontinued and loosing students, the program is now growing. Currently the student number has increased from 12 to 30. University is currently in the process of hiring a tenure track faculty in the computer science program. Another product of the grant is the proposal for introductory course in nanotechnology. The course is intended to generate interest in the nanotechnology field. The Natural and Applied Science department that houses all of the STEM programs in Cheyney University, is currently working to bring back environmental science program from moratorium. The university has been working to improve minority participation in STEM and made significant stride in terms of progressing students toward graduate programs and into professoriate track. This success is due to faculty mentors who work closely with students to guiding them through the application processes for research internship and graduate programs; it is also due to the university forming collaborative agreements with research intensive institutions, federal and state agencies and industry. The grant assisted in recruiting and retaining students in STEM by offering tuition scholarship, research scholarship and travel awards. Faculty professional development was supported by the grant by funding travel to conferences, meetings and webinar. As many HBCU Cheyney University is also trying to do more with less. As the STEM programs are inherently expensive, these are the ones that suffer more when resources are scarce. One of the goals of Cheyney University strategic plan is to strengthen STEM programs that is coherent with the critical skill need of Department of Energy. All of the Cheyney University STEM programs are now located in the new science building funded by Pennsylvania state.« less
Research in applied mathematics, numerical analysis, and computer science
NASA Technical Reports Server (NTRS)
1984-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.
NASA Technical Reports Server (NTRS)
1998-01-01
This report highlights the challenging work accomplished during fiscal year 1997 by Ames research scientists and engineers. The work is divided into accomplishments that support the goals of NASA s four Strategic Enterprises: Aeronautics and Space Transportation Technology, Space Science, Human Exploration and Development of Space (HEDS), and Earth Science. NASA Ames Research Center s research effort in the Space, Earth, and HEDS Enterprises is focused i n large part to support Ames lead role for Astrobiology, which broadly defined is the scientific study of the origin, distribution, and future of life in the universe. This NASA initiative in Astrobiology is a broad science effort embracing basic research, technology development, and flight missions. Ames contributions to the Space Science Enterprise are focused in the areas of exobiology, planetary systems, astrophysics, and space technology. Ames supports the Earth Science Enterprise by conducting research and by developing technology with the objective of expanding our knowledge of the Earth s atmosphere and ecosystems. Finallv, Ames supports the HEDS Enterprise by conducting research, managing spaceflight projects, and developing technologies. A key objective is to understand the phenomena surrounding the effects of gravity on living things. Ames has also heen designated the Agency s Center of Evcellence for Information Technnlogv. The three cornerstones of Information Technology research at Ames are automated reasoning, human-centered computing, and high performance computing and networking.
NASA Technical Reports Server (NTRS)
Koh, Severino L. (Editor); Speziale, Charles G. (Editor)
1989-01-01
Various papers on recent advances in engineering science are presented. Some individual topics addressed include: advances in adaptive methods in computational fluid mechanics, mixtures of two medicomorphic materials, computer tests of rubber elasticity, shear bands in isotropic micropolar elastic materials, nonlinear surface wave and resonator effects in magnetostrictive crystals, simulation of electrically enhanced fibrous filtration, plasticity theory of granular materials, dynamics of viscoelastic media with internal oscillators, postcritical behavior of a cantilever bar, boundary value problems in nonlocal elasticity, stability of flexible structures with random parameters, electromagnetic tornadoes in earth's ionosphere and magnetosphere, helicity fluctuations and the energy cascade in turbulence, mechanics of interfacial zones in bonded materials, propagation of a normal shock in a varying area duct, analytical mechanics of fracture and fatigue.
Science-Driven Computing: NERSC's Plan for 2006-2010
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simon, Horst D.; Kramer, William T.C.; Bailey, David H.
NERSC has developed a five-year strategic plan focusing on three components: Science-Driven Systems, Science-Driven Services, and Science-Driven Analytics. (1) Science-Driven Systems: Balanced introduction of the best new technologies for complete computational systems--computing, storage, networking, visualization and analysis--coupled with the activities necessary to engage vendors in addressing the DOE computational science requirements in their future roadmaps. (2) Science-Driven Services: The entire range of support activities, from high-quality operations and user services to direct scientific support, that enable a broad range of scientists to effectively use NERSC systems in their research. NERSC will concentrate on resources needed to realize the promise ofmore » the new highly scalable architectures for scientific discovery in multidisciplinary computational science projects. (3) Science-Driven Analytics: The architectural and systems enhancements and services required to integrate NERSC's powerful computational and storage resources to provide scientists with new tools to effectively manipulate, visualize, and analyze the huge data sets derived from simulations and experiments.« less
NASA Astrophysics Data System (ADS)
Cody, R. P.; Kassin, A.; Gaylord, A.; Brown, J.; Tweedie, C. E.
2012-12-01
The Barrow area of northern Alaska is one of the most intensely researched locations in the Arctic. The Barrow Area Information Database (BAID, www.baidims.org) is a cyberinfrastructure (CI) that details much of the historic and extant research undertaken within in the Barrow region in a suite of interactive web-based mapping and information portals (geobrowsers). The BAID user community and target audience for BAID is diverse and includes research scientists, science logisticians, land managers, educators, students, and the general public. BAID contains information on more than 9,600 Barrow area research sites that extend back to the 1940's and more than 640 remote sensing images and geospatial datasets. In a web-based setting, users can zoom, pan, query, measure distance, and save or print maps and query results. Data are described with metadata that meet Federal Geographic Data Committee standards and are archived at the University Corporation for Atmospheric Research Earth Observing Laboratory (EOL) where non-proprietary BAID data can be freely downloaded. BAID has been used to: Optimize research site choice; Reduce duplication of science effort; Discover complementary and potentially detrimental research activities in an area of scientific interest; Re-establish historical research sites for resampling efforts assessing change in ecosystem structure and function over time; Exchange knowledge across disciplines and generations; Facilitate communication between western science and traditional ecological knowledge; Provide local residents access to science data that facilitates adaptation to arctic change; (and) Educate the next generation of environmental and computer scientists. This poster describes key activities that will be undertaken over the next three years to provide BAID users with novel software tools to interact with a current and diverse selection of information and data about the Barrow area. Key activities include: 1. Collecting data on research activities, generating geospatial data, and providing mapping support. 2. Maintaining, updating and innovating the existing suite of BAID geobrowsers. 3. Maintaining and updating aging server hardware supporting BAID. 4. Adding interoperability with other CI using workflows, controlled vocabularies and web services. 5. Linking BAID to data archives at the National Snow and Ice Data Center (NSIDC). 6. Developing a wireless sensor network that provides web based interaction with near-real time climate and other data. 7. Training next generation of environmental and computer scientists and conducting outreach.
NASA Astrophysics Data System (ADS)
Delgado, Francisco
2017-12-01
Quantum information is an emergent area merging physics, mathematics, computer science and engineering. To reach its technological goals, it is requiring adequate approaches to understand how to combine physical restrictions, computational approaches and technological requirements to get functional universal quantum information processing. This work presents the modeling and the analysis of certain general type of Hamiltonian representing several physical systems used in quantum information and establishing a dynamics reduction in a natural grammar for bipartite processing based on entangled states.
NASA Astrophysics Data System (ADS)
Esparza, Javier
In many areas of computer science entities can “reproduce”, “replicate”, or “create new instances”. Paramount examples are threads in multithreaded programs, processes in operating systems, and computer viruses, but many others exist: procedure calls create new incarnations of the callees, web crawlers discover new pages to be explored (and so “create” new tasks), divide-and-conquer procedures split a problem into subproblems, and leaves of tree-based data structures become internal nodes with children. For lack of a better name, I use the generic term systems with process creation to refer to all these entities.
Philosophical approaches to the nursing informatics data-information-knowledge-wisdom framework.
Matney, Susan; Brewster, Philip J; Sward, Katherine A; Cloyes, Kristin G; Staggers, Nancy
2011-01-01
Although informatics is an important area of nursing inquiry and practice, few scholars have articulated the philosophical foundations of the field or how these translate into practice including the often-cited data, information, knowledge, and wisdom (DIKW) framework. Data, information, and knowledge, often approached through postpositivism, can be exhibited in computer systems. Wisdom aligns with constructivist epistemological perspectives such as Gadamerian hermeneutics. Computer systems can support wisdom development. Wisdom is an important element of the DIKW framework and adds value to the role of nursing informaticists and nursing science.
ERIC Educational Resources Information Center
Stoilescu, Dorian; Egodawatte, Gunawardena
2010-01-01
Research shows that female and male students in undergraduate computer science programs view computer culture differently. Female students are interested more in the use of computers than in doing programming, whereas male students see computer science mainly as a programming activity. The overall purpose of our research was not to find new…
One-to-one iPad technology in the middle school mathematics and science classrooms
NASA Astrophysics Data System (ADS)
Bixler, Sharon G.
Science, technology, engineering, and mathematics (STEM) education has become an emphasized component of PreK-12 education in the United States. The US is struggling to produce enough science, mathematics, and technology experts to meet its national and global needs, and the mean scores of science and mathematics students are not meeting the expected levels desired by our leaders (Hossain & Robinson, 2011). In an effort to improve achievement scores in mathematics and science, school districts must consider many components that can contribute to the development of a classroom where students are engaged and growing academically. Computer technology (CT) for student use is a popular avenue for school districts to pursue in their goal to attain higher achievement. The purpose of this study is to examine the use of iPads in a one-to-one setting, where every student has his own device 24/7, to determine the effects, if any, on academic achievement in the areas of mathematics and science. This comparison study used hierarchical linear modeling (HLM) to examine three middle schools in a private school district. Two of the schools have implemented a one-to-one iPad program with their sixth through eighth grades and the third school uses computers on limited occasions in the classroom and in a computer lab setting. The questions addressed were what effect, if any, do the implementation of a one-to-one iPad program and a teacher's perception of his use of constructivist teaching strategies have on student academic achievement in the mathematics and science middle school classrooms. The research showed that although the program helped promote the use of constructivist activities through the use of technology, the one-to-one iPad initiative had no effect on academic achievement in the middle school mathematics and science classrooms.
ERIC Educational Resources Information Center
Zhang, Yulei; Dang, Yan
2015-01-01
Web development is an important component in the curriculum of computer science and information systems areas. However, it is generally considered difficult to learn among students. In this study,we examined factors that could influence students' perceptions of accomplishment and enjoyment and their intention to learn in the web development…
Experiences in Digital Circuit Design Courses: A Self-Study Platform for Learning Support
ERIC Educational Resources Information Center
Bañeres, David; Clarisó, Robert; Jorba, Josep; Serra, Montse
2014-01-01
The synthesis of digital circuits is a basic skill in all the bachelor programmes around the ICT area of knowledge, such as Computer Science, Telecommunication Engineering or Electrical Engineering. An important hindrance in the learning process of this skill is that the existing educational tools for the design of circuits do not allow the…
NASA Technical Reports Server (NTRS)
Ortega, J. M.
1984-01-01
The research efforts of University of Virginia students under a NASA sponsored program are summarized and the status of the program is reported. The research includes: testing method evaluations for N version programming; a representation scheme for modeling three dimensional objects; fault tolerant protocols for real time local area networks; performance investigation of Cyber network; XFEM implementation; and vectorizing incomplete Cholesky conjugate gradients.
Credit by Examination at the University of Texas at Austin, 1985-1986.
ERIC Educational Resources Information Center
Mahoney, Susan S.
The University of Texas (UT) at Austin's credit by examination program is described. In 1985-86, credit by examination was offered in 55 subjects. Details were provided for each of 18 subject areas in which over 20 tests were administered: Biology; Chemistry; Chinese; Computer Science; Economics; Electrical Engineering; English; French; German;…
1988-06-27
de olf nessse end Id e ;-tl Sb ieeI smleo) ,Optical Artificial Intellegence ; Optical inference engines; Optical logic; Optical informationprocessing...common. They arise in areas such as expert systems and other artificial intelligence systems. In recent years, the computer science language PROLOG has...cal processors should in principle be well suited for : I artificial intelligence applications. In recent years, symbolic logic processing. , the
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duda, R.O.; Shortliffe, E.H.
1983-04-15
Artificial intelligence, long a topic of basic computer science research, is now being applied to problems of scientific, technical, and commercial interest. Some consultation programs although limited in versatility, have achieved levels of performance rivaling those of human experts. A collateral benefit of this work is the systematization of previously unformalized knowledge in areas such as medical diagnosis and geology. 30 references.
Library Automation Design for Visually Impaired People
ERIC Educational Resources Information Center
Yurtay, Nilufer; Bicil, Yucel; Celebi, Sait; Cit, Guluzar; Dural, Deniz
2011-01-01
Speech synthesis is a technology used in many different areas in computer science. This technology can bring a solution to reading activity of visually impaired people due to its text to speech conversion. Based on this problem, in this study, a system is designed needed for a visually impaired person to make use of all the library facilities in…
PDAs and Handhelds: ICT at Your Side and Not in Your Face
ERIC Educational Resources Information Center
Wishart, Jocelyn; Ramsden, Andy; McFarlane, Angela
2007-01-01
In order to evaluate the potential of Personal Digital Assistants (PDAs) or handheld computers to support initial teacher training (ITT), 14 science teacher trainees at the Graduate School of Education in the University of Bristol were given PDAs with mobile phone connectivity to use throughout the academic year. The following areas were…
Language Maintenance on the Internet
ERIC Educational Resources Information Center
Ward, Judit Hajnal; Agocs, Laszlo
2004-01-01
Due to the expanding use of computer networks in Hungary, the Hungarian language has become a grown-up member of the World Wide Web and the Internet. In the past few years, the number of web pages written in Hungarian has significantly increased, since all areas of business, science, education, culture, etc., are eager to make use of the evolving…
ERIC Educational Resources Information Center
Dunn, Linda W.; Corn, Anne L.; Morelock, Martha J.
2004-01-01
This investigation compared fantasy-proneness levels and IQ scores in gifted adolescents with primary talent areas in 1 of 4 domains: mathematics, computer science, creative writing, and chemistry. The Inventory of Childhood Memories and Imaginings: Children's Form (ICMIC; Myers, 1983) was used to assess fantasy-proneness. IQ scores were generated…
1985 Annual Technical Report: A Research Program in Computer Technology. July 1984--June 1985.
ERIC Educational Resources Information Center
University of Southern California, Marina del Rey. Information Sciences Inst.
Summaries of research performed by the Information Sciences Institute at the University of Southern California for the U.S. Department of Defense Advanced Research Projects Agency in 17 areas are provided in this report: (1) Common LISP framework, an exportable version of the Formalized Software Development (FSD) testbed; (2) Explainable Expert…
Opportunities for Computational Discovery in Basic Energy Sciences
NASA Astrophysics Data System (ADS)
Pederson, Mark
2011-03-01
An overview of the broad-ranging support of computational physics and computational science within the Department of Energy Office of Science will be provided. Computation as the third branch of physics is supported by all six offices (Advanced Scientific Computing, Basic Energy, Biological and Environmental, Fusion Energy, High-Energy Physics, and Nuclear Physics). Support focuses on hardware, software and applications. Most opportunities within the fields of~condensed-matter physics, chemical-physics and materials sciences are supported by the Officeof Basic Energy Science (BES) or through partnerships between BES and the Office for Advanced Scientific Computing. Activities include radiation sciences, catalysis, combustion, materials in extreme environments, energy-storage materials, light-harvesting and photovoltaics, solid-state lighting and superconductivity.~ A summary of two recent reports by the computational materials and chemical communities on the role of computation during the next decade will be provided. ~In addition to materials and chemistry challenges specific to energy sciences, issues identified~include a focus on the role of the domain scientist in integrating, expanding and sustaining applications-oriented capabilities on evolving high-performance computing platforms and on the role of computation in accelerating the development of innovative technologies. ~~
Research | Computational Science | NREL
Research Research NREL's computational science experts use advanced high-performance computing (HPC technologies, thereby accelerating the transformation of our nation's energy system. Enabling High-Impact Research NREL's computational science capabilities enable high-impact research. Some recent examples
How robotics programs influence young women's career choices : a grounded theory model
NASA Astrophysics Data System (ADS)
Craig, Cecilia Dosh-Bluhm
The fields of engineering, computer science, and physics have a paucity of women despite decades of intervention by universities and organizations. Women's graduation rates in these fields continue to stagnate, posing a critical problem for society. This qualitative grounded theory (GT) study sought to understand how robotics programs influenced young women's career decisions and the program's effect on engineering, physics, and computer science career interests. To test this, a study was mounted to explore how the FIRST (For Inspiration and Recognition of Science and Technology) Robotics Competition (FRC) program influenced young women's college major and career choices. Career theories suggested that experiential programs coupled with supportive relationships strongly influence career decisions, especially for science, technology, engineering, and mathematics careers. The study explored how and when young women made career decisions and how the experiential program and! its mentors and role models influenced career choice. Online focus groups and interviews (online and face-to-face) with 10 female FRC alumnae and GT processes (inductive analysis, open coding, categorizations using mind maps and content clouds) were used to generate a general systems theory style model of the career decision process for these young women. The study identified gender stereotypes and other career obstacles for women. The study's conclusions include recommendations to foster connections to real-world challenges, to develop training programs for mentors, and to nurture social cohesion, a mostly untapped area. Implementing these recommendations could help grow a critical mass of women in engineering, physics, and computer science careers, a social change worth pursuing.
NASA Astrophysics Data System (ADS)
de Groot, R.
2008-12-01
The Southern California Earthquake Center (SCEC) has been developing groundbreaking computer modeling capabilities for studying earthquakes. These visualizations were initially shared within the scientific community but have recently gained visibility via television news coverage in Southern California. Computers have opened up a whole new world for scientists working with large data sets, and students can benefit from the same opportunities (Libarkin & Brick, 2002). For example, The Great Southern California ShakeOut was based on a potential magnitude 7.8 earthquake on the southern San Andreas fault. The visualization created for the ShakeOut was a key scientific and communication tool for the earthquake drill. This presentation will also feature SCEC Virtual Display of Objects visualization software developed by SCEC Undergraduate Studies in Earthquake Information Technology interns. According to Gordin and Pea (1995), theoretically visualization should make science accessible, provide means for authentic inquiry, and lay the groundwork to understand and critique scientific issues. This presentation will discuss how the new SCEC visualizations and other earthquake imagery achieve these results, how they fit within the context of major themes and study areas in science communication, and how the efficacy of these tools can be improved.
The fourth International Conference on Information Science and Cloud Computing
NASA Astrophysics Data System (ADS)
This book comprises the papers accepted by the fourth International Conference on Information Science and Cloud Computing (ISCC), which was held from 18-19 December, 2015 in Guangzhou, China. It has 70 papers divided into four parts. The first part focuses on Information Theory with 20 papers; the second part emphasizes Machine Learning also containing 21 papers; in the third part, there are 21 papers as well in the area of Control Science; and the last part with 8 papers is dedicated to Cloud Science. Each part can be used as an excellent reference by engineers, researchers and students who need to build a knowledge base of the most current advances and state-of-practice in the topics covered by the ISCC conference. Special thanks go to Professor Deyu Qi, General Chair of ISCC 2015, for his leadership in supervising the organization of the entire conference; Professor Tinghuai Ma, Program Chair, and members of program committee for evaluating all the submissions and ensuring the selection of only the highest quality papers; and the authors for sharing their ideas, results and insights. We sincerely hope that you enjoy reading papers included in this book.
1996 NASA-ASEE-Stanford Summer Faculty Fellowship Program. Part 1
NASA Technical Reports Server (NTRS)
1996-01-01
As is customary, the final technical report for the NASA-ASEE Summer Faculty Fellowship Program at the Ames Research Center, Dryden Flight Research Center and Stanford University essentially consists of a compilation of the summary technical reports of all the fellows. More extended versions done either as NASA publications, archival papers, or other laboratory reports are not included here. The reader will note that the areas receiving emphasis were the life sciences, astronomy, remote sensing, aeronautics, fluid dynamics/aerophysics, and computer science. Of course, the areas of emphasis vary somewhat from year to year depending on the interests of the most qualified applicants. Once again, the work is of especially high quality. The reports of the first and second year fellows are grouped separately and are arranged alphabetically within each group.
PREFACE: International Conference on Applied Sciences (ICAS2014)
NASA Astrophysics Data System (ADS)
Lemle, Ludovic Dan; Jiang, Yiwen
2015-06-01
The International Conference on Applied Sciences (ICAS2014) took place in Hunedoara, Romania from 2-4 October 2014 at the Engineering Faculty of Hunedoara. The conference takes place alternately in Romania and in P.R. China and is organized by "Politehnica" University of Timisoara, Romania, and Military Economics Academy of Wuhan, P.R. China, with the aim to serve as a platform for exchange of information between various areas of applied sciences and to promote the communication between scientists of different nations, countries and continents. The topics of the conference covered a comprehensive spectrum of issues: 1. Economical Sciences 2. Engineering Sciences 3. Fundamental Sciences 4. Medical Sciences The conference gathered qualified researchers whose expertise can be used to develop new engineering knowledge that has the potential for application in economics, defense, medicine, etc. There were nearly 100 registered participants from six countries, and four invited and 56 oral talks were delivered during the two days of the conference. Based on the work presented at the conference, selected papers are included in this volume of IOP Conference Series: Materials Science and Engineering. These papers present new research in the various fields of Materials Engineering, Mechanical Engineering, Computer Engineering, and Mathematical Engineering. It is our great pleasure to present this volume of IOP Conference Series: Materials Science and Engineering to the scientific community to promote further research in these areas. We sincerely hope that the papers published in this volume will contribute to the advancement of knowledge in their respective fields.
2011 Computation Directorate Annual Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crawford, D L
2012-04-11
From its founding in 1952 until today, Lawrence Livermore National Laboratory (LLNL) has made significant strategic investments to develop high performance computing (HPC) and its application to national security and basic science. Now, 60 years later, the Computation Directorate and its myriad resources and capabilities have become a key enabler for LLNL programs and an integral part of the effort to support our nation's nuclear deterrent and, more broadly, national security. In addition, the technological innovation HPC makes possible is seen as vital to the nation's economic vitality. LLNL, along with other national laboratories, is working to make supercomputing capabilitiesmore » and expertise available to industry to boost the nation's global competitiveness. LLNL is on the brink of an exciting milestone with the 2012 deployment of Sequoia, the National Nuclear Security Administration's (NNSA's) 20-petaFLOP/s resource that will apply uncertainty quantification to weapons science. Sequoia will bring LLNL's total computing power to more than 23 petaFLOP/s-all brought to bear on basic science and national security needs. The computing systems at LLNL provide game-changing capabilities. Sequoia and other next-generation platforms will enable predictive simulation in the coming decade and leverage industry trends, such as massively parallel and multicore processors, to run petascale applications. Efficient petascale computing necessitates refining accuracy in materials property data, improving models for known physical processes, identifying and then modeling for missing physics, quantifying uncertainty, and enhancing the performance of complex models and algorithms in macroscale simulation codes. Nearly 15 years ago, NNSA's Accelerated Strategic Computing Initiative (ASCI), now called the Advanced Simulation and Computing (ASC) Program, was the critical element needed to shift from test-based confidence to science-based confidence. Specifically, ASCI/ASC accelerated the development of simulation capabilities necessary to ensure confidence in the nuclear stockpile-far exceeding what might have been achieved in the absence of a focused initiative. While stockpile stewardship research pushed LLNL scientists to develop new computer codes, better simulation methods, and improved visualization technologies, this work also stimulated the exploration of HPC applications beyond the standard sponsor base. As LLNL advances to a petascale platform and pursues exascale computing (1,000 times faster than Sequoia), ASC will be paramount to achieving predictive simulation and uncertainty quantification. Predictive simulation and quantifying the uncertainty of numerical predictions where little-to-no data exists demands exascale computing and represents an expanding area of scientific research important not only to nuclear weapons, but to nuclear attribution, nuclear reactor design, and understanding global climate issues, among other fields. Aside from these lofty goals and challenges, computing at LLNL is anything but 'business as usual.' International competition in supercomputing is nothing new, but the HPC community is now operating in an expanded, more aggressive climate of global competitiveness. More countries understand how science and technology research and development are inextricably linked to economic prosperity, and they are aggressively pursuing ways to integrate HPC technologies into their native industrial and consumer products. In the interest of the nation's economic security and the science and technology that underpins it, LLNL is expanding its portfolio and forging new collaborations. We must ensure that HPC remains an asymmetric engine of innovation for the Laboratory and for the U.S. and, in doing so, protect our research and development dynamism and the prosperity it makes possible. One untapped area of opportunity LLNL is pursuing is to help U.S. industry understand how supercomputing can benefit their business. Industrial investment in HPC applications has historically been limited by the prohibitive cost of entry, the inaccessibility of software to run the powerful systems, and the years it takes to grow the expertise to develop codes and run them in an optimal way. LLNL is helping industry better compete in the global market place by providing access to some of the world's most powerful computing systems, the tools to run them, and the experts who are adept at using them. Our scientists are collaborating side by side with industrial partners to develop solutions to some of industry's toughest problems. The goal of the Livermore Valley Open Campus High Performance Computing Innovation Center is to allow American industry the opportunity to harness the power of supercomputing by leveraging the scientific and computational expertise at LLNL in order to gain a competitive advantage in the global economy.« less
NASA's computer science research program
NASA Technical Reports Server (NTRS)
Larsen, R. L.
1983-01-01
Following a major assessment of NASA's computing technology needs, a new program of computer science research has been initiated by the Agency. The program includes work in concurrent processing, management of large scale scientific databases, software engineering, reliable computing, and artificial intelligence. The program is driven by applications requirements in computational fluid dynamics, image processing, sensor data management, real-time mission control and autonomous systems. It consists of university research, in-house NASA research, and NASA's Research Institute for Advanced Computer Science (RIACS) and Institute for Computer Applications in Science and Engineering (ICASE). The overall goal is to provide the technical foundation within NASA to exploit advancing computing technology in aerospace applications.
Girls Save the World through Computer Science
ERIC Educational Resources Information Center
Murakami, Christine
2011-01-01
It's no secret that fewer and fewer women are entering computer science fields. Attracting high school girls to computer science is only part of the solution. Retaining them while they are in higher education or the workforce is also a challenge. To solve this, there is a need to show girls that computer science is a wide-open field that offers…
ERIC Educational Resources Information Center
Liang, Jyh-Chong; Su, Yi-Ching; Tsai, Chin-Chung
2015-01-01
The aim of this study was to explore Taiwanese college students' conceptions of and approaches to learning computer science and then explore the relationships between the two. Two surveys, Conceptions of Learning Computer Science (COLCS) and Approaches to Learning Computer Science (ALCS), were administered to 421 college students majoring in…
Hispanic Women Overcoming Deterrents to Computer Science: A Phenomenological Study
ERIC Educational Resources Information Center
Herling, Lourdes
2011-01-01
The products of computer science are important to all aspects of society and are tools in the solution of the world's problems. It is, therefore, troubling that the United States faces a shortage in qualified graduates in computer science. The number of women and minorities in computer science is significantly lower than the percentage of the…
ERIC Educational Resources Information Center
Payton, Jamie; Barnes, Tiffany; Buch, Kim; Rorrer, Audrey; Zuo, Huifang
2015-01-01
This study is a follow-up to one published in computer science education in 2010 that reported preliminary results showing a positive impact of service learning on student attitudes associated with success and retention in computer science. That paper described how service learning was incorporated into a computer science course in the context of…
An Investigation of Primary School Science Teachers' Use of Computer Applications
ERIC Educational Resources Information Center
Ocak, Mehmet Akif; Akdemir, Omur
2008-01-01
This study investigated the level and frequency of science teachers' use of computer applications as an instructional tool in the classroom. The manner and frequency of science teachers' use of computer, their perceptions about integration of computer applications, and other factors contributed to changes in their computer literacy are…
Methodical Approaches to Teaching of Computer Modeling in Computer Science Course
ERIC Educational Resources Information Center
Rakhimzhanova, B. Lyazzat; Issabayeva, N. Darazha; Khakimova, Tiyshtik; Bolyskhanova, J. Madina
2015-01-01
The purpose of this study was to justify of the formation technique of representation of modeling methodology at computer science lessons. The necessity of studying computer modeling is that the current trends of strengthening of general education and worldview functions of computer science define the necessity of additional research of the…
Climate Modeling Computing Needs Assessment
NASA Astrophysics Data System (ADS)
Petraska, K. E.; McCabe, J. D.
2011-12-01
This paper discusses early findings of an assessment of computing needs for NASA science, engineering and flight communities. The purpose of this assessment is to document a comprehensive set of computing needs that will allow us to better evaluate whether our computing assets are adequately structured to meet evolving demand. The early results are interesting, already pointing out improvements we can make today to get more out of the computing capacity we have, as well as potential game changing innovations for the future in how we apply information technology to science computing. Our objective is to learn how to leverage our resources in the best way possible to do more science for less money. Our approach in this assessment is threefold: Development of use case studies for science workflows; Creating a taxonomy and structure for describing science computing requirements; and characterizing agency computing, analysis, and visualization resources. As projects evolve, science data sets increase in a number of ways: in size, scope, timelines, complexity, and fidelity. Generating, processing, moving, and analyzing these data sets places distinct and discernable requirements on underlying computing, analysis, storage, and visualization systems. The initial focus group for this assessment is the Earth Science modeling community within NASA's Science Mission Directorate (SMD). As the assessment evolves, this focus will expand to other science communities across the agency. We will discuss our use cases, our framework for requirements and our characterizations, as well as our interview process, what we learned and how we plan to improve our materials after using them in the first round of interviews in the Earth Science Modeling community. We will describe our plans for how to expand this assessment, first into the Earth Science data analysis and remote sensing communities, and then throughout the full community of science, engineering and flight at NASA.
Area collapse algorithm computing new curve of 2D geometric objects
NASA Astrophysics Data System (ADS)
Buczek, Michał Mateusz
2017-06-01
The processing of cartographic data demands human involvement. Up-to-date algorithms try to automate a part of this process. The goal is to obtain a digital model, or additional information about shape and topology of input geometric objects. A topological skeleton is one of the most important tools in the branch of science called shape analysis. It represents topological and geometrical characteristics of input data. Its plot depends on using algorithms such as medial axis, skeletonization, erosion, thinning, area collapse and many others. Area collapse, also known as dimension change, replaces input data with lower-dimensional geometric objects like, for example, a polygon with a polygonal chain, a line segment with a point. The goal of this paper is to introduce a new algorithm for the automatic calculation of polygonal chains representing a 2D polygon. The output is entirely contained within the area of the input polygon, and it has a linear plot without branches. The computational process is automatic and repeatable. The requirements of input data are discussed. The author analyzes results based on the method of computing ends of output polygonal chains. Additional methods to improve results are explored. The algorithm was tested on real-world cartographic data received from BDOT/GESUT databases, and on point clouds from laser scanning. An implementation for computing hatching of embankment is described.
feature extraction, human-computer interaction, and physics-based modeling. Professional Experience 2009 ., computer science, University of Colorado at Boulder M.S., computer science, University of Colorado at Boulder B.S., computer science, New Mexico Institute of Mining and Technology
PACES Participation in Educational Outreach Programs at the University of Texas at El Paso
NASA Technical Reports Server (NTRS)
Dodge, Rebecca L.
1997-01-01
The University of Texas at El Paso (UTEP) is involved in several initiatives to improve science education within the El Paso area public schools. These include outreach efforts into the K- 12 classrooms; training programs for in-service teachers; and the introduction of a strong science core curricula within the College of Education. The Pan American Center for Earth and Environmental Studies (PACES), a NASA-funded University Research Center, will leverage off the goals of these existing initiatives to provide curriculum support materials at all levels. We will use currently available Mission to Planet Earth (MTPE) materials as well as new materials developed specifically for this region, in an effort to introduce the Earth System Science perspective into these programs. In addition, we are developing curriculum support materials and classes within the Geology and Computer Departments, to provide education in the area of remote sensing and GIS applications at the undergraduate and graduate levels.
Conference Grant Proposal for ICOPS 2016
DOE Office of Scientific and Technical Information (OSTI.GOV)
Safronova, Alla
ICOPS (International Conference on Plasma Science) is an internationally renowned, well-attended annual conference that involves topics of direct interest to the Office of Fusion Energy Sciences of Department of Energy. In particular, ICOPS 2016 emphasized both the traditional areas of plasma science and the new areas of growth that include but are not limited to Fusion (Inertial, Magnetic and Alternate Concepts), Particle Acceleration with Laser and Beams, High Energy Density Matter, Laser Produced Plasma, Fast Z-pinches, Computational Plasma Physics, Plasma Diagnostics, and such frontiers as studying Warm Dense Matter using the X-ray free electron lasers. The travel support of themore » students at ICOPS comes usually from sponsor organizations. Increasing the participation of outstanding students at ICOPS 2016 who are the first authors of the abstracts and are selected to receive the travel support based on the scientific merit of the submitted abstracts is crucial for the creation of the new generation of the plasma physicists.« less
Computer-aided design and computer science technology
NASA Technical Reports Server (NTRS)
Fulton, R. E.; Voigt, S. J.
1976-01-01
A description is presented of computer-aided design requirements and the resulting computer science advances needed to support aerospace design. The aerospace design environment is examined, taking into account problems of data handling and aspects of computer hardware and software. The interactive terminal is normally the primary interface between the computer system and the engineering designer. Attention is given to user aids, interactive design, interactive computations, the characteristics of design information, data management requirements, hardware advancements, and computer science developments.
Reading Emotion From Mouse Cursor Motions: Affective Computing Approach.
Yamauchi, Takashi; Xiao, Kunchen
2018-04-01
Affective computing research has advanced emotion recognition systems using facial expressions, voices, gaits, and physiological signals, yet these methods are often impractical. This study integrates mouse cursor motion analysis into affective computing and investigates the idea that movements of the computer cursor can provide information about emotion of the computer user. We extracted 16-26 trajectory features during a choice-reaching task and examined the link between emotion and cursor motions. Participants were induced for positive or negative emotions by music, film clips, or emotional pictures, and they indicated their emotions with questionnaires. Our 10-fold cross-validation analysis shows that statistical models formed from "known" participants (training data) could predict nearly 10%-20% of the variance of positive affect and attentiveness ratings of "unknown" participants, suggesting that cursor movement patterns such as the area under curve and direction change help infer emotions of computer users. Copyright © 2017 Cognitive Science Society, Inc.
NASA Astrophysics Data System (ADS)
Payton, Jamie; Barnes, Tiffany; Buch, Kim; Rorrer, Audrey; Zuo, Huifang
2015-07-01
This study is a follow-up to one published in computer science education in 2010 that reported preliminary results showing a positive impact of service learning on student attitudes associated with success and retention in computer science. That paper described how service learning was incorporated into a computer science course in the context of the Students & Technology in Academia, Research, and Service (STARS) Alliance, an NSF-supported broadening participation in computing initiative that aims to diversify the computer science pipeline through innovative pedagogy and inter-institutional partnerships. The current paper describes how the STARS Alliance has expanded to diverse institutions, all using service learning as a vehicle for broadening participation in computing and enhancing attitudes and behaviors associated with student success. Results supported the STARS model of service learning for enhancing computing efficacy and computing commitment and for providing diverse students with many personal and professional development benefits.
ICASE Computer Science Program
NASA Technical Reports Server (NTRS)
1985-01-01
The Institute for Computer Applications in Science and Engineering computer science program is discussed in outline form. Information is given on such topics as problem decomposition, algorithm development, programming languages, and parallel architectures.
Wang, Zhaocai; Ji, Zuwen; Wang, Xiaoming; Wu, Tunhua; Huang, Wei
2017-12-01
As a promising approach to solve the computationally intractable problem, the method based on DNA computing is an emerging research area including mathematics, computer science and molecular biology. The task scheduling problem, as a well-known NP-complete problem, arranges n jobs to m individuals and finds the minimum execution time of last finished individual. In this paper, we use a biologically inspired computational model and describe a new parallel algorithm to solve the task scheduling problem by basic DNA molecular operations. In turn, we skillfully design flexible length DNA strands to represent elements of the allocation matrix, take appropriate biological experiment operations and get solutions of the task scheduling problem in proper length range with less than O(n 2 ) time complexity. Copyright © 2017. Published by Elsevier B.V.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jablonowski, Christiane
The research investigates and advances strategies how to bridge the scale discrepancies between local, regional and global phenomena in climate models without the prohibitive computational costs of global cloud-resolving simulations. In particular, the research explores new frontiers in computational geoscience by introducing high-order Adaptive Mesh Refinement (AMR) techniques into climate research. AMR and statically-adapted variable-resolution approaches represent an emerging trend for atmospheric models and are likely to become the new norm in future-generation weather and climate models. The research advances the understanding of multi-scale interactions in the climate system and showcases a pathway how to model these interactions effectively withmore » advanced computational tools, like the Chombo AMR library developed at the Lawrence Berkeley National Laboratory. The research is interdisciplinary and combines applied mathematics, scientific computing and the atmospheric sciences. In this research project, a hierarchy of high-order atmospheric models on cubed-sphere computational grids have been developed that serve as an algorithmic prototype for the finite-volume solution-adaptive Chombo-AMR approach. The foci of the investigations have lied on the characteristics of both static mesh adaptations and dynamically-adaptive grids that can capture flow fields of interest like tropical cyclones. Six research themes have been chosen. These are (1) the introduction of adaptive mesh refinement techniques into the climate sciences, (2) advanced algorithms for nonhydrostatic atmospheric dynamical cores, (3) an assessment of the interplay between resolved-scale dynamical motions and subgrid-scale physical parameterizations, (4) evaluation techniques for atmospheric model hierarchies, (5) the comparison of AMR refinement strategies and (6) tropical cyclone studies with a focus on multi-scale interactions and variable-resolution modeling. The results of this research project demonstrate significant advances in all six research areas. The major conclusions are that statically-adaptive variable-resolution modeling is currently becoming mature in the climate sciences, and that AMR holds outstanding promise for future-generation weather and climate models on high-performance computing architectures.« less
Research Institute for Advanced Computer Science: Annual Report October 1998 through September 1999
NASA Technical Reports Server (NTRS)
Leiner, Barry M.; Gross, Anthony R. (Technical Monitor)
1999-01-01
The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center (ARC). It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. ARC has been designated NASA's Center of Excellence in Information Technology. In this capacity, ARC is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA ARC and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, and visiting scientist programs, designed to encourage and facilitate collaboration between the university and NASA information technology research communities.
Research Institute for Advanced Computer Science
NASA Technical Reports Server (NTRS)
Gross, Anthony R. (Technical Monitor); Leiner, Barry M.
2000-01-01
The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth; (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking. Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, and visiting scientist programs, designed to encourage and facilitate collaboration between the university and NASA information technology research communities.
Basic energy sciences: Summary of accomplishments
NASA Astrophysics Data System (ADS)
1990-05-01
For more than four decades, the Department of Energy, including its predecessor agencies, has supported a program of basic research in nuclear- and energy related sciences, known as Basic Energy Sciences. The purpose of the program is to explore fundamental phenomena, create scientific knowledge, and provide unique user facilities necessary for conducting basic research. Its technical interests span the range of scientific disciplines: physical and biological sciences, geological sciences, engineering, mathematics, and computer sciences. Its products and facilities are essential to technology development in many of the more applied areas of the Department's energy, science, and national defense missions. The accomplishments of Basic Energy Sciences research are numerous and significant. Not only have they contributed to Departmental missions, but have aided significantly the development of technologies which now serve modern society daily in business, industry, science, and medicine. In a series of stories, this report highlights 22 accomplishments, selected because of their particularly noteworthy contributions to modern society. A full accounting of all the accomplishments would be voluminous. Detailed documentation of the research results can be found in many thousands of articles published in peer-reviewed technical literature.
Basic Energy Sciences: Summary of Accomplishments
DOE R&D Accomplishments Database
1990-05-01
For more than four decades, the Department of Energy, including its predecessor agencies, has supported a program of basic research in nuclear- and energy-related sciences, known as Basic Energy Sciences. The purpose of the program is to explore fundamental phenomena, create scientific knowledge, and provide unique user'' facilities necessary for conducting basic research. Its technical interests span the range of scientific disciplines: physical and biological sciences, geological sciences, engineering, mathematics, and computer sciences. Its products and facilities are essential to technology development in many of the more applied areas of the Department's energy, science, and national defense missions. The accomplishments of Basic Energy Sciences research are numerous and significant. Not only have they contributed to Departmental missions, but have aided significantly the development of technologies which now serve modern society daily in business, industry, science, and medicine. In a series of stories, this report highlights 22 accomplishments, selected because of their particularly noteworthy contributions to modern society. A full accounting of all the accomplishments would be voluminous. Detailed documentation of the research results can be found in many thousands of articles published in peer-reviewed technical literature.
Optical information processing at NASA Ames Research Center
NASA Technical Reports Server (NTRS)
Reid, Max B.; Bualat, Maria G.; Cho, Young C.; Downie, John D.; Gary, Charles K.; Ma, Paul W.; Ozcan, Meric; Pryor, Anna H.; Spirkovska, Lilly
1993-01-01
The combination of analog optical processors with digital electronic systems offers the potential of tera-OPS computational performance, while often requiring less power and weight relative to all-digital systems. NASA is working to develop and demonstrate optical processing techniques for on-board, real time science and mission applications. Current research areas and applications under investigation include optical matrix processing for space structure vibration control and the analysis of Space Shuttle Main Engine plume spectra, optical correlation-based autonomous vision for robotic vehicles, analog computation for robotic path planning, free-space optical interconnections for information transfer within digital electronic computers, and multiplexed arrays of fiber optic interferometric sensors for acoustic and vibration measurements.
Applications of Out-of-Domain Knowledge in Students' Reasoning about Computer Program State
ERIC Educational Resources Information Center
Lewis, Colleen Marie
2012-01-01
To meet a growing demand and a projected deficit in the supply of computer professionals (NCWIT, 2009), it is of vital importance to expand students' access to computer science. However, many researchers in the computer science education community unproductively assume that some students lack an innate ability for computer science and…
Scientific Computing Strategic Plan for the Idaho National Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whiting, Eric Todd
Scientific computing is a critical foundation of modern science. Without innovations in the field of computational science, the essential missions of the Department of Energy (DOE) would go unrealized. Taking a leadership role in such innovations is Idaho National Laboratory’s (INL’s) challenge and charge, and is central to INL’s ongoing success. Computing is an essential part of INL’s future. DOE science and technology missions rely firmly on computing capabilities in various forms. Modeling and simulation, fueled by innovations in computational science and validated through experiment, are a critical foundation of science and engineering. Big data analytics from an increasing numbermore » of widely varied sources is opening new windows of insight and discovery. Computing is a critical tool in education, science, engineering, and experiments. Advanced computing capabilities in the form of people, tools, computers, and facilities, will position INL competitively to deliver results and solutions on important national science and engineering challenges. A computing strategy must include much more than simply computers. The foundational enabling component of computing at many DOE national laboratories is the combination of a showcase like data center facility coupled with a very capable supercomputer. In addition, network connectivity, disk storage systems, and visualization hardware are critical and generally tightly coupled to the computer system and co located in the same facility. The existence of these resources in a single data center facility opens the doors to many opportunities that would not otherwise be possible.« less
Infrastructure Systems for Advanced Computing in E-science applications
NASA Astrophysics Data System (ADS)
Terzo, Olivier
2013-04-01
In the e-science field are growing needs for having computing infrastructure more dynamic and customizable with a model of use "on demand" that follow the exact request in term of resources and storage capacities. The integration of grid and cloud infrastructure solutions allows us to offer services that can adapt the availability in terms of up scaling and downscaling resources. The main challenges for e-sciences domains will on implement infrastructure solutions for scientific computing that allow to adapt dynamically the demands of computing resources with a strong emphasis on optimizing the use of computing resources for reducing costs of investments. Instrumentation, data volumes, algorithms, analysis contribute to increase the complexity for applications who require high processing power and storage for a limited time and often exceeds the computational resources that equip the majority of laboratories, research Unit in an organization. Very often it is necessary to adapt or even tweak rethink tools, algorithms, and consolidate existing applications through a phase of reverse engineering in order to adapt them to a deployment on Cloud infrastructure. For example, in areas such as rainfall monitoring, meteorological analysis, Hydrometeorology, Climatology Bioinformatics Next Generation Sequencing, Computational Electromagnetic, Radio occultation, the complexity of the analysis raises several issues such as the processing time, the scheduling of tasks of processing, storage of results, a multi users environment. For these reasons, it is necessary to rethink the writing model of E-Science applications in order to be already adapted to exploit the potentiality of cloud computing services through the uses of IaaS, PaaS and SaaS layer. An other important focus is on create/use hybrid infrastructure typically a federation between Private and public cloud, in fact in this way when all resources owned by the organization are all used it will be easy with a federate cloud infrastructure to add some additional resources form the Public cloud for following the needs in term of computational and storage resources and release them where process are finished. Following the hybrid model, the scheduling approach is important for managing both cloud models. Thanks to this model infrastructure every time resources are available for additional request in term of IT capacities that can used "on demand" for a limited time without having to proceed to purchase additional servers.
A Cognitive Model for Problem Solving in Computer Science
ERIC Educational Resources Information Center
Parham, Jennifer R.
2009-01-01
According to industry representatives, computer science education needs to emphasize the processes involved in solving computing problems rather than their solutions. Most of the current assessment tools used by universities and computer science departments analyze student answers to problems rather than investigating the processes involved in…
Approaches to Classroom-Based Computational Science.
ERIC Educational Resources Information Center
Guzdial, Mark
Computational science includes the use of computer-based modeling and simulation to define and test theories about scientific phenomena. The challenge for educators is to develop techniques for implementing computational science in the classroom. This paper reviews some previous work on the use of simulation alone (without modeling), modeling…
Defining Computational Thinking for Mathematics and Science Classrooms
ERIC Educational Resources Information Center
Weintrop, David; Beheshti, Elham; Horn, Michael; Orton, Kai; Jona, Kemi; Trouille, Laura; Wilensky, Uri
2016-01-01
Science and mathematics are becoming computational endeavors. This fact is reflected in the recently released Next Generation Science Standards and the decision to include "computational thinking" as a core scientific practice. With this addition, and the increased presence of computation in mathematics and scientific contexts, a new…
NASA Center for Computational Sciences: History and Resources
NASA Technical Reports Server (NTRS)
2000-01-01
The Nasa Center for Computational Sciences (NCCS) has been a leading capacity computing facility, providing a production environment and support resources to address the challenges facing the Earth and space sciences research community.
Institute for Computer Applications in Science and Engineering (ICASE)
NASA Technical Reports Server (NTRS)
1984-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science during the period April 1, 1983 through September 30, 1983 is summarized.
Workflow based framework for life science informatics.
Tiwari, Abhishek; Sekhar, Arvind K T
2007-10-01
Workflow technology is a generic mechanism to integrate diverse types of available resources (databases, servers, software applications and different services) which facilitate knowledge exchange within traditionally divergent fields such as molecular biology, clinical research, computational science, physics, chemistry and statistics. Researchers can easily incorporate and access diverse, distributed tools and data to develop their own research protocols for scientific analysis. Application of workflow technology has been reported in areas like drug discovery, genomics, large-scale gene expression analysis, proteomics, and system biology. In this article, we have discussed the existing workflow systems and the trends in applications of workflow based systems.
Geological applications and training in remote sensing
NASA Technical Reports Server (NTRS)
Sabins, F. F., Jr.
1981-01-01
Some of the experiences, methods, and opinions developed during 15 years of teaching an introductory course in remote sensing at several universities in the Southern California area are related. Although the course is offered in Geology departments, every class includes significant numbers of students from other disciplines including geography, computer science, biology, and environmental science. The instructor or teaching assistant provides a few hours of tutorial lectures (outside of regular class time) on basic geology for these nongeologists. This approach is successful because the grade distribution for nongeologists is similar to that for geologists. The schedule for a typical one-semester course is given.
Computers in Science: Thinking Outside the Discipline.
ERIC Educational Resources Information Center
Hamilton, Todd M.
2003-01-01
Describes the Computers in Science course which integrates computer-related techniques into the science disciplines of chemistry, physics, biology, and Earth science. Uses a team teaching approach and teaches students how to solve chemistry problems with spreadsheets, identify minerals with X-rays, and chemical and force analysis. (Contains 14…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-28
... NATIONAL SCIENCE FOUNDATION Advisory Committee for Computer and Information Science and Engineering; Cancellation of Meeting SUMMARY: As a result of the impact of the recent government shutdown, the... Committee for Computer and Information Science and Engineering meeting. The public notice for this committee...
Exemplary Science Teachers' Use of Technology
ERIC Educational Resources Information Center
Hakverdi-Can, Meral; Dana, Thomas M.
2012-01-01
The purpose of this study is to examine exemplary science teachers' level of computer use, their knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, how often they required their students to use those applications in or for their science class…
ERIC Educational Resources Information Center
Science and Children, 1990
1990-01-01
Reviewed are seven computer software packages for IBM and/or Apple Computers. Included are "Windows on Science: Volume 1--Physical Science"; "Science Probe--Physical Science"; "Wildlife Adventures--Grizzly Bears"; "Science Skills--Development Programs"; "The Clean Machine"; "Rock Doctor";…
ERIC Educational Resources Information Center
Hung, Yen-Chu
2012-01-01
The instructional value of web-based education systems has been an important area of research in information systems education. This study investigates the effect of various teaching methods on program design learning for students with specific learning styles in web-based education systems. The study takes first-year Computer Science and…
List of Publications of the U.S. Army Engineer Waterways Experiment Station. Volume 2
1993-09-01
Station List of Publications of the U.S. Army Engineer Waterways Experiment Station Volume II compiled by Research Library Information Management Division...Waterways Experiment Station for Other Agencies Air Base Survivability Systems Management Office Headquarters .............................. Z-1 Airport... manages , conducts, and coordinates research and development in the Information Management (IM) technology areas that include computer science
Accurate Arabic Script Language/Dialect Classification
2014-01-01
Army Research Laboratory Accurate Arabic Script Language/Dialect Classification by Stephen C. Tratz ARL-TR-6761 January 2014 Approved for public...1197 ARL-TR-6761 January 2014 Accurate Arabic Script Language/Dialect Classification Stephen C. Tratz Computational and Information Sciences...Include area code) Standard Form 298 (Rev. 8/98) Prescribed by ANSI Std. Z39.18 January 2014 Final Accurate Arabic Script Language/Dialect Classification
ERIC Educational Resources Information Center
Polo, Blanca J.
2013-01-01
Much research has been done in regards to student programming errors, online education and studio-based learning (SBL) in computer science education. This study furthers this area by bringing together this knowledge and applying it to proactively help students overcome impasses caused by common student programming errors. This project proposes a…
ERIC Educational Resources Information Center
Sayre, Scott Alan
The ultimate goal of the science of artificial intelligence (AI) is to establish programs that will use algorithmic computer techniques to imitate the heuristic thought processes of humans. Most AI programs, especially expert systems, organize their knowledge into three specific areas: data storage, a rule set, and a control structure. Limitations…
Small business innovation research. Abstracts of completed 1987 phase 1 projects
NASA Technical Reports Server (NTRS)
1989-01-01
Non-proprietary summaries of Phase 1 Small Business Innovation Research (SBIR) projects supported by NASA in the 1987 program year are given. Work in the areas of aeronautical propulsion, aerodynamics, acoustics, aircraft systems, materials and structures, teleoperators and robotics, computer sciences, information systems, spacecraft systems, spacecraft power supplies, spacecraft propulsion, bioastronautics, satellite communication, and space processing are covered.
An Overview of NASA's Intelligent Systems Program
NASA Technical Reports Server (NTRS)
Cooke, Daniel E.; Norvig, Peter (Technical Monitor)
2001-01-01
NASA and the Computer Science Research community are poised to enter a critical era. An era in which - it seems - that each needs the other. Market forces, driven by the immediate economic viability of computer science research results, place Computer Science in a relatively novel position. These forces impact how research is done, and could, in worst case, drive the field away from significant innovation opting instead for incremental advances that result in greater stability in the market place. NASA, however, requires significant advances in computer science research in order to accomplish the exploration and science agenda it has set out for itself. NASA may indeed be poised to advance computer science research in this century much the way it advanced aero-based research in the last.
NASA Astrophysics Data System (ADS)
Klopfer, Eric; Scheintaub, Hal; Huang, Wendy; Wendel, Daniel
Computational approaches to science are radically altering the nature of scientific investigatiogn. Yet these computer programs and simulations are sparsely used in science education, and when they are used, they are typically “canned” simulations which are black boxes to students. StarLogo The Next Generation (TNG) was developed to make programming of simulations more accessible for students and teachers. StarLogo TNG builds on the StarLogo tradition of agent-based modeling for students and teachers, with the added features of a graphical programming environment and a three-dimensional (3D) world. The graphical programming environment reduces the learning curve of programming, especially syntax. The 3D graphics make for a more immersive and engaging experience for students, including making it easy to design and program their own video games. Another change to StarLogo TNG is a fundamental restructuring of the virtual machine to make it more transparent. As a result of these changes, classroom use of TNG is expanding to new areas. This chapter is concluded with a description of field tests conducted in middle and high school science classes.
NASA Astrophysics Data System (ADS)
Hudak, Bethany M.
Science, technology, engineering, and mathematics (STEM) education has become an emphasized component of PreK-12 education in the United States. The US is struggling to produce enough science, mathematics, and technology experts to meet its national and global needs, and the mean scores of science and mathematics students are not meeting the expected levels desired by our leaders (Hossain & Robinson, 2011). In an effort to improve achievement scores in mathematics and science, school districts must consider many components that can contribute to the development of a classroom where students are engaged and growing academically. Computer technology (CT) for student use is a popular avenue for school districts to pursue in their goal to attain higher achievement. The purpose of this study is to examine the use of iPads in a one-to-one setting, where every student has his own device 24/7, to determine the effects, if any, on academic achievement in the areas of mathematics and science. This comparison study used hierarchical linear modeling (HLM) to examine three middle schools in a private school district. Two of the schools have implemented a one-to-one iPad program with their sixth through eighth grades and the third school uses computers on limited occasions in the classroom and in a computer lab setting. The questions addressed were what effect, if any, do the implementation of a one-to-one iPad program and a teacher's perception of his use of constructivist teaching strategies have on student academic achievement in the mathematics and science middle school classrooms. The research showed that although the program helped promote the use of constructivist activities through the use of technology, the one-to-one iPad initiative had no effect on academic achievement in the middle school mathematics and science classrooms.
A Review of Models for Teacher Preparation Programs for Precollege Computer Science Education.
ERIC Educational Resources Information Center
Deek, Fadi P.; Kimmel, Howard
2002-01-01
Discusses the need for adequate precollege computer science education and focuses on the issues of teacher preparation programs and requirements needed to teach high school computer science. Presents models of teacher preparation programs and compares state requirements with Association for Computing Machinery (ACM) recommendations. (Author/LRW)
A DDC Bibliography on Computers in Information Sciences. Volume II. Information Sciences Series.
ERIC Educational Resources Information Center
Defense Documentation Center, Alexandria, VA.
The unclassified and unlimited bibliography compiles references dealing specifically with the role of computers in information sciences. The volume contains 239 annotated references grouped under three major headings: Artificial and Programming Languages, Computer Processing of Analog Data, and Computer Processing of Digital Data. The references…
Making Advanced Computer Science Topics More Accessible through Interactive Technologies
ERIC Educational Resources Information Center
Shao, Kun; Maher, Peter
2012-01-01
Purpose: Teaching advanced technical concepts in a computer science program to students of different technical backgrounds presents many challenges. The purpose of this paper is to present a detailed experimental pedagogy in teaching advanced computer science topics, such as computer networking, telecommunications and data structures using…
ASCR Workshop on Quantum Computing for Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aspuru-Guzik, Alan; Van Dam, Wim; Farhi, Edward
This report details the findings of the DOE ASCR Workshop on Quantum Computing for Science that was organized to assess the viability of quantum computing technologies to meet the computational requirements of the DOE’s science and energy mission, and to identify the potential impact of quantum technologies. The workshop was held on February 17-18, 2015, in Bethesda, MD, to solicit input from members of the quantum computing community. The workshop considered models of quantum computation and programming environments, physical science applications relevant to DOE's science mission as well as quantum simulation, and applied mathematics topics including potential quantum algorithms formore » linear algebra, graph theory, and machine learning. This report summarizes these perspectives into an outlook on the opportunities for quantum computing to impact problems relevant to the DOE’s mission as well as the additional research required to bring quantum computing to the point where it can have such impact.« less