Sample records for computer science techniques

  1. Computer sciences

    NASA Technical Reports Server (NTRS)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  2. Computational techniques in tribology and material science at the atomic level

    NASA Technical Reports Server (NTRS)

    Ferrante, J.; Bozzolo, G. H.

    1992-01-01

    Computations in tribology and material science at the atomic level present considerable difficulties. Computational techniques ranging from first-principles to semi-empirical and their limitations are discussed. Example calculations of metallic surface energies using semi-empirical techniques are presented. Finally, application of the methods to calculation of adhesion and friction are presented.

  3. Approaches to Classroom-Based Computational Science.

    ERIC Educational Resources Information Center

    Guzdial, Mark

    Computational science includes the use of computer-based modeling and simulation to define and test theories about scientific phenomena. The challenge for educators is to develop techniques for implementing computational science in the classroom. This paper reviews some previous work on the use of simulation alone (without modeling), modeling…

  4. Methodical Approaches to Teaching of Computer Modeling in Computer Science Course

    ERIC Educational Resources Information Center

    Rakhimzhanova, B. Lyazzat; Issabayeva, N. Darazha; Khakimova, Tiyshtik; Bolyskhanova, J. Madina

    2015-01-01

    The purpose of this study was to justify of the formation technique of representation of modeling methodology at computer science lessons. The necessity of studying computer modeling is that the current trends of strengthening of general education and worldview functions of computer science define the necessity of additional research of the…

  5. Computers in Science: Thinking Outside the Discipline.

    ERIC Educational Resources Information Center

    Hamilton, Todd M.

    2003-01-01

    Describes the Computers in Science course which integrates computer-related techniques into the science disciplines of chemistry, physics, biology, and Earth science. Uses a team teaching approach and teaches students how to solve chemistry problems with spreadsheets, identify minerals with X-rays, and chemical and force analysis. (Contains 14…

  6. PREFACE: 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT2013)

    NASA Astrophysics Data System (ADS)

    Wang, Jianxiong

    2014-06-01

    This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF

  7. Integrating Multimedia Techniques into CS Pedagogy.

    ERIC Educational Resources Information Center

    Adams, Sandra Honda; Jou, Richard; Nasri, Ahmad; Radimsky, Anne-Louise; Sy, Bon K.

    Through its grants, the National Science Foundation sponsors workshops that inform faculty of current topics in computer science. Such a workshop, entitled, "Developing Multimedia-based Interactive Laboratory Modules for Computer Science," was given July 27-August 6, 1998, at Illinois State University at Normal. Each participant was…

  8. Group Projects and the Computer Science Curriculum

    ERIC Educational Resources Information Center

    Joy, Mike

    2005-01-01

    Group projects in computer science are normally delivered with reference to good software engineering practice. The discipline of software engineering is rapidly evolving, and the application of the latest 'agile techniques' to group projects causes a potential conflict with constraints imposed by regulating bodies on the computer science…

  9. Computer Science Concept Inventories: Past and Future

    ERIC Educational Resources Information Center

    Taylor, C.; Zingaro, D.; Porter, L.; Webb, K. C.; Lee, C. B.; Clancy, M.

    2014-01-01

    Concept Inventories (CIs) are assessments designed to measure student learning of core concepts. CIs have become well known for their major impact on pedagogical techniques in other sciences, especially physics. Presently, there are no widely used, validated CIs for computer science. However, considerable groundwork has been performed in the form…

  10. Can Peer Instruction Be Effective in Upper-Division Computer Science Courses?

    ERIC Educational Resources Information Center

    Bailey Lee, Cynthia; Garcia, Saturnino; Porter, Leo

    2013-01-01

    Peer Instruction (PI) is an active learning pedagogical technique. PI lectures present students with a series of multiple-choice questions, which they respond to both individually and in groups. PI has been widely successful in the physical sciences and, recently, has been successfully adopted by computer science instructors in lower-division,…

  11. Archaeology Through Computational Linguistics: Inscription Statistics Predict Excavation Sites of Indus Valley Artifacts.

    PubMed

    Recchia, Gabriel L; Louwerse, Max M

    2016-11-01

    Computational techniques comparing co-occurrences of city names in texts allow the relative longitudes and latitudes of cities to be estimated algorithmically. However, these techniques have not been applied to estimate the provenance of artifacts with unknown origins. Here, we estimate the geographic origin of artifacts from the Indus Valley Civilization, applying methods commonly used in cognitive science to the Indus script. We show that these methods can accurately predict the relative locations of archeological sites on the basis of artifacts of known provenance, and we further apply these techniques to determine the most probable excavation sites of four sealings of unknown provenance. These findings suggest that inscription statistics reflect historical interactions among locations in the Indus Valley region, and they illustrate how computational methods can help localize inscribed archeological artifacts of unknown origin. The success of this method offers opportunities for the cognitive sciences in general and for computational anthropology specifically. Copyright © 2015 Cognitive Science Society, Inc.

  12. Implications of Windowing Techniques for CAI.

    ERIC Educational Resources Information Center

    Heines, Jesse M.; Grinstein, Georges G.

    This paper discusses the use of a technique called windowing in computer assisted instruction to allow independent control of functional areas in complex CAI displays and simultaneous display of output from a running computer program and coordinated instructional material. Two obstacles to widespread use of CAI in computer science courses are…

  13. CAPSAS: Computer Assisted Program for the Selection of Appropriate Statistics.

    ERIC Educational Resources Information Center

    Shermis, Mark D.; Albert, Susan L.

    A computer-assisted program has been developed for the selection of statistics or statistical techniques by both students and researchers. Based on Andrews, Klem, Davidson, O'Malley and Rodgers "A Guide for Selecting Statistical Techniques for Analyzing Social Science Data," this FORTRAN-compiled interactive computer program was…

  14. Effect of Computer Animation Technique on Students' Comprehension of the "Solar System and Beyond" Unit in the Science and Technology Course

    ERIC Educational Resources Information Center

    Aksoy, Gokhan

    2013-01-01

    The purpose of this study is to determine the effect of computer animation technique on academic achievement of students in the "Solar System and Beyond" unit lecture as part of the Science and Technology course of the seventh grade in primary education. The sample of the study consists of 60 students attending to the 7th grade of primary school…

  15. Animated computer graphics models of space and earth sciences data generated via the massively parallel processor

    NASA Technical Reports Server (NTRS)

    Treinish, Lloyd A.; Gough, Michael L.; Wildenhain, W. David

    1987-01-01

    The capability was developed of rapidly producing visual representations of large, complex, multi-dimensional space and earth sciences data sets via the implementation of computer graphics modeling techniques on the Massively Parallel Processor (MPP) by employing techniques recently developed for typically non-scientific applications. Such capabilities can provide a new and valuable tool for the understanding of complex scientific data, and a new application of parallel computing via the MPP. A prototype system with such capabilities was developed and integrated into the National Space Science Data Center's (NSSDC) Pilot Climate Data System (PCDS) data-independent environment for computer graphics data display to provide easy access to users. While developing these capabilities, several problems had to be solved independently of the actual use of the MPP, all of which are outlined.

  16. Applying service learning to computer science: attracting and engaging under-represented students

    NASA Astrophysics Data System (ADS)

    Dahlberg, Teresa; Barnes, Tiffany; Buch, Kim; Bean, Karen

    2010-09-01

    This article describes a computer science course that uses service learning as a vehicle to accomplish a range of pedagogical and BPC (broadening participation in computing) goals: (1) to attract a diverse group of students and engage them in outreach to younger students to help build a diverse computer science pipeline, (2) to develop leadership and team skills using experiential techniques, and (3) to develop student attitudes associated with success and retention in computer science. First, we describe the course and how it was designed to incorporate good practice in service learning. We then report preliminary results showing a positive impact of the course on all pedagogical goals and discuss the implications of the results for broadening participation in computing.

  17. A Spacelab Expert System for Remote Engineering and Science

    NASA Technical Reports Server (NTRS)

    Groleau, Nick; Colombano, Silvano; Friedland, Peter (Technical Monitor)

    1994-01-01

    NASA's space science program is based on strictly pre-planned activities. This approach does not always result in the best science. We describe an existing computer system that enables space science to be conducted in a more reactive manner through advanced automation techniques that have recently been used in SLS-2 October 1993 space shuttle flight. Advanced computing techniques, usually developed in the field of Artificial Intelligence, allow large portions of the scientific investigator's knowledge to be "packaged" in a portable computer to present advice to the astronaut operator. We strongly believe that this technology has wide applicability to other forms of remote science/engineering. In this brief article, we present the technology of remote science/engineering assistance as implemented for the SLS-2 space shuttle flight. We begin with a logical overview of the system (paying particular attention to the implementation details relevant to the use of the embedded knowledge for system reasoning), then describe its use and success in space, and conclude with ideas about possible earth uses of the technology in the life and medical sciences.

  18. Great Computational Intelligence in the Formal Sciences via Analogical Reasoning

    DTIC Science & Technology

    2017-05-08

    computational harnessing of traditional mathematical statistics (as e.g. covered in Hogg, Craig & McKean 2005) is used to power statistical learning techniques...AFRL-AFOSR-VA-TR-2017-0099 Great Computational Intelligence in the Formal Sciences via Analogical Reasoning Selmer Bringsjord RENSSELAER POLYTECHNIC...08-05-2017 2. REPORT TYPE Final Performance 3. DATES COVERED (From - To) 15 Oct 2011 to 31 Dec 2016 4. TITLE AND SUBTITLE Great Computational

  19. What Does CALL Have to Offer Computer Science and What Does Computer Science Have to Offer CALL?

    ERIC Educational Resources Information Center

    Cushion, Steve

    2006-01-01

    We will argue that CALL can usefully be viewed as a subset of computer software engineering and can profit from adopting some of the recent progress in software development theory. The unified modelling language has become the industry standard modelling technique and the accompanying unified process is rapidly gaining acceptance. The manner in…

  20. 78 FR 49781 - Notice of Intent To Seek Approval To Establish an Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-15

    ... computer and information science and engineering. Awardees will be required to submit annual project... through the use of automated collection techniques or other forms of information technology. DATES...: Title of Collection: Computer and Information Science and Engineering Reporting Requirements. OMB Number...

  1. Computer-Based Imaginary Sciences and Research on Concept Acquisition.

    ERIC Educational Resources Information Center

    Allen, Brockenbrough S.

    To control for interactions in learning research due to subjects' prior knowledge of the instructional material presented, an imaginary curriculum was presented with a computer assisted technique based on Carl Berieter's imaginary science of Xenograde systems. The curriculum consisted of a classification system for ten conceptual classes of…

  2. Structural biology computing: Lessons for the biomedical research sciences.

    PubMed

    Morin, Andrew; Sliz, Piotr

    2013-11-01

    The field of structural biology, whose aim is to elucidate the molecular and atomic structures of biological macromolecules, has long been at the forefront of biomedical sciences in adopting and developing computational research methods. Operating at the intersection between biophysics, biochemistry, and molecular biology, structural biology's growth into a foundational framework on which many concepts and findings of molecular biology are interpreted1 has depended largely on parallel advancements in computational tools and techniques. Without these computing advances, modern structural biology would likely have remained an exclusive pursuit practiced by few, and not become the widely practiced, foundational field it is today. As other areas of biomedical research increasingly embrace research computing techniques, the successes, failures and lessons of structural biology computing can serve as a useful guide to progress in other biomedically related research fields. Copyright © 2013 Wiley Periodicals, Inc.

  3. Science Notes.

    ERIC Educational Resources Information Center

    Murray, A. J. S.; And Others

    1988-01-01

    Presents 31 science activities for use with high school or college science classes. Topics included are: chromatography, ecology, invertebrates, enzymes, genetics, botany, creep, crystals, diffusion, computer interfaces, acid rain, teaching techniques, chemical reactions, waves, electric fields, rainbows, electricity, magnetic fields, and a Pitot…

  4. Global Journal of Computer Science and Technology. Volume 1.2

    ERIC Educational Resources Information Center

    Dixit, R. K.

    2009-01-01

    Articles in this issue of "Global Journal of Computer Science and Technology" include: (1) Input Data Processing Techniques in Intrusion Detection Systems--Short Review (Suhair H. Amer and John A. Hamilton, Jr.); (2) Semantic Annotation of Stock Photography for CBIR Using MPEG-7 standards (R. Balasubramani and V. Kannan); (3) An Experimental Study…

  5. Global Journal of Computer Science and Technology. Volume 9, Issue 5 (Ver. 2.0)

    ERIC Educational Resources Information Center

    Dixit, R. K.

    2010-01-01

    This is a special issue published in version 1.0 of "Global Journal of Computer Science and Technology." Articles in this issue include: (1) [Theta] Scheme (Orthogonal Milstein Scheme), a Better Numerical Approximation for Multi-dimensional SDEs (Klaus Schmitz Abe); (2) Input Data Processing Techniques in Intrusion Detection…

  6. Feasibility Study of a Vision-Based Landing System for Unmanned Fixed-Wing Aircraft

    DTIC Science & Technology

    2017-06-01

    International Journal of Computer Science and Network Security 7 no. 3: 112–117. Accessed April 7, 2017. http://www.sciencedirect.com/science/ article /pii...the feasibility of applying computer vision techniques and visual feedback in the control loop for an autonomous system. This thesis examines the...integration into an autonomous aircraft control system. 14. SUBJECT TERMS autonomous systems, auto-land, computer vision, image processing

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None, None

    The Second SIAM Conference on Computational Science and Engineering was held in San Diego from February 10-12, 2003. Total conference attendance was 553. This is a 23% increase in attendance over the first conference. The focus of this conference was to draw attention to the tremendous range of major computational efforts on large problems in science and engineering, to promote the interdisciplinary culture required to meet these large-scale challenges, and to encourage the training of the next generation of computational scientists. Computational Science & Engineering (CS&E) is now widely accepted, along with theory and experiment, as a crucial third modemore » of scientific investigation and engineering design. Aerospace, automotive, biological, chemical, semiconductor, and other industrial sectors now rely on simulation for technical decision support. For federal agencies also, CS&E has become an essential support for decisions on resources, transportation, and defense. CS&E is, by nature, interdisciplinary. It grows out of physical applications and it depends on computer architecture, but at its heart are powerful numerical algorithms and sophisticated computer science techniques. From an applied mathematics perspective, much of CS&E has involved analysis, but the future surely includes optimization and design, especially in the presence of uncertainty. Another mathematical frontier is the assimilation of very large data sets through such techniques as adaptive multi-resolution, automated feature search, and low-dimensional parameterization. The themes of the 2003 conference included, but were not limited to: Advanced Discretization Methods; Computational Biology and Bioinformatics; Computational Chemistry and Chemical Engineering; Computational Earth and Atmospheric Sciences; Computational Electromagnetics; Computational Fluid Dynamics; Computational Medicine and Bioengineering; Computational Physics and Astrophysics; Computational Solid Mechanics and Materials; CS&E Education; Meshing and Adaptivity; Multiscale and Multiphysics Problems; Numerical Algorithms for CS&E; Discrete and Combinatorial Algorithms for CS&E; Inverse Problems; Optimal Design, Optimal Control, and Inverse Problems; Parallel and Distributed Computing; Problem-Solving Environments; Software and Wddleware Systems; Uncertainty Estimation and Sensitivity Analysis; and Visualization and Computer Graphics.« less

  8. Girls in computer science: A female only introduction class in high school

    NASA Astrophysics Data System (ADS)

    Drobnis, Ann W.

    This study examined the impact of an all girls' classroom environment in a high school introductory computer science class on the student's attitudes towards computer science and their thoughts on future involvement with computer science. It was determined that an all girls' introductory class could impact the declining female enrollment and female students' efficacy towards computer science. This research was conducted in a summer school program through a regional magnet school for science and technology which these students attend during the school year. Three different groupings of students were examined for the research: female students in an all girls' class, female students in mixed-gender classes and male students in mixed-gender classes. A survey, Attitudes about Computers and Computer Science (ACCS), was designed to obtain an understanding of the students' thoughts, preconceptions, attitude, knowledge of computer science, and future intentions around computer science, both in education and career. Students in all three groups were administered the ACCS prior to taking the class and upon completion of the class. In addition, students in the all girls' class wrote in a journal throughout the course, and some of those students were also interviewed upon completion of the course. The data was analyzed using quantitative and qualitative techniques. While there were no major differences found in the quantitative data, it was determined that girls in the all girls' class were truly excited by what they had learned and were more open to the idea of computer science being a part of their future.

  9. Technologies for Army Knowledge Fusion

    DTIC Science & Technology

    2004-09-01

    interpret it in context and understand the implications (Alberts et al., 2002). Note that the knowledge / information fusion issue arises immediately here...Army Knowledge Fusion Richard Scherl Department of Computer Science Monmouth University Dana L. Ulery Computational and Information Sciences...civilian and military sources. Knowledge fusion, also called information fusion and multisensor data fusion, names the body of techniques needed to

  10. Computer literacy for life sciences: helping the digital-era biology undergraduates face today's research.

    PubMed

    Smolinski, Tomasz G

    2010-01-01

    Computer literacy plays a critical role in today's life sciences research. Without the ability to use computers to efficiently manipulate and analyze large amounts of data resulting from biological experiments and simulations, many of the pressing questions in the life sciences could not be answered. Today's undergraduates, despite the ubiquity of computers in their lives, seem to be largely unfamiliar with how computers are being used to pursue and answer such questions. This article describes an innovative undergraduate-level course, titled Computer Literacy for Life Sciences, that aims to teach students the basics of a computerized scientific research pursuit. The purpose of the course is for students to develop a hands-on working experience in using standard computer software tools as well as computer techniques and methodologies used in life sciences research. This paper provides a detailed description of the didactical tools and assessment methods used in and outside of the classroom as well as a discussion of the lessons learned during the first installment of the course taught at Emory University in fall semester 2009.

  11. Biomanufacturing: a US-China National Science Foundation-sponsored workshop.

    PubMed

    Sun, Wei; Yan, Yongnian; Lin, Feng; Spector, Myron

    2006-05-01

    A recent US-China National Science Foundation-sponsored workshop on biomanufacturing reviewed the state-of-the-art of an array of new technologies for producing scaffolds for tissue engineering, providing precision multi-scale control of material, architecture, and cells. One broad category of such techniques has been termed solid freeform fabrication. The techniques in this category include: stereolithography, selected laser sintering, single- and multiple-nozzle deposition and fused deposition modeling, and three-dimensional printing. The precise and repetitive placement of material and cells in a three-dimensional construct at the micrometer length scale demands computer control. These novel computer-controlled scaffold production techniques, when coupled with computer-based imaging and structural modeling methods for the production of the templates for the scaffolds, define an emerging field of computer-aided tissue engineering. In formulating the questions that remain to be answered and discussing the knowledge required to further advance the field, the Workshop provided a basis for recommendations for future work.

  12. Design of Mariner 9 Science Sequences using Interactive Graphics Software

    NASA Technical Reports Server (NTRS)

    Freeman, J. E.; Sturms, F. M, Jr.; Webb, W. A.

    1973-01-01

    This paper discusses the analyst/computer system used to design the daily science sequences required to carry out the desired Mariner 9 science plan. The Mariner 9 computer environment, the development and capabilities of the science sequence design software, and the techniques followed in the daily mission operations are discussed. Included is a discussion of the overall mission operations organization and the individual components which played an essential role in the sequence design process. A summary of actual sequences processed, a discussion of problems encountered, and recommendations for future applications are given.

  13. Computer science: Key to a space program renaissance. The 1981 NASA/ASEE summer study on the use of computer science and technology in NASA. Volume 2: Appendices

    NASA Technical Reports Server (NTRS)

    Freitas, R. A., Jr. (Editor); Carlson, P. A. (Editor)

    1983-01-01

    Adoption of an aggressive computer science research and technology program within NASA will: (1) enable new mission capabilities such as autonomous spacecraft, reliability and self-repair, and low-bandwidth intelligent Earth sensing; (2) lower manpower requirements, especially in the areas of Space Shuttle operations, by making fuller use of control center automation, technical support, and internal utilization of state-of-the-art computer techniques; (3) reduce project costs via improved software verification, software engineering, enhanced scientist/engineer productivity, and increased managerial effectiveness; and (4) significantly improve internal operations within NASA with electronic mail, managerial computer aids, an automated bureaucracy and uniform program operating plans.

  14. Gesture Analysis for Astronomy Presentation Software

    NASA Astrophysics Data System (ADS)

    Robinson, Marc A.

    Astronomy presentation software in a planetarium setting provides a visually stimulating way to introduce varied scientific concepts, including computer science concepts, to a wide audience. However, the underlying computational complexity and opportunities for discussion are often overshadowed by the brilliance of the presentation itself. To bring this discussion back out into the open, a method needs to be developed to make the computer science applications more visible. This thesis introduces the GAAPS system, which endeavors to implement free-hand gesture-based control of astronomy presentation software, with the goal of providing that talking point to begin the discussion of computer science concepts in a planetarium setting. The GAAPS system incorporates gesture capture and analysis in a unique environment presenting unique challenges, and introduces a novel algorithm called a Bounding Box Tree to create and select features for this particular gesture data. This thesis also analyzes several different machine learning techniques to determine a well-suited technique for the classification of this particular data set, with an artificial neural network being chosen as the implemented algorithm. The results of this work will allow for the desired introduction of computer science discussion into the specific setting used, as well as provide for future work pertaining to gesture recognition with astronomy presentation software.

  15. Analysis of Nature of Science Included in Recent Popular Writing Using Text Mining Techniques

    NASA Astrophysics Data System (ADS)

    Jiang, Feng; McComas, William F.

    2014-09-01

    This study examined the inclusion of nature of science (NOS) in popular science writing to determine whether it could serve supplementary resource for teaching NOS and to evaluate the accuracy of text mining and classification as a viable research tool in science education research. Four groups of documents published from 2001 to 2010 were analyzed: Scientific American, Discover magazine, winners of the Royal Society Winton Prize for Science Books, and books from NSTA's list of Outstanding Science Trade Books. Computer analysis categorized passages in the selected documents based on their inclusions of NOS. Human analysis assessed the frequency, context, coverage, and accuracy of the inclusions of NOS within computer identified NOS passages. NOS was rarely addressed in selected document sets but somewhat more frequently addressed in the letters section of the two magazines. This result suggests that readers seem interested in the discussion of NOS-related themes. In the popular science books analyzed, NOS presentations were found more likely to be aggregated in the beginning and the end of the book, rather than scattered throughout. The most commonly addressed NOS elements in the analyzed documents are science and society and empiricism in science. Only one inaccurate presentation of NOS were identified in all analyzed documents. The text mining technique demonstrated exciting performance, which invites more applications of the technique to analyze other aspects of science textbooks, popular science writing, or other materials involved in science teaching and learning.

  16. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.

  17. Nontrivial, Nonintelligent, Computer-Based Learning.

    ERIC Educational Resources Information Center

    Bork, Alfred

    1987-01-01

    This paper describes three interactive computer programs used with personal computers to present science learning modules for all ages. Developed by groups of teachers at the Educational Technology Center at the University of California, Irvine, these instructional materials do not use the techniques of contemporary artificial intelligence. (GDC)

  18. Hypergraph-Based Combinatorial Optimization of Matrix-Vector Multiplication

    ERIC Educational Resources Information Center

    Wolf, Michael Maclean

    2009-01-01

    Combinatorial scientific computing plays an important enabling role in computational science, particularly in high performance scientific computing. In this thesis, we will describe our work on optimizing matrix-vector multiplication using combinatorial techniques. Our research has focused on two different problems in combinatorial scientific…

  19. Knowledge Discovery from Climate Data using Graph-Based Methods

    NASA Astrophysics Data System (ADS)

    Steinhaeuser, K.

    2012-04-01

    Climate and Earth sciences have recently experienced a rapid transformation from a historically data-poor to a data-rich environment, thus bringing them into the realm of the Fourth Paradigm of scientific discovery - a term coined by the late Jim Gray (Hey et al. 2009), the other three being theory, experimentation and computer simulation. In particular, climate-related observations from remote sensors on satellites and weather radars, in situ sensors and sensor networks, as well as outputs of climate or Earth system models from large-scale simulations, provide terabytes of spatio-temporal data. These massive and information-rich datasets offer a significant opportunity for advancing climate science and our understanding of the global climate system, yet current analysis techniques are not able to fully realize their potential benefits. We describe a class of computational approaches, specifically from the data mining and machine learning domains, which may be novel to the climate science domain and can assist in the analysis process. Computer scientists have developed spatial and spatio-temporal analysis techniques for a number of years now, and many of them may be applicable and/or adaptable to problems in climate science. We describe a large-scale, NSF-funded project aimed at addressing climate science question using computational analysis methods; team members include computer scientists, statisticians, and climate scientists from various backgrounds. One of the major thrusts is in the development of graph-based methods, and several illustrative examples of recent work in this area will be presented.

  20. Data Science and Optimal Learning for Material Discovery and Design

    Science.gov Websites

    in computation and experimental techniques generating vast arrays of data, without a clear link to experimental and computational data, designing new materials and impacting computational models. This meeting computational and experimental data (c) Analysis of data from probes such as light sources, as well as other

  1. Computer Animations a Science Teaching Aid: Contemplating an Effective Methodology

    ERIC Educational Resources Information Center

    Tannu, Kirti

    2008-01-01

    To improve quality of science education, the author suggests use of entertaining and exciting technique of animation for better understanding of scientific principles. Latest technologies are being used with more vigour to spread venomous superstitions. Better understanding of science may help students to better their scientific temper. Keeping…

  2. Changing the Paradigm: Preparing Students for the Computing Profession in the 21st Century

    NASA Technical Reports Server (NTRS)

    Robbins, Kay A.

    2003-01-01

    The dramatic technological developments of the past decade have led to a tremendous growth in the demand for computer science professionals well-versed in advanced technology and techniques. NASA, traditionally a haven for cutting-edge innovators, is now competing with every industrial and government sector for computer science talent. The computer science program at University of Texas at San Antonio (UTSA) faces challenges beyond those intrinsically presented by rapid technological change, because a significant number of UTSA students come from low-income families with no Internet or computer access at home. An examination of enrollment statistics for the computer science program at UTSA showed that very few students who entered as freshmen successfully graduated. The upper division courses appeared to be populated by graduate students removing deficiencies and by transfer students. The faculty was also concerned that the students who did graduate from the program did not have the strong technical and programming skills that the CS program had been noted for in the community during the 1980's.

  3. Wave refraction diagrams for the Baltimore Canyon region of the mid-Atlantic continental shelf computed by using three bottom topography approximation techniques

    NASA Technical Reports Server (NTRS)

    Poole, L. R.

    1976-01-01

    The Langley Research Center and Virginia Institute of Marine Science wave refraction computer model was applied to the Baltimore Canyon region of the mid-Atlantic continental shelf. Wave refraction diagrams for a wide range of normally expected wave periods and directions were computed by using three bottom topography approximation techniques: quadratic least squares, cubic least squares, and constrained bicubic interpolation. Mathematical or physical interpretation of certain features appearing in the computed diagrams is discussed.

  4. Applications of multigrid software in the atmospheric sciences

    NASA Technical Reports Server (NTRS)

    Adams, J.; Garcia, R.; Gross, B.; Hack, J.; Haidvogel, D.; Pizzo, V.

    1992-01-01

    Elliptic partial differential equations from different areas in the atmospheric sciences are efficiently and easily solved utilizing the multigrid software package named MUDPACK. It is demonstrated that the multigrid method is more efficient than other commonly employed techniques, such as Gaussian elimination and fixed-grid relaxation. The efficiency relative to other techniques, both in terms of storage requirement and computational time, increases quickly with grid size.

  5. Matthew Reynolds | NREL

    Science.gov Websites

    food science. Matthew's research at NREL is focused on applying uncertainty quantification techniques . Research Interests Uncertainty quantification Computational multilinear algebra Approximation theory of and the Canonical Tensor Decomposition, Journal of Computational Physics (2017) Randomized Alternating

  6. Developing science gateways for drug discovery in a grid environment.

    PubMed

    Pérez-Sánchez, Horacio; Rezaei, Vahid; Mezhuyev, Vitaliy; Man, Duhu; Peña-García, Jorge; den-Haan, Helena; Gesing, Sandra

    2016-01-01

    Methods for in silico screening of large databases of molecules increasingly complement and replace experimental techniques to discover novel compounds to combat diseases. As these techniques become more complex and computationally costly we are faced with an increasing problem to provide the research community of life sciences with a convenient tool for high-throughput virtual screening on distributed computing resources. To this end, we recently integrated the biophysics-based drug-screening program FlexScreen into a service, applicable for large-scale parallel screening and reusable in the context of scientific workflows. Our implementation is based on Pipeline Pilot and Simple Object Access Protocol and provides an easy-to-use graphical user interface to construct complex workflows, which can be executed on distributed computing resources, thus accelerating the throughput by several orders of magnitude.

  7. Identification and Addressing Reduction-Related Misconceptions

    ERIC Educational Resources Information Center

    Gal-Ezer, Judith; Trakhtenbrot, Mark

    2016-01-01

    Reduction is one of the key techniques used for problem-solving in computer science. In particular, in the theory of computation and complexity (TCC), mapping and polynomial reductions are used for analysis of decidability and computational complexity of problems, including the core concept of NP-completeness. Reduction is a highly abstract…

  8. Symposium Connects Government Problems with State of the Art Network Science Research

    DTIC Science & Technology

    2015-10-16

    Symposium Connects Government Problems with State-of-the- Art Network Science Research By Rajmonda S. Caceres and Benjamin A. Miller Network...the US Gov- ernment, and match these with the state-of-the- art models and techniques developed in the network science research community. Since its... science has grown significantly in the last several years as a field at the intersec- tion of mathematics, computer science , social science , and engineering

  9. CESDIS

    NASA Technical Reports Server (NTRS)

    1994-01-01

    CESDIS, the Center of Excellence in Space Data and Information Sciences was developed jointly by NASA, Universities Space Research Association (USRA), and the University of Maryland in 1988 to focus on the design of advanced computing techniques and data systems to support NASA Earth and space science research programs. CESDIS is operated by USRA under contract to NASA. The Director, Associate Director, Staff Scientists, and administrative staff are located on-site at NASA's Goddard Space Flight Center in Greenbelt, Maryland. The primary CESDIS mission is to increase the connection between computer science and engineering research programs at colleges and universities and NASA groups working with computer applications in Earth and space science. The 1993-94 CESDIS year included a broad range of computer science research applied to NASA problems. This report provides an overview of these research projects and programs as well as a summary of the various other activities of CESDIS in support of NASA and the university research community, We have had an exciting and challenging year.

  10. Scaling up to address data science challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wendelberger, Joanne R.

    Statistics and Data Science provide a variety of perspectives and technical approaches for exploring and understanding Big Data. Partnerships between scientists from different fields such as statistics, machine learning, computer science, and applied mathematics can lead to innovative approaches for addressing problems involving increasingly large amounts of data in a rigorous and effective manner that takes advantage of advances in computing. Here, this article will explore various challenges in Data Science and will highlight statistical approaches that can facilitate analysis of large-scale data including sampling and data reduction methods, techniques for effective analysis and visualization of large-scale simulations, and algorithmsmore » and procedures for efficient processing.« less

  11. Scaling up to address data science challenges

    DOE PAGES

    Wendelberger, Joanne R.

    2017-04-27

    Statistics and Data Science provide a variety of perspectives and technical approaches for exploring and understanding Big Data. Partnerships between scientists from different fields such as statistics, machine learning, computer science, and applied mathematics can lead to innovative approaches for addressing problems involving increasingly large amounts of data in a rigorous and effective manner that takes advantage of advances in computing. Here, this article will explore various challenges in Data Science and will highlight statistical approaches that can facilitate analysis of large-scale data including sampling and data reduction methods, techniques for effective analysis and visualization of large-scale simulations, and algorithmsmore » and procedures for efficient processing.« less

  12. PREFACE: 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011)

    NASA Astrophysics Data System (ADS)

    Teodorescu, Liliana; Britton, David; Glover, Nigel; Heinrich, Gudrun; Lauret, Jérôme; Naumann, Axel; Speer, Thomas; Teixeira-Dias, Pedro

    2012-06-01

    ACAT2011 This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011) which took place on 5-7 September 2011 at Brunel University, UK. The workshop series, which began in 1990 in Lyon, France, brings together computer science researchers and practitioners, and researchers from particle physics and related fields in order to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. It is a forum for the exchange of ideas among the fields, exploring and promoting cutting-edge computing, data analysis and theoretical calculation techniques in fundamental physics research. This year's edition of the workshop brought together over 100 participants from all over the world. 14 invited speakers presented key topics on computing ecosystems, cloud computing, multivariate data analysis, symbolic and automatic theoretical calculations as well as computing and data analysis challenges in astrophysics, bioinformatics and musicology. Over 80 other talks and posters presented state-of-the art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. Panel and round table discussions on data management and multivariate data analysis uncovered new ideas and collaboration opportunities in the respective areas. This edition of ACAT was generously sponsored by the Science and Technology Facility Council (STFC), the Institute for Particle Physics Phenomenology (IPPP) at Durham University, Brookhaven National Laboratory in the USA and Dell. We would like to thank all the participants of the workshop for the high level of their scientific contributions and for the enthusiastic participation in all its activities which were, ultimately, the key factors in the success of the workshop. Further information on ACAT 2011 can be found at http://acat2011.cern.ch Dr Liliana Teodorescu Brunel University ACATgroup The PDF also contains details of the workshop's committees and sponsors.

  13. Sixth New Zealand Computer Conference (Auckland 78). Volume I, Papers.

    ERIC Educational Resources Information Center

    New Zealand Computer Society, Auckland.

    This collection of conference presentations includes 23 papers on a variety of topics pertaining to the use of computer in New Zealand. Among the topics discussed are computer science techniques in a commercial data processing situation, data processing personnel and their careers, the communication aspects of an airline system, implementation of…

  14. First principles calculations of thermal conductivity with out of equilibrium molecular dynamics simulations

    NASA Astrophysics Data System (ADS)

    Puligheddu, Marcello; Gygi, Francois; Galli, Giulia

    The prediction of the thermal properties of solids and liquids is central to numerous problems in condensed matter physics and materials science, including the study of thermal management of opto-electronic and energy conversion devices. We present a method to compute the thermal conductivity of solids by performing ab initio molecular dynamics at non equilibrium conditions. Our formulation is based on a generalization of the approach to equilibrium technique, using sinusoidal temperature gradients, and it only requires calculations of first principles trajectories and atomic forces. We discuss results and computational requirements for a representative, simple oxide, MgO, and compare with experiments and data obtained with classical potentials. This work was supported by MICCoM as part of the Computational Materials Science Program funded by the U.S. Department of Energy (DOE), Office of Science , Basic Energy Sciences (BES), Materials Sciences and Engineering Division under Grant DOE/BES 5J-30.

  15. Toward a Big Data Science: A challenge of "Science Cloud"

    NASA Astrophysics Data System (ADS)

    Murata, Ken T.; Watanabe, Hidenobu

    2013-04-01

    During these 50 years, along with appearance and development of high-performance computers (and super-computers), numerical simulation is considered to be a third methodology for science, following theoretical (first) and experimental and/or observational (second) approaches. The variety of data yielded by the second approaches has been getting more and more. It is due to the progress of technologies of experiments and observations. The amount of the data generated by the third methodologies has been getting larger and larger. It is because of tremendous development and programming techniques of super computers. Most of the data files created by both experiments/observations and numerical simulations are saved in digital formats and analyzed on computers. The researchers (domain experts) are interested in not only how to make experiments and/or observations or perform numerical simulations, but what information (new findings) to extract from the data. However, data does not usually tell anything about the science; sciences are implicitly hidden in the data. Researchers have to extract information to find new sciences from the data files. This is a basic concept of data intensive (data oriented) science for Big Data. As the scales of experiments and/or observations and numerical simulations get larger, new techniques and facilities are required to extract information from a large amount of data files. The technique is called as informatics as a fourth methodology for new sciences. Any methodologies must work on their facilities: for example, space environment are observed via spacecraft and numerical simulations are performed on super-computers, respectively in space science. The facility of the informatics, which deals with large-scale data, is a computational cloud system for science. This paper is to propose a cloud system for informatics, which has been developed at NICT (National Institute of Information and Communications Technology), Japan. The NICT science cloud, we named as OneSpaceNet (OSN), is the first open cloud system for scientists who are going to carry out their informatics for their own science. The science cloud is not for simple uses. Many functions are expected to the science cloud; such as data standardization, data collection and crawling, large and distributed data storage system, security and reliability, database and meta-database, data stewardship, long-term data preservation, data rescue and preservation, data mining, parallel processing, data publication and provision, semantic web, 3D and 4D visualization, out-reach and in-reach, and capacity buildings. Figure (not shown here) is a schematic picture of the NICT science cloud. Both types of data from observation and simulation are stored in the storage system in the science cloud. It should be noted that there are two types of data in observation. One is from archive site out of the cloud: this is a data to be downloaded through the Internet to the cloud. The other one is data from the equipment directly connected to the science cloud. They are often called as sensor clouds. In the present talk, we first introduce the NICT science cloud. We next demonstrate the efficiency of the science cloud, showing several scientific results which we achieved with this cloud system. Through the discussions and demonstrations, the potential performance of sciences cloud will be revealed for any research fields.

  16. Advances in Machine Learning and Data Mining for Astronomy

    NASA Astrophysics Data System (ADS)

    Way, Michael J.; Scargle, Jeffrey D.; Ali, Kamal M.; Srivastava, Ashok N.

    2012-03-01

    Advances in Machine Learning and Data Mining for Astronomy documents numerous successful collaborations among computer scientists, statisticians, and astronomers who illustrate the application of state-of-the-art machine learning and data mining techniques in astronomy. Due to the massive amount and complexity of data in most scientific disciplines, the material discussed in this text transcends traditional boundaries between various areas in the sciences and computer science. The book's introductory part provides context to issues in the astronomical sciences that are also important to health, social, and physical sciences, particularly probabilistic and statistical aspects of classification and cluster analysis. The next part describes a number of astrophysics case studies that leverage a range of machine learning and data mining technologies. In the last part, developers of algorithms and practitioners of machine learning and data mining show how these tools and techniques are used in astronomical applications. With contributions from leading astronomers and computer scientists, this book is a practical guide to many of the most important developments in machine learning, data mining, and statistics. It explores how these advances can solve current and future problems in astronomy and looks at how they could lead to the creation of entirely new algorithms within the data mining community.

  17. Contributions of Cognitive Science and Related Research on Learning to the Design of Computer Literacy Curricula. Report No. 81-1. Series in Learning and Cognition.

    ERIC Educational Resources Information Center

    Mayer, Richard E.

    A review of the research on techniques for increasing the novice's understanding of computers and computer programming, this paper considers the potential usefulness of five tentative recommendations pertinent to the design of computer literacy curricula: (1) provide the learner with a concrete model of the computer; (2) encourage the learner to…

  18. Proceedings of the Annual National Conference on Ada (Trademark) Technology (3rd) Held at Prarie View, Texas on 20-21 March 1985.

    DTIC Science & Technology

    1985-01-01

    CECOM, Ft. Monmouth, N.J. ".,.-- .,. Kurth Krause , Intermetrics, Inc., Huntington Beach, CA. Benjamin Martin, Atlanta University, Atlanta, GA. Isabel...Teledyne Brown, Tinton Falls, N.J. Paul Wolfgang , Computer Science Corp., Moorestown, N.J. - TECHNICAL SESSIONS - " Wednesday, March 20, 1985 9:00 am...STRATEGIES J. McGlynn, CENTACS, CECOM, Ft. Mon- AND TECHNIQUES mouth, NJ ............................. 178 Chairperson: Paul Wolfgang , Computer Science

  19. Computing Principal Eigenvectors of Large Web Graphs: Algorithms and Accelerations Related to PageRank and HITS

    ERIC Educational Resources Information Center

    Nagasinghe, Iranga

    2010-01-01

    This thesis investigates and develops a few acceleration techniques for the search engine algorithms used in PageRank and HITS computations. PageRank and HITS methods are two highly successful applications of modern Linear Algebra in computer science and engineering. They constitute the essential technologies accounted for the immense growth and…

  20. Prediction of Software Reliability using Bio Inspired Soft Computing Techniques.

    PubMed

    Diwaker, Chander; Tomar, Pradeep; Poonia, Ramesh C; Singh, Vijander

    2018-04-10

    A lot of models have been made for predicting software reliability. The reliability models are restricted to using particular types of methodologies and restricted number of parameters. There are a number of techniques and methodologies that may be used for reliability prediction. There is need to focus on parameters consideration while estimating reliability. The reliability of a system may increase or decreases depending on the selection of different parameters used. Thus there is need to identify factors that heavily affecting the reliability of the system. In present days, reusability is mostly used in the various area of research. Reusability is the basis of Component-Based System (CBS). The cost, time and human skill can be saved using Component-Based Software Engineering (CBSE) concepts. CBSE metrics may be used to assess those techniques which are more suitable for estimating system reliability. Soft computing is used for small as well as large-scale problems where it is difficult to find accurate results due to uncertainty or randomness. Several possibilities are available to apply soft computing techniques in medicine related problems. Clinical science of medicine using fuzzy-logic, neural network methodology significantly while basic science of medicine using neural-networks-genetic algorithm most frequently and preferably. There is unavoidable interest shown by medical scientists to use the various soft computing methodologies in genetics, physiology, radiology, cardiology and neurology discipline. CBSE boost users to reuse the past and existing software for making new products to provide quality with a saving of time, memory space, and money. This paper focused on assessment of commonly used soft computing technique like Genetic Algorithm (GA), Neural-Network (NN), Fuzzy Logic, Support Vector Machine (SVM), Ant Colony Optimization (ACO), Particle Swarm Optimization (PSO), and Artificial Bee Colony (ABC). This paper presents working of soft computing techniques and assessment of soft computing techniques to predict reliability. The parameter considered while estimating and prediction of reliability are also discussed. This study can be used in estimation and prediction of the reliability of various instruments used in the medical system, software engineering, computer engineering and mechanical engineering also. These concepts can be applied to both software and hardware, to predict the reliability using CBSE.

  1. Automated Explanation for Educational Applications.

    ERIC Educational Resources Information Center

    Suthers, Daniel D.

    1991-01-01

    Artificial intelligence techniques available for generating explanations for teaching purposes are surveyed, and the way in which they are combined in a computer program that provides explanations is described. The program responds to questions in the physical sciences. Potential contributions of this technology to computer-based education are…

  2. Abstracts of Research, July 1973 through June 1974.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Computer and Information Science Research Center.

    Abstracts of research papers in the fields of computer and information science are given; 72 papers are abstracted in the areas of information storage and retrieval, information processing, linguistic analysis, artificial intelligence, mathematical techniques, systems programing, and computer networks. In addition, the Ohio State University…

  3. Terahertz Radiation: A Non-contact Tool for the Selective Stimulation of Biological Responses in Human Cells

    DTIC Science & Technology

    2014-01-01

    computational and empirical dosimetric tools [31]. For the computational dosimetry, we employed finite-dif- ference time- domain (FDTD) modeling techniques to...temperature-time data collected for a well exposed to THz radiation using finite-difference time- domain (FDTD) modeling techniques and thermocouples... like )). Alter- ation in the expression of such genes underscores the signif- 62 IEEE TRANSACTIONS ON TERAHERTZ SCIENCE AND TECHNOLOGY, VOL. 6, NO. 1

  4. Use of Digital Game Based Learning and Gamification in Secondary School Science: The Effect on Student Engagement, Learning and Gender Difference

    ERIC Educational Resources Information Center

    Khan, Amna; Ahmad, Farzana Hayat; Malik, Muhammad Muddassir

    2017-01-01

    This study aimed to identify the impact of a game based learning (GBL) application using computer technologies on student engagement in secondary school science classrooms. The literature reveals that conventional Science teaching techniques (teacher-centered lecture and teaching), which foster rote learning among students, are one of the major…

  5. Introducing Enabling Computational Tools to the Climate Sciences: Multi-Resolution Climate Modeling with Adaptive Cubed-Sphere Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jablonowski, Christiane

    The research investigates and advances strategies how to bridge the scale discrepancies between local, regional and global phenomena in climate models without the prohibitive computational costs of global cloud-resolving simulations. In particular, the research explores new frontiers in computational geoscience by introducing high-order Adaptive Mesh Refinement (AMR) techniques into climate research. AMR and statically-adapted variable-resolution approaches represent an emerging trend for atmospheric models and are likely to become the new norm in future-generation weather and climate models. The research advances the understanding of multi-scale interactions in the climate system and showcases a pathway how to model these interactions effectively withmore » advanced computational tools, like the Chombo AMR library developed at the Lawrence Berkeley National Laboratory. The research is interdisciplinary and combines applied mathematics, scientific computing and the atmospheric sciences. In this research project, a hierarchy of high-order atmospheric models on cubed-sphere computational grids have been developed that serve as an algorithmic prototype for the finite-volume solution-adaptive Chombo-AMR approach. The foci of the investigations have lied on the characteristics of both static mesh adaptations and dynamically-adaptive grids that can capture flow fields of interest like tropical cyclones. Six research themes have been chosen. These are (1) the introduction of adaptive mesh refinement techniques into the climate sciences, (2) advanced algorithms for nonhydrostatic atmospheric dynamical cores, (3) an assessment of the interplay between resolved-scale dynamical motions and subgrid-scale physical parameterizations, (4) evaluation techniques for atmospheric model hierarchies, (5) the comparison of AMR refinement strategies and (6) tropical cyclone studies with a focus on multi-scale interactions and variable-resolution modeling. The results of this research project demonstrate significant advances in all six research areas. The major conclusions are that statically-adaptive variable-resolution modeling is currently becoming mature in the climate sciences, and that AMR holds outstanding promise for future-generation weather and climate models on high-performance computing architectures.« less

  6. REVEAL: An Extensible Reduced Order Model Builder for Simulation and Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agarwal, Khushbu; Sharma, Poorva; Ma, Jinliang

    2013-04-30

    Many science domains need to build computationally efficient and accurate representations of high fidelity, computationally expensive simulations. These computationally efficient versions are known as reduced-order models. This paper presents the design and implementation of a novel reduced-order model (ROM) builder, the REVEAL toolset. This toolset generates ROMs based on science- and engineering-domain specific simulations executed on high performance computing (HPC) platforms. The toolset encompasses a range of sampling and regression methods that can be used to generate a ROM, automatically quantifies the ROM accuracy, and provides support for an iterative approach to improve ROM accuracy. REVEAL is designed to bemore » extensible in order to utilize the core functionality with any simulator that has published input and output formats. It also defines programmatic interfaces to include new sampling and regression techniques so that users can ‘mix and match’ mathematical techniques to best suit the characteristics of their model. In this paper, we describe the architecture of REVEAL and demonstrate its usage with a computational fluid dynamics model used in carbon capture.« less

  7. Single-shot ultrafast tomographic imaging by spectral multiplexing

    NASA Astrophysics Data System (ADS)

    Matlis, N. H.; Axley, A.; Leemans, W. P.

    2012-10-01

    Computed tomography has profoundly impacted science, medicine and technology by using projection measurements scanned over multiple angles to permit cross-sectional imaging of an object. The application of computed tomography to moving or dynamically varying objects, however, has been limited by the temporal resolution of the technique, which is set by the time required to complete the scan. For objects that vary on ultrafast timescales, traditional scanning methods are not an option. Here we present a non-scanning method capable of resolving structure on femtosecond timescales by using spectral multiplexing of a single laser beam to perform tomographic imaging over a continuous range of angles simultaneously. We use this technique to demonstrate the first single-shot ultrafast computed tomography reconstructions and obtain previously inaccessible structure and position information for laser-induced plasma filaments. This development enables real-time tomographic imaging for ultrafast science, and offers a potential solution to the challenging problem of imaging through scattering surfaces.

  8. All biology is computational biology.

    PubMed

    Markowetz, Florian

    2017-03-01

    Here, I argue that computational thinking and techniques are so central to the quest of understanding life that today all biology is computational biology. Computational biology brings order into our understanding of life, it makes biological concepts rigorous and testable, and it provides a reference map that holds together individual insights. The next modern synthesis in biology will be driven by mathematical, statistical, and computational methods being absorbed into mainstream biological training, turning biology into a quantitative science.

  9. Current Developments in Machine Learning Techniques in Biological Data Mining.

    PubMed

    Dumancas, Gerard G; Adrianto, Indra; Bello, Ghalib; Dozmorov, Mikhail

    2017-01-01

    This supplement is intended to focus on the use of machine learning techniques to generate meaningful information on biological data. This supplement under Bioinformatics and Biology Insights aims to provide scientists and researchers working in this rapid and evolving field with online, open-access articles authored by leading international experts in this field. Advances in the field of biology have generated massive opportunities to allow the implementation of modern computational and statistical techniques. Machine learning methods in particular, a subfield of computer science, have evolved as an indispensable tool applied to a wide spectrum of bioinformatics applications. Thus, it is broadly used to investigate the underlying mechanisms leading to a specific disease, as well as the biomarker discovery process. With a growth in this specific area of science comes the need to access up-to-date, high-quality scholarly articles that will leverage the knowledge of scientists and researchers in the various applications of machine learning techniques in mining biological data.

  10. Image/Time Series Mining Algorithms: Applications to Developmental Biology, Document Processing and Data Streams

    ERIC Educational Resources Information Center

    Tataw, Oben Moses

    2013-01-01

    Interdisciplinary research in computer science requires the development of computational techniques for practical application in different domains. This usually requires careful integration of different areas of technical expertise. This dissertation presents image and time series analysis algorithms, with practical interdisciplinary applications…

  11. A Primer on Simulation and Gaming.

    ERIC Educational Resources Information Center

    Barton, Richard F.

    In a primer intended for the administrative professions, for the behavioral sciences, and for education, simulation and its various aspects are defined, illustrated, and explained. Man-model simulation, man-computer simulation, all-computer simulation, and analysis are discussed as techniques for studying object systems (parts of the "real…

  12. Using Ontologies for Knowledge Management: An Information Systems Perspective.

    ERIC Educational Resources Information Center

    Jurisica, Igor; Mylopoulos, John; Yu, Eric

    1999-01-01

    Surveys some of the basic concepts that have been used in computer science for the representation of knowledge and summarizes some of their advantages and drawbacks. Relates these techniques to information sciences theory and practice. Concepts are classified in four broad ontological categories: static ontology, dynamic ontology, intentional…

  13. Scientific Visualization and Computational Science: Natural Partners

    NASA Technical Reports Server (NTRS)

    Uselton, Samuel P.; Lasinski, T. A. (Technical Monitor)

    1995-01-01

    Scientific visualization is developing rapidly, stimulated by computational science, which is gaining acceptance as a third alternative to theory and experiment. Computational science is based on numerical simulations of mathematical models derived from theory. But each individual simulation is like a hypothetical experiment; initial conditions are specified, and the result is a record of the observed conditions. Experiments can be simulated for situations that can not really be created or controlled. Results impossible to measure can be computed.. Even for observable values, computed samples are typically much denser. Numerical simulations also extend scientific exploration where the mathematics is analytically intractable. Numerical simulations are used to study phenomena from subatomic to intergalactic scales and from abstract mathematical structures to pragmatic engineering of everyday objects. But computational science methods would be almost useless without visualization. The obvious reason is that the huge amounts of data produced require the high bandwidth of the human visual system, and interactivity adds to the power. Visualization systems also provide a single context for all the activities involved from debugging the simulations, to exploring the data, to communicating the results. Most of the presentations today have their roots in image processing, where the fundamental task is: Given an image, extract information about the scene. Visualization has developed from computer graphics, and the inverse task: Given a scene description, make an image. Visualization extends the graphics paradigm by expanding the possible input. The goal is still to produce images; the difficulty is that the input is not a scene description displayable by standard graphics methods. Visualization techniques must either transform the data into a scene description or extend graphics techniques to display this odd input. Computational science is a fertile field for visualization research because the results vary so widely and include things that have no known appearance. The amount of data creates additional challenges for both hardware and software systems. Evaluations of visualization should ultimately reflect the insight gained into the scientific phenomena. So making good visualizations requires consideration of characteristics of the user and the purpose of the visualization. Knowledge about human perception and graphic design is also relevant. It is this breadth of knowledge that stimulates proposals for multidisciplinary visualization teams and intelligent visualization assistant software. Visualization is an immature field, but computational science is stimulating research on a broad front.

  14. Computer Security Models

    DTIC Science & Technology

    1984-09-01

    Verification Technique for a Class of Security Kernels," International Symposium on Programming , Lecture Notes in Computer Science 137, Springer-Verlag, New York...September 1984 MTR9S31 " J. K. Millen Computer Security C. M. Cerniglia Models * 0 Ne c - ¢- C. S• ~CONTRACT SPONSOR OUSDRE/C31 & ESO/ALEE...ABSTRACT The purpose of this report is to provide a basis for evaluating security models in the context of secure computer system development

  15. Realistic Covariance Prediction for the Earth Science Constellation

    NASA Technical Reports Server (NTRS)

    Duncan, Matthew; Long, Anne

    2006-01-01

    Routine satellite operations for the Earth Science Constellation (ESC) include collision risk assessment between members of the constellation and other orbiting space objects. One component of the risk assessment process is computing the collision probability between two space objects. The collision probability is computed using Monte Carlo techniques as well as by numerically integrating relative state probability density functions. Each algorithm takes as inputs state vector and state vector uncertainty information for both objects. The state vector uncertainty information is expressed in terms of a covariance matrix. The collision probability computation is only as good as the inputs. Therefore, to obtain a collision calculation that is a useful decision-making metric, realistic covariance matrices must be used as inputs to the calculation. This paper describes the process used by the NASA/Goddard Space Flight Center's Earth Science Mission Operations Project to generate realistic covariance predictions for three of the Earth Science Constellation satellites: Aqua, Aura and Terra.

  16. Linking Computer Algebra Systems and Paper-and-Pencil Techniques To Support the Teaching of Mathematics.

    ERIC Educational Resources Information Center

    van Herwaarden, Onno A.; Gielen, Joseph L. W.

    2002-01-01

    Focuses on students showing a lack of conceptual insight while using computer algebra systems (CAS) in the setting of an elementary calculus and linear algebra course for first year university students in social sciences. The use of a computer algebra environment has been incorporated into a more traditional course but with special attention on…

  17. Report on Information Retrieval and Library Automation Studies.

    ERIC Educational Resources Information Center

    Alberta Univ., Edmonton. Dept. of Computing Science.

    Short abstracts of works in progress or completed in the Department of Computing Science at the University of Alberta are presented under five major headings. The five categories are: Storage and search techniques for document data bases, Automatic classification, Study of indexing and classification languages through computer manipulation of data…

  18. Preface to MOST-ONISW 2009

    NASA Astrophysics Data System (ADS)

    Doerr, Martin; Freitas, Fred; Guizzardi, Giancarlo; Han, Hyoil

    Ontology is a cross-disciplinary field concerned with the study of concepts and theories that can be used for representing shared conceptualizations of specific domains. Ontological Engineering is a discipline in computer and information science concerned with the development of techniques, methods, languages and tools for the systematic construction of concrete artifacts capturing these representations, i.e., models (e.g., domain ontologies) and metamodels (e.g., upper-level ontologies). In recent years, there has been a growing interest in the application of formal ontology and ontological engineering to solve modeling problems in diverse areas in computer science such as software and data engineering, knowledge representation, natural language processing, information science, among many others.

  19. Equation-free and variable free modeling for complex/multiscale systems. Coarse-grained computation in science and engineering using fine-grained models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kevrekidis, Ioannis G.

    The work explored the linking of modern developing machine learning techniques (manifold learning and in particular diffusion maps) with traditional PDE modeling/discretization/scientific computation techniques via the equation-free methodology developed by the PI. The result (in addition to several PhD degrees, two of them by CSGF Fellows) was a sequence of strong developments - in part on the algorithmic side, linking data mining with scientific computing, and in part on applications, ranging from PDE discretizations to molecular dynamics and complex network dynamics.

  20. Computer Techniques for Studying Coverage, Overlaps, and Gaps in Collections.

    ERIC Educational Resources Information Center

    White, Howard D.

    1987-01-01

    Describes techniques for using the Statistical Package for the Social Sciences (SSPS) to create tables for cooperative collection development across a number of libraries. Specific commands are given to generate holdings profiles focusing on collection coverage, overlaps, gaps, or other areas of interest, from a master bibliographic list. (CLB)

  1. Integrating Intelligent Systems Domain Knowledge Into the Earth Science Curricula

    NASA Astrophysics Data System (ADS)

    Güereque, M.; Pennington, D. D.; Pierce, S. A.

    2017-12-01

    High-volume heterogeneous datasets are becoming ubiquitous, migrating to center stage over the last ten years and transcending the boundaries of computationally intensive disciplines into the mainstream, becoming a fundamental part of every science discipline. Despite the fact that large datasets are now pervasive across industries and academic disciplines, the array of skills is generally absent from earth science programs. This has left the bulk of the student population without access to curricula that systematically teach appropriate intelligent-systems skills, creating a void for skill sets that should be universal given their need and marketability. While some guidance regarding appropriate computational thinking and pedagogy is appearing, there exist few examples where these have been specifically designed and tested within the earth science domain. Furthermore, best practices from learning science have not yet been widely tested for developing intelligent systems-thinking skills. This research developed and tested evidence based computational skill modules that target this deficit with the intention of informing the earth science community as it continues to incorporate intelligent systems techniques and reasoning into its research and classrooms.

  2. Introduction to Computational Physics for Undergraduates

    NASA Astrophysics Data System (ADS)

    Zubairi, Omair; Weber, Fridolin

    2018-03-01

    This is an introductory textbook on computational methods and techniques intended for undergraduates at the sophomore or junior level in the fields of science, mathematics, and engineering. It provides an introduction to programming languages such as FORTRAN 90/95/2000 and covers numerical techniques such as differentiation, integration, root finding, and data fitting. The textbook also entails the use of the Linux/Unix operating system and other relevant software such as plotting programs, text editors, and mark up languages such as LaTeX. It includes multiple homework assignments.

  3. Health Science Education

    ERIC Educational Resources Information Center

    Hartsell, Horace C.

    1970-01-01

    Briefly describes several instructional techniques including computer aid simulation of the medical encounter, media-biased approaches for teaching doctor-patient relationships, and programed media for teaching decision-making to nursing students." (Author/AA)

  4. Using Robotics to Improve Retention and Increase Comprehension in Introductory Programming Courses

    ERIC Educational Resources Information Center

    Pullan, Marie

    2013-01-01

    Several college majors, outside of computer science, require students to learn computer programming. Many students have difficulty getting through the programming sequence and ultimately change majors or drop out of college. To deal with this problem, active learning techniques were developed and implemented in a freshman programming logic and…

  5. A Diagnostic Study of Computer Application of Structural Communication Grid

    ERIC Educational Resources Information Center

    Bahar, Mehmet; Aydin, Fatih; Karakirik, Erol

    2009-01-01

    In this article, Structural communication grid (SCG), an alternative measurement and evaluation technique, has been firstly summarised and the design, development and implementation of a computer based SCG system have been introduced. The system is then tested on a sample of 154 participants consisting of candidate students, science teachers and…

  6. Progress in fast, accurate multi-scale climate simulations

    DOE PAGES

    Collins, W. D.; Johansen, H.; Evans, K. J.; ...

    2015-06-01

    We present a survey of physical and computational techniques that have the potential to contribute to the next generation of high-fidelity, multi-scale climate simulations. Examples of the climate science problems that can be investigated with more depth with these computational improvements include the capture of remote forcings of localized hydrological extreme events, an accurate representation of cloud features over a range of spatial and temporal scales, and parallel, large ensembles of simulations to more effectively explore model sensitivities and uncertainties. Numerical techniques, such as adaptive mesh refinement, implicit time integration, and separate treatment of fast physical time scales are enablingmore » improved accuracy and fidelity in simulation of dynamics and allowing more complete representations of climate features at the global scale. At the same time, partnerships with computer science teams have focused on taking advantage of evolving computer architectures such as many-core processors and GPUs. As a result, approaches which were previously considered prohibitively costly have become both more efficient and scalable. In combination, progress in these three critical areas is poised to transform climate modeling in the coming decades.« less

  7. 1976 annual summary report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1978-03-01

    Abstracts of papers published during the previous calendar year, arranged in accordance with the project titles used in the USDOE Schedule 189 Budget Proposals, are presented. The collection of abstracts supplements the listing of papers published in the Schedule 189. The following subject areas are represented: high-energy physics; nuclear physics; basic energy sciences (nuclear science, materials sciences, solid state physics, materials chemistry); molecular, mathematical, and earth sciences (fundamental interactions, processes and techniques, mathematical and computer sciences); environmental research and development; physical and technological studies (characterization, measurement and monitoring); and nuclear research and applications.

  8. Enabling campus grids with open science grid technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weitzel, Derek; Bockelman, Brian; Swanson, David

    2011-01-01

    The Open Science Grid is a recognized key component of the US national cyber-infrastructure enabling scientific discovery through advanced high throughput computing. The principles and techniques that underlie the Open Science Grid can also be applied to Campus Grids since many of the requirements are the same, even if the implementation technologies differ. We find five requirements for a campus grid: trust relationships, job submission, resource independence, accounting, and data management. The Holland Computing Center's campus grid at the University of Nebraska-Lincoln was designed to fulfill the requirements of a campus grid. A bridging daemon was designed to bring non-Condormore » clusters into a grid managed by Condor. Condor features which make it possible to bridge Condor sites into a multi-campus grid have been exploited at the Holland Computing Center as well.« less

  9. CESDIS

    NASA Technical Reports Server (NTRS)

    1994-01-01

    CESDIS, the Center of Excellence in Space Data and Information Sciences was developed jointly by NASA, Universities Space Research Association (USRA), and the University of Maryland in 1988 to focus on the design of advanced computing techniques and data systems to support NASA Earth and space science research programs. CESDIS is operated by USRA under contract to NASA. The Director, Associate Director, Staff Scientists, and administrative staff are located on-site at NASA's Goddard Space Flight Center in Greenbelt, Maryland. The primary CESDIS mission is to increase the connection between computer science and engineering research programs at colleges and universities and NASA groups working with computer applications in Earth and space science. Research areas of primary interest at CESDIS include: 1) High performance computing, especially software design and performance evaluation for massively parallel machines; 2) Parallel input/output and data storage systems for high performance parallel computers; 3) Data base and intelligent data management systems for parallel computers; 4) Image processing; 5) Digital libraries; and 6) Data compression. CESDIS funds multiyear projects at U. S. universities and colleges. Proposals are accepted in response to calls for proposals and are selected on the basis of peer reviews. Funds are provided to support faculty and graduate students working at their home institutions. Project personnel visit Goddard during academic recess periods to attend workshops, present seminars, and collaborate with NASA scientists on research projects. Additionally, CESDIS takes on specific research tasks of shorter duration for computer science research requested by NASA Goddard scientists.

  10. Caleb Phillips | NREL

    Science.gov Websites

    , Statistical Analysis and Data Mining: The ASA Data Science Journal (2017) Using GIS-Based Methods and Lidar techniques to the problem of large area coverage mapping for wireless networks. He has also done work in -4297 Dr. Caleb Phillips is a data scientist with the Computational Science Center at NREL. Caleb comes

  11. Calibration Experiments for a Computer Vision Oyster Volume Estimation System

    ERIC Educational Resources Information Center

    Chang, G. Andy; Kerns, G. Jay; Lee, D. J.; Stanek, Gary L.

    2009-01-01

    Calibration is a technique that is commonly used in science and engineering research that requires calibrating measurement tools for obtaining more accurate measurements. It is an important technique in various industries. In many situations, calibration is an application of linear regression, and is a good topic to be included when explaining and…

  12. Bridging the Gap between Basic and Clinical Sciences: A Description of a Radiological Anatomy Course

    ERIC Educational Resources Information Center

    Torres, Anna; Staskiewicz, Grzegorz J.; Lisiecka, Justyna; Pietrzyk, Lukasz; Czekajlo, Michael; Arancibia, Carlos U.; Maciejewski, Ryszard; Torres, Kamil

    2016-01-01

    A wide variety of medical imaging techniques pervade modern medicine, and the changing portability and performance of tools like ultrasound imaging have brought these medical imaging techniques into the everyday practice of many specialties outside of radiology. However, proper interpretation of ultrasonographic and computed tomographic images…

  13. CPE--A New Perspective: The Impact of the Technology Revolution. Proceedings of the Computer Performance Evaluation Users Group Meeting (19th, San Francisco, California, October 25-28, 1983). Final Report. Reports on Computer Science and Technology.

    ERIC Educational Resources Information Center

    Mobray, Deborah, Ed.

    Papers on local area networks (LANs), modelling techniques, software improvement, capacity planning, software engineering, microcomputers and end user computing, cost accounting and chargeback, configuration and performance management, and benchmarking presented at this conference include: (1) "Theoretical Performance Analysis of Virtual…

  14. Management Sciences Division Annual Report (10th)

    DTIC Science & Technology

    1993-01-01

    of the Weapon System Management Information System (WSMIS). TheI Aircraft Sustainability Model ( ASM ) is the computational technique employed by...provisioning. We enhanced the capabilities of RBIRD by using the Aircraft Sustainability Model ( ASM ) for the spares calculation. ASM offers many... ASM for several years to 3 compute spares for war. It is also fully compatible with the Air Force’s peacetime spares computation system (D041). This

  15. Remarks on neurocybernetics and its links to computing science. To the memory of Prof. Luigi M. Ricciardi.

    PubMed

    Moreno-Díaz, Roberto; Moreno-Díaz, Arminda

    2013-06-01

    This paper explores the origins and content of neurocybernetics and its links to artificial intelligence, computer science and knowledge engineering. Starting with three remarkable pieces of work, we center attention on a number of events that initiated and developed basic topics that are still nowadays a matter of research and inquire, from goal directed activity theories to circular causality and to reverberations and learning. Within this context, we pay tribute to the memory of Prof. Ricciardi documenting the importance of his contributions in the mathematics of brain, neural nets and neurophysiological models, computational simulations and techniques. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  16. Computational Accelerator Physics. Proceedings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bisognano, J.J.; Mondelli, A.A.

    1997-04-01

    The sixty two papers appearing in this volume were presented at CAP96, the Computational Accelerator Physics Conference held in Williamsburg, Virginia from September 24{minus}27,1996. Science Applications International Corporation (SAIC) and the Thomas Jefferson National Accelerator Facility (Jefferson lab) jointly hosted CAP96, with financial support from the U.S. department of Energy`s Office of Energy Research and the Office of Naval reasearch. Topics ranged from descriptions of specific codes to advanced computing techniques and numerical methods. Update talks were presented on nearly all of the accelerator community`s major electromagnetic and particle tracking codes. Among all papers, thirty of them are abstracted formore » the Energy Science and Technology database.(AIP)« less

  17. Approximate Computing Techniques for Iterative Graph Algorithms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Panyala, Ajay R.; Subasi, Omer; Halappanavar, Mahantesh

    Approximate computing enables processing of large-scale graphs by trading off quality for performance. Approximate computing techniques have become critical not only due to the emergence of parallel architectures but also the availability of large scale datasets enabling data-driven discovery. Using two prototypical graph algorithms, PageRank and community detection, we present several approximate computing heuristics to scale the performance with minimal loss of accuracy. We present several heuristics including loop perforation, data caching, incomplete graph coloring and synchronization, and evaluate their efficiency. We demonstrate performance improvements of up to 83% for PageRank and up to 450x for community detection, with lowmore » impact of accuracy for both the algorithms. We expect the proposed approximate techniques will enable scalable graph analytics on data of importance to several applications in science and their subsequent adoption to scale similar graph algorithms.« less

  18. High performance computing and communications: Advancing the frontiers of information technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1997-12-31

    This report, which supplements the President`s Fiscal Year 1997 Budget, describes the interagency High Performance Computing and Communications (HPCC) Program. The HPCC Program will celebrate its fifth anniversary in October 1996 with an impressive array of accomplishments to its credit. Over its five-year history, the HPCC Program has focused on developing high performance computing and communications technologies that can be applied to computation-intensive applications. Major highlights for FY 1996: (1) High performance computing systems enable practical solutions to complex problems with accuracies not possible five years ago; (2) HPCC-funded research in very large scale networking techniques has been instrumental inmore » the evolution of the Internet, which continues exponential growth in size, speed, and availability of information; (3) The combination of hardware capability measured in gigaflop/s, networking technology measured in gigabit/s, and new computational science techniques for modeling phenomena has demonstrated that very large scale accurate scientific calculations can be executed across heterogeneous parallel processing systems located thousands of miles apart; (4) Federal investments in HPCC software R and D support researchers who pioneered the development of parallel languages and compilers, high performance mathematical, engineering, and scientific libraries, and software tools--technologies that allow scientists to use powerful parallel systems to focus on Federal agency mission applications; and (5) HPCC support for virtual environments has enabled the development of immersive technologies, where researchers can explore and manipulate multi-dimensional scientific and engineering problems. Educational programs fostered by the HPCC Program have brought into classrooms new science and engineering curricula designed to teach computational science. This document contains a small sample of the significant HPCC Program accomplishments in FY 1996.« less

  19. Density functional theory in materials science.

    PubMed

    Neugebauer, Jörg; Hickel, Tilmann

    2013-09-01

    Materials science is a highly interdisciplinary field. It is devoted to the understanding of the relationship between (a) fundamental physical and chemical properties governing processes at the atomistic scale with (b) typically macroscopic properties required of materials in engineering applications. For many materials, this relationship is not only determined by chemical composition, but strongly governed by microstructure. The latter is a consequence of carefully selected process conditions (e.g., mechanical forming and annealing in metallurgy or epitaxial growth in semiconductor technology). A key task of computational materials science is to unravel the often hidden composition-structure-property relationships using computational techniques. The present paper does not aim to give a complete review of all aspects of materials science. Rather, we will present the key concepts underlying the computation of selected material properties and discuss the major classes of materials to which they are applied. Specifically, our focus will be on methods used to describe single or polycrystalline bulk materials of semiconductor, metal or ceramic form.

  20. In Praise of Numerical Computation

    NASA Astrophysics Data System (ADS)

    Yap, Chee K.

    Theoretical Computer Science has developed an almost exclusively discrete/algebraic persona. We have effectively shut ourselves off from half of the world of computing: a host of problems in Computational Science & Engineering (CS&E) are defined on the continuum, and, for them, the discrete viewpoint is inadequate. The computational techniques in such problems are well-known to numerical analysis and applied mathematics, but are rarely discussed in theoretical algorithms: iteration, subdivision and approximation. By various case studies, I will indicate how our discrete/algebraic view of computing has many shortcomings in CS&E. We want embrace the continuous/analytic view, but in a new synthesis with the discrete/algebraic view. I will suggest a pathway, by way of an exact numerical model of computation, that allows us to incorporate iteration and approximation into our algorithms’ design. Some recent results give a peek into how this view of algorithmic development might look like, and its distinctive form suggests the name “numerical computational geometry” for such activities.

  1. Computational science: shifting the focus from tools to models

    PubMed Central

    Hinsen, Konrad

    2014-01-01

    Computational techniques have revolutionized many aspects of scientific research over the last few decades. Experimentalists use computation for data analysis, processing ever bigger data sets. Theoreticians compute predictions from ever more complex models. However, traditional articles do not permit the publication of big data sets or complex models. As a consequence, these crucial pieces of information no longer enter the scientific record. Moreover, they have become prisoners of scientific software: many models exist only as software implementations, and the data are often stored in proprietary formats defined by the software. In this article, I argue that this emphasis on software tools over models and data is detrimental to science in the long term, and I propose a means by which this can be reversed. PMID:25309728

  2. Crosscut report: Exascale Requirements Reviews, March 9–10, 2017 – Tysons Corner, Virginia. An Office of Science review sponsored by: Advanced Scientific Computing Research, Basic Energy Sciences, Biological and Environmental Research, Fusion Energy Sciences, High Energy Physics, Nuclear Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerber, Richard; Hack, James; Riley, Katherine

    The mission of the U.S. Department of Energy Office of Science (DOE SC) is the delivery of scientific discoveries and major scientific tools to transform our understanding of nature and to advance the energy, economic, and national security missions of the United States. To achieve these goals in today’s world requires investments in not only the traditional scientific endeavors of theory and experiment, but also in computational science and the facilities that support large-scale simulation and data analysis. The Advanced Scientific Computing Research (ASCR) program addresses these challenges in the Office of Science. ASCR’s mission is to discover, develop, andmore » deploy computational and networking capabilities to analyze, model, simulate, and predict complex phenomena important to DOE. ASCR supports research in computational science, three high-performance computing (HPC) facilities — the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory and Leadership Computing Facilities at Argonne (ALCF) and Oak Ridge (OLCF) National Laboratories — and the Energy Sciences Network (ESnet) at Berkeley Lab. ASCR is guided by science needs as it develops research programs, computers, and networks at the leading edge of technologies. As we approach the era of exascale computing, technology changes are creating challenges for science programs in SC for those who need to use high performance computing and data systems effectively. Numerous significant modifications to today’s tools and techniques will be needed to realize the full potential of emerging computing systems and other novel computing architectures. To assess these needs and challenges, ASCR held a series of Exascale Requirements Reviews in 2015–2017, one with each of the six SC program offices,1 and a subsequent Crosscut Review that sought to integrate the findings from each. Participants at the reviews were drawn from the communities of leading domain scientists, experts in computer science and applied mathematics, ASCR facility staff, and DOE program managers in ASCR and the respective program offices. The purpose of these reviews was to identify mission-critical scientific problems within the DOE Office of Science (including experimental facilities) and determine the requirements for the exascale ecosystem that would be needed to address those challenges. The exascale ecosystem includes exascale computing systems, high-end data capabilities, efficient software at scale, libraries, tools, and other capabilities. This effort will contribute to the development of a strategic roadmap for ASCR compute and data facility investments and will help the ASCR Facility Division establish partnerships with Office of Science stakeholders. It will also inform the Office of Science research needs and agenda. The results of the six reviews have been published in reports available on the web at http://exascaleage.org/. This report presents a summary of the individual reports and of common and crosscutting findings, and it identifies opportunities for productive collaborations among the DOE SC program offices.« less

  3. Using the technique of computed tomography for nondestructive analysis of pharmaceutical dosage forms

    NASA Astrophysics Data System (ADS)

    de Oliveira, José Martins, Jr.; Mangini, F. Salvador; Carvalho Vila, Marta Maria Duarte; ViníciusChaud, Marco

    2013-05-01

    This work presents an alternative and non-conventional technique for evaluatingof physic-chemical properties of pharmaceutical dosage forms, i.e. we used computed tomography (CT) technique as a nondestructive technique to visualize internal structures of pharmaceuticals dosage forms and to conduct static and dynamical studies. The studies were conducted involving static and dynamic situations through the use of tomographic images, generated by the scanner at University of Sorocaba - Uniso. We have shown that through the use of tomographic images it is possible to conduct studies of porosity, densities, analysis of morphological parameters and performing studies of dissolution. Our results are in agreement with the literature, showing that CT is a powerful tool for use in the pharmaceutical sciences.

  4. Building a Data Science capability for USGS water research and communication

    NASA Astrophysics Data System (ADS)

    Appling, A.; Read, E. K.

    2015-12-01

    Interpreting and communicating water issues in an era of exponentially increasing information requires a blend of domain expertise, computational proficiency, and communication skills. The USGS Office of Water Information has established a Data Science team to meet these needs, providing challenging careers for diverse domain scientists and innovators in the fields of information technology and data visualization. Here, we detail the experience of building a Data Science capability as a bridging element between traditional water resources analyses and modern computing tools and data management techniques. This approach includes four major components: 1) building reusable research tools, 2) documenting data-intensive research approaches in peer reviewed journals, 3) communicating complex water resources issues with interactive web visualizations, and 4) offering training programs for our peers in scientific computing. These components collectively improve the efficiency, transparency, and reproducibility of USGS data analyses and scientific workflows.

  5. Learning Analytics and Computational Techniques for Detecting and Evaluating Patterns in Learning: An Introduction to the Special Issue

    ERIC Educational Resources Information Center

    Martin, Taylor; Sherin, Bruce

    2013-01-01

    The learning sciences community's interest in learning analytics (LA) has been growing steadily over the past several years. Three recent symposia on the theme (at the American Educational Research Association 2011 and 2012 annual conferences, and the International Conference of the Learning Sciences 2012), organized by Paulo Blikstein, led…

  6. Topics in computational physics

    NASA Astrophysics Data System (ADS)

    Monville, Maura Edelweiss

    Computational Physics spans a broad range of applied fields extending beyond the border of traditional physics tracks. Demonstrated flexibility and capability to switch to a new project, and pick up the basics of the new field quickly, are among the essential requirements for a computational physicist. In line with the above mentioned prerequisites, my thesis described the development and results of two computational projects belonging to two different applied science areas. The first project is a Materials Science application. It is a prescription for an innovative nano-fabrication technique that is built out of two other known techniques. The preliminary results of the simulation of this novel nano-patterning fabrication method show an average improvement, roughly equal to 18%, with respect to the single techniques it draws on. The second project is a Homeland Security application aimed at preventing smuggling of nuclear material at ports of entry. It is concerned with a simulation of an active material interrogation system based on the analysis of induced photo-nuclear reactions. This project consists of a preliminary evaluation of the photo-fission implementation in the more robust radiation transport Monte Carlo codes, followed by the customization and extension of MCNPX, a Monte Carlo code developed in Los Alamos National Laboratory, and MCNP-PoliMi. The final stage of the project consists of testing the interrogation system against some real world scenarios, for the purpose of determining the system's reliability, material discrimination power, and limitations.

  7. Artificial intelligence in medicine.

    PubMed Central

    Ramesh, A. N.; Kambhampati, C.; Monson, J. R. T.; Drew, P. J.

    2004-01-01

    INTRODUCTION: Artificial intelligence is a branch of computer science capable of analysing complex medical data. Their potential to exploit meaningful relationship with in a data set can be used in the diagnosis, treatment and predicting outcome in many clinical scenarios. METHODS: Medline and internet searches were carried out using the keywords 'artificial intelligence' and 'neural networks (computer)'. Further references were obtained by cross-referencing from key articles. An overview of different artificial intelligent techniques is presented in this paper along with the review of important clinical applications. RESULTS: The proficiency of artificial intelligent techniques has been explored in almost every field of medicine. Artificial neural network was the most commonly used analytical tool whilst other artificial intelligent techniques such as fuzzy expert systems, evolutionary computation and hybrid intelligent systems have all been used in different clinical settings. DISCUSSION: Artificial intelligence techniques have the potential to be applied in almost every field of medicine. There is need for further clinical trials which are appropriately designed before these emergent techniques find application in the real clinical setting. PMID:15333167

  8. Artificial intelligence in medicine.

    PubMed

    Ramesh, A N; Kambhampati, C; Monson, J R T; Drew, P J

    2004-09-01

    Artificial intelligence is a branch of computer science capable of analysing complex medical data. Their potential to exploit meaningful relationship with in a data set can be used in the diagnosis, treatment and predicting outcome in many clinical scenarios. Medline and internet searches were carried out using the keywords 'artificial intelligence' and 'neural networks (computer)'. Further references were obtained by cross-referencing from key articles. An overview of different artificial intelligent techniques is presented in this paper along with the review of important clinical applications. The proficiency of artificial intelligent techniques has been explored in almost every field of medicine. Artificial neural network was the most commonly used analytical tool whilst other artificial intelligent techniques such as fuzzy expert systems, evolutionary computation and hybrid intelligent systems have all been used in different clinical settings. Artificial intelligence techniques have the potential to be applied in almost every field of medicine. There is need for further clinical trials which are appropriately designed before these emergent techniques find application in the real clinical setting.

  9. The Mind Research Network - Mental Illness Neuroscience Discovery Grant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberts, J.; Calhoun, V.

    The scientific and technological programs of the Mind Research Network (MRN), reflect DOE missions in basic science and associated instrumentation, computational modeling, and experimental techniques. MRN's technical goals over the course of this project have been to develop and apply integrated, multi-modality functional imaging techniques derived from a decade of DOE-support research and technology development.

  10. Preparing Students for Careers in Science and Industry with Computational Physics

    NASA Astrophysics Data System (ADS)

    Florinski, V. A.

    2011-12-01

    Funded by NSF CAREER grant, the University of Alabama (UAH) in Huntsville has launched a new graduate program in Computational Physics. It is universally accepted that today's physics is done on a computer. The program blends the boundary between physics and computer science by teaching student modern, practical techniques of solving difficult physics problems using diverse computational platforms. Currently consisting of two courses first offered in the Fall of 2011, the program will eventually include 5 courses covering methods for fluid dynamics, particle transport via stochastic methods, and hybrid and PIC plasma simulations. The UAH's unique location allows courses to be shaped through discussions with faculty, NASA/MSFC researchers and local R&D business representatives, i.e., potential employers of the program's graduates. Students currently participating in the program have all begun their research careers in space and plasma physics; many are presenting their research at this meeting.

  11. Management and Analysis of Biological and Clinical Data: How Computer Science May Support Biomedical and Clinical Research

    NASA Astrophysics Data System (ADS)

    Veltri, Pierangelo

    The use of computer based solutions for data management in biology and clinical science has contributed to improve life-quality and also to gather research results in shorter time. Indeed, new algorithms and high performance computation have been using in proteomics and genomics studies for curing chronic diseases (e.g., drug designing) as well as supporting clinicians both in diagnosis (e.g., images-based diagnosis) and patient curing (e.g., computer based information analysis on information gathered from patient). In this paper we survey on examples of computer based techniques applied in both biology and clinical contexts. The reported applications are also results of experiences in real case applications at University Medical School of Catanzaro and also part of experiences of the National project Staywell SH 2.0 involving many research centers and companies aiming to study and improve citizen wellness.

  12. Apollo experience report: Apollo lunar surface experiments package data processing system

    NASA Technical Reports Server (NTRS)

    Eason, R. L.

    1974-01-01

    Apollo Program experience in the processing of scientific data from the Apollo lunar surface experiments package, in which computers and associated hardware and software were used, is summarized. The facility developed for the preprocessing of the lunar science data is described, as are several computer facilities and programs used by the Principal Investigators. The handling, processing, and analyzing of lunar science data and the interface with the Principal Investigators are discussed. Pertinent problems that arose in the development of the data processing schemes are discussed so that future programs may benefit from the solutions to the problems. The evolution of the data processing techniques for lunar science data related to recommendations for future programs of this type.

  13. ALCF Data Science Program: Productive Data-centric Supercomputing

    NASA Astrophysics Data System (ADS)

    Romero, Nichols; Vishwanath, Venkatram

    The ALCF Data Science Program (ADSP) is targeted at big data science problems that require leadership computing resources. The goal of the program is to explore and improve a variety of computational methods that will enable data-driven discoveries across all scientific disciplines. The projects will focus on data science techniques covering a wide area of discovery including but not limited to uncertainty quantification, statistics, machine learning, deep learning, databases, pattern recognition, image processing, graph analytics, data mining, real-time data analysis, and complex and interactive workflows. Project teams will be among the first to access Theta, ALCFs forthcoming 8.5 petaflops Intel/Cray system. The program will transition to the 200 petaflop/s Aurora supercomputing system when it becomes available. In 2016, four projects have been selected to kick off the ADSP. The selected projects span experimental and computational sciences and range from modeling the brain to discovering new materials for solar-powered windows to simulating collision events at the Large Hadron Collider (LHC). The program will have a regular call for proposals with the next call expected in Spring 2017.http://www.alcf.anl.gov/alcf-data-science-program This research used resources of the ALCF, which is a DOE Office of Science User Facility supported under Contract DE-AC02-06CH11357.

  14. Requirements and principles for the implementation and construction of large-scale geographic information systems

    NASA Technical Reports Server (NTRS)

    Smith, Terence R.; Menon, Sudhakar; Star, Jeffrey L.; Estes, John E.

    1987-01-01

    This paper provides a brief survey of the history, structure and functions of 'traditional' geographic information systems (GIS), and then suggests a set of requirements that large-scale GIS should satisfy, together with a set of principles for their satisfaction. These principles, which include the systematic application of techniques from several subfields of computer science to the design and implementation of GIS and the integration of techniques from computer vision and image processing into standard GIS technology, are discussed in some detail. In particular, the paper provides a detailed discussion of questions relating to appropriate data models, data structures and computational procedures for the efficient storage, retrieval and analysis of spatially-indexed data.

  15. Computed tomography: Will the slices reveal the truth

    PubMed Central

    Haridas, Harish; Mohan, Abarajithan; Papisetti, Sravanthi; Ealla, Kranti K. R.

    2016-01-01

    With the advances in the field of imaging sciences, new methods have been developed in dental radiology. These include digital radiography, density analyzing methods, cone beam computed tomography (CBCT), magnetic resonance imaging, ultrasound, and nuclear imaging techniques, which provide high-resolution detailed images of oral structures. The current review aims to critically elaborate the use of CBCT in endodontics. PMID:27652253

  16. Effect of Jigsaw II, Reading-Writing-Presentation, and Computer Animations on the Teaching of "Light" Unit

    ERIC Educational Resources Information Center

    Koç, Yasemin; Yildiz, Emre; Çaliklar, Seyma; Simsek, Ümit

    2016-01-01

    The aim of this study is to determine the effect of Jigsaw II technique, reading-writing-presentation method, and computer animation on students' academic achievements, epistemological beliefs, attitudes towards science lesson, and the retention of knowledge in the "Light" unit covered in the 7th grade. The sample of the study consists…

  17. A 21st Century Science, Technology, and Innovation Strategy for Americas National Security

    DTIC Science & Technology

    2016-05-01

    areas. Advanced Computing and Communications The exponential growth of the digital economy, driven by ubiquitous computing and communication...weapons- focused R&D, many of the capabilities being developed have significant dual-use potential. Digital connectivity, for instance, brings...scale than traditional recombinant DNA techniques, and to share these designs digitally . Nanotechnology promises the ability to engineer entirely

  18. Filling the gap between biology and computer science

    PubMed Central

    Aguilar-Ruiz, Jesús S; Moore, Jason H; Ritchie, Marylyn D

    2008-01-01

    This editorial introduces BioData Mining, a new journal which publishes research articles related to advances in computational methods and techniques for the extraction of useful knowledge from heterogeneous biological data. We outline the aims and scope of the journal, introduce the publishing model and describe the open peer review policy, which fosters interaction within the research community. PMID:18822148

  19. International Instrumentation Symposium, 34th, Albuquerque, NM, May 2-6, 1988, Proceedings

    NASA Astrophysics Data System (ADS)

    Various papers on aerospace instrumentation are presented. The general topics addressed include: blast and shock, wind tunnel instrumentations and controls, digital/optical sensors, software design/development, special test facilities, fiber optic techniques, electro/fiber optical measurement systems, measurement uncertainty, real time systems, pressure. Also discussed are: flight test and avionics instrumentation, data acquisition techniques, computer applications, thermal force and displacement, science and government, modeling techniques, reentry vehicle testing, strain and pressure.

  20. Restricted access processor - An application of computer security technology

    NASA Technical Reports Server (NTRS)

    Mcmahon, E. M.

    1985-01-01

    This paper describes a security guard device that is currently being developed by Computer Sciences Corporation (CSC). The methods used to provide assurance that the system meets its security requirements include the system architecture, a system security evaluation, and the application of formal and informal verification techniques. The combination of state-of-the-art technology and the incorporation of new verification procedures results in a demonstration of the feasibility of computer security technology for operational applications.

  1. An introduction to metabolomics and its potential application in veterinary science.

    PubMed

    Jones, Oliver A H; Cheung, Victoria L

    2007-10-01

    Metabolomics has been found to be applicable to a wide range of fields, including the study of gene function, toxicology, plant sciences, environmental analysis, clinical diagnostics, nutrition, and the discrimination of organism genotypes. This approach combines high-throughput sample analysis with computer-assisted multivariate pattern-recognition techniques. It is increasingly being deployed in toxico- and pharmacokinetic studies in the pharmaceutical industry, especially during the safety assessment of candidate drugs in human medicine. However, despite the potential of this technique to reduce both costs and the numbers of animals used for research, examples of the application of metabolomics in veterinary research are, thus far, rare. Here we give an introduction to metabolomics and discuss its potential in the field of veterinary science.

  2. University of Washington's eScience Institute Promotes New Training and Career Pathways in Data Science

    NASA Astrophysics Data System (ADS)

    Stone, S.; Parker, M. S.; Howe, B.; Lazowska, E.

    2015-12-01

    Rapid advances in technology are transforming nearly every field from "data-poor" to "data-rich." The ability to extract knowledge from this abundance of data is the cornerstone of 21st century discovery. At the University of Washington eScience Institute, our mission is to engage researchers across disciplines in developing and applying advanced computational methods and tools to real world problems in data-intensive discovery. Our research team consists of individuals with diverse backgrounds in domain sciences such as astronomy, oceanography and geology, with complementary expertise in advanced statistical and computational techniques such as data management, visualization, and machine learning. Two key elements are necessary to foster careers in data science: individuals with cross-disciplinary training in both method and domain sciences, and career paths emphasizing alternative metrics for advancement. We see persistent and deep-rooted challenges for the career paths of people whose skills, activities and work patterns don't fit neatly into the traditional roles and success metrics of academia. To address these challenges the eScience Institute has developed training programs and established new career opportunities for data-intensive research in academia. Our graduate students and post-docs have mentors in both a methodology and an application field. They also participate in coursework and tutorials to advance technical skill and foster community. Professional Data Scientist positions were created to support research independence while encouraging the development and adoption of domain-specific tools and techniques. The eScience Institute also supports the appointment of faculty who are innovators in developing and applying data science methodologies to advance their field of discovery. Our ultimate goal is to create a supportive environment for data science in academia and to establish global recognition for data-intensive discovery across all fields.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, William D; Johansen, Hans; Evans, Katherine J

    We present a survey of physical and computational techniques that have the potential to con- tribute to the next generation of high-fidelity, multi-scale climate simulations. Examples of the climate science problems that can be investigated with more depth include the capture of remote forcings of localized hydrological extreme events, an accurate representation of cloud features over a range of spatial and temporal scales, and parallel, large ensembles of simulations to more effectively explore model sensitivities and uncertainties. Numerical techniques, such as adaptive mesh refinement, implicit time integration, and separate treatment of fast physical time scales are enabling improved accuracy andmore » fidelity in simulation of dynamics and allow more complete representations of climate features at the global scale. At the same time, part- nerships with computer science teams have focused on taking advantage of evolving computer architectures, such as many-core processors and GPUs, so that these approaches which were previously considered prohibitively costly have become both more efficient and scalable. In combination, progress in these three critical areas is poised to transform climate modeling in the coming decades.« less

  4. An Overview of High Performance Computing and Challenges for the Future

    ScienceCinema

    Google Tech Talks

    2017-12-09

    In this talk we examine how high performance computing has changed over the last 10-year and look toward the future in terms of trends. These changes have had and will continue to have a major impact on our software. A new generation of software libraries and lgorithms are needed for the effective and reliable use of (wide area) dynamic, distributed and parallel environments. Some of the software and algorithm challenges have already been encountered, such as management of communication and memory hierarchies through a combination of compile--time and run--time techniques, but the increased scale of computation, depth of memory hierarchies, range of latencies, and increased run--time environment variability will make these problems much harder. We will focus on the redesign of software to fit multicore architectures. Speaker: Jack Dongarra University of Tennessee Oak Ridge National Laboratory University of Manchester Jack Dongarra received a Bachelor of Science in Mathematics from Chicago State University in 1972 and a Master of Science in Computer Science from the Illinois Institute of Technology in 1973. He received his Ph.D. in Applied Mathematics from the University of New Mexico in 1980. He worked at the Argonne National Laboratory until 1989, becoming a senior scientist. He now holds an appointment as University Distinguished Professor of Computer Science in the Electrical Engineering and Computer Science Department at the University of Tennessee, has the position of a Distinguished Research Staff member in the Computer Science and Mathematics Division at Oak Ridge National Laboratory (ORNL), Turing Fellow in the Computer Science and Mathematics Schools at the University of Manchester, and an Adjunct Professor in the Computer Science Department at Rice University. He specializes in numerical algorithms in linear algebra, parallel computing, the use of advanced-computer architectures, programming methodology, and tools for parallel computers. His research includes the development, testing and documentation of high quality mathematical software. He has contributed to the design and implementation of the following open source software packages and systems: EISPACK, LINPACK, the BLAS, LAPACK, ScaLAPACK, Netlib, PVM, MPI, NetSolve, Top500, ATLAS, and PAPI. He has published approximately 200 articles, papers, reports and technical memoranda and he is coauthor of several books. He was awarded the IEEE Sid Fernbach Award in 2004 for his contributions in the application of high performance computers using innovative approaches. He is a Fellow of the AAAS, ACM, and the IEEE and a member of the National Academy of Engineering.

  5. An Overview of High Performance Computing and Challenges for the Future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Google Tech Talks

    In this talk we examine how high performance computing has changed over the last 10-year and look toward the future in terms of trends. These changes have had and will continue to have a major impact on our software. A new generation of software libraries and lgorithms are needed for the effective and reliable use of (wide area) dynamic, distributed and parallel environments. Some of the software and algorithm challenges have already been encountered, such as management of communication and memory hierarchies through a combination of compile--time and run--time techniques, but the increased scale of computation, depth of memory hierarchies,more » range of latencies, and increased run--time environment variability will make these problems much harder. We will focus on the redesign of software to fit multicore architectures. Speaker: Jack Dongarra University of Tennessee Oak Ridge National Laboratory University of Manchester Jack Dongarra received a Bachelor of Science in Mathematics from Chicago State University in 1972 and a Master of Science in Computer Science from the Illinois Institute of Technology in 1973. He received his Ph.D. in Applied Mathematics from the University of New Mexico in 1980. He worked at the Argonne National Laboratory until 1989, becoming a senior scientist. He now holds an appointment as University Distinguished Professor of Computer Science in the Electrical Engineering and Computer Science Department at the University of Tennessee, has the position of a Distinguished Research Staff member in the Computer Science and Mathematics Division at Oak Ridge National Laboratory (ORNL), Turing Fellow in the Computer Science and Mathematics Schools at the University of Manchester, and an Adjunct Professor in the Computer Science Department at Rice University. He specializes in numerical algorithms in linear algebra, parallel computing, the use of advanced-computer architectures, programming methodology, and tools for parallel computers. His research includes the development, testing and documentation of high quality mathematical software. He has contributed to the design and implementation of the following open source software packages and systems: EISPACK, LINPACK, the BLAS, LAPACK, ScaLAPACK, Netlib, PVM, MPI, NetSolve, Top500, ATLAS, and PAPI. He has published approximately 200 articles, papers, reports and technical memoranda and he is coauthor of several books. He was awarded the IEEE Sid Fernbach Award in 2004 for his contributions in the application of high performance computers using innovative approaches. He is a Fellow of the AAAS, ACM, and the IEEE and a member of the National Academy of Engineering.« less

  6. Large Scale Many-Body Perturbation Theory calculations: methodological developments, data collections, validation

    NASA Astrophysics Data System (ADS)

    Govoni, Marco; Galli, Giulia

    Green's function based many-body perturbation theory (MBPT) methods are well established approaches to compute quasiparticle energies and electronic lifetimes. However, their application to large systems - for instance to heterogeneous systems, nanostructured, disordered, and defective materials - has been hindered by high computational costs. We will discuss recent MBPT methodological developments leading to an efficient formulation of electron-electron and electron-phonon interactions, and that can be applied to systems with thousands of electrons. Results using a formulation that does not require the explicit calculation of virtual states, nor the storage and inversion of large dielectric matrices will be presented. We will discuss data collections obtained using the WEST code, the advantages of the algorithms used in WEST over standard techniques, and the parallel performance. Work done in collaboration with I. Hamada, R. McAvoy, P. Scherpelz, and H. Zheng. This work was supported by MICCoM, as part of the Computational Materials Sciences Program funded by the U.S. Department of Energy, Office of Science, Basic Energy Sciences, Materials Sciences and Engineering Division and by ANL.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dang, Liem X.; Vo, Quynh N.; Nilsson, Mikael

    We report one of the first simulations using a classical rate theory approach to predict the mechanism of the exchange process between water and aqueous uranyl ions. Using our water and ion-water polarizable force fields and molecular dynamics techniques, we computed the potentials of mean force for the uranyl ion-water pair as the function of pressures at ambient temperature. Subsequently, these simulated potentials of mean force were used to calculate rate constants using the transition rate theory; the time dependent transmission coefficients were also examined using the reactive flux method and Grote-Hynes treatments of the dynamic response of the solvent.more » The computed activation volumes using transition rate theory and the corrected rate constants are positive, thus the mechanism of this particular water-exchange is a dissociative process. We discuss our rate theory results and compare them with previously studies in which non-polarizable force fields were used. This work was supported by the US Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences, and Biosciences. The calculations were carried out using computer resources provided by the Office of Basic Energy Sciences.« less

  8. Using Technology to Enhance the Effectiveness of General Chemistry Laboratory Courses

    ERIC Educational Resources Information Center

    Carvalho-Knighton, Kathleen M.; Keen-Rocha, Linda

    2007-01-01

    The effectiveness of two different laboratory techniques is compared to teach students majoring in science in a general chemistry laboratory. The results demonstrated that student laboratory activities with computer-interface systems could improve student understanding.

  9. Quantum rendering

    NASA Astrophysics Data System (ADS)

    Lanzagorta, Marco O.; Gomez, Richard B.; Uhlmann, Jeffrey K.

    2003-08-01

    In recent years, computer graphics has emerged as a critical component of the scientific and engineering process, and it is recognized as an important computer science research area. Computer graphics are extensively used for a variety of aerospace and defense training systems and by Hollywood's special effects companies. All these applications require the computer graphics systems to produce high quality renderings of extremely large data sets in short periods of time. Much research has been done in "classical computing" toward the development of efficient methods and techniques to reduce the rendering time required for large datasets. Quantum Computing's unique algorithmic features offer the possibility of speeding up some of the known rendering algorithms currently used in computer graphics. In this paper we discuss possible implementations of quantum rendering algorithms. In particular, we concentrate on the implementation of Grover's quantum search algorithm for Z-buffering, ray-tracing, radiosity, and scene management techniques. We also compare the theoretical performance between the classical and quantum versions of the algorithms.

  10. Correlative visualization techniques for multidimensional data

    NASA Technical Reports Server (NTRS)

    Treinish, Lloyd A.; Goettsche, Craig

    1989-01-01

    Critical to the understanding of data is the ability to provide pictorial or visual representation of those data, particularly in support of correlative data analysis. Despite the advancement of visualization techniques for scientific data over the last several years, there are still significant problems in bringing today's hardware and software technology into the hands of the typical scientist. For example, there are other computer science domains outside of computer graphics that are required to make visualization effective such as data management. Well-defined, flexible mechanisms for data access and management must be combined with rendering algorithms, data transformation, etc. to form a generic visualization pipeline. A generalized approach to data visualization is critical for the correlative analysis of distinct, complex, multidimensional data sets in the space and Earth sciences. Different classes of data representation techniques must be used within such a framework, which can range from simple, static two- and three-dimensional line plots to animation, surface rendering, and volumetric imaging. Static examples of actual data analyses will illustrate the importance of an effective pipeline in data visualization system.

  11. High resolution spectroscopic mapping imaging applied in situ to multilayer structures for stratigraphic identification of painted art objects

    NASA Astrophysics Data System (ADS)

    Karagiannis, Georgios Th.

    2016-04-01

    The development of non-destructive techniques is a reality in the field of conservation science. These techniques are usually not so accurate, as the analytical micro-sampling techniques, however, the proper development of soft-computing techniques can improve their accuracy. In this work, we propose a real-time fast acquisition spectroscopic mapping imaging system that operates from the ultraviolet to mid infrared (UV/Vis/nIR/mIR) area of the electromagnetic spectrum and it is supported by a set of soft-computing methods to identify the materials that exist in a stratigraphic structure of paint layers. Particularly, the system acquires spectra in diffuse-reflectance mode, scanning in a Region-Of-Interest (ROI), and having wavelength range from 200 up to 5000 nm. Also, a fuzzy c-means clustering algorithm, i.e., the particular soft-computing algorithm, produces the mapping images. The evaluation of the method was tested on a byzantine painted icon.

  12. Acoustic Source Bearing Estimation (ASBE) computer program development

    NASA Technical Reports Server (NTRS)

    Wiese, Michael R.

    1987-01-01

    A new bearing estimation algorithm (Acoustic Source Analysis Technique - ASAT) and an acoustic analysis computer program (Acoustic Source Bearing Estimation - ASBE) are described, which were developed by Computer Sciences Corporation for NASA Langley Research Center. The ASBE program is used by the Acoustics Division/Applied Acoustics Branch and the Instrument Research Division/Electro-Mechanical Instrumentation Branch to analyze acoustic data and estimate the azimuths from which the source signals radiated. Included are the input and output from a benchmark test case.

  13. Applications of hybrid and digital computation methods in aerospace-related sciences and engineering. [problem solving methods at the University of Houston

    NASA Technical Reports Server (NTRS)

    Huang, C. J.; Motard, R. L.

    1978-01-01

    The computing equipment in the engineering systems simulation laboratory of the Houston University Cullen College of Engineering is described and its advantages are summarized. The application of computer techniques in aerospace-related research psychology and in chemical, civil, electrical, industrial, and mechanical engineering is described in abstracts of 84 individual projects and in reprints of published reports. Research supports programs in acoustics, energy technology, systems engineering, and environment management as well as aerospace engineering.

  14. The GI Project: a prototype electronic textbook for high school biology.

    PubMed

    Calhoun, P S; Fishman, E K

    1997-01-01

    A prototype electronic science textbook for secondary education was developed to help bridge the gap between state-of-the-art medical technology and the basic science classroom. The prototype combines the latest in radiologic imaging techniques with a user-friendly multimedia computer program to teach the anatomy, physiology, and diseases of the gastrointestinal (GI) tract. The program includes original text, illustrations, photographs, animations, images from upper GI studies, plain radiographs, computed tomographic images, and three-dimensional reconstructions. These features are intended to create a stimulus-rich environment in which the high school science student can enjoy a variety of interactive experiences that will facilitate the learning process. The computer-based book is a new educational tool that promises to play a prominent role in the coming years. Current research suggests that computer-based books are valuable as an alternative educational medium. Although it is not yet clear what form textbooks will take in the future, computer-based books are already proving valuable as an alternative educational medium. For beginning students, they reinforce the material found in traditional textbooks and class presentations; for advanced students, they provide motivation to learn outside the traditional classroom.

  15. [Activities of Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2001-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administrations missions. RIACS is located at the NASA Ames Research Center, Moffett Field, California. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1. Automated Reasoning for Autonomous Systems Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. 2. Human-Centered Computing Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities. 3. High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to analysis of large scientific datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.

  16. Designing a hands-on brain computer interface laboratory course.

    PubMed

    Khalighinejad, Bahar; Long, Laura Kathleen; Mesgarani, Nima

    2016-08-01

    Devices and systems that interact with the brain have become a growing field of research and development in recent years. Engineering students are well positioned to contribute to both hardware development and signal analysis techniques in this field. However, this area has been left out of most engineering curricula. We developed an electroencephalography (EEG) based brain computer interface (BCI) laboratory course to educate students through hands-on experiments. The course is offered jointly by the Biomedical Engineering, Electrical Engineering, and Computer Science Departments of Columbia University in the City of New York and is open to senior undergraduate and graduate students. The course provides an effective introduction to the experimental design, neuroscience concepts, data analysis techniques, and technical skills required in the field of BCI.

  17. Final Report for the ZERT Project: Basic Science of Retention Issues, Risk Assessment & Measurement, Monitoring and Verification for Geologic Sequestration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spangler, Lee; Cunningham, Alfred; Lageson, David

    2011-03-31

    ZERT has made major contributions to five main areas of sequestration science: improvement of computational tools; measurement and monitoring techniques to verify storage and track migration of CO{sub 2}; development of a comprehensive performance and risk assessment framework; fundamental geophysical, geochemical and hydrological investigations of CO{sub 2} storage; and investigate innovative, bio-based mitigation strategies.

  18. Effect of Web Assisted Education Supported by Six Thinking Hats on Students' Academic Achievement in Science and Technology Classes

    ERIC Educational Resources Information Center

    Ercan, Orhan; Bilen, Kadir

    2014-01-01

    Advances in computer technologies and adoption of related methods and techniques in education have developed parallel to each other. This study focuses on the need to utilize more than one teaching method and technique in education rather than focusing on a single teaching method. By using the pre-test post-test and control group semi-experimental…

  19. Proceedings ICASS 2017

    NASA Astrophysics Data System (ADS)

    Fu, Qiang; Schaaf, Peter

    2018-07-01

    This special issue of the high impact international peer reviewed journal Applied Surface Science represents the proceedings of the 2nd International Conference on Applied Surface Science ICASS held 12-16 June 2017 in Dalian China. The conference provided a forum for researchers in all areas of applied surface science to present their work. The main topics of the conference are in line with the most popular areas of research reported in Applied Surface Science. Thus, this issue includes current research on the role and use of surfaces in chemical and physical processes, related to catalysis, electrochemistry, surface engineering and functionalization, biointerfaces, semiconductors, 2D-layered materials, surface nanotechnology, energy, new/functional materials and nanotechnology. Also the various techniques and characterization methods will be discussed. Hence, scientific research on the atomic and molecular level of material properties investigated with specific surface analytical techniques and/or computational methods is essential for any further progress in these fields.

  20. Modeling the Milky Way: Spreadsheet Science.

    ERIC Educational Resources Information Center

    Whitmer, John C.

    1990-01-01

    Described is the generation of a scale model of the solar system and the milky way galaxy using a computer spreadsheet program. A sample spreadsheet including cell formulas is provided. Suggestions for using this activity as a teaching technique are included. (CW)

  1. Prognostic Modeling and Experimental Techniques for Electrolytic Capacitor Health Monitoring

    DTIC Science & Technology

    2011-09-01

    result in errors in the Inertial Navigation ( INAV ) computations of position and heading, causing the aircraft to fly off course [2, 3]. Chetan... Science Instruments, 2010. 21) IEC 60384-4-1 Fixed capacitors for use in electronic equipment

  2. Abstracts of Research. July 1974-June 1975.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Computer and Information Science Research Center.

    Abstracts of research papers in computer and information science are given for 68 papers in the areas of information storage and retrieval; human information processing; information analysis; linguistic analysis; artificial intelligence; information processes in physical, biological, and social systems; mathematical techniques; systems…

  3. A review of computer-aided design/computer-aided manufacture techniques for removable denture fabrication.

    PubMed

    Bilgin, Mehmet Selim; Baytaroğlu, Ebru Nur; Erdem, Ali; Dilber, Erhan

    2016-01-01

    The aim of this review was to investigate usage of computer-aided design/computer-aided manufacture (CAD/CAM) such as milling and rapid prototyping (RP) technologies for removable denture fabrication. An electronic search was conducted in the PubMed/MEDLINE, ScienceDirect, Google Scholar, and Web of Science databases. Databases were searched from 1987 to 2014. The search was performed using a variety of keywords including CAD/CAM, complete/partial dentures, RP, rapid manufacturing, digitally designed, milled, computerized, and machined. The identified developments (in chronological order), techniques, advantages, and disadvantages of CAD/CAM and RP for removable denture fabrication are summarized. Using a variety of keywords and aiming to find the topic, 78 publications were initially searched. For the main topic, the abstract of these 78 articles were scanned, and 52 publications were selected for reading in detail. Full-text of these articles was gained and searched in detail. Totally, 40 articles that discussed the techniques, advantages, and disadvantages of CAD/CAM and RP for removable denture fabrication and the articles were incorporated in this review. Totally, 16 of the papers summarized in the table. Following review of all relevant publications, it can be concluded that current innovations and technological developments of CAD/CAM and RP allow the digitally planning and manufacturing of removable dentures from start to finish. As a result according to the literature review CAD/CAM techniques and supportive maxillomandibular relationship transfer devices are growing fast. In the close future, fabricating removable dentures will become medical informatics instead of needing a technical staff and procedures. However the methods have several limitations for now.

  4. A review of computer-aided design/computer-aided manufacture techniques for removable denture fabrication

    PubMed Central

    Bilgin, Mehmet Selim; Baytaroğlu, Ebru Nur; Erdem, Ali; Dilber, Erhan

    2016-01-01

    The aim of this review was to investigate usage of computer-aided design/computer-aided manufacture (CAD/CAM) such as milling and rapid prototyping (RP) technologies for removable denture fabrication. An electronic search was conducted in the PubMed/MEDLINE, ScienceDirect, Google Scholar, and Web of Science databases. Databases were searched from 1987 to 2014. The search was performed using a variety of keywords including CAD/CAM, complete/partial dentures, RP, rapid manufacturing, digitally designed, milled, computerized, and machined. The identified developments (in chronological order), techniques, advantages, and disadvantages of CAD/CAM and RP for removable denture fabrication are summarized. Using a variety of keywords and aiming to find the topic, 78 publications were initially searched. For the main topic, the abstract of these 78 articles were scanned, and 52 publications were selected for reading in detail. Full-text of these articles was gained and searched in detail. Totally, 40 articles that discussed the techniques, advantages, and disadvantages of CAD/CAM and RP for removable denture fabrication and the articles were incorporated in this review. Totally, 16 of the papers summarized in the table. Following review of all relevant publications, it can be concluded that current innovations and technological developments of CAD/CAM and RP allow the digitally planning and manufacturing of removable dentures from start to finish. As a result according to the literature review CAD/CAM techniques and supportive maxillomandibular relationship transfer devices are growing fast. In the close future, fabricating removable dentures will become medical informatics instead of needing a technical staff and procedures. However the methods have several limitations for now. PMID:27095912

  5. Enabling Extreme Scale Earth Science Applications at the Oak Ridge Leadership Computing Facility

    NASA Astrophysics Data System (ADS)

    Anantharaj, V. G.; Mozdzynski, G.; Hamrud, M.; Deconinck, W.; Smith, L.; Hack, J.

    2014-12-01

    The Oak Ridge Leadership Facility (OLCF), established at the Oak Ridge National Laboratory (ORNL) under the auspices of the U.S. Department of Energy (DOE), welcomes investigators from universities, government agencies, national laboratories and industry who are prepared to perform breakthrough research across a broad domain of scientific disciplines, including earth and space sciences. Titan, the OLCF flagship system, is currently listed as #2 in the Top500 list of supercomputers in the world, and the largest available for open science. The computational resources are allocated primarily via the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, sponsored by the U.S. DOE Office of Science. In 2014, over 2.25 billion core hours on Titan were awarded via INCITE projects., including 14% of the allocation toward earth sciences. The INCITE competition is also open to research scientists based outside the USA. In fact, international research projects account for 12% of the INCITE awards in 2014. The INCITE scientific review panel also includes 20% participation from international experts. Recent accomplishments in earth sciences at OLCF include the world's first continuous simulation of 21,000 years of earth's climate history (2009); and an unprecedented simulation of a magnitude 8 earthquake over 125 sq. miles. One of the ongoing international projects involves scaling the ECMWF Integrated Forecasting System (IFS) model to over 200K cores of Titan. ECMWF is a partner in the EU funded Collaborative Research into Exascale Systemware, Tools and Applications (CRESTA) project. The significance of the research carried out within this project is the demonstration of techniques required to scale current generation Petascale capable simulation codes towards the performance levels required for running on future Exascale systems. One of the techniques pursued by ECMWF is to use Fortran2008 coarrays to overlap computations and communications and to reduce the total volume of data communicated. Use of Titan has enabled ECMWF to plan future scalability developments and resource requirements. We will also discuss the best practices developed over the years in navigating logistical, legal and regulatory hurdles involved in supporting the facility's diverse user community.

  6. The Future of Pharmaceutical Manufacturing Sciences

    PubMed Central

    2015-01-01

    The entire pharmaceutical sector is in an urgent need of both innovative technological solutions and fundamental scientific work, enabling the production of highly engineered drug products. Commercial‐scale manufacturing of complex drug delivery systems (DDSs) using the existing technologies is challenging. This review covers important elements of manufacturing sciences, beginning with risk management strategies and design of experiments (DoE) techniques. Experimental techniques should, where possible, be supported by computational approaches. With that regard, state‐of‐art mechanistic process modeling techniques are described in detail. Implementation of materials science tools paves the way to molecular‐based processing of future DDSs. A snapshot of some of the existing tools is presented. Additionally, general engineering principles are discussed covering process measurement and process control solutions. Last part of the review addresses future manufacturing solutions, covering continuous processing and, specifically, hot‐melt processing and printing‐based technologies. Finally, challenges related to implementing these technologies as a part of future health care systems are discussed. © 2015 The Authors. Journal of Pharmaceutical Sciences published by Wiley Periodicals, Inc. and the American Pharmacists Association J Pharm Sci 104:3612–3638, 2015 PMID:26280993

  7. The Future of Pharmaceutical Manufacturing Sciences.

    PubMed

    Rantanen, Jukka; Khinast, Johannes

    2015-11-01

    The entire pharmaceutical sector is in an urgent need of both innovative technological solutions and fundamental scientific work, enabling the production of highly engineered drug products. Commercial-scale manufacturing of complex drug delivery systems (DDSs) using the existing technologies is challenging. This review covers important elements of manufacturing sciences, beginning with risk management strategies and design of experiments (DoE) techniques. Experimental techniques should, where possible, be supported by computational approaches. With that regard, state-of-art mechanistic process modeling techniques are described in detail. Implementation of materials science tools paves the way to molecular-based processing of future DDSs. A snapshot of some of the existing tools is presented. Additionally, general engineering principles are discussed covering process measurement and process control solutions. Last part of the review addresses future manufacturing solutions, covering continuous processing and, specifically, hot-melt processing and printing-based technologies. Finally, challenges related to implementing these technologies as a part of future health care systems are discussed. © 2015 The Authors. Journal of Pharmaceutical Sciences published by Wiley Periodicals, Inc. and the American Pharmacists Association.

  8. High-Performance Compute Infrastructure in Astronomy: 2020 Is Only Months Away

    NASA Astrophysics Data System (ADS)

    Berriman, B.; Deelman, E.; Juve, G.; Rynge, M.; Vöckler, J. S.

    2012-09-01

    By 2020, astronomy will be awash with as much as 60 PB of public data. Full scientific exploitation of such massive volumes of data will require high-performance computing on server farms co-located with the data. Development of this computing model will be a community-wide enterprise that has profound cultural and technical implications. Astronomers must be prepared to develop environment-agnostic applications that support parallel processing. The community must investigate the applicability and cost-benefit of emerging technologies such as cloud computing to astronomy, and must engage the Computer Science community to develop science-driven cyberinfrastructure such as workflow schedulers and optimizers. We report here the results of collaborations between a science center, IPAC, and a Computer Science research institute, ISI. These collaborations may be considered pathfinders in developing a high-performance compute infrastructure in astronomy. These collaborations investigated two exemplar large-scale science-driver workflow applications: 1) Calculation of an infrared atlas of the Galactic Plane at 18 different wavelengths by placing data from multiple surveys on a common plate scale and co-registering all the pixels; 2) Calculation of an atlas of periodicities present in the public Kepler data sets, which currently contain 380,000 light curves. These products have been generated with two workflow applications, written in C for performance and designed to support parallel processing on multiple environments and platforms, but with different compute resource needs: the Montage image mosaic engine is I/O-bound, and the NASA Star and Exoplanet Database periodogram code is CPU-bound. Our presentation will report cost and performance metrics and lessons-learned for continuing development. Applicability of Cloud Computing: Commercial Cloud providers generally charge for all operations, including processing, transfer of input and output data, and for storage of data, and so the costs of running applications vary widely according to how they use resources. The cloud is well suited to processing CPU-bound (and memory bound) workflows such as the periodogram code, given the relatively low cost of processing in comparison with I/O operations. I/O-bound applications such as Montage perform best on high-performance clusters with fast networks and parallel file-systems. Science-driven Cyberinfrastructure: Montage has been widely used as a driver application to develop workflow management services, such as task scheduling in distributed environments, designing fault tolerance techniques for job schedulers, and developing workflow orchestration techniques. Running Parallel Applications Across Distributed Cloud Environments: Data processing will eventually take place in parallel distributed across cyber infrastructure environments having different architectures. We have used the Pegasus Work Management System (WMS) to successfully run applications across three very different environments: TeraGrid, OSG (Open Science Grid), and FutureGrid. Provisioning resources across different grids and clouds (also referred to as Sky Computing), involves establishing a distributed environment, where issues of, e.g, remote job submission, data management, and security need to be addressed. This environment also requires building virtual machine images that can run in different environments. Usually, each cloud provides basic images that can be customized with additional software and services. In most of our work, we provisioned compute resources using a custom application, called Wrangler. Pegasus WMS abstracts the architectures of the compute environments away from the end-user, and can be considered a first-generation tool suitable for scientists to run their applications on disparate environments.

  9. ONRASIA Scientific Information Bulletin. Volume 8, Number 3, July- September 1993

    DTIC Science & Technology

    1993-09-01

    the Ninth Symposium on Preconditioned Conjugate Dr. Steven F. Ashby Gradient Methods , which he organized. Computing Sciences Department Computing...ditioned Conjugate Gradient Methods , held at Keio chines and is currently a topic of considerable University (Yokohama). During this meeting, I interest...in the United States. In Japan, on the other discussed iterative methods for linear systems with hand, this technique does not appear to be too well

  10. Standards guide for space and earth sciences computer software

    NASA Technical Reports Server (NTRS)

    Mason, G.; Chapman, R.; Klinglesmith, D.; Linnekin, J.; Putney, W.; Shaffer, F.; Dapice, R.

    1972-01-01

    Guidelines for the preparation of systems analysis and programming work statements are presented. The data is geared toward the efficient administration of available monetary and equipment resources. Language standards and the application of good management techniques to software development are emphasized.

  11. Electronic Circuit Analysis Language (ECAL)

    NASA Astrophysics Data System (ADS)

    Chenghang, C.

    1983-03-01

    The computer aided design technique is an important development in computer applications and it is an important component of computer science. The special language for electronic circuit analysis is the foundation of computer aided design or computer aided circuit analysis (abbreviated as CACD and CACA) of simulated circuits. Electronic circuit analysis language (ECAL) is a comparatively simple and easy to use circuit analysis special language which uses the FORTRAN language to carry out the explanatory executions. It is capable of conducting dc analysis, ac analysis, and transient analysis of a circuit. Futhermore, the results of the dc analysis can be used directly as the initial conditions for the ac and transient analyses.

  12. Cumulative Reports and Publications through December 31, 1989 (Institute for Computer Applications in Science and Engineering)

    DTIC Science & Technology

    1990-05-01

    Research is conducted primarily by visiting scientists from universities and industry who have resident appointments for limited periods of time , and...Elsevier Science Publishers B. V. (North-holland), IFIP, 1989. Crowley, Kay, Joel Saltz, Ravi Mirchandaney, and Harry Berryman: Run- time scheduling...Inverse problem techniques for beams with tip body and time hysteresis camping. ICASE Report No. 89-22, April 18, 1989. 24 pages. To appear in

  13. Advanced Training Techniques Using Computer Generated Imagery.

    DTIC Science & Technology

    1981-09-15

    Annual Technical Report for Period- 16 May 1980 - 15 July 1981 LJ Prepared for AIR FORCE OFFICE OF SCIENTIFIC RESEARCH Director of Life Sciences Building...Simulation Management Branch, ATC, Randolph AFB, TX 78148, November 1977. Allbee, K. F., Semple C. A.; Aircrew Training Devices Life Cycle Cost and Worth...in Simulator Design and Application, Life Sciences, Inc., 227 Lood 820 NE, Hurst, Texas 76053, AFOSR-TR-77- 0965, 30 September 1976 McDonnell Aircraft

  14. Cyber Science, Biometrics and Digital Forensics: Workshop on Emerging Cyber Techniques and Technologies

    DTIC Science & Technology

    2016-09-07

    and the University of Southern California through have been collaborating on a proposal led by Florida International University’s School of Computing...security. We will develop an action plan to identify needs, assess vulnerabilities and address disruptive technologies that could clearly provide a ...Institute of Technology and his Bachelor of Science degree in Aerospace Engineering, Polytechnic University of New York. Mr. Hurtado is a member of the

  15. Computer vision applications for coronagraphic optical alignment and image processing.

    PubMed

    Savransky, Dmitry; Thomas, Sandrine J; Poyneer, Lisa A; Macintosh, Bruce A

    2013-05-10

    Modern coronagraphic systems require very precise alignment between optical components and can benefit greatly from automated image processing. We discuss three techniques commonly employed in the fields of computer vision and image analysis as applied to the Gemini Planet Imager, a new facility instrument for the Gemini South Observatory. We describe how feature extraction and clustering methods can be used to aid in automated system alignment tasks, and also present a search algorithm for finding regular features in science images used for calibration and data processing. Along with discussions of each technique, we present our specific implementation and show results of each one in operation.

  16. Recent advances in 3D computed tomography techniques for simulation and navigation in hepatobiliary pancreatic surgery.

    PubMed

    Uchida, Masafumi

    2014-04-01

    A few years ago it could take several hours to complete a 3D image using a 3D workstation. Thanks to advances in computer science, obtaining results of interest now requires only a few minutes. Many recent 3D workstations or multimedia computers are equipped with onboard 3D virtual patient modeling software, which enables patient-specific preoperative assessment and virtual planning, navigation, and tool positioning. Although medical 3D imaging can now be conducted using various modalities, including computed tomography (CT), magnetic resonance imaging (MRI), positron emission tomography (PET), and ultrasonography (US) among others, the highest quality images are obtained using CT data, and CT images are now the most commonly used source of data for 3D simulation and navigation image. If the 2D source image is bad, no amount of 3D image manipulation in software will provide a quality 3D image. In this exhibition, the recent advances in CT imaging technique and 3D visualization of the hepatobiliary and pancreatic abnormalities are featured, including scan and image reconstruction technique, contrast-enhanced techniques, new application of advanced CT scan techniques, and new virtual reality simulation and navigation imaging. © 2014 Japanese Society of Hepato-Biliary-Pancreatic Surgery.

  17. Designing a Hands-On Brain Computer Interface Laboratory Course

    PubMed Central

    Khalighinejad, Bahar; Long, Laura Kathleen; Mesgarani, Nima

    2017-01-01

    Devices and systems that interact with the brain have become a growing field of research and development in recent years. Engineering students are well positioned to contribute to both hardware development and signal analysis techniques in this field. However, this area has been left out of most engineering curricula. We developed an electroencephalography (EEG) based brain computer interface (BCI) laboratory course to educate students through hands-on experiments. The course is offered jointly by the Biomedical Engineering, Electrical Engineering, and Computer Science Departments of Columbia University in the City of New York and is open to senior undergraduate and graduate students. The course provides an effective introduction to the experimental design, neuroscience concepts, data analysis techniques, and technical skills required in the field of BCI. PMID:28268946

  18. Computational Analysis of Behavior.

    PubMed

    Egnor, S E Roian; Branson, Kristin

    2016-07-08

    In this review, we discuss the emerging field of computational behavioral analysis-the use of modern methods from computer science and engineering to quantitatively measure animal behavior. We discuss aspects of experiment design important to both obtaining biologically relevant behavioral data and enabling the use of machine vision and learning techniques for automation. These two goals are often in conflict. Restraining or restricting the environment of the animal can simplify automatic behavior quantification, but it can also degrade the quality or alter important aspects of behavior. To enable biologists to design experiments to obtain better behavioral measurements, and computer scientists to pinpoint fruitful directions for algorithm improvement, we review known effects of artificial manipulation of the animal on behavior. We also review machine vision and learning techniques for tracking, feature extraction, automated behavior classification, and automated behavior discovery, the assumptions they make, and the types of data they work best with.

  19. Understanding Islamist political violence through computational social simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watkins, Jennifer H; Mackerrow, Edward P; Patelli, Paolo G

    Understanding the process that enables political violence is of great value in reducing the future demand for and support of violent opposition groups. Methods are needed that allow alternative scenarios and counterfactuals to be scientifically researched. Computational social simulation shows promise in developing 'computer experiments' that would be unfeasible or unethical in the real world. Additionally, the process of modeling and simulation reveals and challenges assumptions that may not be noted in theories, exposes areas where data is not available, and provides a rigorous, repeatable, and transparent framework for analyzing the complex dynamics of political violence. This paper demonstrates themore » computational modeling process using two simulation techniques: system dynamics and agent-based modeling. The benefits and drawbacks of both techniques are discussed. In developing these social simulations, we discovered that the social science concepts and theories needed to accurately simulate the associated psychological and social phenomena were lacking.« less

  20. Viewpoints: A New Computer Program for Interactive Exploration of Large Multivariate Space Science and Astrophysics Data.

    NASA Astrophysics Data System (ADS)

    Levit, Creon; Gazis, P.

    2006-06-01

    The graphics processing units (GPUs) built in to all professional desktop and laptop computers currently on the market are capable of transforming, filtering, and rendering hundreds of millions of points per second. We present a prototype open-source cross-platform (windows, linux, Apple OSX) application which leverages some of the power latent in the GPU to enable smooth interactive exploration and analysis of large high-dimensional data using a variety of classical and recent techniques. The targeted application area is the interactive analysis of complex, multivariate space science and astrophysics data sets, with dimensionalities that may surpass 100 and sample sizes that may exceed 10^6-10^8.

  1. Visualization Techniques in Space and Atmospheric Sciences

    NASA Technical Reports Server (NTRS)

    Szuszczewicz, E. P. (Editor); Bredekamp, Joseph H. (Editor)

    1995-01-01

    Unprecedented volumes of data will be generated by research programs that investigate the Earth as a system and the origin of the universe, which will in turn require analysis and interpretation that will lead to meaningful scientific insight. Providing a widely distributed research community with the ability to access, manipulate, analyze, and visualize these complex, multidimensional data sets depends on a wide range of computer science and technology topics. Data storage and compression, data base management, computational methods and algorithms, artificial intelligence, telecommunications, and high-resolution display are just a few of the topics addressed. A unifying theme throughout the papers with regards to advanced data handling and visualization is the need for interactivity, speed, user-friendliness, and extensibility.

  2. Applications of penetrating radiation for small animal imaging

    NASA Astrophysics Data System (ADS)

    Hasegawa, Bruce H.; Wu, Max C.; Iwata, Koji; Hwang, Andrew B.; Wong, Kenneth H.; Barber, William C.; Dae, Michael W.; Sakdinawat, Anne E.

    2002-11-01

    Researchers long have relied on research involving small animals to unravel scientific mysteries in the biological sciences, and to develop new diagnostic and therapeutic techniques in the medical and health sciences. Within the past 2 decades, new techniques have been developed to manipulate the genome of the mouse, allowing the development of transgenic and knockout models of mammalian and human disease, development, and physiology. Traditionally, much biological research involving small animals has relied on the use of invasive methods such as organ harvesting, tissue sampling, and autoradiography during which the animal was sacrificed to perform a single measurement. More recently, imaging techniques have been developed that assess anatomy and physiology in the intact animal, in a way that allows the investigator to follow the progression of disease, or to monitor the response to therapeutic interventions. Imaging techniques that use penetrating radiation at millimeter or submillimeter levels to image small animals include x-ray computed tomography (microCT), single-photon emission computed tomography (microSPECT), and imaging positron emission computed tomography (microPET). MicroCT generates cross-sectional slices which reveal the structure of the object with spatial resolution in the range of 50 to 100 microns. MicroSPECT and microPET are radionuclide imaging techniques in which a radiopharmaceutical is injected into the animal that is accumulated to metabolism, blood flow, bone remodeling, tumor growth, or other biological processes. Both microSPECT and microPET offer spatial resolutions in the range of 1-2 millimeters. However, microPET records annihilation photons produced by a positron-emitting radiopharmaceutical using electronic coincidence, and has a sensitivity approximately two orders of magnitude better than microSPECT, while microSPECT is compatible with gamma-ray emitting radiopharmaceuticals that are less expensive and more readily available than those used with microPET. High-resolution dual-modality imaging systems now are being developed that combine microPET or microSPECT with microCT in a way that facilitates more direct correlation of anatomy and physiology in the same animal. Small animal imaging allows researchers to perform experiments that are not possible with conventional invasive techniques, and thereby are becoming increasingly important tools for discovery of fundamental biological information, and development of new diagnostic and therapeutic techniques in the biomedical sciences.

  3. Continual Response Measurement: Design and Validation.

    ERIC Educational Resources Information Center

    Baggaley, Jon

    1987-01-01

    Discusses reliability and validity of continual response measurement (CRM), a computer-based measurement technique, and its use in social science research. Highlights include the importance of criterion-referencing the data, guidelines for designing studies using CRM, examples typifying their deductive and inductive functions, and a discussion of…

  4. Grasping Reality Through Illusion: Interactive Graphics Serving Science

    DTIC Science & Technology

    1988-03-01

    SIGGRAPH, or riding techniques to the enhancement of scientific computing. StarTours at Disneyland shows how stunningly far we ........ have come. We need...supercomputer References matching and steering tools. Such tools must be Bergman, L., Fuchs, H., Grant , E., Spach, S. [1986] universal and application

  5. Numerical computation of linear instability of detonations

    NASA Astrophysics Data System (ADS)

    Kabanov, Dmitry; Kasimov, Aslan

    2017-11-01

    We propose a method to study linear stability of detonations by direct numerical computation. The linearized governing equations together with the shock-evolution equation are solved in the shock-attached frame using a high-resolution numerical algorithm. The computed results are processed by the Dynamic Mode Decomposition technique to generate dispersion relations. The method is applied to the reactive Euler equations with simple-depletion chemistry as well as more complex multistep chemistry. The results are compared with those known from normal-mode analysis. We acknowledge financial support from King Abdullah University of Science and Technology.

  6. NCI's High Performance Computing (HPC) and High Performance Data (HPD) Computing Platform for Environmental and Earth System Data Science

    NASA Astrophysics Data System (ADS)

    Evans, Ben; Allen, Chris; Antony, Joseph; Bastrakova, Irina; Gohar, Kashif; Porter, David; Pugh, Tim; Santana, Fabiana; Smillie, Jon; Trenham, Claire; Wang, Jingbo; Wyborn, Lesley

    2015-04-01

    The National Computational Infrastructure (NCI) has established a powerful and flexible in-situ petascale computational environment to enable both high performance computing and Data-intensive Science across a wide spectrum of national environmental and earth science data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress so far to harmonise the underlying data collections for future interdisciplinary research across these large volume data collections. NCI has established 10+ PBytes of major national and international data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the major Australian national-scale scientific collections), leading research communities, and collaborating overseas organisations. New infrastructures created at NCI mean the data collections are now accessible within an integrated High Performance Computing and Data (HPC-HPD) environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large-scale high-bandwidth Lustre filesystems. The hardware was designed at inception to ensure that it would allow the layered software environment to flexibly accommodate the advancement of future data science. New approaches to software technology and data models have also had to be developed to enable access to these large and exponentially increasing data volumes at NCI. Traditional HPC and data environments are still made available in a way that flexibly provides the tools, services and supporting software systems on these new petascale infrastructures. But to enable the research to take place at this scale, the data, metadata and software now need to evolve together - creating a new integrated high performance infrastructure. The new infrastructure at NCI currently supports a catalogue of integrated, reusable software and workflows from earth system and ecosystem modelling, weather research, satellite and other observed data processing and analysis. One of the challenges for NCI has been to support existing techniques and methods, while carefully preparing the underlying infrastructure for the transition needed for the next class of Data-intensive Science. In doing so, a flexible range of techniques and software can be made available for application across the corpus of data collections available, and to provide a new infrastructure for future interdisciplinary research.

  7. Elementary and Advanced Computer Projects for the Physics Classroom and Laboratory

    DTIC Science & Technology

    1992-12-01

    are SPF/PC, MS Word, n3, Symphony, Mathematics, and FORTRAN. The authors’ programs assist data analysis in particular laboratory experiments and make...assist data analysis in particular laboratory experiments and make use of the Monte Carlo and other numerical techniques in computer simulation and...the language of science and engineering in industry and government laboratories (alth..4h C is becoming a powerful competitor ). RM/FORTRAN (cost $400

  8. Computational Difficulties in the Identification and Optimization of Control Systems.

    DTIC Science & Technology

    1980-01-01

    necessary and Identify by block number) - -. 3. iABSTRACT (Continue on revers, side It necessary and Identify by block number) As more realistic models ...Island 02912 ABSTRACT As more realistic models for resource management are developed, the need for efficient computational techniques for parameter...optimization (optimal control) in "state" models which This research was supported in part by ttfe National Science Foundation under grant NSF-MCS 79-05774

  9. Phylo: A Citizen Science Approach for Improving Multiple Sequence Alignment

    PubMed Central

    Kam, Alfred; Kwak, Daniel; Leung, Clarence; Wu, Chu; Zarour, Eleyine; Sarmenta, Luis; Blanchette, Mathieu; Waldispühl, Jérôme

    2012-01-01

    Background Comparative genomics, or the study of the relationships of genome structure and function across different species, offers a powerful tool for studying evolution, annotating genomes, and understanding the causes of various genetic disorders. However, aligning multiple sequences of DNA, an essential intermediate step for most types of analyses, is a difficult computational task. In parallel, citizen science, an approach that takes advantage of the fact that the human brain is exquisitely tuned to solving specific types of problems, is becoming increasingly popular. There, instances of hard computational problems are dispatched to a crowd of non-expert human game players and solutions are sent back to a central server. Methodology/Principal Findings We introduce Phylo, a human-based computing framework applying “crowd sourcing” techniques to solve the Multiple Sequence Alignment (MSA) problem. The key idea of Phylo is to convert the MSA problem into a casual game that can be played by ordinary web users with a minimal prior knowledge of the biological context. We applied this strategy to improve the alignment of the promoters of disease-related genes from up to 44 vertebrate species. Since the launch in November 2010, we received more than 350,000 solutions submitted from more than 12,000 registered users. Our results show that solutions submitted contributed to improving the accuracy of up to 70% of the alignment blocks considered. Conclusions/Significance We demonstrate that, combined with classical algorithms, crowd computing techniques can be successfully used to help improving the accuracy of MSA. More importantly, we show that an NP-hard computational problem can be embedded in casual game that can be easily played by people without significant scientific training. This suggests that citizen science approaches can be used to exploit the billions of “human-brain peta-flops” of computation that are spent every day playing games. Phylo is available at: http://phylo.cs.mcgill.ca. PMID:22412834

  10. Riding the Hype Wave: Evaluating new AI Techniques for their Applicability in Earth Science

    NASA Astrophysics Data System (ADS)

    Ramachandran, R.; Zhang, J.; Maskey, M.; Lee, T. J.

    2016-12-01

    Every few years a new technology rides the hype wave generated by the computer science community. Converts to this new technology who surface from both the science community and the informatics community promulgate that it can radically improve or even change the existing scientific process. Recent examples of new technology following in the footsteps of "big data" now include deep learning algorithms and knowledge graphs. Deep learning algorithms mimic the human brain and process information through multiple stages of transformation and representation. These algorithms are able to learn complex functions that map pixels directly to outputs without relying on human-crafted features and solve some of the complex classification problems that exist in science. Similarly, knowledge graphs aggregate information around defined topics that enable users to resolve their query without having to navigate and assemble information manually. Knowledge graphs could potentially be used in scientific research to assist in hypothesis formulation, testing, and review. The challenge for the Earth science research community is to evaluate these new technologies by asking the right questions and considering what-if scenarios. What is this new technology enabling/providing that is innovative and different? Can one justify the adoption costs with respect to the research returns? Since nothing comes for free, utilizing a new technology entails adoption costs that may outweigh the benefits. Furthermore, these technologies may require significant computing infrastructure in order to be utilized effectively. Results from two different projects will be presented along with lessons learned from testing these technologies. The first project primarily evaluates deep learning techniques for different applications of image retrieval within Earth science while the second project builds a prototype knowledge graph constructed for Hurricane science.

  11. Computational Unification: a Vision for Connecting Researchers

    NASA Astrophysics Data System (ADS)

    Troy, R. M.; Kingrey, O. J.

    2002-12-01

    Computational Unification of science, once only a vision, is becoming a reality. This technology is based upon a scientifically defensible, general solution for Earth Science data management and processing. The computational unification of science offers a real opportunity to foster inter and intra-discipline cooperation, and the end of 're-inventing the wheel'. As we move forward using computers as tools, it is past time to move from computationally isolating, "one-off" or discipline-specific solutions into a unified framework where research can be more easily shared, especially with researchers in other disciplines. The author will discuss how distributed meta-data, distributed processing and distributed data objects are structured to constitute a working interdisciplinary system, including how these resources lead to scientific defensibility through known lineage of all data products. Illustration of how scientific processes are encapsulated and executed illuminates how previously written processes and functions are integrated into the system efficiently and with minimal effort. Meta-data basics will illustrate how intricate relationships may easily be represented and used to good advantage. Retrieval techniques will be discussed including trade-offs of using meta-data versus embedded data, how the two may be integrated, and how simplifying assumptions may or may not help. This system is based upon the experience of the Sequoia 2000 and BigSur research projects at the University of California, Berkeley, whose goals were to find an alternative to the Hughes EOS-DIS system and is presently offered by Science Tools corporation, of which the author is a principal.

  12. Petascale supercomputing to accelerate the design of high-temperature alloys

    DOE PAGES

    Shin, Dongwon; Lee, Sangkeun; Shyam, Amit; ...

    2017-10-25

    Recent progress in high-performance computing and data informatics has opened up numerous opportunities to aid the design of advanced materials. Herein, we demonstrate a computational workflow that includes rapid population of high-fidelity materials datasets via petascale computing and subsequent analyses with modern data science techniques. We use a first-principles approach based on density functional theory to derive the segregation energies of 34 microalloying elements at the coherent and semi-coherent interfaces between the aluminium matrix and the θ'-Al 2Cu precipitate, which requires several hundred supercell calculations. We also perform extensive correlation analyses to identify materials descriptors that affect the segregation behaviourmore » of solutes at the interfaces. Finally, we show an example of leveraging machine learning techniques to predict segregation energies without performing computationally expensive physics-based simulations. As a result, the approach demonstrated in the present work can be applied to any high-temperature alloy system for which key materials data can be obtained using high-performance computing.« less

  13. Petascale supercomputing to accelerate the design of high-temperature alloys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shin, Dongwon; Lee, Sangkeun; Shyam, Amit

    Recent progress in high-performance computing and data informatics has opened up numerous opportunities to aid the design of advanced materials. Herein, we demonstrate a computational workflow that includes rapid population of high-fidelity materials datasets via petascale computing and subsequent analyses with modern data science techniques. We use a first-principles approach based on density functional theory to derive the segregation energies of 34 microalloying elements at the coherent and semi-coherent interfaces between the aluminium matrix and the θ'-Al 2Cu precipitate, which requires several hundred supercell calculations. We also perform extensive correlation analyses to identify materials descriptors that affect the segregation behaviourmore » of solutes at the interfaces. Finally, we show an example of leveraging machine learning techniques to predict segregation energies without performing computationally expensive physics-based simulations. As a result, the approach demonstrated in the present work can be applied to any high-temperature alloy system for which key materials data can be obtained using high-performance computing.« less

  14. Petascale supercomputing to accelerate the design of high-temperature alloys

    NASA Astrophysics Data System (ADS)

    Shin, Dongwon; Lee, Sangkeun; Shyam, Amit; Haynes, J. Allen

    2017-12-01

    Recent progress in high-performance computing and data informatics has opened up numerous opportunities to aid the design of advanced materials. Herein, we demonstrate a computational workflow that includes rapid population of high-fidelity materials datasets via petascale computing and subsequent analyses with modern data science techniques. We use a first-principles approach based on density functional theory to derive the segregation energies of 34 microalloying elements at the coherent and semi-coherent interfaces between the aluminium matrix and the θ‧-Al2Cu precipitate, which requires several hundred supercell calculations. We also perform extensive correlation analyses to identify materials descriptors that affect the segregation behaviour of solutes at the interfaces. Finally, we show an example of leveraging machine learning techniques to predict segregation energies without performing computationally expensive physics-based simulations. The approach demonstrated in the present work can be applied to any high-temperature alloy system for which key materials data can be obtained using high-performance computing.

  15. Expertise transfer for expert system design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boose, J.H.

    This book is about the Expertise Transfer System-a computer program which interviews experts and helps them build expert systems, i.e. computer programs that use knowledge from experts to make decisions and judgements under conditions of uncertainty. The techniques are useful to anyone who uses decision-making information based on the expertise of others. The methods can also be applied to personal decision-making. The interviewing methodology is borrowed from a branch of psychology called Personal Construct Theory. It is not necessary to use a computer to take advantage of the techniques from Personal Construction Theory; the fundamental procedures used by the Expertisemore » Transfer System can be performed using paper and pencil. It is not necessary that the reader understand very much about computers to understand the ideas in this book. The few relevant concepts from computer science and expert systems that are needed are explained in a straightforward manner. Ideas from Personal Construct Psychology are also introduced as needed.« less

  16. 2016 Energetic Materials Gordon Research Conference and Gordon Research Seminar Research Area 7: Chemical Sciences 7.0 Chemical Sciences (Dr. James K. Parker)

    DTIC Science & Technology

    2016-08-10

    thermal decomposition and mechanical damage of energetics. The program for the meeting included nine oral presentation sessions. Discussion leaders...USA) 7:30 pm - 7:35 pm Introduction by Discussion Leader 7:35 pm - 7:50 pm Vincent Baijot (Laboratory for Analysis and Architecture of Systems , CNRS...were synthesis of new materials, performance, advanced diagnostics, experimental techniques, theoretical approaches, and computational models for

  17. Conceptual Knowledge Acquisition in Biomedicine: A Methodological Review

    PubMed Central

    Payne, Philip R.O.; Mendonça, Eneida A.; Johnson, Stephen B.; Starren, Justin B.

    2007-01-01

    The use of conceptual knowledge collections or structures within the biomedical domain is pervasive, spanning a variety of applications including controlled terminologies, semantic networks, ontologies, and database schemas. A number of theoretical constructs and practical methods or techniques support the development and evaluation of conceptual knowledge collections. This review will provide an overview of the current state of knowledge concerning conceptual knowledge acquisition, drawing from multiple contributing academic disciplines such as biomedicine, computer science, cognitive science, education, linguistics, semiotics, and psychology. In addition, multiple taxonomic approaches to the description and selection of conceptual knowledge acquisition and evaluation techniques will be proposed in order to partially address the apparent fragmentation of the current literature concerning this domain. PMID:17482521

  18. RIACS/USRA

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph

    1993-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on 6 June 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. A flexible scientific staff is provided through a university faculty visitor program, a post doctoral program, and a student visitor program. Not only does this provide appropriate expertise but it also introduces scientists outside of NASA to NASA problems. A small group of core RIACS staff provides continuity and interacts with an ARC technical monitor and scientific advisory group to determine the RIACS mission. RIACS activities are reviewed and monitored by a USRA advisory council and ARC technical monitor. Research at RIACS is currently being done in the following areas: Parallel Computing, Advanced Methods for Scientific Computing, High Performance Networks and Technology, and Learning Systems. Parallel compiler techniques, adaptive numerical methods for flows in complicated geometries, and optimization were identified as important problems to investigate for ARC's involvement in the Computational Grand Challenges of the next decade.

  19. Industrial benefits and future expectations in materials and processes resulting from space technology

    NASA Technical Reports Server (NTRS)

    Meyer, J. D.

    1977-01-01

    Space technology transfer is discussed as applied to the field of materials science. Advances made in processing include improved computer techniques, and structural analysis. Technology transfer is shown to have an important impact potential in the overall productivity of the United States.

  20. A Tutorial on Techniques and Applications for Natural Language Processing

    DTIC Science & Technology

    1983-10-17

    mentioned above as specific to context-free grammars were tackled by linguists, in particular Chomsky [21, 221 through Transformational Grammar . As shown...DTIC e, C 17 October 1983 MAY 1,5 1990 DEPARTMENT of COMPUTER SCIENCE Approved for pu ]3 -- ,. " Carnegie-Mellon University . . . - -A.,,Anm m n n n n ln...A Tutorial on Techniques and Applications for Natural Language Processing Philip J. Hayes and Jaime G. Carbonell Carnegie-Mellon University 17

  1. Modular Approaches to Earth Science Scientific Computing: 3D Electromagnetic Induction Modeling as an Example

    NASA Astrophysics Data System (ADS)

    Tandon, K.; Egbert, G.; Siripunvaraporn, W.

    2003-12-01

    We are developing a modular system for three-dimensional inversion of electromagnetic (EM) induction data, using an object oriented programming approach. This approach allows us to modify the individual components of the inversion scheme proposed, and also reuse the components for variety of problems in earth science computing howsoever diverse they might be. In particular, the modularity allows us to (a) change modeling codes independently of inversion algorithm details; (b) experiment with new inversion algorithms; and (c) modify the way prior information is imposed in the inversion to test competing hypothesis and techniques required to solve an earth science problem. Our initial code development is for EM induction equations on a staggered grid, using iterative solution techniques in 3D. An example illustrated here is an experiment with the sensitivity of 3D magnetotelluric inversion to uncertainties in the boundary conditions required for regional induction problems. These boundary conditions should reflect the large-scale geoelectric structure of the study area, which is usually poorly constrained. In general for inversion of MT data, one fixes boundary conditions at the edge of the model domain, and adjusts the earth?s conductivity structure within the modeling domain. Allowing for errors in specification of the open boundary values is simple in principle, but no existing inversion codes that we are aware of have this feature. Adding a feature such as this is straightforward within the context of the modular approach. More generally, a modular approach provides an efficient methodology for setting up earth science computing problems to test various ideas. As a concrete illustration relevant to EM induction problems, we investigate the sensitivity of MT data near San Andreas Fault at Parkfield (California) to uncertainties in the regional geoelectric structure.

  2. Computational oncology.

    PubMed

    Lefor, Alan T

    2011-08-01

    Oncology research has traditionally been conducted using techniques from the biological sciences. The new field of computational oncology has forged a new relationship between the physical sciences and oncology to further advance research. By applying physics and mathematics to oncologic problems, new insights will emerge into the pathogenesis and treatment of malignancies. One major area of investigation in computational oncology centers around the acquisition and analysis of data, using improved computing hardware and software. Large databases of cellular pathways are being analyzed to understand the interrelationship among complex biological processes. Computer-aided detection is being applied to the analysis of routine imaging data including mammography and chest imaging to improve the accuracy and detection rate for population screening. The second major area of investigation uses computers to construct sophisticated mathematical models of individual cancer cells as well as larger systems using partial differential equations. These models are further refined with clinically available information to more accurately reflect living systems. One of the major obstacles in the partnership between physical scientists and the oncology community is communications. Standard ways to convey information must be developed. Future progress in computational oncology will depend on close collaboration between clinicians and investigators to further the understanding of cancer using these new approaches.

  3. Protein Modelling: What Happened to the “Protein Structure Gap”?

    PubMed Central

    Schwede, Torsten

    2013-01-01

    Computational modeling and prediction of three-dimensional macromolecular structures and complexes from their sequence has been a long standing vision in structural biology as it holds the promise to bypass part of the laborious process of experimental structure solution. Over the last two decades, a paradigm shift has occurred: starting from a situation where the “structure knowledge gap” between the huge number of protein sequences and small number of known structures has hampered the widespread use of structure-based approaches in life science research, today some form of structural information – either experimental or computational – is available for the majority of amino acids encoded by common model organism genomes. Template based homology modeling techniques have matured to a point where they are now routinely used to complement experimental techniques. With the scientific focus of interest moving towards larger macromolecular complexes and dynamic networks of interactions, the integration of computational modeling methods with low-resolution experimental techniques allows studying large and complex molecular machines. Computational modeling and prediction techniques are still facing a number of challenges which hamper the more widespread use by the non-expert scientist. For example, it is often difficult to convey the underlying assumptions of a computational technique, as well as the expected accuracy and structural variability of a specific model. However, these aspects are crucial to understand the limitations of a model, and to decide which interpretations and conclusions can be supported. PMID:24010712

  4. Tools and Techniques for Measuring and Improving Grid Performance

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Frumkin, M.; Smith, W.; VanderWijngaart, R.; Wong, P.; Biegel, Bryan (Technical Monitor)

    2001-01-01

    This viewgraph presentation provides information on NASA's geographically dispersed computing resources, and the various methods by which the disparate technologies are integrated within a nationwide computational grid. Many large-scale science and engineering projects are accomplished through the interaction of people, heterogeneous computing resources, information systems and instruments at different locations. The overall goal is to facilitate the routine interactions of these resources to reduce the time spent in design cycles, particularly for NASA's mission critical projects. The IPG (Information Power Grid) seeks to implement NASA's diverse computing resources in a fashion similar to the way in which electric power is made available.

  5. Recent achievements in real-time computational seismology in Taiwan

    NASA Astrophysics Data System (ADS)

    Lee, S.; Liang, W.; Huang, B.

    2012-12-01

    Real-time computational seismology is currently possible to be achieved which needs highly connection between seismic database and high performance computing. We have developed a real-time moment tensor monitoring system (RMT) by using continuous BATS records and moment tensor inversion (CMT) technique. The real-time online earthquake simulation service is also ready to open for researchers and public earthquake science education (ROS). Combine RMT with ROS, the earthquake report based on computational seismology can provide within 5 minutes after an earthquake occurred (RMT obtains point source information < 120 sec; ROS completes a 3D simulation < 3 minutes). All of these computational results are posted on the internet in real-time now. For more information, welcome to visit real-time computational seismology earthquake report webpage (RCS).

  6. Teaching Computer Languages and Elementary Theory for Mixed Audiences at University Level

    NASA Astrophysics Data System (ADS)

    Christiansen, Henning

    2004-09-01

    Theoretical issues of computer science are traditionally taught in a way that presupposes a solid mathematical background and are usually considered more or less inaccessible for students without this. An effective methodology is described which has been developed for a target group of university students with different backgrounds such as natural science or humanities. It has been developed for a course that integrates theoretical material on computer languages and abstract machines with practical programming techniques. Prolog used as meta-language for describing language issues is the central instrument in the approach: Formal descriptions become running prototypes that are easy and appealing to test and modify, and can be extended into analyzers, interpreters, and tools such as tracers and debuggers. Experience shows a high learning curve, especially when the principles are extended into a learning-by-doing approach having the students to develop such descriptions themselves from an informal introduction.

  7. The Teaching and Learning Environment SAIDA: Some Features and Lessons.

    ERIC Educational Resources Information Center

    Grandbastien, Monique; Morinet-Lambert, Josette

    Written in ADA language, SAIDA, a Help System for Data Implementation, is an experimental teaching and learning environment which uses artificial intelligence techniques to teach a computer science course on abstract data representations. The application domain is teaching advanced programming concepts which have not received much attention from…

  8. Dynamic Learning Style Prediction Method Based on a Pattern Recognition Technique

    ERIC Educational Resources Information Center

    Yang, Juan; Huang, Zhi Xing; Gao, Yue Xiang; Liu, Hong Tao

    2014-01-01

    During the past decade, personalized e-learning systems and adaptive educational hypermedia systems have attracted much attention from researchers in the fields of computer science Aand education. The integration of learning styles into an intelligent system is a possible solution to the problems of "learning deviation" and…

  9. The Application of Peer Teaching in Digital Forensics Education

    ERIC Educational Resources Information Center

    Govan, Michelle

    2016-01-01

    The field of digital forensics requires a multidisciplinary understanding of a range of diverse subjects, but is interdisciplinary (in using principles, techniques and theories from other disciplines) encompassing both computer and forensic science. This requires that practitioners have a deep technical knowledge and understanding, but that they…

  10. Researchers' Expectations Regarding the Online Presence of Academic Libraries

    ERIC Educational Resources Information Center

    Mierzecka, Anna; Kisilowska, Malgorzata; Suminas, Andrius

    2017-01-01

    The article reports the results of a survey conducted among the Polish and Lithuanian academics concerning their information needs and expectations regarding academic library websites. The survey was realized using the technique of Computer-Assisted Web Interviewing (CAWI) on a group of scholars representing sciences and humanities or social…

  11. Teaching Human-Centered Security Using Nontraditional Techniques

    ERIC Educational Resources Information Center

    Renaud, Karen; Cutts, Quintin

    2013-01-01

    Computing science students amass years of programming experience and a wealth of factual knowledge in their undergraduate courses. Based on our combined years of experience, however, one of our students' abiding shortcomings is that they think there is only "one correct answer" to issues in most courses: an "idealistic"…

  12. A Low-Tech, Hands-On Approach To Teaching Sorting Algorithms to Working Students.

    ERIC Educational Resources Information Center

    Dios, R.; Geller, J.

    1998-01-01

    Focuses on identifying the educational effects of "activity oriented" instructional techniques. Examines which instructional methods produce enhanced learning and comprehension. Discusses the problem of learning "sorting algorithms," a major topic in every Computer Science curriculum. Presents a low-tech, hands-on teaching method for sorting…

  13. Using Small-Step Refinement for Algorithm Verification in Computer Science Education

    ERIC Educational Resources Information Center

    Simic, Danijela

    2015-01-01

    Stepwise program refinement techniques can be used to simplify program verification. Programs are better understood since their main properties are clearly stated, and verification of rather complex algorithms is reduced to proving simple statements connecting successive program specifications. Additionally, it is easy to analyse similar…

  14. Teaching Public Policy: Theory, Research, and Practice. Contributions in Political Science, Number 268.

    ERIC Educational Resources Information Center

    Bergerson, Peter J., Ed.

    The 16 chapters of this book offer innovative instructional techniques used to train public managers. It presents public management concepts along with such subtopics as organizational theory and ethics, research skills, program evaluation, financial management, computers and communication skills in public administration, comparative public…

  15. High speed civil transport: Sonic boom softening and aerodynamic optimization

    NASA Technical Reports Server (NTRS)

    Cheung, Samson

    1994-01-01

    An improvement in sonic boom extrapolation techniques has been the desire of aerospace designers for years. This is because the linear acoustic theory developed in the 60's is incapable of predicting the nonlinear phenomenon of shock wave propagation. On the other hand, CFD techniques are too computationally expensive to employ on sonic boom problems. Therefore, this research focused on the development of a fast and accurate sonic boom extrapolation method that solves the Euler equations for axisymmetric flow. This new technique has brought the sonic boom extrapolation techniques up to the standards of the 90's. Parallel computing is a fast growing subject in the field of computer science because of its promising speed. A new optimizer (IIOWA) for the parallel computing environment has been developed and tested for aerodynamic drag minimization. This is a promising method for CFD optimization making use of the computational resources of workstations, which unlike supercomputers can spend most of their time idle. Finally, the OAW concept is attractive because of its overall theoretical performance. In order to fully understand the concept, a wind-tunnel model was built and is currently being tested at NASA Ames Research Center. The CFD calculations performed under this cooperative agreement helped to identify the problem of the flow separation, and also aided the design by optimizing the wing deflection for roll trim.

  16. Finite Dimensional Approximations for Continuum Multiscale Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berlyand, Leonid

    2017-01-24

    The completed research project concerns the development of novel computational techniques for modeling nonlinear multiscale physical and biological phenomena. Specifically, it addresses the theoretical development and applications of the homogenization theory (coarse graining) approach to calculation of the effective properties of highly heterogenous biological and bio-inspired materials with many spatial scales and nonlinear behavior. This theory studies properties of strongly heterogeneous media in problems arising in materials science, geoscience, biology, etc. Modeling of such media raises fundamental mathematical questions, primarily in partial differential equations (PDEs) and calculus of variations, the subject of the PI’s research. The focus of completed researchmore » was on mathematical models of biological and bio-inspired materials with the common theme of multiscale analysis and coarse grain computational techniques. Biological and bio-inspired materials offer the unique ability to create environmentally clean functional materials used for energy conversion and storage. These materials are intrinsically complex, with hierarchical organization occurring on many nested length and time scales. The potential to rationally design and tailor the properties of these materials for broad energy applications has been hampered by the lack of computational techniques, which are able to bridge from the molecular to the macroscopic scale. The project addressed the challenge of computational treatments of such complex materials by the development of a synergistic approach that combines innovative multiscale modeling/analysis techniques with high performance computing.« less

  17. Federal Technology Catalog 1982: Summaries of practical technology

    NASA Astrophysics Data System (ADS)

    The catalog presents summaries of practical technology selected for commercial potential and/or promising applications to the fields of computer technology, electrotechnology, energy, engineering, life sciences, machinery and tools, manufacturing, materials, physical sciences, and testing and instrumentation. Each summary not only describes a technology, but gives a source for further information. This publication describes some 1,100 new processes, inventions, equipment, software, and techniques developed by and for dozens of Federal agencies during 1982. Included is coverage of NASA Tech Briefs, DOE Energygrams, and Army Manufacturing Notes.

  18. Know Your Discipline: Teaching the Philosophy of Computer Science

    ERIC Educational Resources Information Center

    Tedre, Matti

    2007-01-01

    The diversity and interdisciplinarity of computer science and the multiplicity of its uses in other sciences make it hard to define computer science and to prescribe how computer science should be carried out. The diversity of computer science also causes friction between computer scientists from different branches. Computer science curricula, as…

  19. Research Reports: 1988 NASA/ASEE Summer Faculty Fellowship Program

    NASA Technical Reports Server (NTRS)

    Freeman, L. Michael (Editor); Chappell, Charles R. (Editor); Cothran, Ernestine K. (Editor); Karr, Gerald R. (Editor)

    1988-01-01

    The basic objectives are to further the professional knowledge of qualified engineering and science faculty members; to stimulate an exchange of ideas between participants and NASA: to enrich and refresh the research and teaching activities of the participants' institutions; and to contribute to the research objectives of the NASA centers. Topics addressed include: cryogenics; thunderstorm simulation; computer techniques; computer assisted instruction; system analysis weather forecasting; rocket engine design; crystal growth; control systems design; turbine pumps for the Space Shuttle Main engine; electron mobility; heat transfer predictions; rotor dynamics; mathematical models; computational fluid dynamics; and structural analysis.

  20. New Trends in E-Science: Machine Learning and Knowledge Discovery in Databases

    NASA Astrophysics Data System (ADS)

    Brescia, Massimo

    2012-11-01

    Data mining, or Knowledge Discovery in Databases (KDD), while being the main methodology to extract the scientific information contained in Massive Data Sets (MDS), needs to tackle crucial problems since it has to orchestrate complex challenges posed by transparent access to different computing environments, scalability of algorithms, reusability of resources. To achieve a leap forward for the progress of e-science in the data avalanche era, the community needs to implement an infrastructure capable of performing data access, processing and mining in a distributed but integrated context. The increasing complexity of modern technologies carried out a huge production of data, whose related warehouse management and the need to optimize analysis and mining procedures lead to a change in concept on modern science. Classical data exploration, based on local user own data storage and limited computing infrastructures, is no more efficient in the case of MDS, worldwide spread over inhomogeneous data centres and requiring teraflop processing power. In this context modern experimental and observational science requires a good understanding of computer science, network infrastructures, Data Mining, etc. i.e. of all those techniques which fall into the domain of the so called e-science (recently assessed also by the Fourth Paradigm of Science). Such understanding is almost completely absent in the older generations of scientists and this reflects in the inadequacy of most academic and research programs. A paradigm shift is needed: statistical pattern recognition, object oriented programming, distributed computing, parallel programming need to become an essential part of scientific background. A possible practical solution is to provide the research community with easy-to understand, easy-to-use tools, based on the Web 2.0 technologies and Machine Learning methodology. Tools where almost all the complexity is hidden to the final user, but which are still flexible and able to produce efficient and reliable scientific results. All these considerations will be described in the detail in the chapter. Moreover, examples of modern applications offering to a wide variety of e-science communities a large spectrum of computational facilities to exploit the wealth of available massive data sets and powerful machine learning and statistical algorithms will be also introduced.

  1. Cloud computing approaches to accelerate drug discovery value chain.

    PubMed

    Garg, Vibhav; Arora, Suchir; Gupta, Chitra

    2011-12-01

    Continued advancements in the area of technology have helped high throughput screening (HTS) evolve from a linear to parallel approach by performing system level screening. Advanced experimental methods used for HTS at various steps of drug discovery (i.e. target identification, target validation, lead identification and lead validation) can generate data of the order of terabytes. As a consequence, there is pressing need to store, manage, mine and analyze this data to identify informational tags. This need is again posing challenges to computer scientists to offer the matching hardware and software infrastructure, while managing the varying degree of desired computational power. Therefore, the potential of "On-Demand Hardware" and "Software as a Service (SAAS)" delivery mechanisms cannot be denied. This on-demand computing, largely referred to as Cloud Computing, is now transforming the drug discovery research. Also, integration of Cloud computing with parallel computing is certainly expanding its footprint in the life sciences community. The speed, efficiency and cost effectiveness have made cloud computing a 'good to have tool' for researchers, providing them significant flexibility, allowing them to focus on the 'what' of science and not the 'how'. Once reached to its maturity, Discovery-Cloud would fit best to manage drug discovery and clinical development data, generated using advanced HTS techniques, hence supporting the vision of personalized medicine.

  2. Computational systems biology and dose-response modeling in relation to new directions in toxicity testing.

    PubMed

    Zhang, Qiang; Bhattacharya, Sudin; Andersen, Melvin E; Conolly, Rory B

    2010-02-01

    The new paradigm envisioned for toxicity testing in the 21st century advocates shifting from the current animal-based testing process to a combination of in vitro cell-based studies, high-throughput techniques, and in silico modeling. A strategic component of the vision is the adoption of the systems biology approach to acquire, analyze, and interpret toxicity pathway data. As key toxicity pathways are identified and their wiring details elucidated using traditional and high-throughput techniques, there is a pressing need to understand their qualitative and quantitative behaviors in response to perturbation by both physiological signals and exogenous stressors. The complexity of these molecular networks makes the task of understanding cellular responses merely by human intuition challenging, if not impossible. This process can be aided by mathematical modeling and computer simulation of the networks and their dynamic behaviors. A number of theoretical frameworks were developed in the last century for understanding dynamical systems in science and engineering disciplines. These frameworks, which include metabolic control analysis, biochemical systems theory, nonlinear dynamics, and control theory, can greatly facilitate the process of organizing, analyzing, and understanding toxicity pathways. Such analysis will require a comprehensive examination of the dynamic properties of "network motifs"--the basic building blocks of molecular circuits. Network motifs like feedback and feedforward loops appear repeatedly in various molecular circuits across cell types and enable vital cellular functions like homeostasis, all-or-none response, memory, and biological rhythm. These functional motifs and associated qualitative and quantitative properties are the predominant source of nonlinearities observed in cellular dose response data. Complex response behaviors can arise from toxicity pathways built upon combinations of network motifs. While the field of computational cell biology has advanced rapidly with increasing availability of new data and powerful simulation techniques, a quantitative orientation is still lacking in life sciences education to make efficient use of these new tools to implement the new toxicity testing paradigm. A revamped undergraduate curriculum in the biological sciences including compulsory courses in mathematics and analysis of dynamical systems is required to address this gap. In parallel, dissemination of computational systems biology techniques and other analytical tools among practicing toxicologists and risk assessment professionals will help accelerate implementation of the new toxicity testing vision.

  3. Pacific Research Platform - Creation of a West Coast Big Data Freeway System Applied to the CONNected objECT (CONNECT) Data Mining Framework for Earth Science Knowledge Discovery

    NASA Astrophysics Data System (ADS)

    Sellars, S. L.; Nguyen, P.; Tatar, J.; Graham, J.; Kawsenuk, B.; DeFanti, T.; Smarr, L.; Sorooshian, S.; Ralph, M.

    2017-12-01

    A new era in computational earth sciences is within our grasps with the availability of ever-increasing earth observational data, enhanced computational capabilities, and innovative computation approaches that allow for the assimilation, analysis and ability to model the complex earth science phenomena. The Pacific Research Platform (PRP), CENIC and associated technologies such as the Flash I/O Network Appliance (FIONA) provide scientists a unique capability for advancing towards this new era. This presentation reports on the development of multi-institutional rapid data access capabilities and data pipeline for applying a novel image characterization and segmentation approach, CONNected objECT (CONNECT) algorithm to study Atmospheric River (AR) events impacting the Western United States. ARs are often associated with torrential rains, swollen rivers, flash flooding, and mudslides. CONNECT is computationally intensive, reliant on very large data transfers, storage and data mining techniques. The ability to apply the method to multiple variables and datasets located at different University of California campuses has previously been challenged by inadequate network bandwidth and computational constraints. The presentation will highlight how the inter-campus CONNECT data mining framework improved from our prior download speeds of 10MB/s to 500MB/s using the PRP and the FIONAs. We present a worked example using the NASA MERRA data to describe how the PRP and FIONA have provided researchers with the capability for advancing knowledge about ARs. Finally, we will discuss future efforts to expand the scope to additional variables in earth sciences.

  4. Examination of the Effects of Dimensionality on Cognitive Processing in Science: A Computational Modeling Experiment Comparing Online Laboratory Simulations and Serious Educational Games

    NASA Astrophysics Data System (ADS)

    Lamb, Richard L.

    2016-02-01

    Within the last 10 years, new tools for assisting in the teaching and learning of academic skills and content within the context of science have arisen. These new tools include multiple types of computer software and hardware to include (video) games. The purpose of this study was to examine and compare the effect of computer learning games in the form of three-dimensional serious educational games, two-dimensional online laboratories, and traditional lecture-based instruction in the context of student content learning in science. In particular, this study examines the impact of dimensionality, or the ability to move along the X-, Y-, and Z-axis in the games. Study subjects ( N = 551) were randomly selected using a stratified sampling technique. Independent strata subsamples were developed based upon the conditions of serious educational games, online laboratories, and lecture. The study also computationally models a potential mechanism of action and compares two- and three-dimensional learning environments. F test results suggest a significant difference for the main effect of condition across the factor of content gain score with large effect. Overall, comparisons using computational models suggest that three-dimensional serious educational games increase the level of success in learning as measured with content examinations through greater recruitment and attributional retraining of cognitive systems. The study supports assertions in the literature that the use of games in higher dimensions (i.e., three-dimensional versus two-dimensional) helps to increase student understanding of science concepts.

  5. The Statistical Package for the Social Sciences (SPSS) as an adjunct to pharmacokinetic analysis.

    PubMed

    Mather, L E; Austin, K L

    1983-01-01

    Computer techniques for numerical analysis are well known to pharmacokineticists. Powerful techniques for data file management have been developed by social scientists but have, in general, been ignored by pharmacokineticists because of their apparent lack of ability to interface with pharmacokinetic programs. Extensive use has been made of the Statistical Package for the Social Sciences (SPSS) for its data handling capabilities, but at the same time, techniques have been developed within SPSS to interface with pharmacokinetic programs of the users' choice and to carry out a variety of user-defined pharmacokinetic tasks within SPSS commands, apart from the expected variety of statistical tasks. Because it is based on a ubiquitous package, this methodology has all of the benefits of excellent documentation, interchangeability between different types and sizes of machines and true portability of techniques and data files. An example is given of the total management of a pharmacokinetic study previously reported in the literature by the authors.

  6. Emerging Nanophotonic Applications Explored with Advanced Scientific Parallel Computing

    NASA Astrophysics Data System (ADS)

    Meng, Xiang

    The domain of nanoscale optical science and technology is a combination of the classical world of electromagnetics and the quantum mechanical regime of atoms and molecules. Recent advancements in fabrication technology allows the optical structures to be scaled down to nanoscale size or even to the atomic level, which are far smaller than the wavelength they are designed for. These nanostructures can have unique, controllable, and tunable optical properties and their interactions with quantum materials can have important near-field and far-field optical response. Undoubtedly, these optical properties can have many important applications, ranging from the efficient and tunable light sources, detectors, filters, modulators, high-speed all-optical switches; to the next-generation classical and quantum computation, and biophotonic medical sensors. This emerging research of nanoscience, known as nanophotonics, is a highly interdisciplinary field requiring expertise in materials science, physics, electrical engineering, and scientific computing, modeling and simulation. It has also become an important research field for investigating the science and engineering of light-matter interactions that take place on wavelength and subwavelength scales where the nature of the nanostructured matter controls the interactions. In addition, the fast advancements in the computing capabilities, such as parallel computing, also become as a critical element for investigating advanced nanophotonic devices. This role has taken on even greater urgency with the scale-down of device dimensions, and the design for these devices require extensive memory and extremely long core hours. Thus distributed computing platforms associated with parallel computing are required for faster designs processes. Scientific parallel computing constructs mathematical models and quantitative analysis techniques, and uses the computing machines to analyze and solve otherwise intractable scientific challenges. In particular, parallel computing are forms of computation operating on the principle that large problems can often be divided into smaller ones, which are then solved concurrently. In this dissertation, we report a series of new nanophotonic developments using the advanced parallel computing techniques. The applications include the structure optimizations at the nanoscale to control both the electromagnetic response of materials, and to manipulate nanoscale structures for enhanced field concentration, which enable breakthroughs in imaging, sensing systems (chapter 3 and 4) and improve the spatial-temporal resolutions of spectroscopies (chapter 5). We also report the investigations on the confinement study of optical-matter interactions at the quantum mechanical regime, where the size-dependent novel properties enhanced a wide range of technologies from the tunable and efficient light sources, detectors, to other nanophotonic elements with enhanced functionality (chapter 6 and 7).

  7. Factors that Influence the Success of Male and Female Computer Programming Students in College

    NASA Astrophysics Data System (ADS)

    Clinkenbeard, Drew A.

    As the demand for a technologically skilled work force grows, experience and skill in computer science have become increasingly valuable for college students. However, the number of students graduating with computer science degrees is not growing proportional to this need. Traditionally several groups are underrepresented in this field, notably women and students of color. This study investigated elements of computer science education that influence academic achievement in beginning computer programming courses. The goal of the study was to identify elements that increase success in computer programming courses. A 38-item questionnaire was developed and administered during the Spring 2016 semester at California State University Fullerton (CSUF). CSUF is an urban public university comprised of about 40,000 students. Data were collected from three beginning programming classes offered at CSUF. In total 411 questionnaires were collected resulting in a response rate of 58.63%. Data for the study were grouped into three broad categories of variables. These included academic and background variables; affective variables; and peer, mentor, and role-model variables. A conceptual model was developed to investigate how these variables might predict final course grade. Data were analyzed using statistical techniques such as linear regression, factor analysis, and path analysis. Ultimately this study found that peer interactions, comfort with computers, computer self-efficacy, self-concept, and perception of achievement were the best predictors of final course grade. In addition, the analyses showed that male students exhibited higher levels of computer self-efficacy and self-concept compared to female students, even when they achieved comparable course grades. Implications and explanations of these findings are explored, and potential policy changes are offered.

  8. Validation of Computational Models in Biomechanics

    PubMed Central

    Henninger, Heath B.; Reese, Shawn P.; Anderson, Andrew E.; Weiss, Jeffrey A.

    2010-01-01

    The topics of verification and validation (V&V) have increasingly been discussed in the field of computational biomechanics, and many recent articles have applied these concepts in an attempt to build credibility for models of complex biological systems. V&V are evolving techniques that, if used improperly, can lead to false conclusions about a system under study. In basic science these erroneous conclusions may lead to failure of a subsequent hypothesis, but they can have more profound effects if the model is designed to predict patient outcomes. While several authors have reviewed V&V as they pertain to traditional solid and fluid mechanics, it is the intent of this manuscript to present them in the context of computational biomechanics. Specifically, the task of model validation will be discussed with a focus on current techniques. It is hoped that this review will encourage investigators to engage and adopt the V&V process in an effort to increase peer acceptance of computational biomechanics models. PMID:20839648

  9. Information sciences experiment system

    NASA Technical Reports Server (NTRS)

    Katzberg, Stephen J.; Murray, Nicholas D.; Benz, Harry F.; Bowker, David E.; Hendricks, Herbert D.

    1990-01-01

    The rapid expansion of remote sensing capability over the last two decades will take another major leap forward with the advent of the Earth Observing System (Eos). An approach is presented that will permit experiments and demonstrations in onboard information extraction. The approach is a non-intrusive, eavesdropping mode in which a small amount of spacecraft real estate is allocated to an onboard computation resource. How such an approach allows the evaluation of advanced technology in the space environment, advanced techniques in information extraction for both Earth science and information science studies, direct to user data products, and real-time response to events, all without affecting other on-board instrumentation is discussed.

  10. A framework for propagation of uncertainty contributed by parameterization, input data, model structure, and calibration/validation data in watershed modeling

    USDA-ARS?s Scientific Manuscript database

    The progressive improvement of computer science and development of auto-calibration techniques means that calibration of simulation models is no longer a major challenge for watershed planning and management. Modelers now increasingly focus on challenges such as improved representation of watershed...

  11. COMPUTER TECHNIQUES FOR WEEKLY MULTIPLE-CHOICE TESTING.

    ERIC Educational Resources Information Center

    BROYLES, DAVID

    TO ENCOURAGE POLITICAL SCIENCE STUDENTS TO READ PROPERLY AND CONTINUOUSLY, THE AUTHOR GIVES FREQUENT SHORT QUIZZES BASED ON THE ASSIGNED READINGS. FOR EASE IN ADMINISTRATION AND SCORING, HE USES MARK-SENSE CARDS, ON WHICH THE STUDENT MARKS DESIGNATED AREAS TO INDICATE HIS NUMBER AND HIS CHOICE OF ANSWERS. TO EMPHASIZE THE VALUE OF CONTINUED HIGH…

  12. The Effectiveness of Screencasts and Cognitive Tools as Scaffolding for Novice Object-Oriented Programmers

    ERIC Educational Resources Information Center

    Lee, Mark J. W.; Pradhan, Sunam; Dalgarno, Barney

    2008-01-01

    Modern information technology and computer science curricula employ a variety of graphical tools and development environments to facilitate student learning of introductory programming concepts and techniques. While the provision of interactive features and the use of visualization can enhance students' understanding and assist them in grasping…

  13. Measuring the Impact of High Quality Instant Feedback on Learning

    ERIC Educational Resources Information Center

    Nutbrown, Stephen; Higgins, Colin; Beesley, Su

    2016-01-01

    This paper examines the impact of a novel assessment technique that has been used to improve the feedback given to second year Computer Science students at the University of Nottingham. Criteria for effective, high quality feedback are discussed. An automated marking system (The Marker's Apprentice--TMA) produces instant feedback in synergy with…

  14. Bayesian Asymmetric Regression as a Means to Estimate and Evaluate Oral Reading Fluency Slopes

    ERIC Educational Resources Information Center

    Solomon, Benjamin G.; Forsberg, Ole J.

    2017-01-01

    Bayesian techniques have become increasingly present in the social sciences, fueled by advances in computer speed and the development of user-friendly software. In this paper, we forward the use of Bayesian Asymmetric Regression (BAR) to monitor intervention responsiveness when using Curriculum-Based Measurement (CBM) to assess oral reading…

  15. Statistical Physics in the Era of Big Data

    ERIC Educational Resources Information Center

    Wang, Dashun

    2013-01-01

    With the wealth of data provided by a wide range of high-throughout measurement tools and technologies, statistical physics of complex systems is entering a new phase, impacting in a meaningful fashion a wide range of fields, from cell biology to computer science to economics. In this dissertation, by applying tools and techniques developed in…

  16. Techniques: Coach, Consultant, Critic, Counselor: The Multiple Roles of the Responsive Facilitator.

    ERIC Educational Resources Information Center

    Keenan, Thomas P.; Braxton-Brown, Greg

    1991-01-01

    Responsive facilitation is an interactive orientation to formal learning that requires an individual to assume a variety of roles and to be comfortable with diverse methodologies. The major roles assumed are coach, consultant, critic, and counselor. As illustrated by the redesign of an introductory computer science course, these practices can be…

  17. A Hands-On Approach for Teaching Denial of Service Attacks: A Case Study

    ERIC Educational Resources Information Center

    Trabelsi, Zouheir; Ibrahim, Walid

    2013-01-01

    Nowadays, many academic institutions are including ethical hacking in their information security and Computer Science programs. Information security students need to experiment common ethical hacking techniques in order to be able to implement the appropriate security solutions. This will allow them to more efficiently protect the confidentiality,…

  18. Games and Web 2.0: A Winning Combination for Millennials

    ERIC Educational Resources Information Center

    Spiegelman, Marsha; Glass, Richard

    2009-01-01

    Gaming and social networking define the millennial student. This research focuses on an evolving collaboration between 2 faculty members of different disciplines who merged Web 2.0 and game scenarios to infuse research techniques as integral components of math/computer science courses. Blogs and wikis facilitated student-faculty interaction beyond…

  19. Petascale Kinetic Simulations in Space Sciences: New Simulations and Data Discovery Techniques and Physics Results

    NASA Astrophysics Data System (ADS)

    Karimabadi, Homa

    2012-03-01

    Recent advances in simulation technology and hardware are enabling breakthrough science where many longstanding problems can now be addressed for the first time. In this talk, we focus on kinetic simulations of the Earth's magnetosphere and magnetic reconnection process which is the key mechanism that breaks the protective shield of the Earth's dipole field, allowing the solar wind to enter the Earth's magnetosphere. This leads to the so-called space weather where storms on the Sun can affect space-borne and ground-based technological systems on Earth. The talk will consist of three parts: (a) overview of a new multi-scale simulation technique where each computational grid is updated based on its own unique timestep, (b) Presentation of a new approach to data analysis that we refer to as Physics Mining which entails combining data mining and computer vision algorithms with scientific visualization to extract physics from the resulting massive data sets. (c) Presentation of several recent discoveries in studies of space plasmas including the role of vortex formation and resulting turbulence in magnetized plasmas.

  20. Deterministic alternatives to the full configuration interaction quantum Monte Carlo method for strongly correlated systems

    NASA Astrophysics Data System (ADS)

    Tubman, Norm; Whaley, Birgitta

    The development of exponential scaling methods has seen great progress in tackling larger systems than previously thought possible. One such technique, full configuration interaction quantum Monte Carlo, allows exact diagonalization through stochastically sampling of determinants. The method derives its utility from the information in the matrix elements of the Hamiltonian, together with a stochastic projected wave function, which are used to explore the important parts of Hilbert space. However, a stochastic representation of the wave function is not required to search Hilbert space efficiently and new deterministic approaches have recently been shown to efficiently find the important parts of determinant space. We shall discuss the technique of Adaptive Sampling Configuration Interaction (ASCI) and the related heat-bath Configuration Interaction approach for ground state and excited state simulations. We will present several applications for strongly correlated Hamiltonians. This work was supported through the Scientific Discovery through Advanced Computing (SciDAC) program funded by the U.S. Department of Energy, Office of Science, Advanced Scientific Computing Research and Basic Energy Sciences.

  1. Conceptualization and application of an approach for designing healthcare software interfaces.

    PubMed

    Kumar, Ajit; Maskara, Reena; Maskara, Sanjeev; Chiang, I-Jen

    2014-06-01

    The aim of this study is to conceptualize a novel approach, which facilitates us to design prototype interfaces for healthcare software. Concepts and techniques from various disciplines were used to conceptualize an interface design approach named MORTARS (Map Original Rhetorical To Adapted Rhetorical Situation). The concepts and techniques included in this approach are (1) rhetorical situation - a concept of philosophy provided by Bitzer (1968); (2) move analysis - an applied linguistic technique provided by Swales (1990) and Bhatia (1993); (3) interface design guidelines - a cognitive and computer science concept provided by Johnson (2010); (4) usability evaluation instrument - an interface evaluation questionnaire provided by Lund (2001); (5) user modeling via stereotyping - a cognitive and computer science concept provided by Rich (1979). A prototype interface for outpatient clinic software was designed to introduce the underlying concepts of MORTARS. The prototype interface was evaluated by thirty-two medical informaticians. The medical informaticians found the designed prototype interface to be useful (73.3%), easy to use (71.9%), easy to learn (93.1%), and satisfactory (53.2%). MORTARS approach was found to be effective in designing the prototype user interface for the outpatient clinic software. This approach might be further used to design interfaces for various software pertaining to healthcare and other domains. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. Identification and addressing reduction-related misconceptions

    NASA Astrophysics Data System (ADS)

    Gal-Ezer, Judith; Trakhtenbrot, Mark

    2016-07-01

    Reduction is one of the key techniques used for problem-solving in computer science. In particular, in the theory of computation and complexity (TCC), mapping and polynomial reductions are used for analysis of decidability and computational complexity of problems, including the core concept of NP-completeness. Reduction is a highly abstract technique that involves revealing close non-trivial connections between problems that often seem to have nothing in common. As a result, proper understanding and application of reduction is a serious challenge for students and a source of numerous misconceptions. The main contribution of this paper is detection of such misconceptions, analysis of their roots, and proposing a way to address them in an undergraduate TCC course. Our observations suggest that the main source of the misconceptions is the false intuitive rule "the bigger is a set/problem, the harder it is to solve". Accordingly, we developed a series of exercises for proactive prevention of these misconceptions.

  3. Biological X-ray absorption spectroscopy (BioXAS): a valuable tool for the study of trace elements in the life sciences.

    PubMed

    Strange, Richard W; Feiters, Martin C

    2008-10-01

    Using X-ray absorption spectroscopy (XAS) the binding modes (type and number of ligands, distances and geometry) and oxidation states of metals and other trace elements in crystalline as well as non-crystalline samples can be revealed. The method may be applied to biological systems as a 'stand-alone' technique, but it is particularly powerful when used alongside other X-ray and spectroscopic techniques and computational approaches. In this review, we highlight how biological XAS is being used in concert with crystallography, spectroscopy and computational chemistry to study metalloproteins in crystals, and report recent applications on relatively rare trace elements utilised by living organisms and metals involved in neurodegenerative diseases.

  4. A decision-theoretic approach to the display of information for time-critical decisions: The Vista project

    NASA Technical Reports Server (NTRS)

    Horvitz, Eric; Ruokangas, Corinne; Srinivas, Sampath; Barry, Matthew

    1993-01-01

    We describe a collaborative research and development effort between the Palo Alto Laboratory of the Rockwell Science Center, Rockwell Space Operations Company, and the Propulsion Systems Section of NASA JSC to design computational tools that can manage the complexity of information displayed to human operators in high-stakes, time-critical decision contexts. We shall review an application from NASA Mission Control and describe how we integrated a probabilistic diagnostic model and a time-dependent utility model, with techniques for managing the complexity of computer displays. Then, we shall describe the behavior of VPROP, a system constructed to demonstrate promising display-management techniques. Finally, we shall describe our current research directions on the Vista 2 follow-on project.

  5. Wildlife software: procedures for publication of computer software

    USGS Publications Warehouse

    Samuel, M.D.

    1990-01-01

    Computers and computer software have become an integral part of the practice of wildlife science. Computers now play an important role in teaching, research, and management applications. Because of the specialized nature of wildlife problems, specific computer software is usually required to address a given problem (e.g., home range analysis). This type of software is not usually available from commercial vendors and therefore must be developed by those wildlife professionals with particular skill in computer programming. Current journal publication practices generally prevent a detailed description of computer software associated with new techniques. In addition, peer review of journal articles does not usually include a review of associated computer software. Thus, many wildlife professionals are usually unaware of computer software that would meet their needs or of major improvements in software they commonly use. Indeed most users of wildlife software learn of new programs or important changes only by word of mouth.

  6. Neural network based visualization of collaborations in a citizen science project

    NASA Astrophysics Data System (ADS)

    Morais, Alessandra M. M.; Santos, Rafael D. C.; Raddick, M. Jordan

    2014-05-01

    Citizen science projects are those in which volunteers are asked to collaborate in scientific projects, usually by volunteering idle computer time for distributed data processing efforts or by actively labeling or classifying information - shapes of galaxies, whale sounds, historical records are all examples of citizen science projects in which users access a data collecting system to label or classify images and sounds. In order to be successful, a citizen science project must captivate users and keep them interested on the project and on the science behind it, increasing therefore the time the users spend collaborating with the project. Understanding behavior of citizen scientists and their interaction with the data collection systems may help increase the involvement of the users, categorize them accordingly to different parameters, facilitate their collaboration with the systems, design better user interfaces, and allow better planning and deployment of similar projects and systems. Users behavior can be actively monitored or derived from their interaction with the data collection systems. Records of the interactions can be analyzed using visualization techniques to identify patterns and outliers. In this paper we present some results on the visualization of more than 80 million interactions of almost 150 thousand users with the Galaxy Zoo I citizen science project. Visualization of the attributes extracted from their behaviors was done with a clustering neural network (the Self-Organizing Map) and a selection of icon- and pixel-based techniques. These techniques allows the visual identification of groups of similar behavior in several different ways.

  7. Acoustic impulse response method as a source of undergraduate research projects and advanced laboratory experiments.

    PubMed

    Robertson, W M; Parker, J M

    2012-03-01

    A straightforward and inexpensive implementation of acoustic impulse response measurement is described utilizing the signal processing technique of coherent averaging. The technique is capable of high signal-to-noise measurements with personal computer data acquisition equipment, an amplifier/speaker, and a high quality microphone. When coupled with simple waveguide test systems fabricated from commercial PVC plumbing pipe, impulse response measurement has proven to be ideal for undergraduate research projects-often of publishable quality-or for advanced laboratory experiments. The technique provides important learning objectives for science or engineering students in areas such as interfacing and computer control of experiments; analog-to-digital conversion and sampling; time and frequency analysis using Fourier transforms; signal processing; and insight into a variety of current research areas such as acoustic bandgap materials, acoustic metamaterials, and fast and slow wave manipulation. © 2012 Acoustical Society of America

  8. Interactive visualization of Earth and Space Science computations

    NASA Technical Reports Server (NTRS)

    Hibbard, William L.; Paul, Brian E.; Santek, David A.; Dyer, Charles R.; Battaiola, Andre L.; Voidrot-Martinez, Marie-Francoise

    1994-01-01

    Computers have become essential tools for scientists simulating and observing nature. Simulations are formulated as mathematical models but are implemented as computer algorithms to simulate complex events. Observations are also analyzed and understood in terms of mathematical models, but the number of these observations usually dictates that we automate analyses with computer algorithms. In spite of their essential role, computers are also barriers to scientific understanding. Unlike hand calculations, automated computations are invisible and, because of the enormous numbers of individual operations in automated computations, the relation between an algorithm's input and output is often not intuitive. This problem is illustrated by the behavior of meteorologists responsible for forecasting weather. Even in this age of computers, many meteorologists manually plot weather observations on maps, then draw isolines of temperature, pressure, and other fields by hand (special pads of maps are printed for just this purpose). Similarly, radiologists use computers to collect medical data but are notoriously reluctant to apply image-processing algorithms to that data. To these scientists with life-and-death responsibilities, computer algorithms are black boxes that increase rather than reduce risk. The barrier between scientists and their computations can be bridged by techniques that make the internal workings of algorithms visible and that allow scientists to experiment with their computations. Here we describe two interactive systems developed at the University of Wisconsin-Madison Space Science and Engineering Center (SSEC) that provide these capabilities to Earth and space scientists.

  9. What is bioinformatics? A proposed definition and overview of the field.

    PubMed

    Luscombe, N M; Greenbaum, D; Gerstein, M

    2001-01-01

    The recent flood of data from genome sequences and functional genomics has given rise to new field, bioinformatics, which combines elements of biology and computer science. Here we propose a definition for this new field and review some of the research that is being pursued, particularly in relation to transcriptional regulatory systems. Our definition is as follows: Bioinformatics is conceptualizing biology in terms of macromolecules (in the sense of physical-chemistry) and then applying "informatics" techniques (derived from disciplines such as applied maths, computer science, and statistics) to understand and organize the information associated with these molecules, on a large-scale. Analyses in bioinformatics predominantly focus on three types of large datasets available in molecular biology: macromolecular structures, genome sequences, and the results of functional genomics experiments (e.g. expression data). Additional information includes the text of scientific papers and "relationship data" from metabolic pathways, taxonomy trees, and protein-protein interaction networks. Bioinformatics employs a wide range of computational techniques including sequence and structural alignment, database design and data mining, macromolecular geometry, phylogenetic tree construction, prediction of protein structure and function, gene finding, and expression data clustering. The emphasis is on approaches integrating a variety of computational methods and heterogeneous data sources. Finally, bioinformatics is a practical discipline. We survey some representative applications, such as finding homologues, designing drugs, and performing large-scale censuses. Additional information pertinent to the review is available over the web at http://bioinfo.mbb.yale.edu/what-is-it.

  10. Factors influencing exemplary science teachers' levels of computer use

    NASA Astrophysics Data System (ADS)

    Hakverdi, Meral

    This study examines exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their students' use of computer applications/tools in or for their science class. After a relevant review of the literature certain variables were selected for analysis. These variables included personal self-efficacy in teaching with computers, outcome expectancy, pupil-control ideology, level of computer use, age, gender, teaching experience, personal computer use, professional computer use and science teachers' level of knowledge/skills in using specific computer applications for science instruction. The sample for this study includes middle and high school science teachers who received the Presidential Award for Excellence in Science Teaching Award (sponsored by the White House and the National Science Foundation) between the years 1997 and 2003 from all 50 states and U.S. territories. Award-winning science teachers were contacted about the survey via e-mail or letter with an enclosed return envelope. Of the 334 award-winning science teachers, usable responses were received from 92 science teachers, which made a response rate of 27.5%. Analysis of the survey responses indicated that exemplary science teachers have a variety of knowledge/skills in using computer related applications/tools. The most commonly used computer applications/tools are information retrieval via the Internet, presentation tools, online communication, digital cameras, and data collection probes. Results of the study revealed that students' use of technology in their science classroom is highly correlated with the frequency of their science teachers' use of computer applications/tools. The results of the multiple regression analysis revealed that personal self-efficacy related to the exemplary science teachers' level of computer use suggesting that computer use is dependent on perceived abilities at using computers. The teachers' use of computer-related applications/tools during class, and their personal self-efficacy, age, and gender are highly related with their level of knowledge/skills in using specific computer applications for science instruction. The teachers' level of knowledge/skills in using specific computer applications for science instruction and gender related to their use of computer-related applications/tools during class and the students' use of computer-related applications/tools in or for their science class. In conclusion, exemplary science teachers need assistance in learning and using computer-related applications/tool in their science class.

  11. Data-driven Applications for the Sun-Earth System

    NASA Astrophysics Data System (ADS)

    Kondrashov, D. A.

    2016-12-01

    Advances in observational and data mining techniques allow extracting information from the large volume of Sun-Earth observational data that can be assimilated into first principles physical models. However, equations governing Sun-Earth phenomena are typically nonlinear, complex, and high-dimensional. The high computational demand of solving the full governing equations over a large range of scales precludes the use of a variety of useful assimilative tools that rely on applied mathematical and statistical techniques for quantifying uncertainty and predictability. Effective use of such tools requires the development of computationally efficient methods to facilitate fusion of data with models. This presentation will provide an overview of various existing as well as newly developed data-driven techniques adopted from atmospheric and oceanic sciences that proved to be useful for space physics applications, such as computationally efficient implementation of Kalman Filter in radiation belts modeling, solar wind gap-filling by Singular Spectrum Analysis, and low-rank procedure for assimilation of low-altitude ionospheric magnetic perturbations into the Lyon-Fedder-Mobarry (LFM) global magnetospheric model. Reduced-order non-Markovian inverse modeling and novel data-adaptive decompositions of Sun-Earth datasets will be also demonstrated.

  12. Evolving Non-Dominated Parameter Sets for Computational Models from Multiple Experiments

    NASA Astrophysics Data System (ADS)

    Lane, Peter C. R.; Gobet, Fernand

    2013-03-01

    Creating robust, reproducible and optimal computational models is a key challenge for theorists in many sciences. Psychology and cognitive science face particular challenges as large amounts of data are collected and many models are not amenable to analytical techniques for calculating parameter sets. Particular problems are to locate the full range of acceptable model parameters for a given dataset, and to confirm the consistency of model parameters across different datasets. Resolving these problems will provide a better understanding of the behaviour of computational models, and so support the development of general and robust models. In this article, we address these problems using evolutionary algorithms to develop parameters for computational models against multiple sets of experimental data; in particular, we propose the `speciated non-dominated sorting genetic algorithm' for evolving models in several theories. We discuss the problem of developing a model of categorisation using twenty-nine sets of data and models drawn from four different theories. We find that the evolutionary algorithms generate high quality models, adapted to provide a good fit to all available data.

  13. Computational biology and bioinformatics in Nigeria.

    PubMed

    Fatumo, Segun A; Adoga, Moses P; Ojo, Opeolu O; Oluwagbemi, Olugbenga; Adeoye, Tolulope; Ewejobi, Itunuoluwa; Adebiyi, Marion; Adebiyi, Ezekiel; Bewaji, Clement; Nashiru, Oyekanmi

    2014-04-01

    Over the past few decades, major advances in the field of molecular biology, coupled with advances in genomic technologies, have led to an explosive growth in the biological data generated by the scientific community. The critical need to process and analyze such a deluge of data and turn it into useful knowledge has caused bioinformatics to gain prominence and importance. Bioinformatics is an interdisciplinary research area that applies techniques, methodologies, and tools in computer and information science to solve biological problems. In Nigeria, bioinformatics has recently played a vital role in the advancement of biological sciences. As a developing country, the importance of bioinformatics is rapidly gaining acceptance, and bioinformatics groups comprised of biologists, computer scientists, and computer engineers are being constituted at Nigerian universities and research institutes. In this article, we present an overview of bioinformatics education and research in Nigeria. We also discuss professional societies and academic and research institutions that play central roles in advancing the discipline in Nigeria. Finally, we propose strategies that can bolster bioinformatics education and support from policy makers in Nigeria, with potential positive implications for other developing countries.

  14. Computational Biology and Bioinformatics in Nigeria

    PubMed Central

    Fatumo, Segun A.; Adoga, Moses P.; Ojo, Opeolu O.; Oluwagbemi, Olugbenga; Adeoye, Tolulope; Ewejobi, Itunuoluwa; Adebiyi, Marion; Adebiyi, Ezekiel; Bewaji, Clement; Nashiru, Oyekanmi

    2014-01-01

    Over the past few decades, major advances in the field of molecular biology, coupled with advances in genomic technologies, have led to an explosive growth in the biological data generated by the scientific community. The critical need to process and analyze such a deluge of data and turn it into useful knowledge has caused bioinformatics to gain prominence and importance. Bioinformatics is an interdisciplinary research area that applies techniques, methodologies, and tools in computer and information science to solve biological problems. In Nigeria, bioinformatics has recently played a vital role in the advancement of biological sciences. As a developing country, the importance of bioinformatics is rapidly gaining acceptance, and bioinformatics groups comprised of biologists, computer scientists, and computer engineers are being constituted at Nigerian universities and research institutes. In this article, we present an overview of bioinformatics education and research in Nigeria. We also discuss professional societies and academic and research institutions that play central roles in advancing the discipline in Nigeria. Finally, we propose strategies that can bolster bioinformatics education and support from policy makers in Nigeria, with potential positive implications for other developing countries. PMID:24763310

  15. Computing the Ediz eccentric connectivity index of discrete dynamic structures

    NASA Astrophysics Data System (ADS)

    Wu, Hualong; Kamran Siddiqui, Muhammad; Zhao, Bo; Gan, Jianhou; Gao, Wei

    2017-06-01

    From the earlier studies in physical and chemical sciences, it is found that the physico-chemical characteristics of chemical compounds are internally connected with their molecular structures. As a theoretical basis, it provides a new way of thinking by analyzing the molecular structure of the compounds to understand their physical and chemical properties. In our article, we study the physico-chemical properties of certain molecular structures via computing the Ediz eccentric connectivity index from mathematical standpoint. The results we yielded mainly apply to the techniques of distance and degree computation of mathematical derivation, and the conclusions have guiding significance in physical engineering.

  16. An Elementary Introduction to Recently Developed Computational Methods for Solving Singularly Perturbed Partial Differential Equations Arising in Science and Engineering

    NASA Astrophysics Data System (ADS)

    Kumar, Manoj; Srivastava, Akanksha

    2013-01-01

    This paper presents a survey of innovative approaches of the most effective computational techniques for solving singular perturbed partial differential equations, which are useful because of their numerical and computer realizations. Many applied problems appearing in semiconductors theory, biochemistry, kinetics, theory of electrical chains, economics, solid mechanics, fluid dynamics, quantum mechanics, and many others can be modelled as singularly perturbed systems. Here, we summarize a wide range of research articles published by numerous researchers during the last ten years to get a better view of the present scenario in this area of research.

  17. Algorithmic psychometrics and the scalable subject.

    PubMed

    Stark, Luke

    2018-04-01

    Recent public controversies, ranging from the 2014 Facebook 'emotional contagion' study to psychographic data profiling by Cambridge Analytica in the 2016 American presidential election, Brexit referendum and elsewhere, signal watershed moments in which the intersecting trajectories of psychology and computer science have become matters of public concern. The entangled history of these two fields grounds the application of applied psychological techniques to digital technologies, and an investment in applying calculability to human subjectivity. Today, a quantifiable psychological subject position has been translated, via 'big data' sets and algorithmic analysis, into a model subject amenable to classification through digital media platforms. I term this position the 'scalable subject', arguing it has been shaped and made legible by algorithmic psychometrics - a broad set of affordances in digital platforms shaped by psychology and the behavioral sciences. In describing the contours of this 'scalable subject', this paper highlights the urgent need for renewed attention from STS scholars on the psy sciences, and on a computational politics attentive to psychology, emotional expression, and sociality via digital media.

  18. International Symposium on Grids and Clouds (ISGC) 2014

    NASA Astrophysics Data System (ADS)

    The International Symposium on Grids and Clouds (ISGC) 2014 will be held at Academia Sinica in Taipei, Taiwan from 23-28 March 2014, with co-located events and workshops. The conference is hosted by the Academia Sinica Grid Computing Centre (ASGC).“Bringing the data scientist to global e-Infrastructures” is the theme of ISGC 2014. The last decade has seen the phenomenal growth in the production of data in all forms by all research communities to produce a deluge of data from which information and knowledge need to be extracted. Key to this success will be the data scientist - educated to use advanced algorithms, applications and infrastructures - collaborating internationally to tackle society’s challenges. ISGC 2014 will bring together researchers working in all aspects of data science from different disciplines around the world to collaborate and educate themselves in the latest achievements and techniques being used to tackle the data deluge. In addition to the regular workshops, technical presentations and plenary keynotes, ISGC this year will focus on how to grow the data science community by considering the educational foundation needed for tomorrow’s data scientist. Topics of discussion include Physics (including HEP) and Engineering Applications, Biomedicine & Life Sciences Applications, Earth & Environmental Sciences & Biodiversity Applications, Humanities & Social Sciences Application, Virtual Research Environment (including Middleware, tools, services, workflow, ... etc.), Data Management, Big Data, Infrastructure & Operations Management, Infrastructure Clouds and Virtualisation, Interoperability, Business Models & Sustainability, Highly Distributed Computing Systems, and High Performance & Technical Computing (HPTC).

  19. Porting Ordinary Applications to Blue Gene/Q Supercomputers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maheshwari, Ketan C.; Wozniak, Justin M.; Armstrong, Timothy

    2015-08-31

    Efficiently porting ordinary applications to Blue Gene/Q supercomputers is a significant challenge. Codes are often originally developed without considering advanced architectures and related tool chains. Science needs frequently lead users to want to run large numbers of relatively small jobs (often called many-task computing, an ensemble, or a workflow), which can conflict with supercomputer configurations. In this paper, we discuss techniques developed to execute ordinary applications over leadership class supercomputers. We use the high-performance Swift parallel scripting framework and build two workflow execution techniques-sub-jobs and main-wrap. The sub-jobs technique, built on top of the IBM Blue Gene/Q resource manager Cobalt'smore » sub-block jobs, lets users submit multiple, independent, repeated smaller jobs within a single larger resource block. The main-wrap technique is a scheme that enables C/C++ programs to be defined as functions that are wrapped by a high-performance Swift wrapper and that are invoked as a Swift script. We discuss the needs, benefits, technicalities, and current limitations of these techniques. We further discuss the real-world science enabled by these techniques and the results obtained.« less

  20. Critical Thinking of Young Citizens towards News Headlines in Chile

    ERIC Educational Resources Information Center

    Vernier, Matthieu; Cárcamo, Luis; Scheihing, Eliana

    2018-01-01

    Strengthening critical thinking abilities of citizens in the face of news published on the web represents a key challenge for education. Young citizens appear to be vulnerable in the face of poor quality news or those containing non-explicit ideologies. In the field of data science, computational and statistical techniques have been developed to…

  1. Development of a Computer System To Educate Students To Evaluate and Interpret Published Drug Studies.

    ERIC Educational Resources Information Center

    Abate, Marie A.

    The education of students in the techniques of critical appraisal of drug studies has been identified as a deficiency in many health sciences curricula. Errors in research design and inconsistencies in the reporting of study results persist in professional pharmacy and medical journals. Thus, thorough and accurate review and interpretation of…

  2. Gamification for Engaging Computer Science Students in Learning Activities: A Case Study

    ERIC Educational Resources Information Center

    Ibáñez, Maria-Blanca; Di-Serio, Ángela; Delgado-Kloos, Carlos

    2014-01-01

    Gamification is the use of game design elements in non-game settings to engage participants and encourage desired behaviors. It has been identified as a promising technique to improve students' engagement which could have a positive impact on learning. This study evaluated the learning effectiveness and engagement appeal of a gamified learning…

  3. Maximal aggregation of polynomial dynamical systems

    PubMed Central

    Cardelli, Luca; Tschaikowski, Max

    2017-01-01

    Ordinary differential equations (ODEs) with polynomial derivatives are a fundamental tool for understanding the dynamics of systems across many branches of science, but our ability to gain mechanistic insight and effectively conduct numerical evaluations is critically hindered when dealing with large models. Here we propose an aggregation technique that rests on two notions of equivalence relating ODE variables whenever they have the same solution (backward criterion) or if a self-consistent system can be written for describing the evolution of sums of variables in the same equivalence class (forward criterion). A key feature of our proposal is to encode a polynomial ODE system into a finitary structure akin to a formal chemical reaction network. This enables the development of a discrete algorithm to efficiently compute the largest equivalence, building on approaches rooted in computer science to minimize basic models of computation through iterative partition refinements. The physical interpretability of the aggregation is shown on polynomial ODE systems for biochemical reaction networks, gene regulatory networks, and evolutionary game theory. PMID:28878023

  4. Latent semantic analysis.

    PubMed

    Evangelopoulos, Nicholas E

    2013-11-01

    This article reviews latent semantic analysis (LSA), a theory of meaning as well as a method for extracting that meaning from passages of text, based on statistical computations over a collection of documents. LSA as a theory of meaning defines a latent semantic space where documents and individual words are represented as vectors. LSA as a computational technique uses linear algebra to extract dimensions that represent that space. This representation enables the computation of similarity among terms and documents, categorization of terms and documents, and summarization of large collections of documents using automated procedures that mimic the way humans perform similar cognitive tasks. We present some technical details, various illustrative examples, and discuss a number of applications from linguistics, psychology, cognitive science, education, information science, and analysis of textual data in general. WIREs Cogn Sci 2013, 4:683-692. doi: 10.1002/wcs.1254 CONFLICT OF INTEREST: The author has declared no conflicts of interest for this article. For further resources related to this article, please visit the WIREs website. © 2013 John Wiley & Sons, Ltd.

  5. Final Report. Institute for Ultralscale Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, Kwan-Liu; Galli, Giulia; Gygi, Francois

    The SciDAC Institute for Ultrascale Visualization brought together leading experts from visualization, high-performance computing, and science application areas to make advanced visualization solutions for SciDAC scientists and the broader community. Over the five-year project, the Institute introduced many new enabling visualization techniques, which have significantly enhanced scientists’ ability to validate their simulations, interpret their data, and communicate with others about their work and findings. This Institute project involved a large number of junior and student researchers, who received the opportunities to work on some of the most challenging science applications and gain access to the most powerful high-performance computing facilitiesmore » in the world. They were readily trained and prepared for facing the greater challenges presented by extreme-scale computing. The Institute’s outreach efforts, through publications, workshops and tutorials, successfully disseminated the new knowledge and technologies to the SciDAC and the broader scientific communities. The scientific findings and experience of the Institute team helped plan the SciDAC3 program.« less

  6. A computational approach to estimate postmortem interval using opacity development of eye for human subjects.

    PubMed

    Cantürk, İsmail; Özyılmaz, Lale

    2018-07-01

    This paper presents an approach to postmortem interval (PMI) estimation, which is a very debated and complicated area of forensic science. Most of the reported methods to determine PMI in the literature are not practical because of the need for skilled persons and significant amounts of time, and give unsatisfactory results. Additionally, the error margin of PMI estimation increases proportionally with elapsed time after death. It is crucial to develop practical PMI estimation methods for forensic science. In this study, a computational system is developed to determine the PMI of human subjects by investigating postmortem opacity development of the eye. Relevant features from the eye images were extracted using image processing techniques to reflect gradual opacity development. The features were then investigated to predict the time after death using machine learning methods. The experimental results prove that the development of opacity can be utilized as a practical computational tool to determine PMI for human subjects. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. Detangling complex relationships in forensic data: principles and use of causal networks and their application to clinical forensic science.

    PubMed

    Lefèvre, Thomas; Lepresle, Aude; Chariot, Patrick

    2015-09-01

    The search for complex, nonlinear relationships and causality in data is hindered by the availability of techniques in many domains, including forensic science. Linear multivariable techniques are useful but present some shortcomings. In the past decade, Bayesian approaches have been introduced in forensic science. To date, authors have mainly focused on providing an alternative to classical techniques for quantifying effects and dealing with uncertainty. Causal networks, including Bayesian networks, can help detangle complex relationships in data. A Bayesian network estimates the joint probability distribution of data and graphically displays dependencies between variables and the circulation of information between these variables. In this study, we illustrate the interest in utilizing Bayesian networks for dealing with complex data through an application in clinical forensic science. Evaluating the functional impairment of assault survivors is a complex task for which few determinants are known. As routinely estimated in France, the duration of this impairment can be quantified by days of 'Total Incapacity to Work' ('Incapacité totale de travail,' ITT). In this study, we used a Bayesian network approach to identify the injury type, victim category and time to evaluation as the main determinants of the 'Total Incapacity to Work' (TIW). We computed the conditional probabilities associated with the TIW node and its parents. We compared this approach with a multivariable analysis, and the results of both techniques were converging. Thus, Bayesian networks should be considered a reliable means to detangle complex relationships in data.

  8. Research in computer science

    NASA Technical Reports Server (NTRS)

    Ortega, J. M.

    1984-01-01

    Several short summaries of the work performed during this reporting period are presented. Topics discussed in this document include: (1) resilient seeded errors via simple techniques; (2) knowledge representation for engineering design; (3) analysis of faults in a multiversion software experiment; (4) implementation of parallel programming environment; (5) symbolic execution of concurrent programs; (6) two computer graphics systems for visualization of pressure distribution and convective density particles; (7) design of a source code management system; (8) vectorizing incomplete conjugate gradient on the Cyber 203/205; (9) extensions of domain testing theory and; (10) performance analyzer for the pisces system.

  9. Computational ecology as an emerging science

    PubMed Central

    Petrovskii, Sergei; Petrovskaya, Natalia

    2012-01-01

    It has long been recognized that numerical modelling and computer simulations can be used as a powerful research tool to understand, and sometimes to predict, the tendencies and peculiarities in the dynamics of populations and ecosystems. It has been, however, much less appreciated that the context of modelling and simulations in ecology is essentially different from those that normally exist in other natural sciences. In our paper, we review the computational challenges arising in modern ecology in the spirit of computational mathematics, i.e. with our main focus on the choice and use of adequate numerical methods. Somewhat paradoxically, the complexity of ecological problems does not always require the use of complex computational methods. This paradox, however, can be easily resolved if we recall that application of sophisticated computational methods usually requires clear and unambiguous mathematical problem statement as well as clearly defined benchmark information for model validation. At the same time, many ecological problems still do not have mathematically accurate and unambiguous description, and available field data are often very noisy, and hence it can be hard to understand how the results of computations should be interpreted from the ecological viewpoint. In this scientific context, computational ecology has to deal with a new paradigm: conventional issues of numerical modelling such as convergence and stability become less important than the qualitative analysis that can be provided with the help of computational techniques. We discuss this paradigm by considering computational challenges arising in several specific ecological applications. PMID:23565336

  10. Digital optical computers at the optoelectronic computing systems center

    NASA Technical Reports Server (NTRS)

    Jordan, Harry F.

    1991-01-01

    The Digital Optical Computing Program within the National Science Foundation Engineering Research Center for Opto-electronic Computing Systems has as its specific goal research on optical computing architectures suitable for use at the highest possible speeds. The program can be targeted toward exploiting the time domain because other programs in the Center are pursuing research on parallel optical systems, exploiting optical interconnection and optical devices and materials. Using a general purpose computing architecture as the focus, we are developing design techniques, tools and architecture for operation at the speed of light limit. Experimental work is being done with the somewhat low speed components currently available but with architectures which will scale up in speed as faster devices are developed. The design algorithms and tools developed for a general purpose, stored program computer are being applied to other systems such as optimally controlled optical communication networks.

  11. Gender differences in the use of computers, programming, and peer interactions in computer science classrooms

    NASA Astrophysics Data System (ADS)

    Stoilescu, Dorian; Egodawatte, Gunawardena

    2010-12-01

    Research shows that female and male students in undergraduate computer science programs view computer culture differently. Female students are interested more in the use of computers than in doing programming, whereas male students see computer science mainly as a programming activity. The overall purpose of our research was not to find new definitions for computer science culture but to see how male and female students see themselves involved in computer science practices, how they see computer science as a successful career, and what they like and dislike about current computer science practices. The study took place in a mid-sized university in Ontario. Sixteen students and two instructors were interviewed to get their views. We found that male and female views are different on computer use, programming, and the pattern of student interactions. Female and male students did not have any major issues in using computers. In computing programming, female students were not so involved in computing activities whereas male students were heavily involved. As for the opinions about successful computer science professionals, both female and male students emphasized hard working, detailed oriented approaches, and enjoying playing with computers. The myth of the geek as a typical profile of successful computer science students was not found to be true.

  12. eHealth research from the user's perspective.

    PubMed

    Hesse, Bradford W; Shneiderman, Ben

    2007-05-01

    The application of information technology (IT) to issues of healthcare delivery has had a long and tortuous history in the United States. Within the field of eHealth, vanguard applications of advanced computing techniques, such as applications in artificial intelligence or expert systems, have languished in spite of a track record of scholarly publication and decisional accuracy. The problem is one of purpose, of asking the right questions for the science to solve. Historically, many computer science pioneers have been tempted to ask "what can the computer do?" New advances in eHealth are prompting developers to ask "what can people do?" How can eHealth take part in national goals for healthcare reform to empower relationships between healthcare professionals and patients, healthcare teams and families, and hospitals and communities to improve health equitably throughout the population? To do this, eHealth researchers must combine best evidence from the user sciences (human factors engineering, human-computer interaction, psychology, and usability) with best evidence in medicine to create transformational improvements in the quality of care that medicine offers. These improvements should follow recommendations from the Institute of Medicine to create a healthcare system that is (1) safe, (2) effective (evidence based), (3) patient centered, and (4) timely. Relying on the eHealth researcher's intuitive grasp of systems issues, improvements should be made with considerations of users and beneficiaries at the individual (patient-physician), group (family-staff), community, and broad environmental levels.

  13. Ab initio density-functional calculations in materials science: from quasicrystals over microporous catalysts to spintronics.

    PubMed

    Hafner, Jürgen

    2010-09-29

    During the last 20 years computer simulations based on a quantum-mechanical description of the interactions between electrons and atomic nuclei have developed an increasingly important impact on materials science, not only in promoting a deeper understanding of the fundamental physical phenomena, but also enabling the computer-assisted design of materials for future technologies. The backbone of atomic-scale computational materials science is density-functional theory (DFT) which allows us to cast the intractable complexity of electron-electron interactions into the form of an effective single-particle equation determined by the exchange-correlation functional. Progress in DFT-based calculations of the properties of materials and of simulations of processes in materials depends on: (1) the development of improved exchange-correlation functionals and advanced post-DFT methods and their implementation in highly efficient computer codes, (2) the development of methods allowing us to bridge the gaps in the temperature, pressure, time and length scales between the ab initio calculations and real-world experiments and (3) the extension of the functionality of these codes, permitting us to treat additional properties and new processes. In this paper we discuss the current status of techniques for performing quantum-based simulations on materials and present some illustrative examples of applications to complex quasiperiodic alloys, cluster-support interactions in microporous acid catalysts and magnetic nanostructures.

  14. Computer Science Techniques Applied to Parallel Atomistic Simulation

    NASA Astrophysics Data System (ADS)

    Nakano, Aiichiro

    1998-03-01

    Recent developments in parallel processing technology and multiresolution numerical algorithms have established large-scale molecular dynamics (MD) simulations as a new research mode for studying materials phenomena such as fracture. However, this requires large system sizes and long simulated times. We have developed: i) Space-time multiresolution schemes; ii) fuzzy-clustering approach to hierarchical dynamics; iii) wavelet-based adaptive curvilinear-coordinate load balancing; iv) multilevel preconditioned conjugate gradient method; and v) spacefilling-curve-based data compression for parallel I/O. Using these techniques, million-atom parallel MD simulations are performed for the oxidation dynamics of nanocrystalline Al. The simulations take into account the effect of dynamic charge transfer between Al and O using the electronegativity equalization scheme. The resulting long-range Coulomb interaction is calculated efficiently with the fast multipole method. Results for temperature and charge distributions, residual stresses, bond lengths and bond angles, and diffusivities of Al and O will be presented. The oxidation of nanocrystalline Al is elucidated through immersive visualization in virtual environments. A unique dual-degree education program at Louisiana State University will also be discussed in which students can obtain a Ph.D. in Physics & Astronomy and a M.S. from the Department of Computer Science in five years. This program fosters interdisciplinary research activities for interfacing High Performance Computing and Communications with large-scale atomistic simulations of advanced materials. This work was supported by NSF (CAREER Program), ARO, PRF, and Louisiana LEQSF.

  15. Design Analysis Kit for Optimization and Terascale Applications 6.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-10-19

    Sandia's Dakota software (available at http://dakota.sandia.gov) supports science and engineering transformation through advanced exploration of simulations. Specifically it manages and analyzes ensembles of simulations to provide broader and deeper perspective for analysts and decision makers. This enables them to: (1) enhance understanding of risk, (2) improve products, and (3) assess simulation credibility. In its simplest mode, Dakota can automate typical parameter variation studies through a generic interface to a computational model. However, Dakota also delivers advanced parametric analysis techniques enabling design exploration, optimization, model calibration, risk analysis, and quantification of margins and uncertainty with such models. It directly supports verificationmore » and validation activities. The algorithms implemented in Dakota aim to address challenges in performing these analyses with complex science and engineering models from desktop to high performance computers.« less

  16. Enabling Data Intensive Science through Service Oriented Science: Virtual Laboratories and Science Gateways

    NASA Astrophysics Data System (ADS)

    Lescinsky, D. T.; Wyborn, L. A.; Evans, B. J. K.; Allen, C.; Fraser, R.; Rankine, T.

    2014-12-01

    We present collaborative work on a generic, modular infrastructure for virtual laboratories (VLs, similar to science gateways) that combine online access to data, scientific code, and computing resources as services that support multiple data intensive scientific computing needs across a wide range of science disciplines. We are leveraging access to 10+ PB of earth science data on Lustre filesystems at Australia's National Computational Infrastructure (NCI) Research Data Storage Infrastructure (RDSI) node, co-located with NCI's 1.2 PFlop Raijin supercomputer and a 3000 CPU core research cloud. The development, maintenance and sustainability of VLs is best accomplished through modularisation and standardisation of interfaces between components. Our approach has been to break up tightly-coupled, specialised application packages into modules, with identified best techniques and algorithms repackaged either as data services or scientific tools that are accessible across domains. The data services can be used to manipulate, visualise and transform multiple data types whilst the scientific tools can be used in concert with multiple scientific codes. We are currently designing a scalable generic infrastructure that will handle scientific code as modularised services and thereby enable the rapid/easy deployment of new codes or versions of codes. The goal is to build open source libraries/collections of scientific tools, scripts and modelling codes that can be combined in specially designed deployments. Additional services in development include: provenance, publication of results, monitoring, workflow tools, etc. The generic VL infrastructure will be hosted at NCI, but can access alternative computing infrastructures (i.e., public/private cloud, HPC).The Virtual Geophysics Laboratory (VGL) was developed as a pilot project to demonstrate the underlying technology. This base is now being redesigned and generalised to develop a Virtual Hazards Impact and Risk Laboratory (VHIRL); any enhancements and new capabilities will be incorporated into a generic VL infrastructure. At same time, we are scoping seven new VLs and in the process, identifying other common components to prioritise and focus development.

  17. Exploring the Relationships between Self-Efficacy and Preference for Teacher Authority among Computer Science Majors

    ERIC Educational Resources Information Center

    Lin, Che-Li; Liang, Jyh-Chong; Su, Yi-Ching; Tsai, Chin-Chung

    2013-01-01

    Teacher-centered instruction has been widely adopted in college computer science classrooms and has some benefits in training computer science undergraduates. Meanwhile, student-centered contexts have been advocated to promote computer science education. How computer science learners respond to or prefer the two types of teacher authority,…

  18. A Component Approach to Collaborative Scientific Software Development: Tools and Techniques Utilized by the Quantum Chemistry Science Application Partnership

    DOE PAGES

    Kenny, Joseph P.; Janssen, Curtis L.; Gordon, Mark S.; ...

    2008-01-01

    Cutting-edge scientific computing software is complex, increasingly involving the coupling of multiple packages to combine advanced algorithms or simulations at multiple physical scales. Component-based software engineering (CBSE) has been advanced as a technique for managing this complexity, and complex component applications have been created in the quantum chemistry domain, as well as several other simulation areas, using the component model advocated by the Common Component Architecture (CCA) Forum. While programming models do indeed enable sound software engineering practices, the selection of programming model is just one building block in a comprehensive approach to large-scale collaborative development which must also addressmore » interface and data standardization, and language and package interoperability. We provide an overview of the development approach utilized within the Quantum Chemistry Science Application Partnership, identifying design challenges, describing the techniques which we have adopted to address these challenges and highlighting the advantages which the CCA approach offers for collaborative development.« less

  19. Object-Based Image Analysis Beyond Remote Sensing - the Human Perspective

    NASA Astrophysics Data System (ADS)

    Blaschke, T.; Lang, S.; Tiede, D.; Papadakis, M.; Györi, A.

    2016-06-01

    We introduce a prototypical methodological framework for a place-based GIS-RS system for the spatial delineation of place while incorporating spatial analysis and mapping techniques using methods from different fields such as environmental psychology, geography, and computer science. The methodological lynchpin for this to happen - when aiming to delineate place in terms of objects - is object-based image analysis (OBIA).

  20. Reverse Engineering and Software Products Reuse to Teach Collaborative Web Portals: A Case Study with Final-Year Computer Science Students

    ERIC Educational Resources Information Center

    Medina-Dominguez, Fuensanta; Sanchez-Segura, Maria-Isabel; Mora-Soto, Arturo; Amescua, Antonio

    2010-01-01

    The development of collaborative Web applications does not follow a software engineering methodology. This is because when university students study Web applications in general, and collaborative Web portals in particular, they are not being trained in the use of software engineering techniques to develop collaborative Web portals. This paper…

  1. The Effects of Animation Technique on Teaching of Acids and Bases Topics

    ERIC Educational Resources Information Center

    Dasdemir, Ikramettin; Doymus, Kemal; Simsek, Ümit; Karaçöp, Ataman

    2008-01-01

    This study has been carried out in order to determine the effect of computer animations in teaching acid and base topics in science and technology courses on the academic success of the primary school students and the opinions of students related to teaching with the animations. This research was conducted by the participation of 55 students from…

  2. The Contribution of Human Factors in Military System Development: Methodological Considerations

    DTIC Science & Technology

    1980-07-01

    Risk/Uncertainty Analysis - Project Scoring - Utility Scales - Relevance Tree Techniques (Reverse Factor Analysis) 2. Computer Simulation Simulation...effectiveness of mathematical models for R&D project selection. Management Science, April 1973, 18. 6-43 .1~ *.-. Souder, W.E. h scoring methodology for...per some interval PROFICIENCY test scores (written) RADIATION radiation effects aircrew performance on radiation environments REACTION TIME 1) (time

  3. CMT for soil science applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clausnitzer, V.; Hopmans, J.W.

    Today, x-ray computed microtomography provides us with the ability to noninvasively measure porous-media properties at a scale approaching 10 {mu}m. In contrast, traditional measurement techniques are either destructive or invasive, while still providing only limited information. Because the output from x-ray CT is directly related to density and atomic number, it is well suited for phase identification and concentration measurements.

  4. SBL-Online: Implementing Studio-Based Learning Techniques in an Online Introductory Programming Course to Address Common Programming Errors and Misconceptions

    ERIC Educational Resources Information Center

    Polo, Blanca J.

    2013-01-01

    Much research has been done in regards to student programming errors, online education and studio-based learning (SBL) in computer science education. This study furthers this area by bringing together this knowledge and applying it to proactively help students overcome impasses caused by common student programming errors. This project proposes a…

  5. Artificial Intelligence: An Analysis of the Technology for Training. Training and Development Research Center Project Number Fourteen.

    ERIC Educational Resources Information Center

    Sayre, Scott Alan

    The ultimate goal of the science of artificial intelligence (AI) is to establish programs that will use algorithmic computer techniques to imitate the heuristic thought processes of humans. Most AI programs, especially expert systems, organize their knowledge into three specific areas: data storage, a rule set, and a control structure. Limitations…

  6. Academic computer science and gender: A naturalistic study investigating the causes of attrition

    NASA Astrophysics Data System (ADS)

    Declue, Timothy Hall

    Far fewer women than men take computer science classes in high school, enroll in computer science programs in college, or complete advanced degrees in computer science. The computer science pipeline begins to shrink for women even before entering college, but it is at the college level that the "brain drain" is the most evident numerically, especially in the first class taken by most computer science majors called "Computer Science 1" or CS-I. The result, for both academia and industry, is a pronounced technological gender disparity in academic and industrial computer science. The study revealed the existence of several factors influencing success in CS-I. First, and most clearly, the effect of attribution processes seemed to be quite strong. These processes tend to work against success for females and in favor of success for males. Likewise, evidence was discovered which strengthens theories related to prior experience and the perception that computer science has a culture which is hostile to females. Two unanticipated themes related to the motivation and persistence of successful computer science majors. The findings did not support the belief that females have greater logistical problems in computer science than males, or that females tend to have a different programming style than males which adversely affects the females' ability to succeed in CS-I.

  7. Methodological challenges and analytic opportunities for modeling and interpreting Big Healthcare Data.

    PubMed

    Dinov, Ivo D

    2016-01-01

    Managing, processing and understanding big healthcare data is challenging, costly and demanding. Without a robust fundamental theory for representation, analysis and inference, a roadmap for uniform handling and analyzing of such complex data remains elusive. In this article, we outline various big data challenges, opportunities, modeling methods and software techniques for blending complex healthcare data, advanced analytic tools, and distributed scientific computing. Using imaging, genetic and healthcare data we provide examples of processing heterogeneous datasets using distributed cloud services, automated and semi-automated classification techniques, and open-science protocols. Despite substantial advances, new innovative technologies need to be developed that enhance, scale and optimize the management and processing of large, complex and heterogeneous data. Stakeholder investments in data acquisition, research and development, computational infrastructure and education will be critical to realize the huge potential of big data, to reap the expected information benefits and to build lasting knowledge assets. Multi-faceted proprietary, open-source, and community developments will be essential to enable broad, reliable, sustainable and efficient data-driven discovery and analytics. Big data will affect every sector of the economy and their hallmark will be 'team science'.

  8. A Procedure for Extending Input Selection Algorithms to Low Quality Data in Modelling Problems with Application to the Automatic Grading of Uploaded Assignments

    PubMed Central

    Otero, José; Palacios, Ana; Suárez, Rosario; Junco, Luis

    2014-01-01

    When selecting relevant inputs in modeling problems with low quality data, the ranking of the most informative inputs is also uncertain. In this paper, this issue is addressed through a new procedure that allows the extending of different crisp feature selection algorithms to vague data. The partial knowledge about the ordinal of each feature is modelled by means of a possibility distribution, and a ranking is hereby applied to sort these distributions. It will be shown that this technique makes the most use of the available information in some vague datasets. The approach is demonstrated in a real-world application. In the context of massive online computer science courses, methods are sought for automatically providing the student with a qualification through code metrics. Feature selection methods are used to find the metrics involved in the most meaningful predictions. In this study, 800 source code files, collected and revised by the authors in classroom Computer Science lectures taught between 2013 and 2014, are analyzed with the proposed technique, and the most relevant metrics for the automatic grading task are discussed. PMID:25114967

  9. Planetary Radio Interferometry and Doppler Experiment (PRIDE) technique: A test case of the Mars Express Phobos Flyby. II. Doppler tracking: Formulation of observed and computed values, and noise budget

    NASA Astrophysics Data System (ADS)

    Bocanegra-Bahamón, T. M.; Molera Calvés, G.; Gurvits, L. I.; Duev, D. A.; Pogrebenko, S. V.; Cimò, G.; Dirkx, D.; Rosenblatt, P.

    2018-01-01

    Context. Closed-loop Doppler data obtained by deep space tracking networks, such as the NASA Deep Space Network (DSN) and the ESA tracking station network (Estrack), are routinely used for navigation and science applications. By shadow tracking the spacecraft signal, Earth-based radio telescopes involved in the Planetary Radio Interferometry and Doppler Experiment (PRIDE) can provide open-loop Doppler tracking data only when the dedicated deep space tracking facilities are operating in closed-loop mode. Aims: We explain the data processing pipeline in detail and discuss the capabilities of the technique and its potential applications in planetary science. Methods: We provide the formulation of the observed and computed values of the Doppler data in PRIDE tracking of spacecraft and demonstrate the quality of the results using an experiment with the ESA Mars Express spacecraft as a test case. Results: We find that the Doppler residuals and the corresponding noise budget of the open-loop Doppler detections obtained with the PRIDE stations compare to the closed-loop Doppler detections obtained with dedicated deep space tracking facilities.

  10. First-Principles Study of Superconductivity in Ultra- thin Pb Films

    NASA Astrophysics Data System (ADS)

    Noffsinger, Jesse; Cohen, Marvin L.

    2010-03-01

    Recently, superconductivity in ultrathin layered Pb has been confirmed in samples with as few as two atomic layers [S. Qin, J. Kim, Q. Niu, and C.-K. Shih, Science 2009]. Interestingly, the prototypical strong-coupling superconductor exhibits different Tc's for differing surface reconstructions in samples with only two monolayers. Additionally, Tc is seen to oscillate as the number of atomic layers is increased. Using first principles techniques based on Wannier functions, we analyze the electronic structure, lattice dynamics and electron-phonon coupling for varying thicknesses and surface reconstructions of layered Pb. We discuss results as they relate to superconductivity in the bulk, for which accurate calculations of superconducting properties can be compared to experiment [W. L. McMillan and J.M. Rowell, PRL 1965]. This work was supported by National Science Foundation Grant No. DMR07-05941, the U.S. Department of Energy under Contract No. DE-AC02-05CH11231. Computational resources have been provided by the Lawrencium computational cluster resource provided by the IT Division at the Lawrence Berkeley National Laboratory (Supported by the Director, Office of Science, Office of Basic Energy Sciences, of the U.S. Department of Energy under Contract No. DE-AC02-05CH11231)

  11. Optical information processing at NASA Ames Research Center

    NASA Technical Reports Server (NTRS)

    Reid, Max B.; Bualat, Maria G.; Cho, Young C.; Downie, John D.; Gary, Charles K.; Ma, Paul W.; Ozcan, Meric; Pryor, Anna H.; Spirkovska, Lilly

    1993-01-01

    The combination of analog optical processors with digital electronic systems offers the potential of tera-OPS computational performance, while often requiring less power and weight relative to all-digital systems. NASA is working to develop and demonstrate optical processing techniques for on-board, real time science and mission applications. Current research areas and applications under investigation include optical matrix processing for space structure vibration control and the analysis of Space Shuttle Main Engine plume spectra, optical correlation-based autonomous vision for robotic vehicles, analog computation for robotic path planning, free-space optical interconnections for information transfer within digital electronic computers, and multiplexed arrays of fiber optic interferometric sensors for acoustic and vibration measurements.

  12. An application for multi-person task synchronization

    NASA Technical Reports Server (NTRS)

    Brown, Robert L.; Doyle, Dee

    1990-01-01

    Computer applications are studied that will enable a group of people to synchronize their actions when following a predefined task sequence. It is assumed that the people involved only have computer workstations available to them for communication. Hence, the approach is to study how the computer can be used to help a group remain synchronized. A series of applications were designed and developed that can be used as vehicles for experimentation. An example of how this technique can be used for a remote coaching capability is explained in a report describing an experiment that simulated a Life Sciences experiment on-board Space Station Freedom, with a ground based principal investigator providing the expertise by coaching the on-orbit mission specialist.

  13. Astro Data Science: The Next Generation

    NASA Astrophysics Data System (ADS)

    Mentzel, Chris

    2018-01-01

    Astronomers have been at the forefront of data-driven discovery since before the days of Kepler. Using data in the scientific inquiry into the workings of the the universe is the lifeblood of the field. This said, data science is considered a new thing, and researchers from every discipline are rushing to learn data science techniques, train themselves on data science tools, and even leaving academia to become data scientists. It is undeniable that our ability to harness new computational and statistical methods to make sense of today’s unprecedented size, complexity, and fast streaming data is helping scientists make new discoveries. The question now is how to ensure that researchers can employ these tools and use them appropriately. This talk will cover the state of data science as it relates to scientific research and the role astronomers play in its development, use, and training the next generation of astro-data scientists.

  14. Computer-Game Construction: A Gender-Neutral Attractor to Computing Science

    ERIC Educational Resources Information Center

    Carbonaro, Mike; Szafron, Duane; Cutumisu, Maria; Schaeffer, Jonathan

    2010-01-01

    Enrollment in Computing Science university programs is at a dangerously low level. A major reason for this is the general lack of interest in Computing Science by females. In this paper, we discuss our experience with using a computer game construction environment as a vehicle to encourage female participation in Computing Science. Experiments…

  15. Future computing platforms for science in a power constrained era

    DOE PAGES

    Abdurachmanov, David; Elmer, Peter; Eulisse, Giulio; ...

    2015-12-23

    Power consumption will be a key constraint on the future growth of Distributed High Throughput Computing (DHTC) as used by High Energy Physics (HEP). This makes performance-per-watt a crucial metric for selecting cost-efficient computing solutions. For this paper, we have done a wide survey of current and emerging architectures becoming available on the market including x86-64 variants, ARMv7 32-bit, ARMv8 64-bit, Many-Core and GPU solutions, as well as newer System-on-Chip (SoC) solutions. We compare performance and energy efficiency using an evolving set of standardized HEP-related benchmarks and power measurement techniques we have been developing. In conclusion, we evaluate the potentialmore » for use of such computing solutions in the context of DHTC systems, such as the Worldwide LHC Computing Grid (WLCG).« less

  16. Computer Simulated Development of Improved Command to Line-of-Sight Missile Guidance Techniques

    DTIC Science & Technology

    1979-03-01

    INaval Postgraduate School Ma//MW Monterey, CA 93940 C -" 10 6 -VA. MONITORING A41INCY MAMIE 6 AOORESS(it 011f.,.t frau Cdfltt.I01gg 01HOS). IS. SECURITY...States Navy B.S., United States Naval Academy, 1967 Submitted in partial fulfillment of the requirements for the degree of MASTER OF SCIENCE IN SYSTEM4S...i i i it +P~4 to i if 1 . 1 - ztnco C3i.- Z-, a O.) Z (~VI- (M CWn4 Ul% 103 4-7 A BIBLIOGRAPHY 1. U.S. Army Foreign Science and Technology Center

  17. [Imaging and the new fabric of the human body].

    PubMed

    Moulin, Anne-Marie; Baulieu, Jean-Louis

    2010-11-01

    A short historical survey recalls the main techniques of medical imaging, based on modern physico-chemistry and computer science. Imagery has provided novel visions of the inside of the body, which are not self-obvious but require a training of the gaze. Yet, these new images have permeated the contemporary mind and inspired esthetic ventures. The popularity of these images may be related to their ambiguous status, between real and virtual. The images, reminiscent of Vesalius' De humani corporis fabrica, crosslink art, science and society in a specific way: which role will they play in the "empowerment" of the tomorrow patient?

  18. Computational materials design of crystalline solids.

    PubMed

    Butler, Keith T; Frost, Jarvist M; Skelton, Jonathan M; Svane, Katrine L; Walsh, Aron

    2016-11-07

    The modelling of materials properties and processes from first principles is becoming sufficiently accurate as to facilitate the design and testing of new systems in silico. Computational materials science is both valuable and increasingly necessary for developing novel functional materials and composites that meet the requirements of next-generation technology. A range of simulation techniques are being developed and applied to problems related to materials for energy generation, storage and conversion including solar cells, nuclear reactors, batteries, fuel cells, and catalytic systems. Such techniques may combine crystal-structure prediction (global optimisation), data mining (materials informatics) and high-throughput screening with elements of machine learning. We explore the development process associated with computational materials design, from setting the requirements and descriptors to the development and testing of new materials. As a case study, we critically review progress in the fields of thermoelectrics and photovoltaics, including the simulation of lattice thermal conductivity and the search for Pb-free hybrid halide perovskites. Finally, a number of universal chemical-design principles are advanced.

  19. New Developments and Geoscience Applications of Synchrotron Computed Microtomography (Invited)

    NASA Astrophysics Data System (ADS)

    Rivers, M. L.; Wang, Y.; Newville, M.; Sutton, S. R.; Yu, T.; Lanzirotti, A.

    2013-12-01

    Computed microtomography is the extension to micron spatial resolution of the CAT scanning technique developed for medical imaging. Synchrotron sources are ideal for the method, since they provide a monochromatic, parallel beam with high intensity. High energy storage rings such as the Advanced Photon Source at Argonne National Laboratory produce x-rays with high energy, high brilliance, and high coherence. All of these factors combine to produce an extremely powerful imaging tool for earth science research. Techniques that have been developed include: - Absorption and phase contrast computed tomography with spatial resolution below one micron. - Differential contrast computed tomography, imaging above and below the absorption edge of a particular element. - High-pressure tomography, imaging inside a pressure cell at pressures above 10GPa. - High speed radiography and tomography, with 100 microsecond temporal resolution. - Fluorescence tomography, imaging the 3-D distribution of elements present at ppm concentrations. - Radiographic strain measurements during deformation at high confining pressure, combined with precise x-ray diffraction measurements to determine stress. These techniques have been applied to important problems in earth and environmental sciences, including: - The 3-D distribution of aqueous and organic liquids in porous media, with applications in contaminated groundwater and petroleum recovery. - The kinetics of bubble formation in magma chambers, which control explosive volcanism. - Studies of the evolution of the early solar system from 3-D textures in meteorites - Accurate crystal size distributions in volcanic systems, important for understanding the evolution of magma chambers. - The equation-of-state of amorphous materials at high pressure using both direct measurements of volume as a function of pressure and also by measuring the change x-ray absorption coefficient as a function of pressure. - The location and chemical speciation of toxic elements such as arsenic and nickel in soils and in plant tissues in contaminated Superfund sites. - The strength of earth materials under the pressure and temperature conditions of the Earth's mantle, providing insights into plate tectonics and the generation of earthquakes.

  20. Target recognition based on the moment functions of radar signatures

    NASA Astrophysics Data System (ADS)

    Kim, Kyung-Tae; Kim, Hyo-Tae

    2002-03-01

    In this paper, we present the results of target recognition research based on the moment functions of various radar signatures, such as time-frequency signatures, range profiles, and scattering centers. The proposed approach utilizes geometrical moments or central moments of the obtained radar signatures. In particular, we derived exact and closed form expressions of the geometrical moments of the adaptive Gaussian representation (AGR), which is one of the adaptive joint time-frequency techniques, and also computed the central moments of range profiles and one-dimensional (1-D) scattering centers on a target, which are obtained by various super-resolution techniques. The obtained moment functions are further processed to provide small dimensional and redundancy-free feature vectors, and classified via a neural network approach or a Bayes classifier. The performances of the proposed technique are demonstrated using a simulated radar cross section (RCS) data set, or a measured RCS data set of various scaled aircraft models, obtained at the Pohang University of Science and Technology (POSTECH) compact range facility. Results show that the techniques in this paper can not only provide reliable classification accuracy, but also save computational resources.

  1. Central Computer Science Concepts to Research-Based Teacher Training in Computer Science: An Experimental Study

    ERIC Educational Resources Information Center

    Zendler, Andreas; Klaudt, Dieter

    2012-01-01

    The significance of computer science for economics and society is undisputed. In particular, computer science is acknowledged to play a key role in schools (e.g., by opening multiple career paths). The provision of effective computer science education in schools is dependent on teachers who are able to properly represent the discipline and whose…

  2. Using spatial principles to optimize distributed computing for enabling the physical science discoveries

    PubMed Central

    Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing

    2011-01-01

    Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century. PMID:21444779

  3. Using spatial principles to optimize distributed computing for enabling the physical science discoveries.

    PubMed

    Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing

    2011-04-05

    Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century.

  4. Image steganalysis using Artificial Bee Colony algorithm

    NASA Astrophysics Data System (ADS)

    Sajedi, Hedieh

    2017-09-01

    Steganography is the science of secure communication where the presence of the communication cannot be detected while steganalysis is the art of discovering the existence of the secret communication. Processing a huge amount of information takes extensive execution time and computational sources most of the time. As a result, it is needed to employ a phase of preprocessing, which can moderate the execution time and computational sources. In this paper, we propose a new feature-based blind steganalysis method for detecting stego images from the cover (clean) images with JPEG format. In this regard, we present a feature selection technique based on an improved Artificial Bee Colony (ABC). ABC algorithm is inspired by honeybees' social behaviour in their search for perfect food sources. In the proposed method, classifier performance and the dimension of the selected feature vector depend on using wrapper-based methods. The experiments are performed using two large data-sets of JPEG images. Experimental results demonstrate the effectiveness of the proposed steganalysis technique compared to the other existing techniques.

  5. DataView: a computational visualisation system for multidisciplinary design and analysis

    NASA Astrophysics Data System (ADS)

    Wang, Chengen

    2016-01-01

    Rapidly processing raw data and effectively extracting underlining information from huge volumes of multivariate data become essential to all decision-making processes in sectors like finance, government, medical care, climate analysis, industries, science, etc. Remarkably, visualisation is recognised as a fundamental technology that props up human comprehension, cognition and utilisation of burgeoning amounts of heterogeneous data. This paper presents a computational visualisation system, named DataView, which has been developed for graphically displaying and capturing outcomes of multiphysics problem-solvers widely used in engineering fields. The DataView is functionally composed of techniques for table/diagram representation, and graphical illustration of scalar, vector and tensor fields. The field visualisation techniques are implemented on the basis of a range of linear and non-linear meshes, which flexibly adapts to disparate data representation schemas adopted by a variety of disciplinary problem-solvers. The visualisation system has been successfully applied to a number of engineering problems, of which some illustrations are presented to demonstrate effectiveness of the visualisation techniques.

  6. A Financial Technology Entrepreneurship Program for Computer Science Students

    ERIC Educational Resources Information Center

    Lawler, James P.; Joseph, Anthony

    2011-01-01

    Education in entrepreneurship is becoming a critical area of curricula for computer science students. Few schools of computer science have a concentration in entrepreneurship in the computing curricula. The paper presents Technology Entrepreneurship in the curricula at a leading school of computer science and information systems, in which students…

  7. Why interdisciplinary research enriches the study of crime. Comment on "Statistical physics of crime: A review" by M.R. D'Orsogna and M. Perc

    NASA Astrophysics Data System (ADS)

    Donnay, Karsten

    2015-03-01

    The past several years have seen a rapidly growing interest in the use of advanced quantitative methodologies and formalisms adapted from the natural sciences to study a broad range of social phenomena. The research field of computational social science [1,2], for example, uses digital artifacts of human online activity to cast a new light on social dynamics. Similarly, the studies reviewed by D'Orsogna and Perc showcase a diverse set of advanced quantitative techniques to study the dynamics of crime. Methods used range from partial differential equations and self-exciting point processes to agent-based models, evolutionary game theory and network science [3].

  8. Computer Science Teacher Professional Development in the United States: A Review of Studies Published between 2004 and 2014

    ERIC Educational Resources Information Center

    Menekse, Muhsin

    2015-01-01

    While there has been a remarkable interest to make computer science a core K-12 academic subject in the United States, there is a shortage of K-12 computer science teachers to successfully implement computer sciences courses in schools. In order to enhance computer science teacher capacity, training programs have been offered through teacher…

  9. Experimenter's laboratory for visualized interactive science

    NASA Technical Reports Server (NTRS)

    Hansen, Elaine R.; Klemp, Marjorie K.; Lasater, Sally W.; Szczur, Marti R.; Klemp, Joseph B.

    1992-01-01

    The science activities of the 1990's will require the analysis of complex phenomena and large diverse sets of data. In order to meet these needs, we must take advantage of advanced user interaction techniques: modern user interface tools; visualization capabilities; affordable, high performance graphics workstations; and interoperable data standards and translator. To meet these needs, we propose to adopt and upgrade several existing tools and systems to create an experimenter's laboratory for visualized interactive science. Intuitive human-computer interaction techniques have already been developed and demonstrated at the University of Colorado. A Transportable Applications Executive (TAE+), developed at GSFC, is a powerful user interface tool for general purpose applications. A 3D visualization package developed by NCAR provides both color shaded surface displays and volumetric rendering in either index or true color. The Network Common Data Form (NetCDF) data access library developed by Unidata supports creation, access and sharing of scientific data in a form that is self-describing and network transparent. The combination and enhancement of these packages constitutes a powerful experimenter's laboratory capable of meeting key science needs of the 1990's. This proposal encompasses the work required to build and demonstrate this capability.

  10. Experimenter's laboratory for visualized interactive science

    NASA Technical Reports Server (NTRS)

    Hansen, Elaine R.; Klemp, Marjorie K.; Lasater, Sally W.; Szczur, Marti R.; Klemp, Joseph B.

    1993-01-01

    The science activities of the 1990's will require the analysis of complex phenomena and large diverse sets of data. In order to meet these needs, we must take advantage of advanced user interaction techniques: modern user interface tools; visualization capabilities; affordable, high performance graphics workstations; and interoperatable data standards and translator. To meet these needs, we propose to adopt and upgrade several existing tools and systems to create an experimenter's laboratory for visualized interactive science. Intuitive human-computer interaction techniques have already been developed and demonstrated at the University of Colorado. A Transportable Applications Executive (TAE+), developed at GSFC, is a powerful user interface tool for general purpose applications. A 3D visualization package developed by NCAR provides both color-shaded surface displays and volumetric rendering in either index or true color. The Network Common Data Form (NetCDF) data access library developed by Unidata supports creation, access and sharing of scientific data in a form that is self-describing and network transparent. The combination and enhancement of these packages constitutes a powerful experimenter's laboratory capable of meeting key science needs of the 1990's. This proposal encompasses the work required to build and demonstrate this capability.

  11. Cheyney University Curriculum and Infrastructure Enhamcement in STEM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eva, Sakkar Ara

    Cheyney University is the oldest historically Black educational institution in America. Initially established as a “normal” school emphasizing the matriculation of educators, Cheyney has become a comprehensive university, one of 14 state universities comprising the Pennsylvania State System of Higher Education (PASSHE). Cheyney University graduates still become teachers, but they also enter such fields as journalism, medicine, science, mathematics, law, communication and government. Cheyney University is a small state owned HBCU with very limited resource. At present the university has about a thousand students with 15% in STEM. The CUCIES II grant made significant contribution in saving the computer sciencemore » program from being a discontinued program in the university. The grant enabled the university to hire a temporary faculty to teach in and update the computer science program. The program is enhanced with three tracks; cyber security, human computer interaction and general. The updated and enhanced computer science program will prepare professionals in the area of computer science with the knowledge, skills, and professional ethic needed for the current market. The new curriculum was developed for a professional profile that would focus on the technologies and techniques currently used in the industry. With faculty on board, the university worked with the department to bring back the computer science program from moratorium. Once in the path of being discontinued and loosing students, the program is now growing. Currently the student number has increased from 12 to 30. University is currently in the process of hiring a tenure track faculty in the computer science program. Another product of the grant is the proposal for introductory course in nanotechnology. The course is intended to generate interest in the nanotechnology field. The Natural and Applied Science department that houses all of the STEM programs in Cheyney University, is currently working to bring back environmental science program from moratorium. The university has been working to improve minority participation in STEM and made significant stride in terms of progressing students toward graduate programs and into professoriate track. This success is due to faculty mentors who work closely with students to guiding them through the application processes for research internship and graduate programs; it is also due to the university forming collaborative agreements with research intensive institutions, federal and state agencies and industry. The grant assisted in recruiting and retaining students in STEM by offering tuition scholarship, research scholarship and travel awards. Faculty professional development was supported by the grant by funding travel to conferences, meetings and webinar. As many HBCU Cheyney University is also trying to do more with less. As the STEM programs are inherently expensive, these are the ones that suffer more when resources are scarce. One of the goals of Cheyney University strategic plan is to strengthen STEM programs that is coherent with the critical skill need of Department of Energy. All of the Cheyney University STEM programs are now located in the new science building funded by Pennsylvania state.« less

  12. Contemporary machine learning: techniques for practitioners in the physical sciences

    NASA Astrophysics Data System (ADS)

    Spears, Brian

    2017-10-01

    Machine learning is the science of using computers to find relationships in data without explicitly knowing or programming those relationships in advance. Often without realizing it, we employ machine learning every day as we use our phones or drive our cars. Over the last few years, machine learning has found increasingly broad application in the physical sciences. This most often involves building a model relationship between a dependent, measurable output and an associated set of controllable, but complicated, independent inputs. The methods are applicable both to experimental observations and to databases of simulated output from large, detailed numerical simulations. In this tutorial, we will present an overview of current tools and techniques in machine learning - a jumping-off point for researchers interested in using machine learning to advance their work. We will discuss supervised learning techniques for modeling complicated functions, beginning with familiar regression schemes, then advancing to more sophisticated decision trees, modern neural networks, and deep learning methods. Next, we will cover unsupervised learning and techniques for reducing the dimensionality of input spaces and for clustering data. We'll show example applications from both magnetic and inertial confinement fusion. Along the way, we will describe methods for practitioners to help ensure that their models generalize from their training data to as-yet-unseen test data. We will finally point out some limitations to modern machine learning and speculate on some ways that practitioners from the physical sciences may be particularly suited to help. This work was performed by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  13. IAIMS development at Harvard Medical School.

    PubMed Central

    Barnett, G O; Greenes, R A; Zielstorff, R D

    1988-01-01

    The long-range goal of this IAIMS development project is to achieve an Integrated Academic Information Management System for the Harvard Medical School, the Francis A. Countway Library of Medicine, and Harvard's affiliated institutions and their respective libraries. An "opportunistic, incremental" approach to planning has been devised. The projects selected for the initial phase are to implement an increasingly powerful electronic communications network, to encourage the use of a variety of bibliographic and information access techniques, and to begin an ambitious program of faculty and student education in computer science and its applications to medical education, medical care, and research. In addition, we will explore means to promote better collaboration among the separate computer science units in the various schools and hospitals. We believe that our planning approach will have relevance to other educational institutions where lack of strong central organizational control prevents a "top-down" approach to planning. PMID:3416098

  14. Critical branching neural networks.

    PubMed

    Kello, Christopher T

    2013-01-01

    It is now well-established that intrinsic variations in human neural and behavioral activity tend to exhibit scaling laws in their fluctuations and distributions. The meaning of these scaling laws is an ongoing matter of debate between isolable causes versus pervasive causes. A spiking neural network model is presented that self-tunes to critical branching and, in doing so, simulates observed scaling laws as pervasive to neural and behavioral activity. These scaling laws are related to neural and cognitive functions, in that critical branching is shown to yield spiking activity with maximal memory and encoding capacities when analyzed using reservoir computing techniques. The model is also shown to account for findings of pervasive 1/f scaling in speech and cued response behaviors that are difficult to explain by isolable causes. Issues and questions raised by the model and its results are discussed from the perspectives of physics, neuroscience, computer and information sciences, and psychological and cognitive sciences.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    D'Azevedo, Eduardo; Abbott, Stephen; Koskela, Tuomas

    The XGC fusion gyrokinetic code combines state-of-the-art, portable computational and algorithmic technologies to enable complicated multiscale simulations of turbulence and transport dynamics in ITER edge plasma on the largest US open-science computer, the CRAY XK7 Titan, at its maximal heterogeneous capability, which have not been possible before due to a factor of over 10 shortage in the time-to-solution for less than 5 days of wall-clock time for one physics case. Frontier techniques such as nested OpenMP parallelism, adaptive parallel I/O, staging I/O and data reduction using dynamic and asynchronous applications interactions, dynamic repartitioning.

  16. Computational Material Modeling of Hydrated Cement Paste Calcium Silicate Hydrate (C-S-H) Chemistry Structure - Influence of Magnesium Exchange on Mechanical Stiffness: C-S-H Jennite

    DTIC Science & Technology

    2015-04-27

    MODELING OF C-S-H Material chemistry level modeling following the principles and techniques commonly grouped under Computational Material Science is...Henmi, C. and Kusachi, I. Monoclinic tobermorite from fuka, bitchu-cho, Okoyama Perfecture. Japan J. Min. Petr. Econ . Geol. (1989)84:374-379. [22...31] Liu, Y. et al. First principles study of the stability and mechanical properties of MC (M=Ti, V, Zr, Nb, Hf and Ta) compounds. Journal of Alloys and Compounds. (2014) 582:500-504. 10

  17. Computer Science | Classification | College of Engineering & Applied

    Science.gov Websites

    EMS 1011 profile photo Adrian Dumitrescu, Ph.D.ProfessorComputer Science(414) 229-4265Eng & Math @uwm.eduEng & Math Sciences 919 profile photo Hossein Hosseini, Ph.D.ProfessorComputer Science(414) 229 -5184hosseini@uwm.eduEng & Math Sciences 1091 profile photo Amol Mali, Ph.D.Associate ProfessorComputer

  18. Computers in Science Education: Can They Go Far Enough? Have We Gone Too Far?

    ERIC Educational Resources Information Center

    Schrock, John Richard

    1984-01-01

    Indicates that although computers may churn out creative research, science is still dependent on science education, and that science education consists of increasing human experience. Also considers uses and misuses of computers in the science classroom, examining Edgar Dale's "cone of experience" related to laboratory computer and "extended…

  19. Augmenting Sand Simulation Environments through Subdivision and Particle Refinement

    NASA Astrophysics Data System (ADS)

    Clothier, M.; Bailey, M.

    2012-12-01

    Recent advances in computer graphics and parallel processing hardware have provided disciplines with new methods to evaluate and visualize data. These advances have proven useful for earth and planetary scientists as many researchers are using this hardware to process large amounts of data for analysis. As such, this has provided opportunities for collaboration between computer graphics and the earth sciences. Through collaboration with the Oregon Space Grant and IGERT Ecosystem Informatics programs, we are investigating techniques for simulating the behavior of sand. We are also collaborating with the Jet Propulsion Laboratory's (JPL) DARTS Lab to exchange ideas and gain feedback on our research. The DARTS Lab specializes in simulation of planetary vehicles, such as the Mars rovers. Their simulations utilize a virtual "sand box" to test how a planetary vehicle responds to different environments. Our research builds upon this idea to create a sand simulation framework so that planetary environments, such as the harsh, sandy regions on Mars, are more fully realized. More specifically, we are focusing our research on the interaction between a planetary vehicle, such as a rover, and the sand beneath it, providing further insight into its performance. Unfortunately, this can be a computationally complex problem, especially if trying to represent the enormous quantities of sand particles interacting with each other. However, through the use of high-performance computing, we have developed a technique to subdivide areas of actively participating sand regions across a large landscape. Similar to a Level of Detail (LOD) technique, we only subdivide regions of a landscape where sand particles are actively participating with another object. While the sand is within this subdivision window and moves closer to the surface of the interacting object, the sand region subdivides into smaller regions until individual sand particles are left at the surface. As an example, let's say there is a planetary rover interacting with our sand simulation environment. Sand that is actively interacting with a rover wheel will be represented as individual particles whereas sand that is further under the surface will be represented by larger regions of sand. The result of this technique allows for many particles to be represented without the computational complexity. In developing this method, we have further generalized these subdivision regions into any volumetric area suitable for use in the simulation. This is a further improvement of our method as it allows for more compact subdivision sand regions. This helps to fine tune the simulation so that more emphasis can be placed on regions of actively participating sand. We feel that through the generalization of our technique, our research can provide other opportunities within the earth and planetary sciences. Through collaboration with our academic colleagues, we continue to refine our technique and look for other opportunities to utilize our research.

  20. Numerical Modeling of Ocean Circulation

    NASA Astrophysics Data System (ADS)

    Miller, Robert N.

    2007-01-01

    The modelling of ocean circulation is important not only for its own sake, but also in terms of the prediction of weather patterns and the effects of climate change. This book introduces the basic computational techniques necessary for all models of the ocean and atmosphere, and the conditions they must satisfy. It describes the workings of ocean models, the problems that must be solved in their construction, and how to evaluate computational results. Major emphasis is placed on examining ocean models critically, and determining what they do well and what they do poorly. Numerical analysis is introduced as needed, and exercises are included to illustrate major points. Developed from notes for a course taught in physical oceanography at the College of Oceanic and Atmospheric Sciences at Oregon State University, this book is ideal for graduate students of oceanography, geophysics, climatology and atmospheric science, and researchers in oceanography and atmospheric science. Features examples and critical examination of ocean modelling and results Demonstrates the strengths and weaknesses of different approaches Includes exercises to illustrate major points and supplement mathematical and physical details

  1. Low-Level Graphics Cues For Solicit Image Interpretation

    NASA Astrophysics Data System (ADS)

    McAnulty, Michael A.; Gemmill, Jill P.; Kegley, Kathleen A.; Chiu, Haw-Tsang

    1984-08-01

    Several straightforward techniques for displaying arbitrary solids of the sort encountered in the life sciences are presented, all variations of simple three-dimensional scatter plots. They are all targeted for a medium cost raster display (an AED-5l2 has been used here). Practically any host computer may be used to implement them. All techniques are broadly applicable and were implemented as Master Degree projects. The major hardware constraint is data transmission speed, and this is met by minimizing the amount of graphical data, ignoring enhancement of the data, and using terminal scan-conversion and aspect firmware wherever possible. Three simple rendering techniques and the use of several graphics cues are described.

  2. Computer Science and Technology: Modeling and Measurement Techniques for Evaluation of Design Alternatives in the Implementation of Database Management Software. Final Report.

    ERIC Educational Resources Information Center

    Deutsch, Donald R.

    This report describes a research effort that was carried out over a period of several years to develop and demonstrate a methodology for evaluating proposed Database Management System designs. The major proposition addressed by this study is embodied in the thesis statement: Proposed database management system designs can be evaluated best through…

  3. Establishment of a Vaporous Hydrogen Peroxide Bio-Decontamination Capability

    DTIC Science & Technology

    2007-02-01

    of Colorado at Denver and Health Sciences Center. There he utilised mass spectrometry to investigate the biochemical pathways involved in lipid... techniques (NMR, GC). Since then she has worked in a variety of areas including: (a) computer simulation of vapour dispersion for early warning to...to inactivate biological agents such as B. anthracis and these include beta-propiolactone, chlorine dioxide, ethylene oxide, propylene oxide, ozone

  4. Novel Digital Signal Processing and Detection Techniques.

    DTIC Science & Technology

    1980-09-01

    decimation and interpolation [11, 1 2]. * Submitted by: Bede Liu Department of Electrical .l Engineering and Computer Science Princeton University ...on the use of recursive filters for decimation and interpolation. 4- UNCL.ASSIFIED~ SECURITY CLASSIFICATION OF PAGEfW1,en Data Fneprd) ...filter structure for realizing low-pass filter is developed 16,7]. By employing decimation and interpolation, the filter uses only coefficients 0, +1, and

  5. On Establishing Big Data Wave Breakwaters with Analytics (Invited)

    NASA Astrophysics Data System (ADS)

    Riedel, M.

    2013-12-01

    The Research Data Alliance Big Data Analytics (RDA-BDA) Interest Group seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. RDA-BDA seeks to analyze different scientific domain applications and their potential use of various big data analytics techniques. A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. These combinations are complex since a wide variety of different data analysis algorithms exist (e.g. specific algorithms using GPUs of analyzing brain images) that need to work together with multiple analytical tools reaching from simple (iterative) map-reduce methods (e.g. with Apache Hadoop or Twister) to sophisticated higher level frameworks that leverage machine learning algorithms (e.g. Apache Mahout). These computational analysis techniques are often augmented with visual analytics techniques (e.g. computational steering on large-scale high performance computing platforms) to put the human judgement into the analysis loop or new approaches with databases that are designed to support new forms of unstructured or semi-structured data as opposed to the rather tradtional structural databases (e.g. relational databases). More recently, data analysis and underpinned analytics frameworks also have to consider energy footprints of underlying resources. To sum up, the aim of this talk is to provide pieces of information to understand big data analytics in the context of science and engineering using the aforementioned classification as the lighthouse and as the frame of reference for a systematic approach. This talk will provide insights about big data analytics methods in context of science within varios communities and offers different views of how approaches of correlation and causality offer complementary methods to advance in science and engineering today. The RDA Big Data Analytics Group seeks to understand what approaches are not only technically feasible, but also scientifically feasible. The lighthouse Goal of the RDA Big Data Analytics Group is a classification of clever combinations of various Technologies and scientific applications in order to provide clear recommendations to the scientific community what approaches are technicalla and scientifically feasible.

  6. Staff development and secondary science teachers: Factors that affect voluntary participation

    NASA Astrophysics Data System (ADS)

    Corley, Theresa Roebuck

    2000-10-01

    A researcher-designed survey assessed the perceptions of Alabama secondary science public school teachers toward the need for staff development and toward certain staff development strategies and programs. Factors that encouraged or discouraged attendance at voluntary staff development programs and opinions regarding effective and ineffective features of programs were identified. Data were analyzed using descriptive techniques. Percentages and frequencies were noted. Average rankings were computed for the staff development techniques considered most and least effective and for the preferred designs of future staff development offerings. Chi squares were computed to respond to each of the 4 research hypotheses. Narrative discussions and tables were utilized to report the data and provide clarification. This study related demographic information to the research hypotheses. Analysis of the research hypotheses revealed that experienced teachers agree more strongly about the features of staff development programs that they consider effective and about the factors that may affect participation in staff development programs. Analysis of the research questions revealed that secondary science teachers in Alabama agree that staff development is a personal responsibility but that the school systems are responsible for providing staff development opportunities. Teachers believe that staff development is needed annually in both science content and teaching strategies and favor lengthening the school year for staff development. Teachers identified interest level, graduate credit, ability to implement material, scheduling factors, and the reputation of the organizer as the most important factors in determining participation in voluntary staff development programs. Hands-on workshops were identified as the most effective type of voluntary staff development and teachers requested that future staff development experiences include hands-on workshops, networking, curriculum development, mentoring, support groups, training trainers, cooperative learning groups, coaching, implementing changes, and collecting resources.

  7. Research Institute for Advanced Computer Science: Annual Report October 1998 through September 1999

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.; Gross, Anthony R. (Technical Monitor)

    1999-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center (ARC). It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. ARC has been designated NASA's Center of Excellence in Information Technology. In this capacity, ARC is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA ARC and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, and visiting scientist programs, designed to encourage and facilitate collaboration between the university and NASA information technology research communities.

  8. Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2000-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth; (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking. Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, and visiting scientist programs, designed to encourage and facilitate collaboration between the university and NASA information technology research communities.

  9. Molecular dynamics simulations through GPU video games technologies

    PubMed Central

    Loukatou, Styliani; Papageorgiou, Louis; Fakourelis, Paraskevas; Filntisi, Arianna; Polychronidou, Eleftheria; Bassis, Ioannis; Megalooikonomou, Vasileios; Makałowski, Wojciech; Vlachakis, Dimitrios; Kossida, Sophia

    2016-01-01

    Bioinformatics is the scientific field that focuses on the application of computer technology to the management of biological information. Over the years, bioinformatics applications have been used to store, process and integrate biological and genetic information, using a wide range of methodologies. One of the most de novo techniques used to understand the physical movements of atoms and molecules is molecular dynamics (MD). MD is an in silico method to simulate the physical motions of atoms and molecules under certain conditions. This has become a state strategic technique and now plays a key role in many areas of exact sciences, such as chemistry, biology, physics and medicine. Due to their complexity, MD calculations could require enormous amounts of computer memory and time and therefore their execution has been a big problem. Despite the huge computational cost, molecular dynamics have been implemented using traditional computers with a central memory unit (CPU). A graphics processing unit (GPU) computing technology was first designed with the goal to improve video games, by rapidly creating and displaying images in a frame buffer such as screens. The hybrid GPU-CPU implementation, combined with parallel computing is a novel technology to perform a wide range of calculations. GPUs have been proposed and used to accelerate many scientific computations including MD simulations. Herein, we describe the new methodologies developed initially as video games and how they are now applied in MD simulations. PMID:27525251

  10. Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April, 1986 through September 30, 1986 is summarized.

  11. 78 FR 10180 - Annual Computational Science Symposium; Conference

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-13

    ...] Annual Computational Science Symposium; Conference AGENCY: Food and Drug Administration, HHS. ACTION... Computational Science Symposium.'' The purpose of the conference is to help the broader community align and share experiences to advance computational science. At the conference, which will bring together FDA...

  12. eHealth Research from the User’s Perspective

    PubMed Central

    Hesse, Bradford W.; Shneiderman, Ben

    2007-01-01

    The application of Information Technology (IT) to issues of healthcare delivery has had a long and tortuous history in the U.S. Within the field of eHealth, vanguard applications of advanced computing techniques, such as applications in artificial intelligence or expert systems, have languished in spite of a track record of scholarly publication and decisional accuracy. The problem is one of purpose, of asking the right questions for the science to solve. Historically, many computer science pioneers have been tempted to ask “what can the computer do?” New advances in eHealth are prompting developers to ask “what can people do?” How can eHealth take part in national goals for healthcare reform to empower relationships between healthcare professionals and patients, healthcare teams and families, and hospitals and communities to improve health equitably throughout the population? To do this, eHealth researchers must combine best evidence from the user sciences (human factors engineering, human-computer interaction, psychology, and usability) with best evidence in medicine to create transformational improvements in the quality of care that medicine offers. These improvements should follow recommendations from the Institute of Medicine to create a health care system that is (a) safe, (b) effective (evidence-based), (c) patient-centered, and (d) timely. Relying on the eHealth researcher’s intuitive grasp of systems issues, improvements should be made with considerations of users and beneficiaries at the individual (patient/physician), group (family/staff), community, and broad environmental levels. PMID:17466825

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hules, John

    This 1998 annual report from the National Scientific Energy Research Computing Center (NERSC) presents the year in review of the following categories: Computational Science; Computer Science and Applied Mathematics; and Systems and Services. Also presented are science highlights in the following categories: Basic Energy Sciences; Biological and Environmental Research; Fusion Energy Sciences; High Energy and Nuclear Physics; and Advanced Scientific Computing Research and Other Projects.

  14. Report from the MPP Working Group to the NASA Associate Administrator for Space Science and Applications

    NASA Technical Reports Server (NTRS)

    Fischer, James R.; Grosch, Chester; Mcanulty, Michael; Odonnell, John; Storey, Owen

    1987-01-01

    NASA's Office of Space Science and Applications (OSSA) gave a select group of scientists the opportunity to test and implement their computational algorithms on the Massively Parallel Processor (MPP) located at Goddard Space Flight Center, beginning in late 1985. One year later, the Working Group presented its report, which addressed the following: algorithms, programming languages, architecture, programming environments, the way theory relates, and performance measured. The findings point to a number of demonstrated computational techniques for which the MPP architecture is ideally suited. For example, besides executing much faster on the MPP than on conventional computers, systolic VLSI simulation (where distances are short), lattice simulation, neural network simulation, and image problems were found to be easier to program on the MPP's architecture than on a CYBER 205 or even a VAX. The report also makes technical recommendations covering all aspects of MPP use, and recommendations concerning the future of the MPP and machines based on similar architectures, expansion of the Working Group, and study of the role of future parallel processors for space station, EOS, and the Great Observatories era.

  15. On the importance of a rich embodiment in the grounding of concepts: perspectives from embodied cognitive science and computational linguistics.

    PubMed

    Thill, Serge; Padó, Sebastian; Ziemke, Tom

    2014-07-01

    The recent trend in cognitive robotics experiments on language learning, symbol grounding, and related issues necessarily entails a reduction of sensorimotor aspects from those provided by a human body to those that can be realized in machines, limiting robotic models of symbol grounding in this respect. Here, we argue that there is a need for modeling work in this domain to explicitly take into account the richer human embodiment even for concrete concepts that prima facie relate merely to simple actions, and illustrate this using distributional methods from computational linguistics which allow us to investigate grounding of concepts based on their actual usage. We also argue that these techniques have applications in theories and models of grounding, particularly in machine implementations thereof. Similarly, considering the grounding of concepts in human terms may be of benefit to future work in computational linguistics, in particular in going beyond "grounding" concepts in the textual modality alone. Overall, we highlight the overall potential for a mutually beneficial relationship between the two fields. Copyright © 2014 Cognitive Science Society, Inc.

  16. Visions of the Future - the Changing Role of Actors in Data-Intensive Science

    NASA Astrophysics Data System (ADS)

    Schäfer, L.; Klump, J. F.

    2013-12-01

    Around the world scientific disciplines are increasingly facing the challenge of a burgeoning volume of research data. This data avalanche consists of a stream of information generated from sensors and scientific instruments, digital recordings, social-science surveys or drawn from the World Wide Web. All areas of the scientific economy are affected by this rapid growth in data, from the logging of digs in Archaeology, telescope data with observations of distant galaxies in Astrophysics or data from polls and surveys in the Social Sciences. The challenge for science is not only to process the data through analysis, reduction and visualization, but also to set up infrastructures for provisioning and storing the data. The rise of new technologies and developments also poses new challenges for the actors in the area of research data infrastructures. Libraries, as one of the actors, enable access to digital media and support the publication of research data and its long-term archiving. Digital media and research data, however, introduce new aspects into the libraries' range of activities. How are we to imagine the library of the future? The library as an interface to the computer centers? Will library and computer center fuse into a new service unit? What role will scientific publishers play in future? Currently the traditional form of publication still carry greater weight - articles for conferences and journals. But will this still be the case in future? New forms of publication are already making their presence felt. The tasks of the computer centers may also change. Yesterday their remit was provisioning of rapid hardware, whereas now everything revolves around the topic of data and services. Finally, how about the researchers themselves? Not such a long time ago, Geoscience was not necessarily seen as linked to Computer Science. Nowadays, modern Geoscience relies heavily on IT and its techniques. Thus, in how far will the profile of the modern geoscientist change? This gives rise to the question of what tools are required to locate and pursue the correct course in a networked world. One tool from the area of innovation management is the scenario technique. This poster will outline visions of the future as possible developments of the scientific world in 2020 (or later). The scenarios presented will show possible developments - both positive and negative. It is up then to the actors themselves to define their own position in this context, to rethink it and consider steps that can achieve a positive development for the future.

  17. Enduring Influence of Stereotypical Computer Science Role Models on Women's Academic Aspirations

    ERIC Educational Resources Information Center

    Cheryan, Sapna; Drury, Benjamin J.; Vichayapai, Marissa

    2013-01-01

    The current work examines whether a brief exposure to a computer science role model who fits stereotypes of computer scientists has a lasting influence on women's interest in the field. One-hundred undergraduate women who were not computer science majors met a female or male peer role model who embodied computer science stereotypes in appearance…

  18. A Web of Resources for Introductory Computer Science.

    ERIC Educational Resources Information Center

    Rebelsky, Samuel A.

    As the field of Computer Science has grown, the syllabus of the introductory Computer Science course has changed significantly. No longer is it a simple introduction to programming or a tutorial on computer concepts and applications. Rather, it has become a survey of the field of Computer Science, touching on a wide variety of topics from digital…

  19. First-principles data-driven discovery of transition metal oxides for artificial photosynthesis

    NASA Astrophysics Data System (ADS)

    Yan, Qimin

    We develop a first-principles data-driven approach for rapid identification of transition metal oxide (TMO) light absorbers and photocatalysts for artificial photosynthesis using the Materials Project. Initially focusing on Cr, V, and Mn-based ternary TMOs in the database, we design a broadly-applicable multiple-layer screening workflow automating density functional theory (DFT) and hybrid functional calculations of bulk and surface electronic and magnetic structures. We further assess the electrochemical stability of TMOs in aqueous environments from computed Pourbaix diagrams. Several promising earth-abundant low band-gap TMO compounds with desirable band edge energies and electrochemical stability are identified by our computational efforts and then synergistically evaluated using high-throughput synthesis and photoelectrochemical screening techniques by our experimental collaborators at Caltech. Our joint theory-experiment effort has successfully identified new earth-abundant copper and manganese vanadate complex oxides that meet highly demanding requirements for photoanodes, substantially expanding the known space of such materials. By integrating theory and experiment, we validate our approach and develop important new insights into structure-property relationships for TMOs for oxygen evolution photocatalysts, paving the way for use of first-principles data-driven techniques in future applications. This work is supported by the Materials Project Predictive Modeling Center and the Joint Center for Artificial Photosynthesis through the U.S. Department of Energy, Office of Basic Energy Sciences, Materials Sciences and Engineering Division, under Contract No. DE-AC02-05CH11231. Computational resources also provided by the Department of Energy through the National Energy Supercomputing Center.

  20. Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April l, 1988 through September 30, 1988.

  1. Summary of research in applied mathematics, numerical analysis and computer science at the Institute for Computer Applications in Science and Engineering

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science during the period October 1, 1983 through March 31, 1984 is summarized.

  2. Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1986 through March 31, 1987 is summarized.

  3. Computational Challenges in the Analysis of Petrophysics Using Microtomography and Upscaling

    NASA Astrophysics Data System (ADS)

    Liu, J.; Pereira, G.; Freij-Ayoub, R.; Regenauer-Lieb, K.

    2014-12-01

    Microtomography provides detailed 3D internal structures of rocks in micro- to tens of nano-meter resolution and is quickly turning into a new technology for studying petrophysical properties of materials. An important step is the upscaling of these properties as micron or sub-micron resolution can only be done on the sample-scale of millimeters or even less than a millimeter. We present here a recently developed computational workflow for the analysis of microstructures including the upscaling of material properties. Computations of properties are first performed using conventional material science simulations at micro to nano-scale. The subsequent upscaling of these properties is done by a novel renormalization procedure based on percolation theory. We have tested the workflow using different rock samples, biological and food science materials. We have also applied the technique on high-resolution time-lapse synchrotron CT scans. In this contribution we focus on the computational challenges that arise from the big data problem of analyzing petrophysical properties and its subsequent upscaling. We discuss the following challenges: 1) Characterization of microtomography for extremely large data sets - our current capability. 2) Computational fluid dynamics simulations at pore-scale for permeability estimation - methods, computing cost and accuracy. 3) Solid mechanical computations at pore-scale for estimating elasto-plastic properties - computational stability, cost, and efficiency. 4) Extracting critical exponents from derivative models for scaling laws - models, finite element meshing, and accuracy. Significant progress in each of these challenges is necessary to transform microtomography from the current research problem into a robust computational big data tool for multi-scale scientific and engineering problems.

  4. High school computer science education paves the way for higher education: the Israeli case

    NASA Astrophysics Data System (ADS)

    Armoni, Michal; Gal-Ezer, Judith

    2014-07-01

    The gap between enrollments in higher education computing programs and the high-tech industry's demands is widely reported, and is especially prominent for women. Increasing the availability of computer science education in high school is one of the strategies suggested in order to address this gap. We look at the connection between exposure to computer science in high school and pursuing computing in higher education. We also examine the gender gap, in the context of high school computer science education. We show that in Israel, students who took the high-level computer science matriculation exam were more likely to pursue computing in higher education. Regarding the issue of gender, we will show that, in general, in Israel the difference between males and females who take computer science in high school is relatively small, and a larger, though still not very large difference exists only for the highest exam level. In addition, exposing females to high-level computer science in high school has more relative impact on pursuing higher education in computing.

  5. Advanced Architectures for Astrophysical Supercomputing

    NASA Astrophysics Data System (ADS)

    Barsdell, B. R.; Barnes, D. G.; Fluke, C. J.

    2010-12-01

    Astronomers have come to rely on the increasing performance of computers to reduce, analyze, simulate and visualize their data. In this environment, faster computation can mean more science outcomes or the opening up of new parameter spaces for investigation. If we are to avoid major issues when implementing codes on advanced architectures, it is important that we have a solid understanding of our algorithms. A recent addition to the high-performance computing scene that highlights this point is the graphics processing unit (GPU). The hardware originally designed for speeding-up graphics rendering in video games is now achieving speed-ups of O(100×) in general-purpose computation - performance that cannot be ignored. We are using a generalized approach, based on the analysis of astronomy algorithms, to identify the optimal problem-types and techniques for taking advantage of both current GPU hardware and future developments in computing architectures.

  6. BEYOND THE PRINT—VIRTUAL PALEONTOLOGY IN SCIENCE PUBLISHING, OUTREACH, AND EDUCATION

    PubMed Central

    LAUTENSCHLAGER, STEPHAN; RÜCKLIN, MARTIN

    2015-01-01

    Virtual paleontology unites a variety of computational techniques and methods for the visualization and analysis of fossils. Due to their great potential and increasing availability, these methods have become immensely popular in the last decade. However, communicating the wealth of digital information and results produced by the various techniques is still exacerbated by traditional methods of publication. Transferring and processing three-dimensional information, such as interactive models or animations, into scientific publications still poses a challenge. Here, we present different methods and applications to communicate digital data in academia, outreach and education. Three-dimensional PDFs, QR codes, anaglyph stereo imaging, and rapid prototyping—methods routinely used in the engineering, entertainment, or medical industries—are outlined and evaluated for their potential in science publishing and public engagement. Although limitations remain, these are simple, mostly cost-effective, and powerful tools to create novel and innovative resources for education, public engagement, or outreach. PMID:26306051

  7. Defining Computational Thinking for Mathematics and Science Classrooms

    NASA Astrophysics Data System (ADS)

    Weintrop, David; Beheshti, Elham; Horn, Michael; Orton, Kai; Jona, Kemi; Trouille, Laura; Wilensky, Uri

    2016-02-01

    Science and mathematics are becoming computational endeavors. This fact is reflected in the recently released Next Generation Science Standards and the decision to include "computational thinking" as a core scientific practice. With this addition, and the increased presence of computation in mathematics and scientific contexts, a new urgency has come to the challenge of defining computational thinking and providing a theoretical grounding for what form it should take in school science and mathematics classrooms. This paper presents a response to this challenge by proposing a definition of computational thinking for mathematics and science in the form of a taxonomy consisting of four main categories: data practices, modeling and simulation practices, computational problem solving practices, and systems thinking practices. In formulating this taxonomy, we draw on the existing computational thinking literature, interviews with mathematicians and scientists, and exemplary computational thinking instructional materials. This work was undertaken as part of a larger effort to infuse computational thinking into high school science and mathematics curricular materials. In this paper, we argue for the approach of embedding computational thinking in mathematics and science contexts, present the taxonomy, and discuss how we envision the taxonomy being used to bring current educational efforts in line with the increasingly computational nature of modern science and mathematics.

  8. Parallel Computing for Brain Simulation.

    PubMed

    Pastur-Romay, L A; Porto-Pazos, A B; Cedron, F; Pazos, A

    2017-01-01

    The human brain is the most complex system in the known universe, it is therefore one of the greatest mysteries. It provides human beings with extraordinary abilities. However, until now it has not been understood yet how and why most of these abilities are produced. For decades, researchers have been trying to make computers reproduce these abilities, focusing on both understanding the nervous system and, on processing data in a more efficient way than before. Their aim is to make computers process information similarly to the brain. Important technological developments and vast multidisciplinary projects have allowed creating the first simulation with a number of neurons similar to that of a human brain. This paper presents an up-to-date review about the main research projects that are trying to simulate and/or emulate the human brain. They employ different types of computational models using parallel computing: digital models, analog models and hybrid models. This review includes the current applications of these works, as well as future trends. It is focused on various works that look for advanced progress in Neuroscience and still others which seek new discoveries in Computer Science (neuromorphic hardware, machine learning techniques). Their most outstanding characteristics are summarized and the latest advances and future plans are presented. In addition, this review points out the importance of considering not only neurons: Computational models of the brain should also include glial cells, given the proven importance of astrocytes in information processing. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  9. Using conventional and in situ transmission electron microscopy techniques to understand nanoscale crystallography

    NASA Astrophysics Data System (ADS)

    Hudak, Bethany M.

    Science, technology, engineering, and mathematics (STEM) education has become an emphasized component of PreK-12 education in the United States. The US is struggling to produce enough science, mathematics, and technology experts to meet its national and global needs, and the mean scores of science and mathematics students are not meeting the expected levels desired by our leaders (Hossain & Robinson, 2011). In an effort to improve achievement scores in mathematics and science, school districts must consider many components that can contribute to the development of a classroom where students are engaged and growing academically. Computer technology (CT) for student use is a popular avenue for school districts to pursue in their goal to attain higher achievement. The purpose of this study is to examine the use of iPads in a one-to-one setting, where every student has his own device 24/7, to determine the effects, if any, on academic achievement in the areas of mathematics and science. This comparison study used hierarchical linear modeling (HLM) to examine three middle schools in a private school district. Two of the schools have implemented a one-to-one iPad program with their sixth through eighth grades and the third school uses computers on limited occasions in the classroom and in a computer lab setting. The questions addressed were what effect, if any, do the implementation of a one-to-one iPad program and a teacher's perception of his use of constructivist teaching strategies have on student academic achievement in the mathematics and science middle school classrooms. The research showed that although the program helped promote the use of constructivist activities through the use of technology, the one-to-one iPad initiative had no effect on academic achievement in the middle school mathematics and science classrooms.

  10. Nicholas Brunhart-Lupo | NREL

    Science.gov Websites

    . Education Ph.D., Computer Science, Colorado School of Mines M.S., Computer Science, University of Queensland B.S., Computer Science, Colorado School of Mines Brunhart-Lupo Nicholas Brunhart-Lupo Computational Science Nicholas.Brunhart-Lupo@nrel.gov

  11. The Need for Computer Science

    ERIC Educational Resources Information Center

    Margolis, Jane; Goode, Joanna; Bernier, David

    2011-01-01

    Broadening computer science learning to include more students is a crucial item on the United States' education agenda, these authors say. Although policymakers advocate more computer science expertise, computer science offerings in high schools are few--and actually shrinking. In addition, poorly resourced schools with a high percentage of…

  12. Summary of research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1988 through March 31, 1989 is summarized.

  13. Changing teaching techniques and adapting new technologies to improve student learning in an introductory meteorology and climate course

    NASA Astrophysics Data System (ADS)

    Cutrim, E. M.; Rudge, D.; Kits, K.; Mitchell, J.; Nogueira, R.

    2006-06-01

    Responding to the call for reform in science education, changes were made in an introductory meteorology and climate course offered at a large public university. These changes were a part of a larger project aimed at deepening and extending a program of science content courses that model effective teaching strategies for prospective middle school science teachers. Therefore, revisions were made to address misconceptions about meteorological phenomena, foster deeper understanding of key concepts, encourage engagement with the text, and promote inquiry-based learning. Techniques introduced include: use of a flash cards, student reflection questionnaires, writing assignments, and interactive discussions on weather and forecast data using computer technology such as Integrated Data Viewer (IDV). The revision process is described in a case study format. Preliminary results (self-reflection by the instructor, surveys of student opinion, and measurements of student achievement), suggest student learning has been positively influenced. This study is supported by three grants: NSF grant No. 0202923, the Unidata Equipment Award, and the Lucia Harrison Endowment Fund.

  14. Standardized input for Hanford environmental impact statements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Napier, B.A.

    1981-05-01

    Models and computer programs for simulating the environmental behavior of radionuclides in the environment and the resulting radiation dose to humans have been developed over the years by the Environmental Analysis Section staff, Ecological Sciences Department at the Pacific Northwest Laboratory (PNL). Methodologies have evolved for calculating raidation doses from many exposure pathways for any type of release mechanism. Depending on the situation or process being simulated, different sets of computer programs, assumptions, and modeling techniques must be used. This report is a compilation of recommended computer programs and necessary input information for use in calculating doses to members ofmore » the general public for environmental impact statements prepared for DOE activities to be conducted on or near the Hanford Reservation.« less

  15. Alliance for Computational Science Collaboration HBCU Partnership at Fisk University. Final Report 2001

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, W. E.

    2004-08-16

    Computational Science plays a big role in research and development in mathematics, science, engineering and biomedical disciplines. The Alliance for Computational Science Collaboration (ACSC) has the goal of training African-American and other minority scientists in the computational science field for eventual employment with the Department of Energy (DOE). The involvements of Historically Black Colleges and Universities (HBCU) in the Alliance provide avenues for producing future DOE African-American scientists. Fisk University has been participating in this program through grants from the DOE. The DOE grant supported computational science activities at Fisk University. The research areas included energy related projects, distributed computing,more » visualization of scientific systems and biomedical computing. Students' involvement in computational science research included undergraduate summer research at Oak Ridge National Lab, on-campus research involving the participation of undergraduates, participation of undergraduate and faculty members in workshops, and mentoring of students. These activities enhanced research and education in computational science, thereby adding to Fisk University's spectrum of research and educational capabilities. Among the successes of the computational science activities are the acceptance of three undergraduate students to graduate schools with full scholarships beginning fall 2002 (one for master degree program and two for Doctoral degree program).« less

  16. Complex network problems in physics, computer science and biology

    NASA Astrophysics Data System (ADS)

    Cojocaru, Radu Ionut

    There is a close relation between physics and mathematics and the exchange of ideas between these two sciences are well established. However until few years ago there was no such a close relation between physics and computer science. Even more, only recently biologists started to use methods and tools from statistical physics in order to study the behavior of complex system. In this thesis we concentrate on applying and analyzing several methods borrowed from computer science to biology and also we use methods from statistical physics in solving hard problems from computer science. In recent years physicists have been interested in studying the behavior of complex networks. Physics is an experimental science in which theoretical predictions are compared to experiments. In this definition, the term prediction plays a very important role: although the system is complex, it is still possible to get predictions for its behavior, but these predictions are of a probabilistic nature. Spin glasses, lattice gases or the Potts model are a few examples of complex systems in physics. Spin glasses and many frustrated antiferromagnets map exactly to computer science problems in the NP-hard class defined in Chapter 1. In Chapter 1 we discuss a common result from artificial intelligence (AI) which shows that there are some problems which are NP-complete, with the implication that these problems are difficult to solve. We introduce a few well known hard problems from computer science (Satisfiability, Coloring, Vertex Cover together with Maximum Independent Set and Number Partitioning) and then discuss their mapping to problems from physics. In Chapter 2 we provide a short review of combinatorial optimization algorithms and their applications to ground state problems in disordered systems. We discuss the cavity method initially developed for studying the Sherrington-Kirkpatrick model of spin glasses. We extend this model to the study of a specific case of spin glass on the Bethe lattice at zero temperature and then we apply this formalism to the K-SAT problem defined in Chapter 1. The phase transition which physicists study often corresponds to a change in the computational complexity of the corresponding computer science problem. Chapter 3 presents phase transitions which are specific to the problems discussed in Chapter 1 and also known results for the K-SAT problem. We discuss the replica method and experimental evidences of replica symmetry breaking. The physics approach to hard problems is based on replica methods which are difficult to understand. In Chapter 4 we develop novel methods for studying hard problems using methods similar to the message passing techniques that were discussed in Chapter 2. Although we concentrated on the symmetric case, cavity methods show promise for generalizing our methods to the un-symmetric case. As has been highlighted by John Hopfield, several key features of biological systems are not shared by physical systems. Although living entities follow the laws of physics and chemistry, the fact that organisms adapt and reproduce introduces an essential ingredient that is missing in the physical sciences. In order to extract information from networks many algorithm have been developed. In Chapter 5 we apply polynomial algorithms like minimum spanning tree in order to study and construct gene regulatory networks from experimental data. As future work we propose the use of algorithms like min-cut/max-flow and Dijkstra for understanding key properties of these networks.

  17. Curricular Influences on Female Afterschool Facilitators' Computer Science Interests and Career Choices

    NASA Astrophysics Data System (ADS)

    Koch, Melissa; Gorges, Torie

    2016-10-01

    Underrepresented populations such as women, African-Americans, and Latinos/as often come to STEM (science, technology, engineering, and mathematics) careers by less traditional paths than White and Asian males. To better understand how and why women might shift toward STEM, particularly computer science, careers, we investigated the education and career direction of afterschool facilitators, primarily women of color in their twenties and thirties, who taught Build IT, an afterschool computer science curriculum for middle school girls. Many of these women indicated that implementing Build IT had influenced their own interest in technology and computer science and in some cases had resulted in their intent to pursue technology and computer science education. We wanted to explore the role that teaching Build IT may have played in activating or reactivating interest in careers in computer science and to see whether in the years following implementation of Build IT, these women pursued STEM education and/or careers. We reached nine facilitators who implemented the program in 2011-12 or shortly after. Many indicated that while facilitating Build IT, they learned along with the participants, increasing their interest in and confidence with technology and computer science. Seven of the nine participants pursued further STEM or computer science learning or modified their career paths to include more of a STEM or computer science focus. Through interviews, we explored what aspects of Build IT influenced these facilitators' interest and confidence in STEM and when relevant their pursuit of technology and computer science education and careers.

  18. Advanced Tools and Techniques for Formal Techniques in Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Knight, John C.

    2005-01-01

    This is the final technical report for grant number NAG-1-02101. The title of this grant was "Advanced Tools and Techniques for Formal Techniques In Aerospace Systems". The principal investigator on this grant was Dr. John C. Knight of the Computer Science Department, University of Virginia, Charlottesville, Virginia 22904-4740. This report summarizes activities under the grant during the period 7/01/2002 to 9/30/2004. This report is organized as follows. In section 2, the technical background of the grant is summarized. Section 3 lists accomplishments and section 4 lists students funded under the grant. In section 5, we present a list of presentations given at various academic and research institutions about the research conducted. Finally, a list of publications generated under this grant is included in section 6.

  19. The NASA computer science research program plan

    NASA Technical Reports Server (NTRS)

    1983-01-01

    A taxonomy of computer science is included, one state of the art of each of the major computer science categories is summarized. A functional breakdown of NASA programs under Aeronautics R and D, space R and T, and institutional support is also included. These areas were assessed against the computer science categories. Concurrent processing, highly reliable computing, and information management are identified.

  20. ZTF Undergraduate Astronomy Institute at Caltech and Pomona College

    NASA Astrophysics Data System (ADS)

    Penprase, Bryan Edward; Bellm, Eric Christopher

    2017-01-01

    From the new Zwicky Transient Facility (ZTF), an NSF funded project based at Caltech, comes a new initiative for undergraduate research known as the Summer Undergraduate Astronomy Institute. The Institute brings together 15-20 students from across the world for an immersive experience in astronomy techniques before they begin their summer research projects. The students are primarly based at Caltech in their SURF program but also includes a large cohort of students enrolled in research internships at Pomona College in nearby Claremont CA. The program is intended to introduce students to research techniques in astronomy, laboratory and computational technologies, and to observational astronomy. Since many of the students are previously computer science or physics majors with little astronomy experience, this immersive experience has been extremely helpful for enabling students to learn about the terminologies, techniques and technologies of astronomy. The field trips to the Mount Wilson and Palomar telescopes deepen their knowledge and excitement about astronomy. Lectures about astronomical research from Caltech staff scientists and graduate students also provide context for the student research. Perhaps more importantly, the creation of a cohort of like-minded students, and the chance to reflect about careers in astronomy and research, give these students opportunities to consider themselves as future research scientists and help them immensely as they move forward in their careers. We discuss some of the social and intercultural aspects of the experience as well, as our cohorts typically include international students from many countries and several students from under-represented groups in science.

  1. Use of the computational-informational web-GIS system for the development of climatology students' skills in modeling and understanding climate change

    NASA Astrophysics Data System (ADS)

    Gordova, Yulia; Martynova, Yulia; Shulgina, Tamara

    2015-04-01

    The current situation with the training of specialists in environmental sciences is complicated by the fact that the very scientific field is experiencing a period of rapid development. Global change has caused the development of measurement techniques and modeling of environmental characteristics, accompanied by the expansion of the conceptual and mathematical apparatus. Understanding and forecasting processes in the Earth system requires extensive use of mathematical modeling and advanced computing technologies. As a rule, available training programs in the environmental sciences disciplines do not have time to adapt to such rapid changes in the domain content. As a result, graduates of faculties do not understand processes and mechanisms of the global change, have only superficial knowledge of mathematical modeling of processes in the environment. They do not have the required skills in numerical modeling, data processing and analysis of observations and computation outputs and are not prepared to work with the meteorological data. For adequate training of future specialists in environmental sciences we propose the following approach, which reflects the new "research" paradigm in education. We believe that the training of such specialists should be done not in an artificial learning environment, but based on actual operating information-computational systems used in environment studies, in the so-called virtual research environment via development of virtual research and learning laboratories. In the report the results of the use of computational-informational web-GIS system "Climate" (http://climate.scert.ru/) as a prototype of such laboratory are discussed. The approach is realized at Tomsk State University to prepare bachelors in meteorology. Student survey shows that their knowledge has become deeper and more systemic after undergoing training in virtual learning laboratory. The scientific team plans to assist any educators to utilize the system in earth science education. This work is partially supported by SB RAS project VIII.80.2.1, RFBR grants 13-05-12034 and 14-05-00502.

  2. Dawn: A Simulation Model for Evaluating Costs and Tradeoffs of Big Data Science Architectures

    NASA Astrophysics Data System (ADS)

    Cinquini, L.; Crichton, D. J.; Braverman, A. J.; Kyo, L.; Fuchs, T.; Turmon, M.

    2014-12-01

    In many scientific disciplines, scientists and data managers are bracing for an upcoming deluge of big data volumes, which will increase the size of current data archives by a factor of 10-100 times. For example, the next Climate Model Inter-comparison Project (CMIP6) will generate a global archive of model output of approximately 10-20 Peta-bytes, while the upcoming next generation of NASA decadal Earth Observing instruments are expected to collect tens of Giga-bytes/day. In radio-astronomy, the Square Kilometre Array (SKA) will collect data in the Exa-bytes/day range, of which (after reduction and processing) around 1.5 Exa-bytes/year will be stored. The effective and timely processing of these enormous data streams will require the design of new data reduction and processing algorithms, new system architectures, and new techniques for evaluating computation uncertainty. Yet at present no general software tool or framework exists that will allow system architects to model their expected data processing workflow, and determine the network, computational and storage resources needed to prepare their data for scientific analysis. In order to fill this gap, at NASA/JPL we have been developing a preliminary model named DAWN (Distributed Analytics, Workflows and Numerics) for simulating arbitrary complex workflows composed of any number of data processing and movement tasks. The model can be configured with a representation of the problem at hand (the data volumes, the processing algorithms, the available computing and network resources), and is able to evaluate tradeoffs between different possible workflows based on several estimators: overall elapsed time, separate computation and transfer times, resulting uncertainty, and others. So far, we have been applying DAWN to analyze architectural solutions for 4 different use cases from distinct science disciplines: climate science, astronomy, hydrology and a generic cloud computing use case. This talk will present preliminary results and discuss how DAWN can be evolved into a powerful tool for designing system architectures for data intensive science.

  3. Data management and its role in delivering science at DOE BES user facilities - Past, Present, and Future

    NASA Astrophysics Data System (ADS)

    Miller, Stephen D.; Herwig, Kenneth W.; Ren, Shelly; Vazhkudai, Sudharshan S.; Jemian, Pete R.; Luitz, Steffen; Salnikov, Andrei A.; Gaponenko, Igor; Proffen, Thomas; Lewis, Paul; Green, Mark L.

    2009-07-01

    The primary mission of user facilities operated by Basic Energy Sciences under the Department of Energy is to produce data for users in support of open science and basic research [1]. We trace back almost 30 years of history across selected user facilities illustrating the evolution of facility data management practices and how these practices have related to performing scientific research. The facilities cover multiple techniques such as X-ray and neutron scattering, imaging and tomography sciences. Over time, detector and data acquisition technologies have dramatically increased the ability to produce prolific volumes of data challenging the traditional paradigm of users taking data home upon completion of their experiments to process and publish their results. During this time, computing capacity has also increased dramatically, though the size of the data has grown significantly faster than the capacity of one's laptop to manage and process this new facility produced data. Trends indicate that this will continue to be the case for yet some time. Thus users face a quandary for how to manage today's data complexity and size as these may exceed the computing resources users have available to themselves. This same quandary can also stifle collaboration and sharing. Realizing this, some facilities are already providing web portal access to data and computing thereby providing users access to resources they need [2]. Portal based computing is now driving researchers to think about how to use the data collected at multiple facilities in an integrated way to perform their research, and also how to collaborate and share data. In the future, inter-facility data management systems will enable next tier cross-instrument-cross facility scientific research fuelled by smart applications residing upon user computer resources. We can learn from the medical imaging community that has been working since the early 1990's to integrate data from across multiple modalities to achieve better diagnoses [3] - similarly, data fusion across BES facilities will lead to new scientific discoveries.

  4. On teaching computer ethics within a computer science department.

    PubMed

    Quinn, Michael J

    2006-04-01

    The author has surveyed a quarter of the accredited undergraduate computer science programs in the United States. More than half of these programs offer a 'social and ethical implications of computing' course taught by a computer science faculty member, and there appears to be a trend toward teaching ethics classes within computer science departments. Although the decision to create an 'in house' computer ethics course may sometimes be a pragmatic response to pressure from the accreditation agency, this paper argues that teaching ethics within a computer science department can provide students and faculty members with numerous benefits. The paper lists topics that can be covered in a computer ethics course and offers some practical suggestions for making the course successful.

  5. Cellular Automata

    NASA Astrophysics Data System (ADS)

    Gutowitz, Howard

    1991-08-01

    Cellular automata, dynamic systems in which space and time are discrete, are yielding interesting applications in both the physical and natural sciences. The thirty four contributions in this book cover many aspects of contemporary studies on cellular automata and include reviews, research reports, and guides to recent literature and available software. Chapters cover mathematical analysis, the structure of the space of cellular automata, learning rules with specified properties: cellular automata in biology, physics, chemistry, and computation theory; and generalizations of cellular automata in neural nets, Boolean nets, and coupled map lattices. Current work on cellular automata may be viewed as revolving around two central and closely related problems: the forward problem and the inverse problem. The forward problem concerns the description of properties of given cellular automata. Properties considered include reversibility, invariants, criticality, fractal dimension, and computational power. The role of cellular automata in computation theory is seen as a particularly exciting venue for exploring parallel computers as theoretical and practical tools in mathematical physics. The inverse problem, an area of study gaining prominence particularly in the natural sciences, involves designing rules that possess specified properties or perform specified task. A long-term goal is to develop a set of techniques that can find a rule or set of rules that can reproduce quantitative observations of a physical system. Studies of the inverse problem take up the organization and structure of the set of automata, in particular the parameterization of the space of cellular automata. Optimization and learning techniques, like the genetic algorithm and adaptive stochastic cellular automata are applied to find cellular automaton rules that model such physical phenomena as crystal growth or perform such adaptive-learning tasks as balancing an inverted pole. Howard Gutowitz is Collaborateur in the Service de Physique du Solide et Résonance Magnetique, Commissariat a I'Energie Atomique, Saclay, France.

  6. Computational Science News | Computational Science | NREL

    Science.gov Websites

    -Cooled High-Performance Computing Technology at the ESIF February 28, 2018 NREL Launches New Website for High-Performance Computing System Users The National Renewable Energy Laboratory (NREL) Computational Science Center has launched a revamped website for users of the lab's high-performance computing (HPC

  7. Stability Analysis of Finite Difference Approximations to Hyperbolic Systems, and Problems in Applied and Computational Matrix Theory

    DTIC Science & Technology

    1988-07-08

    Marcus and C. Baczynski), Computer Science Press, Rockville, Maryland, 1986. 3. An Introduction to Pascal and Precalculus , Computer Science Press...Science Press, Rockville, Maryland, 1986. 35. An Introduction to Pascal and Precalculus , Computer Science Press, Rockville, Maryland, 1986. 36

  8. Empirical Determination of Competence Areas to Computer Science Education

    ERIC Educational Resources Information Center

    Zendler, Andreas; Klaudt, Dieter; Seitz, Cornelia

    2014-01-01

    The authors discuss empirically determined competence areas to K-12 computer science education, emphasizing the cognitive level of competence. The results of a questionnaire with 120 professors of computer science serve as a database. By using multi-dimensional scaling and cluster analysis, four competence areas to computer science education…

  9. Factors Influencing Exemplary Science Teachers' Levels of Computer Use

    ERIC Educational Resources Information Center

    Hakverdi, Meral; Dana, Thomas M.; Swain, Colleen

    2011-01-01

    The purpose of this study was to examine exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their…

  10. Preparing Future Secondary Computer Science Educators

    ERIC Educational Resources Information Center

    Ajwa, Iyad

    2007-01-01

    Although nearly every college offers a major in computer science, many computer science teachers at the secondary level have received little formal training. This paper presents details of a project that could make a significant contribution to national efforts to improve computer science education by combining teacher education and professional…

  11. OPENING REMARKS: SciDAC: Scientific Discovery through Advanced Computing

    NASA Astrophysics Data System (ADS)

    Strayer, Michael

    2005-01-01

    Good morning. Welcome to SciDAC 2005 and San Francisco. SciDAC is all about computational science and scientific discovery. In a large sense, computational science characterizes SciDAC and its intent is change. It transforms both our approach and our understanding of science. It opens new doors and crosses traditional boundaries while seeking discovery. In terms of twentieth century methodologies, computational science may be said to be transformational. There are a number of examples to this point. First are the sciences that encompass climate modeling. The application of computational science has in essence created the field of climate modeling. This community is now international in scope and has provided precision results that are challenging our understanding of our environment. A second example is that of lattice quantum chromodynamics. Lattice QCD, while adding precision and insight to our fundamental understanding of strong interaction dynamics, has transformed our approach to particle and nuclear science. The individual investigator approach has evolved to teams of scientists from different disciplines working side-by-side towards a common goal. SciDAC is also undergoing a transformation. This meeting is a prime example. Last year it was a small programmatic meeting tracking progress in SciDAC. This year, we have a major computational science meeting with a variety of disciplines and enabling technologies represented. SciDAC 2005 should position itself as a new corner stone for Computational Science and its impact on science. As we look to the immediate future, FY2006 will bring a new cycle to SciDAC. Most of the program elements of SciDAC will be re-competed in FY2006. The re-competition will involve new instruments for computational science, new approaches for collaboration, as well as new disciplines. There will be new opportunities for virtual experiments in carbon sequestration, fusion, and nuclear power and nuclear waste, as well as collaborations with industry and virtual prototyping. New instruments of collaboration will include institutes and centers while summer schools, workshops and outreach will invite new talent and expertise. Computational science adds new dimensions to science and its practice. Disciplines of fusion, accelerator science, and combustion are poised to blur the boundaries between pure and applied science. As we open the door into FY2006 we shall see a landscape of new scientific challenges: in biology, chemistry, materials, and astrophysics to name a few. The enabling technologies of SciDAC have been transformational as drivers of change. Planning for major new software systems assumes a base line employing Common Component Architectures and this has become a household word for new software projects. While grid algorithms and mesh refinement software have transformed applications software, data management and visualization have transformed our understanding of science from data. The Gordon Bell prize now seems to be dominated by computational science and solvers developed by TOPS ISIC. The priorities of the Office of Science in the Department of Energy are clear. The 20 year facilities plan is driven by new science. High performance computing is placed amongst the two highest priorities. Moore's law says that by the end of the next cycle of SciDAC we shall have peta-flop computers. The challenges of petascale computing are enormous. These and the associated computational science are the highest priorities for computing within the Office of Science. Our effort in Leadership Class computing is just a first step towards this goal. Clearly, computational science at this scale will face enormous challenges and possibilities. Performance evaluation and prediction will be critical to unraveling the needed software technologies. We must not lose sight of our overarching goal—that of scientific discovery. Science does not stand still and the landscape of science discovery and computing holds immense promise. In this environment, I believe it is necessary to institute a system of science based performance metrics to help quantify our progress towards science goals and scientific computing. As a final comment I would like to reaffirm that the shifting landscapes of science will force changes to our computational sciences, and leave you with the quote from Richard Hamming, 'The purpose of computing is insight, not numbers'.

  12. Research Experiences for 14 Year Olds: preliminary report on the `Sky Explorer' pilot program at Springfield (MA) High School of Science and Technology

    NASA Astrophysics Data System (ADS)

    Tucker, G. E.

    1997-05-01

    This NSF supported program, emphasizing hands-on learning and observation with modern instruments, is described in its pilot phase, prior to being launched nationally. A group of 14 year old students are using a small (21 cm) computer controlled telescope and CCD camera to do: (1) a 'sky survey' of brighter celestial objects, finding, identifying, and learning about them, and accumulating a portfolio of images, (2) photometry of variable stars, reducing the data to get a light curve, and (3) learn modern computer-based communication/dissemination skills by posting images and data to a Web site they are designing (http://www.javanet.com/ sky) and contributing data to archives (e.g. AAVSO) via the Internet. To attract more interest to astronomy and science in general and have a wider impact on the school and surrounding community, peer teaching is used as a pedagogical technique and families are encouraged to participate. Students teach e.g. astronomy, software and computers, Internet, instrumentation, and observing to other students, parents and the community by means of daytime presentations of their results (images and data) and evening public viewing at the telescope, operating the equipment themselves. Students can contribute scientifically significant data and experience the `discovery' aspect of science through observing projects where a measurement is made. Their `informal education' activities also help improve the perception of science in general and astronomy in particular in society at large. This program could benefit from collaboration with astronomers wanting to organize geographically distributed observing campaigns coordinated over the Internet and willing to advise on promising observational programs for small telescopes in the context of current science.

  13. AGENT-BASED MODELS IN EMPIRICAL SOCIAL RESEARCH*

    PubMed Central

    Bruch, Elizabeth; Atwell, Jon

    2014-01-01

    Agent-based modeling has become increasingly popular in recent years, but there is still no codified set of recommendations or practices for how to use these models within a program of empirical research. This article provides ideas and practical guidelines drawn from sociology, biology, computer science, epidemiology, and statistics. We first discuss the motivations for using agent-based models in both basic science and policy-oriented social research. Next, we provide an overview of methods and strategies for incorporating data on behavior and populations into agent-based models, and review techniques for validating and testing the sensitivity of agent-based models. We close with suggested directions for future research. PMID:25983351

  14. Soil moisture needs in earth sciences

    NASA Technical Reports Server (NTRS)

    Engman, Edwin T.

    1992-01-01

    The author reviews the development of passive and active microwave techniques for measuring soil moisture with respect to how the data may be used. New science programs such as the EOS, the GEWEX Continental-Scale International Project (GCIP) and STORM, a mesoscale meteorology and hydrology project, will have to account for soil moisture either as a storage in water balance computations or as a state variable in-process modeling. The author discusses future soil moisture needs such as frequency of measurement, accuracy, depth, and spatial resolution, as well as the concomitant model development that must proceed concurrently if the development in microwave technology is to have a major impact in these areas.

  15. Enabling Wide-Scale Computer Science Education through Improved Automated Assessment Tools

    NASA Astrophysics Data System (ADS)

    Boe, Bryce A.

    There is a proliferating demand for newly trained computer scientists as the number of computer science related jobs continues to increase. University programs will only be able to train enough new computer scientists to meet this demand when two things happen: when there are more primary and secondary school students interested in computer science, and when university departments have the resources to handle the resulting increase in enrollment. To meet these goals, significant effort is being made to both incorporate computational thinking into existing primary school education, and to support larger university computer science class sizes. We contribute to this effort through the creation and use of improved automated assessment tools. To enable wide-scale computer science education we do two things. First, we create a framework called Hairball to support the static analysis of Scratch programs targeted for fourth, fifth, and sixth grade students. Scratch is a popular building-block language utilized to pique interest in and teach the basics of computer science. We observe that Hairball allows for rapid curriculum alterations and thus contributes to wide-scale deployment of computer science curriculum. Second, we create a real-time feedback and assessment system utilized in university computer science classes to provide better feedback to students while reducing assessment time. Insights from our analysis of student submission data show that modifications to the system configuration support the way students learn and progress through course material, making it possible for instructors to tailor assignments to optimize learning in growing computer science classes.

  16. Programmers, professors, and parasites: credit and co-authorship in computer science.

    PubMed

    Solomon, Justin

    2009-12-01

    This article presents an in-depth analysis of past and present publishing practices in academic computer science to suggest the establishment of a more consistent publishing standard. Historical precedent for academic publishing in computer science is established through the study of anecdotes as well as statistics collected from databases of published computer science papers. After examining these facts alongside information about analogous publishing situations and standards in other scientific fields, the article concludes with a list of basic principles that should be adopted in any computer science publishing standard. These principles would contribute to the reliability and scientific nature of academic publications in computer science and would allow for more straightforward discourse in future publications.

  17. Increasing Diversity in Computer Science: Acknowledging, yet Moving Beyond, Gender

    NASA Astrophysics Data System (ADS)

    Larsen, Elizabeth A.; Stubbs, Margaret L.

    Lack of diversity within the computer science field has, thus far, been examined most fully through the lens of gender. This article is based on a follow-on to Margolis and Fisher's (2002) study and includes interviews with 33 Carnegie Mellon University students from the undergraduate senior class of 2002 in the School of Computer Science. We found evidence of similarities among the perceptions of these women and men on definitions of computer science, explanations for the notoriously low proportion of women in the field, characterizations of a typical computer science student, impressions of recent curricular changes, a sense of the atmosphere/culture in the program, views of the Women@SCS campus organization, and suggestions for attracting and retaining well-rounded students in computer science. We conclude that efforts to increase diversity in the computer science field will benefit from a more broad-based approach that considers, but is not limited to, notions of gender difference.

  18. Multidisciplinary Aerospace Systems Optimization: Computational AeroSciences (CAS) Project

    NASA Technical Reports Server (NTRS)

    Kodiyalam, S.; Sobieski, Jaroslaw S. (Technical Monitor)

    2001-01-01

    The report describes a method for performing optimization of a system whose analysis is so expensive that it is impractical to let the optimization code invoke it directly because excessive computational cost and elapsed time might result. In such situation it is imperative to have user control the number of times the analysis is invoked. The reported method achieves that by two techniques in the Design of Experiment category: a uniform dispersal of the trial design points over a n-dimensional hypersphere and a response surface fitting, and the technique of krigging. Analyses of all the trial designs whose number may be set by the user are performed before activation of the optimization code and the results are stored as a data base. That code is then executed and referred to the above data base. Two applications, one of the airborne laser system, and one of an aircraft optimization illustrate the method application.

  19. Portable color multimedia training systems based on monochrome laptop computers (CBT-in-a-briefcase), with spinoff implications for video uplink and downlink in spaceflight operations

    NASA Technical Reports Server (NTRS)

    Scott, D. W.

    1994-01-01

    This report describes efforts to use digital motion video compression technology to develop a highly portable device that would convert 1990-91 era IBM-compatible and/or MacIntosh notebook computers into full-color, motion-video capable multimedia training systems. An architecture was conceived that would permit direct conversion of existing laser-disk-based multimedia courses with little or no reauthoring. The project did not physically demonstrate certain critical video keying techniques, but their implementation should be feasible. This investigation of digital motion video has spawned two significant spaceflight projects at MSFC: one to downlink multiple high-quality video signals from Spacelab, and the other to uplink videoconference-quality video in realtime and high quality video off-line, plus investigate interactive, multimedia-based techniques for enhancing onboard science operations. Other airborne or spaceborne spinoffs are possible.

  20. Quantitative Modeling of Earth Surface Processes

    NASA Astrophysics Data System (ADS)

    Pelletier, Jon D.

    This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes.

  21. More details...
  1. Democratizing Computer Science

    ERIC Educational Resources Information Center

    Margolis, Jane; Goode, Joanna; Ryoo, Jean J.

    2015-01-01

    Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…

  2. Implementing an Affordable High-Performance Computing for Teaching-Oriented Computer Science Curriculum

    ERIC Educational Resources Information Center

    Abuzaghleh, Omar; Goldschmidt, Kathleen; Elleithy, Yasser; Lee, Jeongkyu

    2013-01-01

    With the advances in computing power, high-performance computing (HPC) platforms have had an impact on not only scientific research in advanced organizations but also computer science curriculum in the educational community. For example, multicore programming and parallel systems are highly desired courses in the computer science major. However,…

  3. An Ada Object Oriented Missile Flight Simulation

    DTIC Science & Technology

    1991-09-01

    identify by block number) This thesis uses the Ada programming language in the design and development of an air-to-air missile flight simulation with...object oriented techniques and sound software engineering principles. The simulation is designed to be more understandable, modifiable, efficient and...Department of Computer Science ii ABSTRACT This thesis uses the Ada programming language in the design and development of an air-to-air missile flight

  4. Development of Improved Modeling and Analysis Techniques for Dynamics of Shell Structures

    DTIC Science & Technology

    1991-07-24

    Engineering Sciences and Center for Space Structures and Control University of Colorado,Campus Box 429 Boulder, Colorado 80309 Accesion :or -.... ... i...system architecture ; third, to implement a decomposi- tion/mapping procedure that matches as far as possible the layout of the processors to the...element computations. In particular. we address issues that are related to the processor memory size. to the SIMD architecture and to the fast

  5. Structural ceramics

    NASA Technical Reports Server (NTRS)

    Craig, Douglas F.

    1992-01-01

    This presentation gives a brief history of the field of materials sciences and goes on to expound the advantages of the fastest growing area in that field, namely ceramics. Since ceramics are moving to fill the demand for lighter, stronger, more corrosion resistant materials, advancements will rely more on processing and modeling from the atomic scale up which is made possible by advanced analytical, computer, and processing techniques. All information is presented in viewgraph format.

  6. Multi-Frame Convolutional Neural Networks for Object Detection in Temporal Data

    DTIC Science & Technology

    2017-03-01

    maximum 200 words) Given the problem of detecting objects in video , existing neural-network solutions rely on a post-processing step to combine...information across frames and strengthen conclusions. This technique has been successful for videos with simple, dominant objects but it cannot detect objects...Computer Science iii THIS PAGE INTENTIONALLY LEFT BLANK iv ABSTRACT Given the problem of detecting objects in video , existing neural-network solutions rely

  7. Computer Science and the Liberal Arts

    ERIC Educational Resources Information Center

    Shannon, Christine

    2010-01-01

    Computer science and the liberal arts have much to offer each other. Yet liberal arts colleges, in particular, have been slow to recognize the opportunity that the study of computer science provides for achieving the goals of a liberal education. After the precipitous drop in computer science enrollments during the first decade of this century,…

  8. Marrying Content and Process in Computer Science Education

    ERIC Educational Resources Information Center

    Zendler, A.; Spannagel, C.; Klaudt, D.

    2011-01-01

    Constructivist approaches to computer science education emphasize that as well as knowledge, thinking skills and processes are involved in active knowledge construction. K-12 computer science curricula must not be based on fashions and trends, but on contents and processes that are observable in various domains of computer science, that can be…

  9. Computing Whether She Belongs: Stereotypes Undermine Girls' Interest and Sense of Belonging in Computer Science

    ERIC Educational Resources Information Center

    Master, Allison; Cheryan, Sapna; Meltzoff, Andrew N.

    2016-01-01

    Computer science has one of the largest gender disparities in science, technology, engineering, and mathematics. An important reason for this disparity is that girls are less likely than boys to enroll in necessary "pipeline courses," such as introductory computer science. Two experiments investigated whether high-school girls' lower…

  10. Approaching Gender Parity: Women in Computer Science at Afghanistan's Kabul University

    ERIC Educational Resources Information Center

    Plane, Jandelyn

    2010-01-01

    This study explores the representation of women in computer science at the tertiary level through data collected about undergraduate computer science education at Kabul University in Afghanistan. Previous studies have theorized reasons for underrepresentation of women in computer science, and while many of these reasons are indeed present in…

  11. Some Hail 'Computational Science' as Biggest Advance Since Newton, Galileo.

    ERIC Educational Resources Information Center

    Turner, Judith Axler

    1987-01-01

    Computational science is defined as science done on a computer. A computer can serve as a laboratory for researchers who cannot experiment with their subjects, and as a calculator for those who otherwise might need centuries to solve some problems mathematically. The National Science Foundation's support of supercomputers is discussed. (MLW)

  12. Exploring Space Physics Concepts Using Simulation Results

    NASA Astrophysics Data System (ADS)

    Gross, N. A.

    2008-05-01

    The Center for Integrated Space Weather Modeling (CISM), a Science and Technology Center (STC) funded by the National Science Foundation, has the goal of developing a suite of integrated physics based computer models of the space environment that can follow the evolution of a space weather event from the Sun to the Earth. In addition to the research goals, CISM is also committed to training the next generation of space weather professionals who are imbued with a system view of space weather. This view should include an understanding of both helio-spheric and geo-space phenomena. To this end, CISM offers a yearly Space Weather Summer School targeted to first year graduate students, although advanced undergraduates and space weather professionals have also attended. This summer school uses a number of innovative pedagogical techniques including devoting each afternoon to a computer lab exercise that use results from research quality simulations and visualization techniques, along with ground based and satellite data to explore concepts introduced during the morning lectures. These labs are suitable for use in wide variety educational settings from formal classroom instruction to outreach programs. The goal of this poster is to outline the goals and content of the lab materials so that instructors may evaluate their potential use in the classroom or other settings.

  13. Sculpting in cyberspace: Parallel processing the development of new software

    NASA Technical Reports Server (NTRS)

    Fisher, Rob

    1993-01-01

    Stimulating creativity in problem solving, particularly where software development is involved, is applicable to many disciplines. Metaphorical thinking keeps the problem in focus but in a different light, jarring people out of their mental ruts and sparking fresh insights. It forces the mind to stretch to find patterns between dissimilar concepts, in the hope of discovering unusual ideas in odd associations (Technology Review January 1993, p. 37). With a background in Engineering and Visual Design from MIT, I have for the past 30 years pursued a career as a sculptor of interdisciplinary monumental artworks that bridge the fields of science, engineering and art. Since 1979, I have pioneered the application of computer simulation to solve the complex problems associated with these projects. A recent project for the roof of the Carnegie Science Center in Pittsburgh made particular use of the metaphoric creativity technique described above. The problem-solving process led to the creation of hybrid software combining scientific, architectural and engineering visualization techniques. David Steich, a Doctoral Candidate in Electrical Engineering at Penn State, was commissioned to develop special software that enabled me to create innovative free-form sculpture. This paper explores the process of inventing the software through a detailed analysis of the interaction between an artist and a computer programmer.

  14. African-American males in computer science---Examining the pipeline for clogs

    NASA Astrophysics Data System (ADS)

    Stone, Daryl Bryant

    The literature on African-American males (AAM) begins with a statement to the effect that "Today young Black men are more likely to be killed or sent to prison than to graduate from college." Why are the numbers of African-American male college graduates decreasing? Why are those enrolled in college not majoring in the science, technology, engineering, and mathematics (STEM) disciplines? This research explored why African-American males are not filling the well-recognized industry need for Computer Scientist/Technologists by choosing college tracks to these careers. The literature on STEM disciplines focuses largely on women in STEM, as opposed to minorities, and within minorities, there is a noticeable research gap in addressing the needs and opportunities available to African-American males. The primary goal of this study was therefore to examine the computer science "pipeline" from the African-American male perspective. The method included a "Computer Science Degree Self-Efficacy Scale" be distributed to five groups of African-American male students, to include: (1) fourth graders, (2) eighth graders, (3) eleventh graders, (4) underclass undergraduate computer science majors, and (5) upperclass undergraduate computer science majors. In addition to a 30-question self-efficacy test, subjects from each group were asked to participate in a group discussion about "African-American males in computer science." The audio record of each group meeting provides qualitative data for the study. The hypotheses include the following: (1) There is no significant difference in "Computer Science Degree" self-efficacy between fourth and eighth graders. (2) There is no significant difference in "Computer Science Degree" self-efficacy between eighth and eleventh graders. (3) There is no significant difference in "Computer Science Degree" self-efficacy between eleventh graders and lower-level computer science majors. (4) There is no significant difference in "Computer Science Degree" self-efficacy between lower-level computer science majors and upper-level computer science majors. (5) There is no significant difference in "Computer Science Degree" self-efficacy between each of the five groups of students. Finally, the researcher selected African-American male students attending six primary schools, including the predominately African-American elementary, middle and high school that the researcher attended during his own academic career. Additionally, a racially mixed elementary, middle and high school was selected from the same county in Maryland. Bowie State University provided both the underclass and upperclass computer science majors surveyed in this study. Of the five hypotheses, the sample provided enough evidence to support the claim that there are significant differences in the "Computer Science Degree" self-efficacy between each of the five groups of students. ANOVA analysis by question and total self-efficacy scores provided more results of statistical significance. Additionally, factor analysis and review of the qualitative data provide more insightful results. Overall, the data suggest 'a clog' may exist in the middle school level and students attending racially mixed schools were more confident in their computer, math and science skills. African-American males admit to spending lots of time on social networking websites and emailing, but are 'dis-aware' of the skills and knowledge needed to study in the computing disciplines. The majority of the subjects knew little, if any, AAMs in the 'computing discipline pipeline'. The collegian African-American males, in this study, agree that computer programming is a difficult area and serves as a 'major clog in the pipeline'.

  15. The TeraShake Computational Platform for Large-Scale Earthquake Simulations

    NASA Astrophysics Data System (ADS)

    Cui, Yifeng; Olsen, Kim; Chourasia, Amit; Moore, Reagan; Maechling, Philip; Jordan, Thomas

    Geoscientific and computer science researchers with the Southern California Earthquake Center (SCEC) are conducting a large-scale, physics-based, computationally demanding earthquake system science research program with the goal of developing predictive models of earthquake processes. The computational demands of this program continue to increase rapidly as these researchers seek to perform physics-based numerical simulations of earthquake processes for larger meet the needs of this research program, a multiple-institution team coordinated by SCEC has integrated several scientific codes into a numerical modeling-based research tool we call the TeraShake computational platform (TSCP). A central component in the TSCP is a highly scalable earthquake wave propagation simulation program called the TeraShake anelastic wave propagation (TS-AWP) code. In this chapter, we describe how we extended an existing, stand-alone, wellvalidated, finite-difference, anelastic wave propagation modeling code into the highly scalable and widely used TS-AWP and then integrated this code into the TeraShake computational platform that provides end-to-end (initialization to analysis) research capabilities. We also describe the techniques used to enhance the TS-AWP parallel performance on TeraGrid supercomputers, as well as the TeraShake simulations phases including input preparation, run time, data archive management, and visualization. As a result of our efforts to improve its parallel efficiency, the TS-AWP has now shown highly efficient strong scaling on over 40K processors on IBM’s BlueGene/L Watson computer. In addition, the TSCP has developed into a computational system that is useful to many members of the SCEC community for performing large-scale earthquake simulations.

  16. Computational Environments and Analysis methods available on the NCI High Performance Computing (HPC) and High Performance Data (HPD) Platform

    NASA Astrophysics Data System (ADS)

    Evans, B. J. K.; Foster, C.; Minchin, S. A.; Pugh, T.; Lewis, A.; Wyborn, L. A.; Evans, B. J.; Uhlherr, A.

    2014-12-01

    The National Computational Infrastructure (NCI) has established a powerful in-situ computational environment to enable both high performance computing and data-intensive science across a wide spectrum of national environmental data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress in addressing harmonisation of the underlying data collections for future transdisciplinary research that enable accurate climate projections. NCI makes available 10+ PB major data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the national scientific records), major research communities, and collaborating overseas organisations. The data is accessible within an integrated HPC-HPD environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large scale and high-bandwidth Lustre filesystems. This computational environment supports a catalogue of integrated reusable software and workflows from earth system and ecosystem modelling, weather research, satellite and other observed data processing and analysis. To enable transdisciplinary research on this scale, data needs to be harmonised so that researchers can readily apply techniques and software across the corpus of data available and not be constrained to work within artificial disciplinary boundaries. Future challenges will involve the further integration and analysis of this data across the social sciences to facilitate the impacts across the societal domain, including timely analysis to more accurately predict and forecast future climate and environmental state.

  17. Identifying the relationship between feedback provided in computer-assisted instructional modules, science self-efficacy, and academic achievement

    NASA Astrophysics Data System (ADS)

    Mazingo, Diann Etsuko

    Feedback has been identified as a key variable in developing academic self-efficacy. The types of feedback can vary from a traditional, objectivist approach that focuses on minimizing learner errors to a more constructivist approach, focusing on facilitating understanding. The influx of computer-based courses, whether online or through a series of computer-assisted instruction (CAI) modules require that the current research of effective feedback techniques in the classroom be extended to computer environments in order to impact their instructional design. In this study, exposure to different types of feedback during a chemistry CAI module was studied in relation to science self-efficacy (SSE) and performance on an objective-driven assessment (ODA) of the chemistry concepts covered in the unit. The quantitative analysis consisted of two separate ANCOVAs on the dependent variables, using pretest as the covariate and group as the fixed factor. No significant differences were found for either variable between the three groups on adjusted posttest means for the ODA and SSE measures (.95F(2, 106) = 1.311, p = 0.274 and .95F(2, 106) = 1.080, p = 0.344, respectively). However, a mixed methods approach yielded valuable qualitative insights into why only one overall quantitative effect was observed. These findings are discussed in relation to the need to further refine the instruments and methods used in order to more fully explore the possibility that type of feedback might play a role in developing SSE, and consequently, improve academic performance in science. Future research building on this study may reveal significance that could impact instructional design practices for developing online and computer-based instruction.

  18. Constructing a patient-specific computer model of the upper airway in sleep apnea patients.

    PubMed

    Dhaliwal, Sandeep S; Hesabgar, Seyyed M; Haddad, Seyyed M H; Ladak, Hanif; Samani, Abbas; Rotenberg, Brian W

    2018-01-01

    The use of computer simulation to develop a high-fidelity model has been proposed as a novel and cost-effective alternative to help guide therapeutic intervention in sleep apnea surgery. We describe a computer model based on patient-specific anatomy of obstructive sleep apnea (OSA) subjects wherein the percentage and sites of upper airway collapse are compared to findings on drug-induced sleep endoscopy (DISE). Basic science computer model generation. Three-dimensional finite element techniques were undertaken for model development in a pilot study of four OSA patients. Magnetic resonance imaging was used to capture patient anatomy and software employed to outline critical anatomical structures. A finite-element mesh was applied to the volume enclosed by each structure. Linear and hyperelastic soft-tissue properties for various subsites (tonsils, uvula, soft palate, and tongue base) were derived using an inverse finite-element technique from surgical specimens. Each model underwent computer simulation to determine the degree of displacement on various structures within the upper airway, and these findings were compared to DISE exams performed on the four study patients. Computer simulation predictions for percentage of airway collapse and site of maximal collapse show agreement with observed results seen on endoscopic visualization. Modeling the upper airway in OSA patients is feasible and holds promise in aiding patient-specific surgical treatment. NA. Laryngoscope, 128:277-282, 2018. © 2017 The American Laryngological, Rhinological and Otological Society, Inc.

  19. Bringing computational science to the public.

    PubMed

    McDonagh, James L; Barker, Daniel; Alderson, Rosanna G

    2016-01-01

    The increasing use of computers in science allows for the scientific analyses of large datasets at an increasing pace. We provided examples and interactive demonstrations at Dundee Science Centre as part of the 2015 Women in Science festival, to present aspects of computational science to the general public. We used low-cost Raspberry Pi computers to provide hands on experience in computer programming and demonstrated the application of computers to biology. Computer games were used as a means to introduce computers to younger visitors. The success of the event was evaluated by voluntary feedback forms completed by visitors, in conjunction with our own self-evaluation. This work builds on the original work of the 4273π bioinformatics education program of Barker et al. (2013, BMC Bioinform. 14:243). 4273π provides open source education materials in bioinformatics. This work looks at the potential to adapt similar materials for public engagement events. It appears, at least in our small sample of visitors (n = 13), that basic computational science can be conveyed to people of all ages by means of interactive demonstrations. Children as young as five were able to successfully edit simple computer programs with supervision. This was, in many cases, their first experience of computer programming. The feedback is predominantly positive, showing strong support for improving computational science education, but also included suggestions for improvement. Our conclusions are necessarily preliminary. However, feedback forms suggest methods were generally well received among the participants; "Easy to follow. Clear explanation" and "Very easy. Demonstrators were very informative." Our event, held at a local Science Centre in Dundee, demonstrates that computer games and programming activities suitable for young children can be performed alongside a more specialised and applied introduction to computational science for older visitors.

  20. Computer Science and Telecommunications Board summary of activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blumenthal, M.S.

    1992-03-27

    The Computer Science and Telecommunications Board (CSTB) considers technical and policy issues pertaining to computer science, telecommunications, and associated technologies. CSTB actively disseminates the results of its completed projects to those in a position to help implement their recommendations or otherwise use their insights. It provides a forum for the exchange of information on computer science, computing technology, and telecommunications. This report discusses the major accomplishments of CSTB.

  1. Hispanic women overcoming deterrents to computer science: A phenomenological study

    NASA Astrophysics Data System (ADS)

    Herling, Lourdes

    The products of computer science are important to all aspects of society and are tools in the solution of the world's problems. It is, therefore, troubling that the United States faces a shortage in qualified graduates in computer science. The number of women and minorities in computer science is significantly lower than the percentage of the U.S. population which they represent. The overall enrollment in computer science programs has continued to decline with the enrollment of women declining at a higher rate than that of men. This study addressed three aspects of underrepresentation about which there has been little previous research: addressing computing disciplines specifically rather than embedding them within the STEM disciplines, what attracts women and minorities to computer science, and addressing the issues of race/ethnicity and gender in conjunction rather than in isolation. Since women of underrepresented ethnicities are more severely underrepresented than women in general, it is important to consider whether race and ethnicity play a role in addition to gender as has been suggested by previous research. Therefore, this study examined what attracted Hispanic women to computer science specifically. The study determines whether being subjected to multiple marginalizations---female and Hispanic---played a role in the experiences of Hispanic women currently in computer science. The study found five emergent themes within the experiences of Hispanic women in computer science. Encouragement and role models strongly influenced not only the participants' choice to major in the field, but to persist as well. Most of the participants experienced a negative atmosphere and feelings of not fitting in while in college and industry. The interdisciplinary nature of computer science was the most common aspect that attracted the participants to computer science. The aptitudes participants commonly believed are needed for success in computer science are the Twenty-First Century skills problem solving, creativity, and critical thinking. While not all the participants had experience with computers or programming prior to attending college, experience played a role in the self-confidence of those who did.

  2. Research in applied mathematics, numerical analysis, and computer science

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  3. Science-Driven Computing: NERSC's Plan for 2006-2010

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simon, Horst D.; Kramer, William T.C.; Bailey, David H.

    NERSC has developed a five-year strategic plan focusing on three components: Science-Driven Systems, Science-Driven Services, and Science-Driven Analytics. (1) Science-Driven Systems: Balanced introduction of the best new technologies for complete computational systems--computing, storage, networking, visualization and analysis--coupled with the activities necessary to engage vendors in addressing the DOE computational science requirements in their future roadmaps. (2) Science-Driven Services: The entire range of support activities, from high-quality operations and user services to direct scientific support, that enable a broad range of scientists to effectively use NERSC systems in their research. NERSC will concentrate on resources needed to realize the promise ofmore » the new highly scalable architectures for scientific discovery in multidisciplinary computational science projects. (3) Science-Driven Analytics: The architectural and systems enhancements and services required to integrate NERSC's powerful computational and storage resources to provide scientists with new tools to effectively manipulate, visualize, and analyze the huge data sets derived from simulations and experiments.« less

  4. NeuroPhysics: Studying how neurons create the perception of space-time using Physics' tools and techniques

    NASA Astrophysics Data System (ADS)

    Dhingra, Shonali; Sandler, Roman; Rios, Rodrigo; Vuong, Cliff; Mehta, Mayank

    All animals naturally perceive the abstract concept of space-time. A brain region called the Hippocampus is known to be important in creating these perceptions, but the underlying mechanisms are unknown. In our lab we employ several experimental and computational techniques from Physics to tackle this fundamental puzzle. Experimentally, we use ideas from Nanoscience and Materials Science to develop techniques to measure the activity of hippocampal neurons, in freely-behaving animals. Computationally, we develop models to study neuronal activity patterns, which are point processes that are highly stochastic and multidimensional. We then apply these techniques to collect and analyze neuronal signals from rodents while they're exploring space in Real World or Virtual Reality with various stimuli. Our findings show that under these conditions neuronal activity depends on various parameters, such as sensory cues including visual and auditory, and behavioral cues including, linear and angular, position and velocity. Further, neuronal networks create internally-generated rhythms, which influence perception of space and time. In totality, these results further our understanding of how the brain develops a cognitive map of our surrounding space, and keep track of time.

  5. Gender Differences in the Use of Computers, Programming, and Peer Interactions in Computer Science Classrooms

    ERIC Educational Resources Information Center

    Stoilescu, Dorian; Egodawatte, Gunawardena

    2010-01-01

    Research shows that female and male students in undergraduate computer science programs view computer culture differently. Female students are interested more in the use of computers than in doing programming, whereas male students see computer science mainly as a programming activity. The overall purpose of our research was not to find new…

  6. 3D light scanning macrography.

    PubMed

    Huber, D; Keller, M; Robert, D

    2001-08-01

    The technique of 3D light scanning macrography permits the non-invasive surface scanning of small specimens at magnifications up to 200x. Obviating both the problem of limited depth of field inherent to conventional close-up macrophotography and the metallic coating required by scanning electron microscopy, 3D light scanning macrography provides three-dimensional digital images of intact specimens without the loss of colour, texture and transparency information. This newly developed technique offers a versatile, portable and cost-efficient method for the non-invasive digital and photographic documentation of small objects. Computer controlled device operation and digital image acquisition facilitate fast and accurate quantitative morphometric investigations, and the technique offers a broad field of research and educational applications in biological, medical and materials sciences.

  7. Advances in natural language processing.

    PubMed

    Hirschberg, Julia; Manning, Christopher D

    2015-07-17

    Natural language processing employs computational techniques for the purpose of learning, understanding, and producing human language content. Early computational approaches to language research focused on automating the analysis of the linguistic structure of language and developing basic technologies such as machine translation, speech recognition, and speech synthesis. Today's researchers refine and make use of such tools in real-world applications, creating spoken dialogue systems and speech-to-speech translation engines, mining social media for information about health or finance, and identifying sentiment and emotion toward products and services. We describe successes and challenges in this rapidly advancing area. Copyright © 2015, American Association for the Advancement of Science.

  8. Distributed databases for materials study of thermo-kinetic properties

    NASA Astrophysics Data System (ADS)

    Toher, Cormac

    2015-03-01

    High-throughput computational materials science provides researchers with the opportunity to rapidly generate large databases of materials properties. To rapidly add thermal properties to the AFLOWLIB consortium and Materials Project repositories, we have implemented an automated quasi-harmonic Debye model, the Automatic GIBBS Library (AGL). This enables us to screen thousands of materials for thermal conductivity, bulk modulus, thermal expansion and related properties. The search and sort functions of the online database can then be used to identify suitable materials for more in-depth study using more precise computational or experimental techniques. AFLOW-AGL source code is public domain and will soon be released within the GNU-GPL license.

  9. Opportunities for Computational Discovery in Basic Energy Sciences

    NASA Astrophysics Data System (ADS)

    Pederson, Mark

    2011-03-01

    An overview of the broad-ranging support of computational physics and computational science within the Department of Energy Office of Science will be provided. Computation as the third branch of physics is supported by all six offices (Advanced Scientific Computing, Basic Energy, Biological and Environmental, Fusion Energy, High-Energy Physics, and Nuclear Physics). Support focuses on hardware, software and applications. Most opportunities within the fields of~condensed-matter physics, chemical-physics and materials sciences are supported by the Officeof Basic Energy Science (BES) or through partnerships between BES and the Office for Advanced Scientific Computing. Activities include radiation sciences, catalysis, combustion, materials in extreme environments, energy-storage materials, light-harvesting and photovoltaics, solid-state lighting and superconductivity.~ A summary of two recent reports by the computational materials and chemical communities on the role of computation during the next decade will be provided. ~In addition to materials and chemistry challenges specific to energy sciences, issues identified~include a focus on the role of the domain scientist in integrating, expanding and sustaining applications-oriented capabilities on evolving high-performance computing platforms and on the role of computation in accelerating the development of innovative technologies. ~~

  10. Research | Computational Science | NREL

    Science.gov Websites

    Research Research NREL's computational science experts use advanced high-performance computing (HPC technologies, thereby accelerating the transformation of our nation's energy system. Enabling High-Impact Research NREL's computational science capabilities enable high-impact research. Some recent examples

  11. A Combined Experimental and Computational Study on the Stability of Nanofluids Containing Metal Organic Frameworks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Annapureddy, Harsha Vardhan Reddy; Nune, Satish K.; Motkuri, Radha K.

    2015-01-08

    Computational studies on nanofluids composed of metal organic frameworks (MOFs) were performed using molecular modeling techniques. Grand Canonical Monte Carlo (GCMC) simulations were used to study adsorption behavior of 1,1,1,3,3-pentafluoropropane (R-245fa) in a MIL-101 MOF at various temperatures. To understand the stability of the nanofluid composed of MIL-101 particles, we performed molecular dynamics simulations to compute potentials of mean force between hypothetical MIL-101 fragments terminated with two different kinds of modulators in R-245fa and water. Our computed potentials of mean force results indicate that the MOF particles tend to disperse better in water than in R-245fa. The reasons for thismore » observation were analyzed and discussed. Our results agree with experimental results indicating that the employed potential models and modeling approaches provide good description of molecular interactions and the reliabilities. Work performed by LXD was supported by the U.S. Department of Energy (DOE), Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences, and Biosciences. Work performed by HVRA, SKN, RKM, and PBM was supported by the Office of Energy Efficiency and Renewable Energy, Geothermal Technologies Program. Pacific Northwest National Laboratory is a multiprogram national laboratory operated for DOE by Battelle.« less

  12. [Impact of digital technology on clinical practices: perspectives from surgery].

    PubMed

    Zhang, Y; Liu, X J

    2016-04-09

    Digital medical technologies or computer aided medical procedures, refer to imaging, 3D reconstruction, virtual design, 3D printing, navigation guided surgery and robotic assisted surgery techniques. These techniques are integrated into conventional surgical procedures to create new clinical protocols that are known as "digital surgical techniques". Conventional health care is characterized by subjective experiences, while digital medical technologies bring quantifiable information, transferable data, repeatable methods and predictable outcomes into clinical practices. Being integrated into clinical practice, digital techniques facilitate surgical care by improving outcomes and reducing risks. Digital techniques are becoming increasingly popular in trauma surgery, orthopedics, neurosurgery, plastic and reconstructive surgery, imaging and anatomic sciences. Robotic assisted surgery is also evolving and being applied in general surgery, cardiovascular surgery and orthopedic surgery. Rapid development of digital medical technologies is changing healthcare and clinical practices. It is therefore important for all clinicians to purposefully adapt to these technologies and improve their clinical outcomes.

  13. The soft computing-based approach to investigate allergic diseases: a systematic review.

    PubMed

    Tartarisco, Gennaro; Tonacci, Alessandro; Minciullo, Paola Lucia; Billeci, Lucia; Pioggia, Giovanni; Incorvaia, Cristoforo; Gangemi, Sebastiano

    2017-01-01

    Early recognition of inflammatory markers and their relation to asthma, adverse drug reactions, allergic rhinitis, atopic dermatitis and other allergic diseases is an important goal in allergy. The vast majority of studies in the literature are based on classic statistical methods; however, developments in computational techniques such as soft computing-based approaches hold new promise in this field. The aim of this manuscript is to systematically review the main soft computing-based techniques such as artificial neural networks, support vector machines, bayesian networks and fuzzy logic to investigate their performances in the field of allergic diseases. The review was conducted following PRISMA guidelines and the protocol was registered within PROSPERO database (CRD42016038894). The research was performed on PubMed and ScienceDirect, covering the period starting from September 1, 1990 through April 19, 2016. The review included 27 studies related to allergic diseases and soft computing performances. We observed promising results with an overall accuracy of 86.5%, mainly focused on asthmatic disease. The review reveals that soft computing-based approaches are suitable for big data analysis and can be very powerful, especially when dealing with uncertainty and poorly characterized parameters. Furthermore, they can provide valuable support in case of lack of data and entangled cause-effect relationships, which make it difficult to assess the evolution of disease. Although most works deal with asthma, we believe the soft computing approach could be a real breakthrough and foster new insights into other allergic diseases as well.

  14. Argonne Leadership Computing Facility 2011 annual report : Shaping future supercomputing.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Papka, M.; Messina, P.; Coffey, R.

    The ALCF's Early Science Program aims to prepare key applications for the architecture and scale of Mira and to solidify libraries and infrastructure that will pave the way for other future production applications. Two billion core-hours have been allocated to 16 Early Science projects on Mira. The projects, in addition to promising delivery of exciting new science, are all based on state-of-the-art, petascale, parallel applications. The project teams, in collaboration with ALCF staff and IBM, have undertaken intensive efforts to adapt their software to take advantage of Mira's Blue Gene/Q architecture, which, in a number of ways, is a precursormore » to future high-performance-computing architecture. The Argonne Leadership Computing Facility (ALCF) enables transformative science that solves some of the most difficult challenges in biology, chemistry, energy, climate, materials, physics, and other scientific realms. Users partnering with ALCF staff have reached research milestones previously unattainable, due to the ALCF's world-class supercomputing resources and expertise in computation science. In 2011, the ALCF's commitment to providing outstanding science and leadership-class resources was honored with several prestigious awards. Research on multiscale brain blood flow simulations was named a Gordon Bell Prize finalist. Intrepid, the ALCF's BG/P system, ranked No. 1 on the Graph 500 list for the second consecutive year. The next-generation BG/Q prototype again topped the Green500 list. Skilled experts at the ALCF enable researchers to conduct breakthrough science on the Blue Gene system in key ways. The Catalyst Team matches project PIs with experienced computational scientists to maximize and accelerate research in their specific scientific domains. The Performance Engineering Team facilitates the effective use of applications on the Blue Gene system by assessing and improving the algorithms used by applications and the techniques used to implement those algorithms. The Data Analytics and Visualization Team lends expertise in tools and methods for high-performance, post-processing of large datasets, interactive data exploration, batch visualization, and production visualization. The Operations Team ensures that system hardware and software work reliably and optimally; system tools are matched to the unique system architectures and scale of ALCF resources; the entire system software stack works smoothly together; and I/O performance issues, bug fixes, and requests for system software are addressed. The User Services and Outreach Team offers frontline services and support to existing and potential ALCF users. The team also provides marketing and outreach to users, DOE, and the broader community.« less

  15. NASA's computer science research program

    NASA Technical Reports Server (NTRS)

    Larsen, R. L.

    1983-01-01

    Following a major assessment of NASA's computing technology needs, a new program of computer science research has been initiated by the Agency. The program includes work in concurrent processing, management of large scale scientific databases, software engineering, reliable computing, and artificial intelligence. The program is driven by applications requirements in computational fluid dynamics, image processing, sensor data management, real-time mission control and autonomous systems. It consists of university research, in-house NASA research, and NASA's Research Institute for Advanced Computer Science (RIACS) and Institute for Computer Applications in Science and Engineering (ICASE). The overall goal is to provide the technical foundation within NASA to exploit advancing computing technology in aerospace applications.

  16. A synthetic design environment for ship design

    NASA Technical Reports Server (NTRS)

    Chipman, Richard R.

    1995-01-01

    Rapid advances in computer science and information system technology have made possible the creation of synthetic design environments (SDE) which use virtual prototypes to increase the efficiency and agility of the design process. This next generation of computer-based design tools will rely heavily on simulation and advanced visualization techniques to enable integrated product and process teams to concurrently conceptualize, design, and test a product and its fabrication processes. This paper summarizes a successful demonstration of the feasibility of using a simulation based design environment in the shipbuilding industry. As computer science and information science technologies have evolved, there have been many attempts to apply and integrate the new capabilities into systems for the improvement of the process of design. We see the benefits of those efforts in the abundance of highly reliable, technologically complex products and services in the modern marketplace. Furthermore, the computer-based technologies have been so cost effective that the improvements embodied in modern products have been accompanied by lowered costs. Today the state-of-the-art in computerized design has advanced so dramatically that the focus is no longer on merely improving design methodology; rather the goal is to revolutionize the entire process by which complex products are conceived, designed, fabricated, tested, deployed, operated, maintained, refurbished and eventually decommissioned. By concurrently addressing all life-cycle issues, the basic decision making process within an enterprise will be improved dramatically, leading to new levels of quality, innovation, efficiency, and customer responsiveness. By integrating functions and people with an enterprise, such systems will change the fundamental way American industries are organized, creating companies that are more competitive, creative, and productive.

  17. Girls Save the World through Computer Science

    ERIC Educational Resources Information Center

    Murakami, Christine

    2011-01-01

    It's no secret that fewer and fewer women are entering computer science fields. Attracting high school girls to computer science is only part of the solution. Retaining them while they are in higher education or the workforce is also a challenge. To solve this, there is a need to show girls that computer science is a wide-open field that offers…

  18. The Assessment of Taiwanese College Students' Conceptions of and Approaches to Learning Computer Science and Their Relationships

    ERIC Educational Resources Information Center

    Liang, Jyh-Chong; Su, Yi-Ching; Tsai, Chin-Chung

    2015-01-01

    The aim of this study was to explore Taiwanese college students' conceptions of and approaches to learning computer science and then explore the relationships between the two. Two surveys, Conceptions of Learning Computer Science (COLCS) and Approaches to Learning Computer Science (ALCS), were administered to 421 college students majoring in…

  19. Hispanic Women Overcoming Deterrents to Computer Science: A Phenomenological Study

    ERIC Educational Resources Information Center

    Herling, Lourdes

    2011-01-01

    The products of computer science are important to all aspects of society and are tools in the solution of the world's problems. It is, therefore, troubling that the United States faces a shortage in qualified graduates in computer science. The number of women and minorities in computer science is significantly lower than the percentage of the…

  20. The Effects of Integrating Service Learning into Computer Science: An Inter-Institutional Longitudinal Study

    ERIC Educational Resources Information Center

    Payton, Jamie; Barnes, Tiffany; Buch, Kim; Rorrer, Audrey; Zuo, Huifang

    2015-01-01

    This study is a follow-up to one published in computer science education in 2010 that reported preliminary results showing a positive impact of service learning on student attitudes associated with success and retention in computer science. That paper described how service learning was incorporated into a computer science course in the context of…

  1. Architecture and settings optimization procedure of a TES frequency domain multiplexed readout firmware

    NASA Astrophysics Data System (ADS)

    Clenet, A.; Ravera, L.; Bertrand, B.; den Hartog, R.; Jackson, B.; van Leeuwen, B.-J.; van Loon, D.; Parot, Y.; Pointecouteau, E.; Sournac, A.

    2014-11-01

    IRAP is developing the readout electronics of the SPICA-SAFARI's TES bolometer arrays. Based on the frequency domain multiplexing technique the readout electronics provides the AC-signals to voltage-bias the detectors; it demodulates the data; and it computes a feedback to linearize the detection chain. The feedback is computed with a specific technique, so called baseband feedback (BBFB) which ensures that the loop is stable even with long propagation and processing delays (i.e. several μ s) and with fast signals (i.e. frequency carriers of the order of 5 MHz). To optimize the power consumption we took advantage of the reduced science signal bandwidth to decouple the signal sampling frequency and the data processing rate. This technique allowed a reduction of the power consumption of the circuit by a factor of 10. Beyond the firmware architecture the optimization of the instrument concerns the characterization routines and the definition of the optimal parameters. Indeed, to operate an array TES one has to properly define about 21000 parameters. We defined a set of procedures to automatically characterize these parameters and find out the optimal settings.

  2. Non-Determinism: An Abstract Concept in Computer Science Studies

    ERIC Educational Resources Information Center

    Armoni, Michal; Gal-Ezer, Judith

    2007-01-01

    Non-determinism is one of the most important, yet abstract, recurring concepts of Computer Science. It plays an important role in Computer Science areas such as formal language theory, computability theory, distributed computing, and operating systems. We conducted a series of studies on the perception of non-determinism. In the current research,…

  3. An Investigation of Primary School Science Teachers' Use of Computer Applications

    ERIC Educational Resources Information Center

    Ocak, Mehmet Akif; Akdemir, Omur

    2008-01-01

    This study investigated the level and frequency of science teachers' use of computer applications as an instructional tool in the classroom. The manner and frequency of science teachers' use of computer, their perceptions about integration of computer applications, and other factors contributed to changes in their computer literacy are…

  4. Climate Modeling Computing Needs Assessment

    NASA Astrophysics Data System (ADS)

    Petraska, K. E.; McCabe, J. D.

    2011-12-01

    This paper discusses early findings of an assessment of computing needs for NASA science, engineering and flight communities. The purpose of this assessment is to document a comprehensive set of computing needs that will allow us to better evaluate whether our computing assets are adequately structured to meet evolving demand. The early results are interesting, already pointing out improvements we can make today to get more out of the computing capacity we have, as well as potential game changing innovations for the future in how we apply information technology to science computing. Our objective is to learn how to leverage our resources in the best way possible to do more science for less money. Our approach in this assessment is threefold: Development of use case studies for science workflows; Creating a taxonomy and structure for describing science computing requirements; and characterizing agency computing, analysis, and visualization resources. As projects evolve, science data sets increase in a number of ways: in size, scope, timelines, complexity, and fidelity. Generating, processing, moving, and analyzing these data sets places distinct and discernable requirements on underlying computing, analysis, storage, and visualization systems. The initial focus group for this assessment is the Earth Science modeling community within NASA's Science Mission Directorate (SMD). As the assessment evolves, this focus will expand to other science communities across the agency. We will discuss our use cases, our framework for requirements and our characterizations, as well as our interview process, what we learned and how we plan to improve our materials after using them in the first round of interviews in the Earth Science Modeling community. We will describe our plans for how to expand this assessment, first into the Earth Science data analysis and remote sensing communities, and then throughout the full community of science, engineering and flight at NASA.

  5. Nondestructive Evaluation for Aerospace Composites

    NASA Technical Reports Server (NTRS)

    Leckey, Cara; Cramer, Elliott; Perey, Daniel

    2015-01-01

    Nondestructive evaluation (NDE) techniques are important for enabling NASA's missions in space exploration and aeronautics. The expanded and continued use of composite materials for aerospace components and vehicles leads to a need for advanced NDE techniques capable of quantitatively characterizing damage in composites. Quantitative damage detection techniques help to ensure safety, reliability and durability of space and aeronautic vehicles. This presentation will give a broad outline of NASA's range of technical work and an overview of the NDE research performed in the Nondestructive Evaluation Sciences Branch at NASA Langley Research Center. The presentation will focus on ongoing research in the development of NDE techniques for composite materials and structures, including development of automated data processing tools to turn NDE data into quantitative location and sizing results. Composites focused NDE research in the areas of ultrasonics, thermography, X-ray computed tomography, and NDE modeling will be discussed.

  6. [Research progress and development trend of quantitative assessment techniques for urban thermal environment.

    PubMed

    Sun, Tie Gang; Xiao, Rong Bo; Cai, Yun Nan; Wang, Yao Wu; Wu, Chang Guang

    2016-08-01

    Quantitative assessment of urban thermal environment has become a focus for urban climate and environmental science since the concept of urban heat island has been proposed. With the continual development of space information and computer simulation technology, substantial progresses have been made on quantitative assessment techniques and methods of urban thermal environment. The quantitative assessment techniques have been developed to dynamics simulation and forecast of thermal environment at various scales based on statistical analysis of thermal environment on urban-scale using the historical data of weather stations. This study reviewed the development progress of ground meteorological observation, thermal infrared remote sensing and numerical simulation. Moreover, the potential advantages and disadvantages, applicability and the development trends of these techniques were also summarized, aiming to add fundamental knowledge of understanding the urban thermal environment assessment and optimization.

  7. Kenny Gruchalla | NREL

    Science.gov Websites

    feature extraction, human-computer interaction, and physics-based modeling. Professional Experience 2009 ., computer science, University of Colorado at Boulder M.S., computer science, University of Colorado at Boulder B.S., computer science, New Mexico Institute of Mining and Technology

  8. NDE scanning and imaging of aircraft structure

    NASA Astrophysics Data System (ADS)

    Bailey, Donald; Kepler, Carl; Le, Cuong

    1995-07-01

    The Science and Engineering Lab at McClellan Air Force Base, Sacramento, Calif. has been involved in the development and use of computer-based scanning systems for NDE (nondestructive evaluation) since 1985. This paper describes the history leading up to our current applications which employ eddy current and ultrasonic scanning of aircraft structures that contain both metallics and advanced composites. The scanning is performed using industrialized computers interfaced to proprietary acquisition equipment and software. Examples are shown that image several types of damage such as exfoliation and fuselage lap joint corrosion in aluminum, impact damage, embedded foreign material, and porosity in Kevlar and graphite epoxy composites. Image analysis techniques are reported that are performed using consumer oriented computer hardware and software that are not NDE specific and not expensive

  9. GLAD: a system for developing and deploying large-scale bioinformatics grid.

    PubMed

    Teo, Yong-Meng; Wang, Xianbing; Ng, Yew-Kwong

    2005-03-01

    Grid computing is used to solve large-scale bioinformatics problems with gigabytes database by distributing the computation across multiple platforms. Until now in developing bioinformatics grid applications, it is extremely tedious to design and implement the component algorithms and parallelization techniques for different classes of problems, and to access remotely located sequence database files of varying formats across the grid. In this study, we propose a grid programming toolkit, GLAD (Grid Life sciences Applications Developer), which facilitates the development and deployment of bioinformatics applications on a grid. GLAD has been developed using ALiCE (Adaptive scaLable Internet-based Computing Engine), a Java-based grid middleware, which exploits the task-based parallelism. Two bioinformatics benchmark applications, such as distributed sequence comparison and distributed progressive multiple sequence alignment, have been developed using GLAD.

  10. Computer-aided design of polymers and composites

    NASA Technical Reports Server (NTRS)

    Kaelble, D. H.

    1985-01-01

    This book on computer-aided design of polymers and composites introduces and discusses the subject from the viewpoint of atomic and molecular models. Thus, the origins of stiffness, strength, extensibility, and fracture toughness in composite materials can be analyzed directly in terms of chemical composition and molecular structure. Aspects of polymer composite reliability are considered along with characterization techniques for composite reliability, relations between atomic and molecular properties, computer aided design and manufacture, polymer CAD/CAM models, and composite CAD/CAM models. Attention is given to multiphase structural adhesives, fibrous composite reliability, metal joint reliability, polymer physical states and transitions, chemical quality assurance, processability testing, cure monitoring and management, nondestructive evaluation (NDE), surface NDE, elementary properties, ionic-covalent bonding, molecular analysis, acid-base interactions, the manufacturing science, and peel mechanics.

  11. Computer-aided design and computer science technology

    NASA Technical Reports Server (NTRS)

    Fulton, R. E.; Voigt, S. J.

    1976-01-01

    A description is presented of computer-aided design requirements and the resulting computer science advances needed to support aerospace design. The aerospace design environment is examined, taking into account problems of data handling and aspects of computer hardware and software. The interactive terminal is normally the primary interface between the computer system and the engineering designer. Attention is given to user aids, interactive design, interactive computations, the characteristics of design information, data management requirements, hardware advancements, and computer science developments.

  12. Meteor Observations as Big Data Citizen Science

    NASA Astrophysics Data System (ADS)

    Gritsevich, M.; Vinkovic, D.; Schwarz, G.; Nina, A.; Koschny, D.; Lyytinen, E.

    2016-12-01

    Meteor science represents an excellent example of the citizen science project, where progress in the field has been largely determined by amateur observations. Over the last couple of decades technological advancements in observational techniques have yielded drastic improvements in the quality, quantity and diversity of meteor data, while even more ambitious instruments are about to become operational. This empowers meteor science to boost its experimental and theoretical horizons and seek more advanced scientific goals. We review some of the developments that push meteor science into the Big Data era that requires more complex methodological approaches through interdisciplinary collaborations with other branches of physics and computer science. We argue that meteor science should become an integral part of large surveys in astronomy, aeronomy and space physics, and tackle the complexity of micro-physics of meteor plasma and its interaction with the atmosphere. The recent increased interest in meteor science triggered by the Chelyabinsk fireball helps in building the case for technologically and logistically more ambitious meteor projects. This requires developing new methodological approaches in meteor research, with Big Data science and close collaboration between citizen science, geoscience and astronomy as critical elements. We discuss possibilities for improvements and promote an opportunity for collaboration in meteor science within the currently established BigSkyEarth http://bigskyearth.eu/ network.

  13. The effects of integrating service learning into computer science: an inter-institutional longitudinal study

    NASA Astrophysics Data System (ADS)

    Payton, Jamie; Barnes, Tiffany; Buch, Kim; Rorrer, Audrey; Zuo, Huifang

    2015-07-01

    This study is a follow-up to one published in computer science education in 2010 that reported preliminary results showing a positive impact of service learning on student attitudes associated with success and retention in computer science. That paper described how service learning was incorporated into a computer science course in the context of the Students & Technology in Academia, Research, and Service (STARS) Alliance, an NSF-supported broadening participation in computing initiative that aims to diversify the computer science pipeline through innovative pedagogy and inter-institutional partnerships. The current paper describes how the STARS Alliance has expanded to diverse institutions, all using service learning as a vehicle for broadening participation in computing and enhancing attitudes and behaviors associated with student success. Results supported the STARS model of service learning for enhancing computing efficacy and computing commitment and for providing diverse students with many personal and professional development benefits.

  14. Wind energy prospecting: socio-economic value of a new wind resource assessment technique based on a NASA Earth science dataset

    NASA Astrophysics Data System (ADS)

    Vanvyve, E.; Magontier, P.; Vandenberghe, F. C.; Delle Monache, L.; Dickinson, K.

    2012-12-01

    Wind energy is amongst the fastest growing sources of renewable energy in the U.S. and could supply up to 20 % of the U.S power production by 2030. An accurate and reliable wind resource assessment for prospective wind farm sites is a challenging task, yet is crucial for evaluating the long-term profitability and feasibility of a potential development. We have developed an accurate and computationally efficient wind resource assessment technique for prospective wind farm sites, which incorporates innovative statistical techniques and the new NASA Earth science dataset MERRA. This technique produces a wind resource estimate that is more accurate than that obtained by the wind energy industry's standard technique, while providing a reliable quantification of its uncertainty. The focus now is on evaluating the socio-economic value of this new technique upon using the industry's standard technique. Would it yield lower financing costs? Could it result in lower electricity prices? Are there further down-the-line positive consequences, e.g. job creation, time saved, greenhouse gas decrease? Ultimately, we expect our results will inform efforts to refine and disseminate the new technique to support the development of the U.S. renewable energy infrastructure. In order to address the above questions, we are carrying out a cost-benefit analysis based on the net present worth of the technique. We will describe this approach, including the cash-flow process of wind farm financing, how the wind resource assessment factors in, and will present current results for various hypothetical candidate wind farm sites.

  15. A survey of visual preprocessing and shape representation techniques

    NASA Technical Reports Server (NTRS)

    Olshausen, Bruno A.

    1988-01-01

    Many recent theories and methods proposed for visual preprocessing and shape representation are summarized. The survey brings together research from the fields of biology, psychology, computer science, electrical engineering, and most recently, neural networks. It was motivated by the need to preprocess images for a sparse distributed memory (SDM), but the techniques presented may also prove useful for applying other associative memories to visual pattern recognition. The material of this survey is divided into three sections: an overview of biological visual processing; methods of preprocessing (extracting parts of shape, texture, motion, and depth); and shape representation and recognition (form invariance, primitives and structural descriptions, and theories of attention).

  16. A Modified Particle Swarm Optimization Technique for Finding Optimal Designs for Mixture Models

    PubMed Central

    Wong, Weng Kee; Chen, Ray-Bing; Huang, Chien-Chih; Wang, Weichung

    2015-01-01

    Particle Swarm Optimization (PSO) is a meta-heuristic algorithm that has been shown to be successful in solving a wide variety of real and complicated optimization problems in engineering and computer science. This paper introduces a projection based PSO technique, named ProjPSO, to efficiently find different types of optimal designs, or nearly optimal designs, for mixture models with and without constraints on the components, and also for related models, like the log contrast models. We also compare the modified PSO performance with Fedorov's algorithm, a popular algorithm used to generate optimal designs, Cocktail algorithm, and the recent algorithm proposed by [1]. PMID:26091237

  17. Study on the tumor-induced angiogenesis using mathematical models.

    PubMed

    Suzuki, Takashi; Minerva, Dhisa; Nishiyama, Koichi; Koshikawa, Naohiko; Chaplain, Mark Andrew Joseph

    2018-01-01

    We studied angiogenesis using mathematical models describing the dynamics of tip cells. We reviewed the basic ideas of angiogenesis models and its numerical simulation technique to produce realistic computer graphics images of sprouting angiogenesis. We examined the classical model of Anderson-Chaplain using fundamental concepts of mass transport and chemical reaction with ECM degradation included. We then constructed two types of numerical schemes, model-faithful and model-driven ones, where new techniques of numerical simulation are introduced, such as transient probability, particle velocity, and Boolean variables. © 2017 The Authors. Cancer Science published by John Wiley & Sons Australia, Ltd on behalf of Japanese Cancer Association.

  18. Space Weather in the Machine Learning Era: A Multidisciplinary Approach

    NASA Astrophysics Data System (ADS)

    Camporeale, E.; Wing, S.; Johnson, J.; Jackman, C. M.; McGranaghan, R.

    2018-01-01

    The workshop entitled Space Weather: A Multidisciplinary Approach took place at the Lorentz Center, University of Leiden, Netherlands, on 25-29 September 2017. The aim of this workshop was to bring together members of the Space Weather, Mathematics, Statistics, and Computer Science communities to address the use of advanced techniques such as Machine Learning, Information Theory, and Deep Learning, to better understand the Sun-Earth system and to improve space weather forecasting. Although individual efforts have been made toward this goal, the community consensus is that establishing interdisciplinary collaborations is the most promising strategy for fully utilizing the potential of these advanced techniques in solving Space Weather-related problems.

  19. Interests diffusion on a semantic multiplex. Comparing Computer Science and American Physical Society communities

    NASA Astrophysics Data System (ADS)

    D'Agostino, Gregorio; De Nicola, Antonio

    2016-10-01

    Exploiting the information about members of a Social Network (SN) represents one of the most attractive and dwelling subjects for both academic and applied scientists. The community of Complexity Science and especially those researchers working on multiplex social systems are devoting increasing efforts to outline general laws, models, and theories, to the purpose of predicting emergent phenomena in SN's (e.g. success of a product). On the other side the semantic web community aims at engineering a new generation of advanced services tailored to specific people needs. This implies defining constructs, models and methods for handling the semantic layer of SNs. We combined models and techniques from both the former fields to provide a hybrid approach to understand a basic (yet complex) phenomenon: the propagation of individual interests along the social networks. Since information may move along different social networks, one should take into account a multiplex structure. Therefore we introduced the notion of "Semantic Multiplex". In this paper we analyse two different semantic social networks represented by authors publishing in the Computer Science and those in the American Physical Society Journals. The comparison allows to outline common and specific features.

  20. ICASE Computer Science Program

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The Institute for Computer Applications in Science and Engineering computer science program is discussed in outline form. Information is given on such topics as problem decomposition, algorithm development, programming languages, and parallel architectures.

  1. Users matter : multi-agent systems model of high performance computing cluster users.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    North, M. J.; Hood, C. S.; Decision and Information Sciences

    2005-01-01

    High performance computing clusters have been a critical resource for computational science for over a decade and have more recently become integral to large-scale industrial analysis. Despite their well-specified components, the aggregate behavior of clusters is poorly understood. The difficulties arise from complicated interactions between cluster components during operation. These interactions have been studied by many researchers, some of whom have identified the need for holistic multi-scale modeling that simultaneously includes network level, operating system level, process level, and user level behaviors. Each of these levels presents its own modeling challenges, but the user level is the most complex duemore » to the adaptability of human beings. In this vein, there are several major user modeling goals, namely descriptive modeling, predictive modeling and automated weakness discovery. This study shows how multi-agent techniques were used to simulate a large-scale computing cluster at each of these levels.« less

  2. A Study of Computer Center Management

    DTIC Science & Technology

    1988-06-01

    the United States and the rest of the western world and do not take into consideration the various economic and culture factors in developing countries...Mortagy, Thesis Advisor John B Isett, Second Reader David R. Whipple -airman Department of . .&;-sation Science mes M.mgen, Act ng Dean of nm_ Jon and oli...take into consideration the various economic and culture factors in developing countries. This thesis seeks to present a number of new techniques in

  3. A Mathematical Theory of System Information Flow

    DTIC Science & Technology

    2016-06-27

    AFRL-AFOSR-VA-TR-2016-0232 A Mathematical Theory of System Information Flow Michael Mislove ADMINISTRATORS OF THE TULANE EDUCATIONAL FUND THE 6823...MM-YYYY) 17-06-2016 2. REPORT TYPE Final 3. DATES COVERED (From - To) 27MAR2013 - 31MAR2016 4. TITLE AND SUBTITLE A Mathematical Theory of System...systems using techniques from information theory , domain theory and other areas of mathematics and computer science. Over time, the focus shifted

  4. The Effect of Bilingual Term List Size on Dictionary-Based Cross-Language Information Retrieval

    DTIC Science & Technology

    2006-01-01

    The Effect of Bilingual Term List Size on Dictionary -Based Cross-Language Information Retrieval Dina Demner-Fushman Department of Computer Science... dictionary -based Cross-Language Information Retrieval (CLIR), in which the goal is to find documents written in one natural language based on queries that...in which the documents are written. In dictionary -based CLIR techniques, the princi- pal source of translation knowledge is a translation lexicon

  5. An inside look at NASA planetology

    NASA Technical Reports Server (NTRS)

    Dwornik, S. E.

    1976-01-01

    Staffing, financing and budget controls, and research grant allocations of NASA are reviewed with emphasis on NASA-supported research in planetary geological sciences: studies of the composition, structure, and history of solar system planets. Programs, techniques, and research grants for studies of Mars photographs acquired through Mariner 6-10 flights are discussed at length, and particularly the handling of computer-enhanced photographic data. Scheduled future NASA-sponsored planet exploration missions (to Mars, Jupiter, Saturn, Uranus) are mentioned.

  6. Computational Materials Science and Chemistry: Accelerating Discovery and Innovation through Simulation-Based Engineering and Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crabtree, George; Glotzer, Sharon; McCurdy, Bill

    This report is based on a SC Workshop on Computational Materials Science and Chemistry for Innovation on July 26-27, 2010, to assess the potential of state-of-the-art computer simulations to accelerate understanding and discovery in materials science and chemistry, with a focus on potential impacts in energy technologies and innovation. The urgent demand for new energy technologies has greatly exceeded the capabilities of today's materials and chemical processes. To convert sunlight to fuel, efficiently store energy, or enable a new generation of energy production and utilization technologies requires the development of new materials and processes of unprecedented functionality and performance. Newmore » materials and processes are critical pacing elements for progress in advanced energy systems and virtually all industrial technologies. Over the past two decades, the United States has developed and deployed the world's most powerful collection of tools for the synthesis, processing, characterization, and simulation and modeling of materials and chemical systems at the nanoscale, dimensions of a few atoms to a few hundred atoms across. These tools, which include world-leading x-ray and neutron sources, nanoscale science facilities, and high-performance computers, provide an unprecedented view of the atomic-scale structure and dynamics of materials and the molecular-scale basis of chemical processes. For the first time in history, we are able to synthesize, characterize, and model materials and chemical behavior at the length scale where this behavior is controlled. This ability is transformational for the discovery process and, as a result, confers a significant competitive advantage. Perhaps the most spectacular increase in capability has been demonstrated in high performance computing. Over the past decade, computational power has increased by a factor of a million due to advances in hardware and software. This rate of improvement, which shows no sign of abating, has enabled the development of computer simulations and models of unprecedented fidelity. We are at the threshold of a new era where the integrated synthesis, characterization, and modeling of complex materials and chemical processes will transform our ability to understand and design new materials and chemistries with predictive power. In turn, this predictive capability will transform technological innovation by accelerating the development and deployment of new materials and processes in products and manufacturing. Harnessing the potential of computational science and engineering for the discovery and development of materials and chemical processes is essential to maintaining leadership in these foundational fields that underpin energy technologies and industrial competitiveness. Capitalizing on the opportunities presented by simulation-based engineering and science in materials and chemistry will require an integration of experimental capabilities with theoretical and computational modeling; the development of a robust and sustainable infrastructure to support the development and deployment of advanced computational models; and the assembly of a community of scientists and engineers to implement this integration and infrastructure. This community must extend to industry, where incorporating predictive materials science and chemistry into design tools can accelerate the product development cycle and drive economic competitiveness. The confluence of new theories, new materials synthesis capabilities, and new computer platforms has created an unprecedented opportunity to implement a "materials-by-design" paradigm with wide-ranging benefits in technological innovation and scientific discovery. The Workshop on Computational Materials Science and Chemistry for Innovation was convened in Bethesda, Maryland, on July 26-27, 2010. Sponsored by the Department of Energy (DOE) Offices of Advanced Scientific Computing Research and Basic Energy Sciences, the workshop brought together 160 experts in materials science, chemistry, and computational science representing more than 65 universities, laboratories, and industries, and four agencies. The workshop examined seven foundational challenge areas in materials science and chemistry: materials for extreme conditions, self-assembly, light harvesting, chemical reactions, designer fluids, thin films and interfaces, and electronic structure. Each of these challenge areas is critical to the development of advanced energy systems, and each can be accelerated by the integrated application of predictive capability with theory and experiment. The workshop concluded that emerging capabilities in predictive modeling and simulation have the potential to revolutionize the development of new materials and chemical processes. Coupled with world-leading materials characterization and nanoscale science facilities, this predictive capability provides the foundation for an innovation ecosystem that can accelerate the discovery, development, and deployment of new technologies, including advanced energy systems. Delivering on the promise of this innovation ecosystem requires the following: Integration of synthesis, processing, characterization, theory, and simulation and modeling. Many of the newly established Energy Frontier Research Centers and Energy Hubs are exploiting this integration. Achieving/strengthening predictive capability in foundational challenge areas. Predictive capability in the seven foundational challenge areas described in this report is critical to the development of advanced energy technologies. Developing validated computational approaches that span vast differences in time and length scales. This fundamental computational challenge crosscuts all of the foundational challenge areas. Similarly challenging is coupling of analytical data from multiple instruments and techniques that are required to link these length and time scales. Experimental validation and quantification of uncertainty in simulation and modeling. Uncertainty quantification becomes increasingly challenging as simulations become more complex. Robust and sustainable computational infrastructure, including software and applications. For modeling and simulation, software equals infrastructure. To validate the computational tools, software is critical infrastructure that effectively translates huge arrays of experimental data into useful scientific understanding. An integrated approach for managing this infrastructure is essential. Efficient transfer and incorporation of simulation-based engineering and science in industry. Strategies for bridging the gap between research and industrial applications and for widespread industry adoption of integrated computational materials engineering are needed.« less

  7. Geoscience Applications of Synchrotron X-ray Computed Microtomography

    NASA Astrophysics Data System (ADS)

    Rivers, M. L.

    2009-05-01

    Computed microtomography is the extension to micron spatial resolution of the CAT scanning technique developed for medical imaging. Synchrotron sources are ideal for the method, since they provide a monochromatic, parallel beam with high intensity. High energy storage rings such as the Advanced Photon Source at Argonne National Laboratory produce x-rays with high energy, high brilliance, and high coherence. All of these factors combine to produce an extremely powerful imaging tool for earth science research. Techniques that have been developed include: - Absorption and phase contrast computed tomography with spatial resolution approaching one micron - Differential contrast computed tomography, imaging above and below the absorption edge of a particular element - High-pressure tomography, imaging inside a pressure cell at pressures above 10GPa - High speed radiography, with 100 microsecond temporal resolution - Fluorescence tomography, imaging the 3-D distribution of elements present at ppm concentrations. - Radiographic strain measurements during deformation at high confining pressure, combined with precise x- ray diffraction measurements to determine stress. These techniques have been applied to important problems in earth and environmental sciences, including: - The 3-D distribution of aqueous and organic liquids in porous media, with applications in contaminated groundwater and petroleum recovery. - The kinetics of bubble formation in magma chambers, which control explosive volcanism. - Accurate crystal size distributions in volcanic systems, important for understanding the evolution of magma chambers. - The equation-of-state of amorphous materials at high pressure using both direct measurements of volume as a function of pressure and also by measuring the change x-ray absorption coefficient as a function of pressure. - The formation of frost flowers on Arctic sea-ice, which is important in controlling the atmospheric chemistry of mercury. - The distribution of cracks in rocks at potential nuclear waste repositories. - The location and chemical speciation of toxic elements such as arsenic and nickel in soils and in plant tissues in contaminated Superfund sites. - The strength of earth materials under the pressure and temperature conditions of the Earth's mantle, providing insights into plate tectonics and the generation of earthquakes.

  8. Visualizing functional motions of membrane transporters with molecular dynamics simulations.

    PubMed

    Shaikh, Saher A; Li, Jing; Enkavi, Giray; Wen, Po-Chao; Huang, Zhijian; Tajkhorshid, Emad

    2013-01-29

    Computational modeling and molecular simulation techniques have become an integral part of modern molecular research. Various areas of molecular sciences continue to benefit from, indeed rely on, the unparalleled spatial and temporal resolutions offered by these technologies, to provide a more complete picture of the molecular problems at hand. Because of the continuous development of more efficient algorithms harvesting ever-expanding computational resources, and the emergence of more advanced and novel theories and methodologies, the scope of computational studies has expanded significantly over the past decade, now including much larger molecular systems and far more complex molecular phenomena. Among the various computer modeling techniques, the application of molecular dynamics (MD) simulation and related techniques has particularly drawn attention in biomolecular research, because of the ability of the method to describe the dynamical nature of the molecular systems and thereby to provide a more realistic representation, which is often needed for understanding fundamental molecular properties. The method has proven to be remarkably successful in capturing molecular events and structural transitions highly relevant to the function and/or physicochemical properties of biomolecular systems. Herein, after a brief introduction to the method of MD, we use a number of membrane transport proteins studied in our laboratory as examples to showcase the scope and applicability of the method and its power in characterizing molecular motions of various magnitudes and time scales that are involved in the function of this important class of membrane proteins.

  9. Visualizing Functional Motions of Membrane Transporters with Molecular Dynamics Simulations

    PubMed Central

    2013-01-01

    Computational modeling and molecular simulation techniques have become an integral part of modern molecular research. Various areas of molecular sciences continue to benefit from, indeed rely on, the unparalleled spatial and temporal resolutions offered by these technologies, to provide a more complete picture of the molecular problems at hand. Because of the continuous development of more efficient algorithms harvesting ever-expanding computational resources, and the emergence of more advanced and novel theories and methodologies, the scope of computational studies has expanded significantly over the past decade, now including much larger molecular systems and far more complex molecular phenomena. Among the various computer modeling techniques, the application of molecular dynamics (MD) simulation and related techniques has particularly drawn attention in biomolecular research, because of the ability of the method to describe the dynamical nature of the molecular systems and thereby to provide a more realistic representation, which is often needed for understanding fundamental molecular properties. The method has proven to be remarkably successful in capturing molecular events and structural transitions highly relevant to the function and/or physicochemical properties of biomolecular systems. Herein, after a brief introduction to the method of MD, we use a number of membrane transport proteins studied in our laboratory as examples to showcase the scope and applicability of the method and its power in characterizing molecular motions of various magnitudes and time scales that are involved in the function of this important class of membrane proteins. PMID:23298176

  10. A vision for Water Resources Research

    NASA Astrophysics Data System (ADS)

    Clark, M. P.

    2017-12-01

    Water Resources Research (WRR) plays a leading role in advancing hydrologic science. As AGU's hydrology journal, WRR has nurtured and published major breakthroughs in hydrologic process understanding and prediction capabilities, accomplished through innovative measurement campaigns, novel data analysis techniques, and elegant computational methods. Developing synergies between process-oriented and applications-oriented science is becoming more important as large changes in coupled human-natural systems impose new stresses on hydrologic systems and create new needs for hydrologic process understanding and prediction. In this presentation I will summarize some major opportunities for WRR, such as the growth of interdisciplinary science and the need for greater international cooperation through sharing of data and model source codes. I will discuss these opportunities in the context of major external trends, especially (1) changes in the perceived value of science to address societal problems, (2) the explosive global growth in science over the past decade, and (3) the transition to a more diffuse publishing landscape. This presentation is intended to foster discussion on ways that WRR can enhance the quality and impact of hydrologic science.

  11. Applications of Out-of-Domain Knowledge in Students' Reasoning about Computer Program State

    ERIC Educational Resources Information Center

    Lewis, Colleen Marie

    2012-01-01

    To meet a growing demand and a projected deficit in the supply of computer professionals (NCWIT, 2009), it is of vital importance to expand students' access to computer science. However, many researchers in the computer science education community unproductively assume that some students lack an innate ability for computer science and…

  12. Scientific Computing Strategic Plan for the Idaho National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whiting, Eric Todd

    Scientific computing is a critical foundation of modern science. Without innovations in the field of computational science, the essential missions of the Department of Energy (DOE) would go unrealized. Taking a leadership role in such innovations is Idaho National Laboratory’s (INL’s) challenge and charge, and is central to INL’s ongoing success. Computing is an essential part of INL’s future. DOE science and technology missions rely firmly on computing capabilities in various forms. Modeling and simulation, fueled by innovations in computational science and validated through experiment, are a critical foundation of science and engineering. Big data analytics from an increasing numbermore » of widely varied sources is opening new windows of insight and discovery. Computing is a critical tool in education, science, engineering, and experiments. Advanced computing capabilities in the form of people, tools, computers, and facilities, will position INL competitively to deliver results and solutions on important national science and engineering challenges. A computing strategy must include much more than simply computers. The foundational enabling component of computing at many DOE national laboratories is the combination of a showcase like data center facility coupled with a very capable supercomputer. In addition, network connectivity, disk storage systems, and visualization hardware are critical and generally tightly coupled to the computer system and co located in the same facility. The existence of these resources in a single data center facility opens the doors to many opportunities that would not otherwise be possible.« less

  13. Intelligent Systems: Terrestrial Observation and Prediction Using Remote Sensing Data

    NASA Technical Reports Server (NTRS)

    Coughlan, Joseph C.

    2005-01-01

    NASA has made science and technology investments to better utilize its large space-borne remote sensing data holdings of the Earth. With the launch of Terra, NASA created a data-rich environment where the challenge is to fully utilize the data collected from EOS however, despite unprecedented amounts of observed data, there is a need for increasing the frequency, resolution, and diversity of observations. Current terrestrial models that use remote sensing data were constructed in a relatively data and compute limited era and do not take full advantage of on-line learning methods and assimilation techniques that can exploit these data. NASA has invested in visualization, data mining and knowledge discovery methods which have facilitated data exploitation, but these methods are insufficient for improving Earth science models that have extensive background knowledge nor do these methods refine understanding of complex processes. Investing in interdisciplinary teams that include computational scientists can lead to new models and systems for online operation and analysis of data that can autonomously improve in prediction skill over time.

  14. System Architecture Development for Energy and Water Infrastructure Data Management and Geovisual Analytics

    NASA Astrophysics Data System (ADS)

    Berres, A.; Karthik, R.; Nugent, P.; Sorokine, A.; Myers, A.; Pang, H.

    2017-12-01

    Building an integrated data infrastructure that can meet the needs of a sustainable energy-water resource management requires a robust data management and geovisual analytics platform, capable of cross-domain scientific discovery and knowledge generation. Such a platform can facilitate the investigation of diverse complex research and policy questions for emerging priorities in Energy-Water Nexus (EWN) science areas. Using advanced data analytics, machine learning techniques, multi-dimensional statistical tools, and interactive geovisualization components, such a multi-layered federated platform is being developed, the Energy-Water Nexus Knowledge Discovery Framework (EWN-KDF). This platform utilizes several enterprise-grade software design concepts and standards such as extensible service-oriented architecture, open standard protocols, event-driven programming model, enterprise service bus, and adaptive user interfaces to provide a strategic value to the integrative computational and data infrastructure. EWN-KDF is built on the Compute and Data Environment for Science (CADES) environment in Oak Ridge National Laboratory (ORNL).

  15. With Great Measurements Come Great Results

    NASA Astrophysics Data System (ADS)

    Williams, Carl

    Measurements are the foundation for science and modern life. Technologies we take for granted every day depend on them-cell phones, CAT scans, pharmaceuticals, even sports equipment. Metrology, or measurement science, determines what industry can make reliably and what they cannot. At the National Institute of Standards and Technology (NIST) we specialize in making world class measurements that an incredibly wide range of industries use to continually improve their products - computer chips with nanoscale components, atomic clocks that you can hold in your hand, lasers for both super-strong welds and delicate eye surgeries. Think of all the key technologies developed over the last 100 years and better measurements, standards, or analysis techniques played a role in making them possible. NIST works collaboratively with industry researchers on the advanced metrology for tomorrow's technologies. A new kilogram based on electromagnetic force, cars that weigh half as much but are just as strong, quantum computers, personalized medicine, single atom devices - it's all happening in our labs now. This talk will focus on how metrology creates the future.

  16. An Innovative, Multidisciplinary Educational Program in Interactive Information Storage and Retrieval. M.S. Thesis Final Report, 1 Jul. 1985 - 31 Dec. 1987

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Gallagher, Mary C.

    1985-01-01

    There exists a large number of large-scale bibliographic Information Storage and Retrieval Systems containing large amounts of valuable data of interest in a wide variety of research applications. These systems are not used to capacity because the end users, i.e., the researchers, have not been trained in the techniques of accessing such systems. This thesis describes the development of a transportable, university-level course in methods of querying on-line interactive Information Storage and Retrieval systems as a solution to this problem. This course was designed to instruct upper division science and engineering students to enable these end users to directly access such systems. The course is designed to be taught by instructors who are not specialists in either computer science or research skills. It is independent of any particular IS and R system or computer hardware. The project is sponsored by NASA and conducted by the University of Southwestern Louisiana and Southern University.

  17. Meta-heuristic algorithms as tools for hydrological science

    NASA Astrophysics Data System (ADS)

    Yoo, Do Guen; Kim, Joong Hoon

    2014-12-01

    In this paper, meta-heuristic optimization techniques are introduced and their applications to water resources engineering, particularly in hydrological science are introduced. In recent years, meta-heuristic optimization techniques have been introduced that can overcome the problems inherent in iterative simulations. These methods are able to find good solutions and require limited computation time and memory use without requiring complex derivatives. Simulation-based meta-heuristic methods such as Genetic algorithms (GAs) and Harmony Search (HS) have powerful searching abilities, which can occasionally overcome the several drawbacks of traditional mathematical methods. For example, HS algorithms can be conceptualized from a musical performance process and used to achieve better harmony; such optimization algorithms seek a near global optimum determined by the value of an objective function, providing a more robust determination of musical performance than can be achieved through typical aesthetic estimation. In this paper, meta-heuristic algorithms and their applications (focus on GAs and HS) in hydrological science are discussed by subject, including a review of existing literature in the field. Then, recent trends in optimization are presented and a relatively new technique such as Smallest Small World Cellular Harmony Search (SSWCHS) is briefly introduced, with a summary of promising results obtained in previous studies. As a result, previous studies have demonstrated that meta-heuristic algorithms are effective tools for the development of hydrological models and the management of water resources.

  18. A Cognitive Model for Problem Solving in Computer Science

    ERIC Educational Resources Information Center

    Parham, Jennifer R.

    2009-01-01

    According to industry representatives, computer science education needs to emphasize the processes involved in solving computing problems rather than their solutions. Most of the current assessment tools used by universities and computer science departments analyze student answers to problems rather than investigating the processes involved in…

  19. Defining Computational Thinking for Mathematics and Science Classrooms

    ERIC Educational Resources Information Center

    Weintrop, David; Beheshti, Elham; Horn, Michael; Orton, Kai; Jona, Kemi; Trouille, Laura; Wilensky, Uri

    2016-01-01

    Science and mathematics are becoming computational endeavors. This fact is reflected in the recently released Next Generation Science Standards and the decision to include "computational thinking" as a core scientific practice. With this addition, and the increased presence of computation in mathematics and scientific contexts, a new…

  20. Computer finds ore

    NASA Astrophysics Data System (ADS)

    Bell, Peter M.

    Artificial intelligence techniques are being used for the first time to evaluate geophysical, geochemical, and geologic data and theory in order to locate ore deposits. After several years of development, an intelligent computer code has been formulated and applied to the Mount Tolman area in Washington state. In a project funded by the United States Geological Survey and the National Science Foundation a set of computer programs, under the general title Prospector, was used successfully to locate a previously unknown ore-grade porphyry molybdenum deposit in the vicinity of Mount Tolman (Science, Sept. 3, 1982).The general area of the deposit had been known to contain exposures of porphyry mineralization. Between 1964 and 1978, exploration surveys had been run by the Bear Creek Mining Company, and later exploration was done in the area by the Amax Corporation. Some of the geophysical data and geochemical and other prospecting surveys were incorporated into the programs, and mine exploration specialists contributed to a set of rules for Prospector. The rules were encoded as ‘inference networks’ to form the ‘expert system’ on which the artificial intelligence codes were based. The molybdenum ore deposit discovered by the test is large, located subsurface, and has an areal extent of more than 18 km2.

  1. A novel method for NDT applications using NXCT system at the Missouri University of Science & Technology

    NASA Astrophysics Data System (ADS)

    Sinha, Vaibhav; Srivastava, Anjali; Koo Lee, Hyoung

    2014-06-01

    A novel method for non-destructive analysis has been developed using a neutron/X-ray combined computed tomography (NXCT) system at the Missouri University of Science and Technology Reactor (MSTR). This imaging system takes advantage of the fact that neutrons and X-rays have characteristically different interactions with same materials. NXCT fuses the imaging capabilities of both systems at one location and allows instant evaluation for nondestructive testing (NDT) applications. This technique promises viable advances in the field of NDT. In this paper, the complete design criteria and procedures are provided. The described design criteria and procedures can effectively be utilized to design and develop advanced combined computed tomography system. The successful operation of the high resolution X-ray and neutron computed tomography has been demonstrated in this paper. The utility and importance of the NXCT system has been shown by nondestructive evaluation of various phantoms constituting different materials, geometrical, structural and compositional information. The concept of NXCT can be useful for concealed material detection, material characterization, investigation of complex geometries involving different atomic number materials and real time imaging for in-situ studies.

  2. Radar Model of Asteroid 216 Kleopatra

    NASA Technical Reports Server (NTRS)

    2000-01-01

    These images show several views from a radar-based computer model of asteroid 216 Kleopatra. The object, located in the main asteroid belt between Mars and Jupiter, is about 217 kilometers (135 miles) long and about 94 kilometers (58 miles) wide, or about the size of New Jersey.

    This dog bone-shaped asteroid is an apparent leftover from an ancient, violent cosmic collision. Kleopatra is one of several dozen asteroids whose coloring suggests they contain metal.

    A team of astronomers observing Kleopatra used the 305-meter (1,000-foot) telescope of the Arecibo Observatory in Puerto Rico to bounce encoded radio signals off Kleopatra. Using sophisticated computer analysis techniques, they decoded the echoes, transformed them into images, and assembled a computer model of the asteroid's shape.

    The images were obtained when Kleopatra was about 171 million kilometers (106 million miles) from Earth. This model is accurate to within about 15 kilometers (9 miles).

    The Arecibo Observatory is part of the National Astronomy and Ionosphere Center, operated by Cornell University, Ithaca, N.Y., for the National Science Foundation. The Kleopatra radar observations were supported by NASA's Office of Space Science, Washington, DC. JPL is managed for NASA by the California Institute of Technology in Pasadena.

  3. Arthur L. Schawlow Prize in Laser Science Talk: Trapped Ion Quantum Networks with Light

    NASA Astrophysics Data System (ADS)

    Monroe, Christopher

    2015-05-01

    Laser-cooled atomic ions are standards for quantum information science, acting as qubit memories with unsurpassed levels of quantum coherence while also allowing near-perfect measurement. When qubit state-dependent optical dipole forces are applied to a collection of trapped ions, their Coulomb interaction is modulated in a way that allows the entanglement of the qubits through quantum gates that can form the basis of a quantum computer. Similar optical forces allow the simulation of quantum many-body physics, where recent experiments are approaching a level of complexity that cannot be modelled with conventional computers. Scaling to much larger numbers of qubits can be accomplished by coupling trapped ion qubits through optical photons, where entanglement over remote distances can be used for quantum communication and large-scale distributed quantum computers. Laser sources and quantum optical techniques are the workhorse for such quantum networks, and will continue to lead the way as future quantum hardware is developed. This work is supported by the ARO with funding from the IARPA MQCO program, the DARPA Quiness Program, the ARO MURI on Hybrid Quantum Circuits, the AFOSR MURIs on Quantum Transduction and Quantum Verification, and the NSF Physics Frontier Center at JQI.

  4. NASA Center for Computational Sciences: History and Resources

    NASA Technical Reports Server (NTRS)

    2000-01-01

    The Nasa Center for Computational Sciences (NCCS) has been a leading capacity computing facility, providing a production environment and support resources to address the challenges facing the Earth and space sciences research community.

  5. Institute for Computer Applications in Science and Engineering (ICASE)

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science during the period April 1, 1983 through September 30, 1983 is summarized.

  6. Applying physical science techniques and CERN technology to an unsolved problem in radiation treatment for cancer: the multidisciplinary ‘VoxTox’ research programme

    PubMed Central

    Burnet, Neil G; Scaife, Jessica E; Romanchikova, Marina; Thomas, Simon J; Bates, Amy M; Wong, Emma; Noble, David J; Shelley, Leila EA; Bond, Simon J; Forman, Julia R; Hoole, Andrew CF; Barnett, Gillian C; Brochu, Frederic M; Simmons, Michael PD; Jena, Raj; Harrison, Karl; Yeap, Ping Lin; Drew, Amelia; Silvester, Emma; Elwood, Patrick; Pullen, Hannah; Sultana, Andrew; Seah, Shannon YK; Wilson, Megan Z; Russell, Simon G; Benson, Richard J; Rimmer, Yvonne L; Jefferies, Sarah J; Taku, Nicolette; Gurnell, Mark; Powlson, Andrew S; Schönlieb, Carola-Bibiane; Cai, Xiaohao; Sutcliffe, Michael PF; Parker, Michael A

    2017-01-01

    The VoxTox research programme has applied expertise from the physical sciences to the problem of radiotherapy toxicity, bringing together expertise from engineering, mathematics, high energy physics (including the Large Hadron Collider), medical physics and radiation oncology. In our initial cohort of 109 men treated with curative radiotherapy for prostate cancer, daily image guidance computed tomography (CT) scans have been used to calculate delivered dose to the rectum, as distinct from planned dose, using an automated approach. Clinical toxicity data have been collected, allowing us to address the hypothesis that delivered dose provides a better predictor of toxicity than planned dose. PMID:29177202

  7. Applying physical science techniques and CERN technology to an unsolved problem in radiation treatment for cancer: the multidisciplinary 'VoxTox' research programme.

    PubMed

    Burnet, Neil G; Scaife, Jessica E; Romanchikova, Marina; Thomas, Simon J; Bates, Amy M; Wong, Emma; Noble, David J; Shelley, Leila Ea; Bond, Simon J; Forman, Julia R; Hoole, Andrew Cf; Barnett, Gillian C; Brochu, Frederic M; Simmons, Michael Pd; Jena, Raj; Harrison, Karl; Yeap, Ping Lin; Drew, Amelia; Silvester, Emma; Elwood, Patrick; Pullen, Hannah; Sultana, Andrew; Seah, Shannon Yk; Wilson, Megan Z; Russell, Simon G; Benson, Richard J; Rimmer, Yvonne L; Jefferies, Sarah J; Taku, Nicolette; Gurnell, Mark; Powlson, Andrew S; Schönlieb, Carola-Bibiane; Cai, Xiaohao; Sutcliffe, Michael Pf; Parker, Michael A

    2017-06-01

    The VoxTox research programme has applied expertise from the physical sciences to the problem of radiotherapy toxicity, bringing together expertise from engineering, mathematics, high energy physics (including the Large Hadron Collider), medical physics and radiation oncology. In our initial cohort of 109 men treated with curative radiotherapy for prostate cancer, daily image guidance computed tomography (CT) scans have been used to calculate delivered dose to the rectum, as distinct from planned dose, using an automated approach. Clinical toxicity data have been collected, allowing us to address the hypothesis that delivered dose provides a better predictor of toxicity than planned dose.

  8. USRA/RIACS

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph

    1992-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under a cooperative agreement with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. A flexible scientific staff is provided through a university faculty visitor program, a post doctoral program, and a student visitor program. Not only does this provide appropriate expertise but it also introduces scientists outside of NASA to NASA problems. A small group of core RIACS staff provides continuity and interacts with an ARC technical monitor and scientific advisory group to determine the RIACS mission. RIACS activities are reviewed and monitored by a USRA advisory council and ARC technical monitor. Research at RIACS is currently being done in the following areas: (1) parallel computing; (2) advanced methods for scientific computing; (3) learning systems; (4) high performance networks and technology; and (5) graphics, visualization, and virtual environments. In the past year, parallel compiler techniques and adaptive numerical methods for flows in complicated geometries were identified as important problems to investigate for ARC's involvement in the Computational Grand Challenges of the next decade. We concluded a summer student visitors program during this six months. We had six visiting graduate students that worked on projects over the summer and presented seminars on their work at the conclusion of their visits. RIACS technical reports are usually preprints of manuscripts that have been submitted to research journals or conference proceedings. A list of these reports for the period July 1, 1992 through December 31, 1992 is provided.

  9. 78 FR 64255 - Advisory Committee for Computer and Information Science and Engineering; Cancellation of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-28

    ... NATIONAL SCIENCE FOUNDATION Advisory Committee for Computer and Information Science and Engineering; Cancellation of Meeting SUMMARY: As a result of the impact of the recent government shutdown, the... Committee for Computer and Information Science and Engineering meeting. The public notice for this committee...

  10. Exemplary Science Teachers' Use of Technology

    ERIC Educational Resources Information Center

    Hakverdi-Can, Meral; Dana, Thomas M.

    2012-01-01

    The purpose of this study is to examine exemplary science teachers' level of computer use, their knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, how often they required their students to use those applications in or for their science class…

  11. Software Reviews.

    ERIC Educational Resources Information Center

    Science and Children, 1990

    1990-01-01

    Reviewed are seven computer software packages for IBM and/or Apple Computers. Included are "Windows on Science: Volume 1--Physical Science"; "Science Probe--Physical Science"; "Wildlife Adventures--Grizzly Bears"; "Science Skills--Development Programs"; "The Clean Machine"; "Rock Doctor";…

  12. Democratizing data science through data science training.

    PubMed

    Van Horn, John Darrell; Fierro, Lily; Kamdar, Jeana; Gordon, Jonathan; Stewart, Crystal; Bhattrai, Avnish; Abe, Sumiko; Lei, Xiaoxiao; O'Driscoll, Caroline; Sinha, Aakanchha; Jain, Priyambada; Burns, Gully; Lerman, Kristina; Ambite, José Luis

    2018-01-01

    The biomedical sciences have experienced an explosion of data which promises to overwhelm many current practitioners. Without easy access to data science training resources, biomedical researchers may find themselves unable to wrangle their own datasets. In 2014, to address the challenges posed such a data onslaught, the National Institutes of Health (NIH) launched the Big Data to Knowledge (BD2K) initiative. To this end, the BD2K Training Coordinating Center (TCC; bigdatau.org) was funded to facilitate both in-person and online learning, and open up the concepts of data science to the widest possible audience. Here, we describe the activities of the BD2K TCC and its focus on the construction of the Educational Resource Discovery Index (ERuDIte), which identifies, collects, describes, and organizes online data science materials from BD2K awardees, open online courses, and videos from scientific lectures and tutorials. ERuDIte now indexes over 9,500 resources. Given the richness of online training materials and the constant evolution of biomedical data science, computational methods applying information retrieval, natural language processing, and machine learning techniques are required - in effect, using data science to inform training in data science. In so doing, the TCC seeks to democratize novel insights and discoveries brought forth via large-scale data science training.

  13. Democratizing data science through data science training

    PubMed Central

    Van Horn, John Darrell; Fierro, Lily; Kamdar, Jeana; Gordon, Jonathan; Stewart, Crystal; Bhattrai, Avnish; Abe, Sumiko; Lei, Xiaoxiao; O’Driscoll, Caroline; Sinha, Aakanchha; Jain, Priyambada; Burns, Gully; Lerman, Kristina; Ambite, José Luis

    2017-01-01

    The biomedical sciences have experienced an explosion of data which promises to overwhelm many current practitioners. Without easy access to data science training resources, biomedical researchers may find themselves unable to wrangle their own datasets. In 2014, to address the challenges posed such a data onslaught, the National Institutes of Health (NIH) launched the Big Data to Knowledge (BD2K) initiative. To this end, the BD2K Training Coordinating Center (TCC; bigdatau.org) was funded to facilitate both in-person and online learning, and open up the concepts of data science to the widest possible audience. Here, we describe the activities of the BD2K TCC and its focus on the construction of the Educational Resource Discovery Index (ERuDIte), which identifies, collects, describes, and organizes online data science materials from BD2K awardees, open online courses, and videos from scientific lectures and tutorials. ERuDIte now indexes over 9,500 resources. Given the richness of online training materials and the constant evolution of biomedical data science, computational methods applying information retrieval, natural language processing, and machine learning techniques are required - in effect, using data science to inform training in data science. In so doing, the TCC seeks to democratize novel insights and discoveries brought forth via large-scale data science training. PMID:29218890

  14. Advances in computational design and analysis of airbreathing propulsion systems

    NASA Technical Reports Server (NTRS)

    Klineberg, John M.

    1989-01-01

    The development of commercial and military aircraft depends, to a large extent, on engine manufacturers being able to achieve significant increases in propulsion capability through improved component aerodynamics, materials, and structures. The recent history of propulsion has been marked by efforts to develop computational techniques that can speed up the propulsion design process and produce superior designs. The availability of powerful supercomputers, such as the NASA Numerical Aerodynamic Simulator, and the potential for even higher performance offered by parallel computer architectures, have opened the door to the use of multi-dimensional simulations to study complex physical phenomena in propulsion systems that have previously defied analysis or experimental observation. An overview of several NASA Lewis research efforts is provided that are contributing toward the long-range goal of a numerical test-cell for the integrated, multidisciplinary design, analysis, and optimization of propulsion systems. Specific examples in Internal Computational Fluid Mechanics, Computational Structural Mechanics, Computational Materials Science, and High Performance Computing are cited and described in terms of current capabilities, technical challenges, and future research directions.

  15. An Overview of NASA's Intelligent Systems Program

    NASA Technical Reports Server (NTRS)

    Cooke, Daniel E.; Norvig, Peter (Technical Monitor)

    2001-01-01

    NASA and the Computer Science Research community are poised to enter a critical era. An era in which - it seems - that each needs the other. Market forces, driven by the immediate economic viability of computer science research results, place Computer Science in a relatively novel position. These forces impact how research is done, and could, in worst case, drive the field away from significant innovation opting instead for incremental advances that result in greater stability in the market place. NASA, however, requires significant advances in computer science research in order to accomplish the exploration and science agenda it has set out for itself. NASA may indeed be poised to advance computer science research in this century much the way it advanced aero-based research in the last.

  16. A Review of Models for Teacher Preparation Programs for Precollege Computer Science Education.

    ERIC Educational Resources Information Center

    Deek, Fadi P.; Kimmel, Howard

    2002-01-01

    Discusses the need for adequate precollege computer science education and focuses on the issues of teacher preparation programs and requirements needed to teach high school computer science. Presents models of teacher preparation programs and compares state requirements with Association for Computing Machinery (ACM) recommendations. (Author/LRW)

  17. A DDC Bibliography on Computers in Information Sciences. Volume II. Information Sciences Series.

    ERIC Educational Resources Information Center

    Defense Documentation Center, Alexandria, VA.

    The unclassified and unlimited bibliography compiles references dealing specifically with the role of computers in information sciences. The volume contains 239 annotated references grouped under three major headings: Artificial and Programming Languages, Computer Processing of Analog Data, and Computer Processing of Digital Data. The references…

  18. Making Advanced Computer Science Topics More Accessible through Interactive Technologies

    ERIC Educational Resources Information Center

    Shao, Kun; Maher, Peter

    2012-01-01

    Purpose: Teaching advanced technical concepts in a computer science program to students of different technical backgrounds presents many challenges. The purpose of this paper is to present a detailed experimental pedagogy in teaching advanced computer science topics, such as computer networking, telecommunications and data structures using…

  19. ASCR Workshop on Quantum Computing for Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aspuru-Guzik, Alan; Van Dam, Wim; Farhi, Edward

    This report details the findings of the DOE ASCR Workshop on Quantum Computing for Science that was organized to assess the viability of quantum computing technologies to meet the computational requirements of the DOE’s science and energy mission, and to identify the potential impact of quantum technologies. The workshop was held on February 17-18, 2015, in Bethesda, MD, to solicit input from members of the quantum computing community. The workshop considered models of quantum computation and programming environments, physical science applications relevant to DOE's science mission as well as quantum simulation, and applied mathematics topics including potential quantum algorithms formore » linear algebra, graph theory, and machine learning. This report summarizes these perspectives into an outlook on the opportunities for quantum computing to impact problems relevant to the DOE’s mission as well as the additional research required to bring quantum computing to the point where it can have such impact.« less

  20. BIOCOMPUTATION: some history and prospects.

    PubMed

    Cull, Paul

    2013-06-01

    At first glance, biology and computer science are diametrically opposed sciences. Biology deals with carbon based life forms shaped by evolution and natural selection. Computer Science deals with electronic machines designed by engineers and guided by mathematical algorithms. In this brief paper, we review biologically inspired computing. We discuss several models of computation which have arisen from various biological studies. We show what these have in common, and conjecture how biology can still suggest answers and models for the next generation of computing problems. We discuss computation and argue that these biologically inspired models do not extend the theoretical limits on computation. We suggest that, in practice, biological models may give more succinct representations of various problems, and we mention a few cases in which biological models have proved useful. We also discuss the reciprocal impact of computer science on biology and cite a few significant contributions to biological science. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  1. A Case Study of the Introduction of Computer Science in NZ Schools

    ERIC Educational Resources Information Center

    Bell, Tim; Andreae, Peter; Robins, Anthony

    2014-01-01

    For many years computing in New Zealand schools was focused on teaching students how to use computers, and there was little opportunity for students to learn about programming and computer science as formal subjects. In this article we review a series of initiatives that occurred from 2007 to 2009 that led to programming and computer science being…

  2. Identifying Key Features, Cutting Edge Cloud Resources, and Artificial Intelligence Tools to Achieve User-Friendly Water Science in the Cloud

    NASA Astrophysics Data System (ADS)

    Pierce, S. A.

    2017-12-01

    Decision making for groundwater systems is becoming increasingly important, as shifting water demands increasingly impact aquifers. As buffer systems, aquifers provide room for resilient responses and augment the actual timeframe for hydrological response. Yet the pace impacts, climate shifts, and degradation of water resources is accelerating. To meet these new drivers, groundwater science is transitioning toward the emerging field of Integrated Water Resources Management, or IWRM. IWRM incorporates a broad array of dimensions, methods, and tools to address problems that tend to be complex. Computational tools and accessible cyberinfrastructure (CI) are needed to cross the chasm between science and society. Fortunately cloud computing environments, such as the new Jetstream system, are evolving rapidly. While still targeting scientific user groups systems such as, Jetstream, offer configurable cyberinfrastructure to enable interactive computing and data analysis resources on demand. The web-based interfaces allow researchers to rapidly customize virtual machines, modify computing architecture and increase the usability and access for broader audiences to advanced compute environments. The result enables dexterous configurations and opening up opportunities for IWRM modelers to expand the reach of analyses, number of case studies, and quality of engagement with stakeholders and decision makers. The acute need to identify improved IWRM solutions paired with advanced computational resources refocuses the attention of IWRM researchers on applications, workflows, and intelligent systems that are capable of accelerating progress. IWRM must address key drivers of community concern, implement transdisciplinary methodologies, adapt and apply decision support tools in order to effectively support decisions about groundwater resource management. This presentation will provide an overview of advanced computing services in the cloud using integrated groundwater management case studies to highlight how Cloud CI streamlines the process for setting up an interactive decision support system. Moreover, advances in artificial intelligence offer new techniques for old problems from integrating data to adaptive sensing or from interactive dashboards to optimizing multi-attribute problems. The combination of scientific expertise, flexible cloud computing solutions, and intelligent systems opens new research horizons.

  3. Research in Applied Mathematics, Fluid Mechanics and Computer Science

    NASA Technical Reports Server (NTRS)

    1999-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1998 through March 31, 1999.

  4. [Research activities in applied mathematics, fluid mechanics, and computer science

    NASA Technical Reports Server (NTRS)

    1995-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period April 1, 1995 through September 30, 1995.

  5. Activities of the Institute for Computer Applications in Science and Engineering

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April 1, 1985 through October 2, 1985 is summarized.

  6. Use of non-adiabatic geometric phase for quantum computing by NMR.

    PubMed

    Das, Ranabir; Kumar, S K Karthick; Kumar, Anil

    2005-12-01

    Geometric phases have stimulated researchers for its potential applications in many areas of science. One of them is fault-tolerant quantum computation. A preliminary requisite of quantum computation is the implementation of controlled dynamics of qubits. In controlled dynamics, one qubit undergoes coherent evolution and acquires appropriate phase, depending on the state of other qubits. If the evolution is geometric, then the phase acquired depend only on the geometry of the path executed, and is robust against certain types of error. This phenomenon leads to an inherently fault-tolerant quantum computation. Here we suggest a technique of using non-adiabatic geometric phase for quantum computation, using selective excitation. In a two-qubit system, we selectively evolve a suitable subsystem where the control qubit is in state |1, through a closed circuit. By this evolution, the target qubit gains a phase controlled by the state of the control qubit. Using the non-adiabatic geometric phase we demonstrate implementation of Deutsch-Jozsa algorithm and Grover's search algorithm in a two-qubit system.

  7. An adaptive technique to maximize lossless image data compression of satellite images

    NASA Technical Reports Server (NTRS)

    Stewart, Robert J.; Lure, Y. M. Fleming; Liou, C. S. Joe

    1994-01-01

    Data compression will pay an increasingly important role in the storage and transmission of image data within NASA science programs as the Earth Observing System comes into operation. It is important that the science data be preserved at the fidelity the instrument and the satellite communication systems were designed to produce. Lossless compression must therefore be applied, at least, to archive the processed instrument data. In this paper, we present an analysis of the performance of lossless compression techniques and develop an adaptive approach which applied image remapping, feature-based image segmentation to determine regions of similar entropy and high-order arithmetic coding to obtain significant improvements over the use of conventional compression techniques alone. Image remapping is used to transform the original image into a lower entropy state. Several techniques were tested on satellite images including differential pulse code modulation, bi-linear interpolation, and block-based linear predictive coding. The results of these experiments are discussed and trade-offs between computation requirements and entropy reductions are used to identify the optimum approach for a variety of satellite images. Further entropy reduction can be achieved by segmenting the image based on local entropy properties then applying a coding technique which maximizes compression for the region. Experimental results are presented showing the effect of different coding techniques for regions of different entropy. A rule-base is developed through which the technique giving the best compression is selected. The paper concludes that maximum compression can be achieved cost effectively and at acceptable performance rates with a combination of techniques which are selected based on image contextual information.

  8. A Quantitative Model for Assessing Visual Simulation Software Architecture

    DTIC Science & Technology

    2011-09-01

    Software Engineering Arnold Buss Research Associate Professor of MOVES LtCol Jeff Boleng, PhD Associate Professor of Computer Science U.S. Air Force Academy... science (operating and programming systems series). New York, NY, USA: Elsevier Science Ltd. Henry, S., & Kafura, D. (1984). The evaluation of software...Rudy Darken Professor of Computer Science Dissertation Supervisor Ted Lewis Professor of Computer Science Richard Riehle Professor of Practice

  9. K-16 Computationally Rich Science Education: A Ten-Year Review of the "Journal of Science Education and Technology" (1998-2008)

    ERIC Educational Resources Information Center

    Wofford, Jennifer

    2009-01-01

    Computing is anticipated to have an increasingly expansive impact on the sciences overall, becoming the third, crucial component of a "golden triangle" that includes mathematics and experimental and theoretical science. However, even more true with computing than with math and science, we are not preparing our students for this new reality. It is…

  10. Interactive Synthesis of Code Level Security Rules

    DTIC Science & Technology

    2017-04-01

    Interactive Synthesis of Code-Level Security Rules A Thesis Presented by Leo St. Amour to The Department of Computer Science in partial fulfillment...of the requirements for the degree of Master of Science in Computer Science Northeastern University Boston, Massachusetts April 2017 DISTRIBUTION...Abstract of the Thesis Interactive Synthesis of Code-Level Security Rules by Leo St. Amour Master of Science in Computer Science Northeastern University

  11. Approaching gender parity: Women in computer science at Afghanistan's Kabul University

    NASA Astrophysics Data System (ADS)

    Plane, Jandelyn

    This study explores the representation of women in computer science at the tertiary level through data collected about undergraduate computer science education at Kabul University in Afghanistan. Previous studies have theorized reasons for underrepresentation of women in computer science, and while many of these reasons are indeed present in Afghanistan, they appear to hinder advancement to degree to a lesser extent. Women comprise at least 36% of each graduating class from KU's Computer Science Department; however, in 2007 women were 25% of the university population. In the US, women comprise over 50% of university populations while only graduating on average 25% women in undergraduate computer science programs. Representation of women in computer science in the US is 50% below the university rate, but at KU, it is 50% above the university rate. This mixed methods study of KU was conducted in the following three stages: setting up focus groups with women computer science students, distributing surveys to all students in the CS department, and conducting a series of 22 individual interviews with fourth year CS students. The analysis of the data collected and its comparison to literature on university/department retention in Science, Technology, Engineering and Mathematics gender representation and on women's education in underdeveloped Islamic countries illuminates KU's uncharacteristic representation of women in its Computer Science Department. The retention of women in STEM through the education pipeline has several characteristics in Afghanistan that differ from countries often studied in available literature. Few Afghan students have computers in their home and few have training beyond secretarial applications before considering studying CS at university. University students in Afghanistan are selected based on placement exams and are then assigned to an area of study, and financially supported throughout their academic career, resulting in a low attrition rate from the program. Gender and STEM literature identifies parental encouragement, stereotypes and employment perceptions as influential characteristics. Afghan women in computer science received significant parental encouragement even from parents with no computer background. They do not seem to be influenced by any negative "geek" stereotypes, but they do perceive limitations when considering employment after graduation.

  12. Environmental information acquisition and maintenance techniques: reference guide. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Riggins, R.E.; Young, V.T.; Goran, W.D.

    1980-08-01

    This report provides a guide to techniques for collecting, using and maintaining data about each of the 13 environmental technical specialties in the Environmental Impact Computer System (EICS). The technical specialties are: (1) ecology, (2) environmental health, (3) air, (4) surface water, (5) ground water, (6) sociology, (7) economics, (8) earth science, (9) land use, (10) noise, (11) transportation, (12) aesthetics, and (13) energy and resource conservation. Acquisition techniques are classified by the following general categories: (1) secondary data, (2) remote sensing, (3) mathematical modeling, (4) field work, (5) mapping/maps and (6) expert opinion. A matrix identifies the most appropriatemore » techniques for collecting information on the EICS technical specialties. After selecting a method, the user may read an abstract of the report explaining that technique, and may also wish to obtain the original document for detailed information about applying the technique. Finally, this report offers guidelines on storing environmental information for future use, and on presenting that information effectively in environmental documents.« less

  13. Science-Technology Coupling: The Case of Mathematical Logic and Computer Science.

    ERIC Educational Resources Information Center

    Wagner-Dobler, Roland

    1997-01-01

    In the history of science, there have often been periods of sudden rapprochements between pure science and technology-oriented branches of science. Mathematical logic as pure science and computer science as technology-oriented science have experienced such a rapprochement, which is studied in this article in a bibliometric manner. (Author)

  14. Cognitive computing and eScience in health and life science research: artificial intelligence and obesity intervention programs.

    PubMed

    Marshall, Thomas; Champagne-Langabeer, Tiffiany; Castelli, Darla; Hoelscher, Deanna

    2017-12-01

    To present research models based on artificial intelligence and discuss the concept of cognitive computing and eScience as disruptive factors in health and life science research methodologies. The paper identifies big data as a catalyst to innovation and the development of artificial intelligence, presents a framework for computer-supported human problem solving and describes a transformation of research support models. This framework includes traditional computer support; federated cognition using machine learning and cognitive agents to augment human intelligence; and a semi-autonomous/autonomous cognitive model, based on deep machine learning, which supports eScience. The paper provides a forward view of the impact of artificial intelligence on our human-computer support and research methods in health and life science research. By augmenting or amplifying human task performance with artificial intelligence, cognitive computing and eScience research models are discussed as novel and innovative systems for developing more effective adaptive obesity intervention programs.

  15. 78 FR 61870 - Advisory Committee for Computer and Information Science and Engineering; Notice of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-04

    ... NATIONAL SCIENCE FOUNDATION Advisory Committee for Computer and Information Science and Engineering; Notice of Meeting In accordance with Federal Advisory Committee Act (Pub. L. 92-463, as amended... Committee for Computer and Information Science and Engineering (1115). Date/Time: Oct 31, 2013: 12:30 p.m...

  16. Activities of the Institute for Computer Applications in Science and Engineering (ICASE)

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1984 through March 31, 1985 is summarized.

  17. [Research Conducted at the Institute for Computer Applications in Science and Engineering

    NASA Technical Reports Server (NTRS)

    1997-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period 1 Oct. 1996 - 31 Mar. 1997.

  18. Activities of the Institute for Computer Applications in Science and Engineering (ICASE)

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This report summarizes research conducted at the Institute for Computer Applications Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 2, 1987 through March 31, 1988.

  19. [Activities of Institute for Computer Applications in Science and Engineering (ICASE)

    NASA Technical Reports Server (NTRS)

    1999-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics. fluid mechanics, and computer science during the period April 1, 1999 through September 30. 1999.

  20. Practical Measurement of Complexity In Dynamic Systems

    DTIC Science & Technology

    2012-01-01

    policies that produce highly complex behaviors , yet yield no benefit. 21Jason B. Clark and David R. Jacques / Procedia Computer Science 8 (2012) 14... Procedia Computer Science 8 (2012) 14 – 21 1877-0509 © 2012 Published by Elsevier B.V. doi:10.1016/j.procs.2012.01.008 Available online at...www.sciencedirect.com Procedia Computer Science Procedia Computer Science 00 (2012) 000–000 www.elsevier.com/locate/ procedia Available online at

  1. The role of physicality in rich programming environments

    NASA Astrophysics Data System (ADS)

    Liu, Allison S.; Schunn, Christian D.; Flot, Jesse; Shoop, Robin

    2013-12-01

    Computer science proficiency continues to grow in importance, while the number of students entering computer science-related fields declines. Many rich programming environments have been created to motivate student interest and expertise in computer science. In the current study, we investigated whether a recently created environment, Robot Virtual Worlds (RVWs), can be used to teach computer science principles within a robotics context by examining its use in high-school classrooms. We also investigated whether the lack of physicality in these environments impacts student learning by comparing classrooms that used either virtual or physical robots for the RVW curriculum. Results suggest that the RVW environment leads to significant gains in computer science knowledge, that virtual robots lead to faster learning, and that physical robots may have some influence on algorithmic thinking. We discuss the implications of physicality in these programming environments for learning computer science.

  2. The revolution in risk assessment and disease detection made possible with non-invasive imaging: implications for population science.

    PubMed

    Carr, J Jeffrey

    2012-01-01

    The ability to quantify subclinical disease to assess cardiovascular disease is greatly enhanced by modern medical imaging techniques that incorporate concepts from biomedical engineering and computer science. These techniques' numerical results, known as quantitative phenotypes, can be used to help us better understand both health and disease states. In this report, we describe our efforts in using the latest imaging technologies to assess cardiovascular disease risk by quantifying subclinical disease of participants in the Jackson Heart Study. The CT and MRI exams of the Jackson Heart Study have collected detailed information from approximately 3,000 participants. Analyses of the images from these exams provide information on several measures including the amount of plaque in the coronary arteries and the ability of the heart to pump blood. These measures can then be added to the wealth of information on JHS participants to understand how these conditions, as well as how clinical events, such as heart attacks and heart failure, occur in African Americans.

  3. Forensic facial comparison in South Africa: State of the science.

    PubMed

    Steyn, M; Pretorius, M; Briers, N; Bacci, N; Johnson, A; Houlton, T M R

    2018-06-01

    Forensic facial comparison (FFC) is a scientific technique used to link suspects to a crime scene based on the analysis of photos or video recordings from that scene. While basic guidelines on practice and training are provided by the Facial Identification Scientific Working Group, details of how these are applied across the world are scarce. FFC is frequently used in South Africa, with more than 700 comparisons conducted in the last two years alone. In this paper the standards of practice are outlined, with new proposed levels of agreement/conclusions. We outline three levels of training that were established, with training in facial anatomy, terminology, principles of image comparison, image science, facial recognition and computer skills being aimed at developing general competency. Training in generating court charts and understanding court case proceedings are being specifically developed for the South African context. Various shortcomings still exist, specifically with regard to knowledge of the reliability of the technique. These need to be addressed in future research. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Path Not Found: Disparities in Access to Computer Science Courses in California High Schools

    ERIC Educational Resources Information Center

    Martin, Alexis; McAlear, Frieda; Scott, Allison

    2015-01-01

    "Path Not Found: Disparities in Access to Computer Science Courses in California High Schools" exposes one of the foundational causes of underrepresentation in computing: disparities in access to computer science courses in California's public high schools. This report provides new, detailed data on these disparities by student body…

  5. Developing Oral and Written Communication Skills in Undergraduate Computer Science and Information Systems Curriculum

    ERIC Educational Resources Information Center

    Kortsarts, Yana; Fischbach, Adam; Rufinus, Jeff; Utell, Janine M.; Yoon, Suk-Chung

    2010-01-01

    Developing and applying oral and written communication skills in the undergraduate computer science and computer information systems curriculum--one of the ABET accreditation requirements - is a very challenging and, at the same time, a rewarding task that provides various opportunities to enrich the undergraduate computer science and computer…

  6. EOS MLS Science Data Processing System: A Description of Architecture and Capabilities

    NASA Technical Reports Server (NTRS)

    Cuddy, David T.; Echeverri, Mark D.; Wagner, Paul A.; Hanzel, Audrey T.; Fuller, Ryan A.

    2006-01-01

    This paper describes the architecture and capabilities of the Science Data Processing System (SDPS) for the EOS MLS. The SDPS consists of two major components--the Science Computing Facility and the Science Investigator-led Processing System. The Science Computing Facility provides the facilities for the EOS MLS Science Team to perform the functions of scientific algorithm development, processing software development, quality control of data products, and scientific analyses. The Science Investigator-led Processing System processes and reprocesses the science data for the entire mission and delivers the data products to the Science Computing Facility and to the Goddard Space Flight Center Earth Science Distributed Active Archive Center, which archives and distributes the standard science products.

  7. Applications of artificial intelligence to scientific research

    NASA Technical Reports Server (NTRS)

    Prince, Mary Ellen

    1986-01-01

    Artificial intelligence (AI) is a growing field which is just beginning to make an impact on disciplines other than computer science. While a number of military and commercial applications were undertaken in recent years, few attempts were made to apply AI techniques to basic scientific research. There is no inherent reason for the discrepancy. The characteristics of the problem, rather than its domain, determines whether or not it is suitable for an AI approach. Expert system, intelligent tutoring systems, and learning programs are examples of theoretical topics which can be applied to certain areas of scientific research. Further research and experimentation should eventurally make it possible for computers to act as intelligent assistants to scientists.

  8. Women in computer science: An interpretative phenomenological analysis exploring common factors contributing to women's selection and persistence in computer science as an academic major

    NASA Astrophysics Data System (ADS)

    Thackeray, Lynn Roy

    The purpose of this study is to understand the meaning that women make of the social and cultural factors that influence their reasons for entering and remaining in study of computer science. The twenty-first century presents many new challenges in career development and workforce choices for both men and women. Information technology has become the driving force behind many areas of the economy. As this trend continues, it has become essential that U.S. citizens need to pursue a career in technologies, including the computing sciences. Although computer science is a very lucrative profession, many Americans, especially women, are not choosing it as a profession. Recent studies have shown no significant differences in math, technical and science competency between men and women. Therefore, other factors, such as social, cultural, and environmental influences seem to affect women's decisions in choosing an area of study and career choices. A phenomenological method of qualitative research was used in this study, based on interviews of seven female students who are currently enrolled in a post-secondary computer science program. Their narratives provided meaning into the social and cultural environments that contribute to their persistence in their technical studies, as well as identifying barriers and challenges that are faced by female students who choose to study computer science. It is hoped that the data collected from this study may provide recommendations for the recruiting, retention and support for women in computer science departments of U.S. colleges and universities, and thereby increase the numbers of women computer scientists in industry. Keywords: gender access, self-efficacy, culture, stereotypes, computer education, diversity.

  9. An Adaptive Property-Aware HW/SW Framework for DDDAS

    DTIC Science & Technology

    2014-10-21

    sleep queue stores sleeping tasks until their activation time. The task with the earliest activation time is at the front of the sleep queue. At the...queue) or activation time ( sleep queue). Chetan et al. / Procedia Computer Science 00 (2012) 1–9 4 Figure 2: A high level architecture diagram of the...conservative will be the WCET estimation. Vestal et al. suggested the use of Audesly’s prioirity assignment scheme [6] and period transformation technique

  10. Large-Scale Sentinel-1 Processing for Solid Earth Science and Urgent Response using Cloud Computing and Machine Learning

    NASA Astrophysics Data System (ADS)

    Hua, H.; Owen, S. E.; Yun, S. H.; Agram, P. S.; Manipon, G.; Starch, M.; Sacco, G. F.; Bue, B. D.; Dang, L. B.; Linick, J. P.; Malarout, N.; Rosen, P. A.; Fielding, E. J.; Lundgren, P.; Moore, A. W.; Liu, Z.; Farr, T.; Webb, F.; Simons, M.; Gurrola, E. M.

    2017-12-01

    With the increased availability of open SAR data (e.g. Sentinel-1 A/B), new challenges are being faced with processing and analyzing the voluminous SAR datasets to make geodetic measurements. Upcoming SAR missions such as NISAR are expected to generate close to 100TB per day. The Advanced Rapid Imaging and Analysis (ARIA) project can now generate geocoded unwrapped phase and coherence products from Sentinel-1 TOPS mode data in an automated fashion, using the ISCE software. This capability is currently being exercised on various study sites across the United States and around the globe, including Hawaii, Central California, Iceland and South America. The automated and large-scale SAR data processing and analysis capabilities use cloud computing techniques to speed the computations and provide scalable processing power and storage. Aspects such as how to processing these voluminous SLCs and interferograms at global scales, keeping up with the large daily SAR data volumes, and how to handle the voluminous data rates are being explored. Scene-partitioning approaches in the processing pipeline help in handling global-scale processing up to unwrapped interferograms with stitching done at a late stage. We have built an advanced science data system with rapid search functions to enable access to the derived data products. Rapid image processing of Sentinel-1 data to interferograms and time series is already being applied to natural hazards including earthquakes, floods, volcanic eruptions, and land subsidence due to fluid withdrawal. We will present the status of the ARIA science data system for generating science-ready data products and challenges that arise from being able to process SAR datasets to derived time series data products at large scales. For example, how do we perform large-scale data quality screening on interferograms? What approaches can be used to minimize compute, storage, and data movement costs for time series analysis in the cloud? We will also present some of our findings from applying machine learning and data analytics on the processed SAR data streams. We will also present lessons learned on how to ease the SAR community onto interfacing with these cloud-based SAR science data systems.

  11. 77 FR 38630 - Open Internet Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-28

    ... Computer Science and Co-Founder of the Berkman Center for Internet and Society, Harvard University, is... of Technology Computer Science and Artificial Intelligence Laboratory, is appointed vice-chairperson... Jennifer Rexford, Professor of Computer Science, Princeton University Dennis Roberson, Vice Provost...

  12. Research in progress at the Institute for Computer Applications in Science and Engineering

    NASA Technical Reports Server (NTRS)

    1987-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April 1, 1987 through October 1, 1987.

  13. IBM Watson: How Cognitive Computing Can Be Applied to Big Data Challenges in Life Sciences Research.

    PubMed

    Chen, Ying; Elenee Argentinis, J D; Weber, Griff

    2016-04-01

    Life sciences researchers are under pressure to innovate faster than ever. Big data offer the promise of unlocking novel insights and accelerating breakthroughs. Ironically, although more data are available than ever, only a fraction is being integrated, understood, and analyzed. The challenge lies in harnessing volumes of data, integrating the data from hundreds of sources, and understanding their various formats. New technologies such as cognitive computing offer promise for addressing this challenge because cognitive solutions are specifically designed to integrate and analyze big datasets. Cognitive solutions can understand different types of data such as lab values in a structured database or the text of a scientific publication. Cognitive solutions are trained to understand technical, industry-specific content and use advanced reasoning, predictive modeling, and machine learning techniques to advance research faster. Watson, a cognitive computing technology, has been configured to support life sciences research. This version of Watson includes medical literature, patents, genomics, and chemical and pharmacological data that researchers would typically use in their work. Watson has also been developed with specific comprehension of scientific terminology so it can make novel connections in millions of pages of text. Watson has been applied to a few pilot studies in the areas of drug target identification and drug repurposing. The pilot results suggest that Watson can accelerate identification of novel drug candidates and novel drug targets by harnessing the potential of big data. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  14. A parallel-processing approach to computing for the geographic sciences; applications and systems enhancements

    USGS Publications Warehouse

    Crane, Michael; Steinwand, Dan; Beckmann, Tim; Krpan, Greg; Liu, Shu-Guang; Nichols, Erin; Haga, Jim; Maddox, Brian; Bilderback, Chris; Feller, Mark; Homer, George

    2001-01-01

    The overarching goal of this project is to build a spatially distributed infrastructure for information science research by forming a team of information science researchers and providing them with similar hardware and software tools to perform collaborative research. Four geographically distributed Centers of the U.S. Geological Survey (USGS) are developing their own clusters of low-cost, personal computers into parallel computing environments that provide a costeffective way for the USGS to increase participation in the high-performance computing community. Referred to as Beowulf clusters, these hybrid systems provide the robust computing power required for conducting information science research into parallel computing systems and applications.

  15. Enabling Earth Science Through Cloud Computing

    NASA Technical Reports Server (NTRS)

    Hardman, Sean; Riofrio, Andres; Shams, Khawaja; Freeborn, Dana; Springer, Paul; Chafin, Brian

    2012-01-01

    Cloud Computing holds tremendous potential for missions across the National Aeronautics and Space Administration. Several flight missions are already benefiting from an investment in cloud computing for mission critical pipelines and services through faster processing time, higher availability, and drastically lower costs available on cloud systems. However, these processes do not currently extend to general scientific algorithms relevant to earth science missions. The members of the Airborne Cloud Computing Environment task at the Jet Propulsion Laboratory have worked closely with the Carbon in Arctic Reservoirs Vulnerability Experiment (CARVE) mission to integrate cloud computing into their science data processing pipeline. This paper details the efforts involved in deploying a science data system for the CARVE mission, evaluating and integrating cloud computing solutions with the system and porting their science algorithms for execution in a cloud environment.

  16. Simulation of the Effects of Cooling Techniques on Turbine Blade Heat Transfer

    NASA Astrophysics Data System (ADS)

    Shaw, Vince; Fatuzzo, Marco

    Increases in the performance demands of turbo machinery has stimulated the development many new technologies over the last half century. With applications that spread beyond marine, aviation, and power generation, improvements in gas turbine technologies provide a vast impact. High temperatures within the combustion chamber of the gas turbine engine are known to cause an increase in thermal efficiency and power produced by the engine. However, since operating temperatures of these engines reach above 1000 K within the turbine section, the need for advances in material science and cooling techniques to produce functioning engines under these high thermal and dynamic stresses is crucial. As with all research and development, costs related to the production of prototypes can be reduced through the use of computational simulations. By making use of Ansys Simulation Software, the effects of turbine cooling techniques were analyzed. Simulation of the Effects of Cooling Techniques on Turbine Blade Heat Transfer.

  17. Teaching and Learning Methodologies Supported by ICT Applied in Computer Science

    ERIC Educational Resources Information Center

    Capacho, Jose

    2016-01-01

    The main objective of this paper is to show a set of new methodologies applied in the teaching of Computer Science using ICT. The methodologies are framed in the conceptual basis of the following sciences: Psychology, Education and Computer Science. The theoretical framework of the research is supported by Behavioral Theory, Gestalt Theory.…

  18. 76 FR 61118 - Advisory Committee for Computer and Information Science and Engineering; Notice of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-03

    ... Engineering; Notice of Meeting In accordance with the Federal Advisory Committee Act (Pub. L. 92- 463, as... Computer and Information Science and Engineering (1115). Date and Time: November 1, 2011 from 12 p.m.-5:30... Computer and Information Science and Engineering, National Science Foundation, 4201 Wilson Blvd., Suite...

  19. Computer Science in High School Graduation Requirements. ECS Education Trends (Updated)

    ERIC Educational Resources Information Center

    Zinth, Jennifer

    2016-01-01

    Allowing high school students to fulfill a math or science high school graduation requirement via a computer science credit may encourage more student to pursue computer science coursework. This Education Trends report is an update to the original report released in April 2015 and explores state policies that allow or require districts to apply…

  20. The Complexity of Primary Care Psychology: Theoretical Foundations.

    PubMed

    Smit, E H; Derksen, J J L

    2015-07-01

    How does primary care psychology deal with organized complexity? Has it escaped Newtonian science? Has it, as Weaver (1991) suggests, found a way to 'manage problems with many interrelated factors that cannot be dealt by statistical techniques'? Computer simulations and mathematical models in psychology are ongoing positive developments in the study of complex systems. However, the theoretical development of complex systems in psychology lags behind these advances. In this article we use complexity science to develop a theory on experienced complexity in the daily practice of primary care psychologists. We briefly answer the ontological question of what we see (from the perspective of primary care psychology) as reality, the epistemological question of what we can know, the methodological question of how to act, and the ethical question of what is good care. Following our empirical study, we conclude that complexity science can describe the experienced complexity of the psychologist and offer room for personalized client-centered care. Complexity science is slowly filling the gap between the dominant reductionist theory and complex daily practice.

Top