Sample records for modern computer science

  1. Articles on Practical Cybernetics. Computer-Developed Computers; Heuristics and Modern Sciences; Linguistics and Practice; Cybernetics and Moral-Ethical Considerations; and Men and Machines at the Chessboard.

    ERIC Educational Resources Information Center

    Berg, A. I.; And Others

    Five articles which were selected from a Russian language book on cybernetics and then translated are presented here. They deal with the topics of: computer-developed computers, heuristics and modern sciences, linguistics and practice, cybernetics and moral-ethical considerations, and computer chess programs. (Author/JY)

  2. Evaluating Modern Defenses Against Control Flow Hijacking

    DTIC Science & Technology

    2015-09-01

    unsound and could introduce false negatives (opening up another possible set of attacks). CFG Construction using DSA We next evaluate the precision of CFG...Evaluating Modern Defenses Against Control Flow Hijacking by Ulziibayar Otgonbaatar Submitted to the Department of Electrical Engineering and...Computer Science in partial fulfillment of the requirements for the degree of Master of Science in Computer Science and Engineering at the MASSACHUSETTS

  3. Towards a Versatile Tele-Education Platform for Computer Science Educators Based on the Greek School Network

    ERIC Educational Resources Information Center

    Paraskevas, Michael; Zarouchas, Thomas; Angelopoulos, Panagiotis; Perikos, Isidoros

    2013-01-01

    Now days the growing need for highly qualified computer science educators in modern educational environments is commonplace. This study examines the potential use of Greek School Network (GSN) to provide a robust and comprehensive e-training course for computer science educators in order to efficiently exploit advanced IT services and establish a…

  4. Graphical User Interface Programming in Introductory Computer Science.

    ERIC Educational Resources Information Center

    Skolnick, Michael M.; Spooner, David L.

    Modern computing systems exploit graphical user interfaces for interaction with users; as a result, introductory computer science courses must begin to teach the principles underlying such interfaces. This paper presents an approach to graphical user interface (GUI) implementation that is simple enough for beginning students to understand, yet…

  5. Biological and Environmental Research Exascale Requirements Review. An Office of Science review sponsored jointly by Advanced Scientific Computing Research and Biological and Environmental Research, March 28-31, 2016, Rockville, Maryland

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arkin, Adam; Bader, David C.; Coffey, Richard

    Understanding the fundamentals of genomic systems or the processes governing impactful weather patterns are examples of the types of simulation and modeling performed on the most advanced computing resources in America. High-performance computing and computational science together provide a necessary platform for the mission science conducted by the Biological and Environmental Research (BER) office at the U.S. Department of Energy (DOE). This report reviews BER’s computing needs and their importance for solving some of the toughest problems in BER’s portfolio. BER’s impact on science has been transformative. Mapping the human genome, including the U.S.-supported international Human Genome Project that DOEmore » began in 1987, initiated the era of modern biotechnology and genomics-based systems biology. And since the 1950s, BER has been a core contributor to atmospheric, environmental, and climate science research, beginning with atmospheric circulation studies that were the forerunners of modern Earth system models (ESMs) and by pioneering the implementation of climate codes onto high-performance computers. See http://exascaleage.org/ber/ for more information.« less

  6. Bioinformatics in high school biology curricula: a study of state science standards.

    PubMed

    Wefer, Stephen H; Sheppard, Keith

    2008-01-01

    The proliferation of bioinformatics in modern biology marks a modern revolution in science that promises to influence science education at all levels. This study analyzed secondary school science standards of 49 U.S. states (Iowa has no science framework) and the District of Columbia for content related to bioinformatics. The bioinformatics content of each state's biology standards was analyzed and categorized into nine areas: Human Genome Project/genomics, forensics, evolution, classification, nucleotide variations, medicine, computer use, agriculture/food technology, and science technology and society/socioscientific issues. Findings indicated a generally low representation of bioinformatics-related content, which varied substantially across the different areas, with Human Genome Project/genomics and computer use being the lowest (8%), and evolution being the highest (64%) among states' science frameworks. This essay concludes with recommendations for reworking/rewording existing standards to facilitate the goal of promoting science literacy among secondary school students.

  7. Bioinformatics in High School Biology Curricula: A Study of State Science Standards

    PubMed Central

    Sheppard, Keith

    2008-01-01

    The proliferation of bioinformatics in modern biology marks a modern revolution in science that promises to influence science education at all levels. This study analyzed secondary school science standards of 49 U.S. states (Iowa has no science framework) and the District of Columbia for content related to bioinformatics. The bioinformatics content of each state's biology standards was analyzed and categorized into nine areas: Human Genome Project/genomics, forensics, evolution, classification, nucleotide variations, medicine, computer use, agriculture/food technology, and science technology and society/socioscientific issues. Findings indicated a generally low representation of bioinformatics-related content, which varied substantially across the different areas, with Human Genome Project/genomics and computer use being the lowest (8%), and evolution being the highest (64%) among states' science frameworks. This essay concludes with recommendations for reworking/rewording existing standards to facilitate the goal of promoting science literacy among secondary school students. PMID:18316818

  8. Computational Science and Innovation

    NASA Astrophysics Data System (ADS)

    Dean, D. J.

    2011-09-01

    Simulations - utilizing computers to solve complicated science and engineering problems - are a key ingredient of modern science. The U.S. Department of Energy (DOE) is a world leader in the development of high-performance computing (HPC), the development of applied math and algorithms that utilize the full potential of HPC platforms, and the application of computing to science and engineering problems. An interesting general question is whether the DOE can strategically utilize its capability in simulations to advance innovation more broadly. In this article, I will argue that this is certainly possible.

  9. New Trends in E-Science: Machine Learning and Knowledge Discovery in Databases

    NASA Astrophysics Data System (ADS)

    Brescia, Massimo

    2012-11-01

    Data mining, or Knowledge Discovery in Databases (KDD), while being the main methodology to extract the scientific information contained in Massive Data Sets (MDS), needs to tackle crucial problems since it has to orchestrate complex challenges posed by transparent access to different computing environments, scalability of algorithms, reusability of resources. To achieve a leap forward for the progress of e-science in the data avalanche era, the community needs to implement an infrastructure capable of performing data access, processing and mining in a distributed but integrated context. The increasing complexity of modern technologies carried out a huge production of data, whose related warehouse management and the need to optimize analysis and mining procedures lead to a change in concept on modern science. Classical data exploration, based on local user own data storage and limited computing infrastructures, is no more efficient in the case of MDS, worldwide spread over inhomogeneous data centres and requiring teraflop processing power. In this context modern experimental and observational science requires a good understanding of computer science, network infrastructures, Data Mining, etc. i.e. of all those techniques which fall into the domain of the so called e-science (recently assessed also by the Fourth Paradigm of Science). Such understanding is almost completely absent in the older generations of scientists and this reflects in the inadequacy of most academic and research programs. A paradigm shift is needed: statistical pattern recognition, object oriented programming, distributed computing, parallel programming need to become an essential part of scientific background. A possible practical solution is to provide the research community with easy-to understand, easy-to-use tools, based on the Web 2.0 technologies and Machine Learning methodology. Tools where almost all the complexity is hidden to the final user, but which are still flexible and able to produce efficient and reliable scientific results. All these considerations will be described in the detail in the chapter. Moreover, examples of modern applications offering to a wide variety of e-science communities a large spectrum of computational facilities to exploit the wealth of available massive data sets and powerful machine learning and statistical algorithms will be also introduced.

  10. Optimizing Engineering Tools Using Modern Ground Architectures

    DTIC Science & Technology

    2017-12-01

    Considerations,” International Journal of Computer Science & Engineering Survey , vol. 5, no. 4, 2014. [10] R. Bell. (n.d). A beginner’s guide to big O notation...scientific community. Traditional computing architectures were not capable of processing the data efficiently, or in some cases, could not process the...thesis investigates how these modern computing architectures could be leveraged by industry and academia to improve the performance and capabilities of

  11. The Computational Ecologist’s Toolbox

    EPA Science Inventory

    Computational ecology, nestled in the broader field of data science, is an interdisciplinary field that attempts to improve our understanding of complex ecological systems through the use of modern computational methods. Computational ecology is based on a union of competence in...

  12. Joint the Center for Applied Scientific Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gamblin, Todd; Bremer, Timo; Van Essen, Brian

    The Center for Applied Scientific Computing serves as Livermore Lab’s window to the broader computer science, computational physics, applied mathematics, and data science research communities. In collaboration with academic, industrial, and other government laboratory partners, we conduct world-class scientific research and development on problems critical to national security. CASC applies the power of high-performance computing and the efficiency of modern computational methods to the realms of stockpile stewardship, cyber and energy security, and knowledge discovery for intelligence applications.

  13. Computational Ecology and Open Science: Tools to Help Manage Lakes for Cyanobacteria in Lakes

    EPA Science Inventory

    Computational ecology is an interdisciplinary field that takes advantage of modern computation abilities to expand our ecological understanding. As computational ecologists, we use large data sets, which often cover large spatial extents, and advanced statistical/mathematical co...

  14. How Computer Graphics Work.

    ERIC Educational Resources Information Center

    Prosise, Jeff

    This document presents the principles behind modern computer graphics without straying into the arcane languages of mathematics and computer science. Illustrations accompany the clear, step-by-step explanations that describe how computers draw pictures. The 22 chapters of the book are organized into 5 sections. "Part 1: Computer Graphics in…

  15. The journey from forensic to predictive materials science using density functional theory

    DOE PAGES

    Schultz, Peter A.

    2017-09-12

    Approximate methods for electronic structure, implemented in sophisticated computer codes and married to ever-more powerful computing platforms, have become invaluable in chemistry and materials science. The maturing and consolidation of quantum chemistry codes since the 1980s, based upon explicitly correlated electronic wave functions, has made them a staple of modern molecular chemistry. Here, the impact of first principles electronic structure in physics and materials science had lagged owing to the extra formal and computational demands of bulk calculations.

  16. The journey from forensic to predictive materials science using density functional theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schultz, Peter A.

    Approximate methods for electronic structure, implemented in sophisticated computer codes and married to ever-more powerful computing platforms, have become invaluable in chemistry and materials science. The maturing and consolidation of quantum chemistry codes since the 1980s, based upon explicitly correlated electronic wave functions, has made them a staple of modern molecular chemistry. Here, the impact of first principles electronic structure in physics and materials science had lagged owing to the extra formal and computational demands of bulk calculations.

  17. Translations on Eastern Europe, Scientific Affairs, Number 542.

    DTIC Science & Technology

    1977-04-18

    transplanting human tissue has not as yet been given a final juridical approval like euthanasia, artificial insemination , abortion, birth control, and others...and data teleprocessing. This computer may also be used as a satellite computer for complex systems. The IZOT 310 has a large instruction...a well-known truth that modern science is using the most modern and leading technical facilities—from bathyscaphes to satellites , from gigantic

  18. Computational Skills for Biology Students

    ERIC Educational Resources Information Center

    Gross, Louis J.

    2008-01-01

    This interview with Distinguished Science Award recipient Louis J. Gross highlights essential computational skills for modern biology, including: (1) teaching concepts listed in the Math & Bio 2010 report; (2) illustrating to students that jobs today require quantitative skills; and (3) resources and materials that focus on computational skills.

  19. Analysis of Defenses Against Code Reuse Attacks on Modern and New Architectures

    DTIC Science & Technology

    2015-09-01

    soundness or completeness. An incomplete analysis will produce extra edges in the CFG that might allow an attacker to slip through. An unsound analysis...Analysis of Defenses Against Code Reuse Attacks on Modern and New Architectures by Isaac Noah Evans Submitted to the Department of Electrical...Engineering and Computer Science in partial fulfillment of the requirements for the degree of Master of Engineering in Electrical Engineering and Computer

  20. Examining Key Factors that Contribute to African Americans' Pursuit of Computing Science Degrees: Implications for Cultivating Career Choice and Aspiration

    ERIC Educational Resources Information Center

    Charleston, LaVar Jovan

    2010-01-01

    As a result of decreasing degree attainment in science, technology, engineering, and mathematics (STEM) fields, the United States is undergoing a shortage in the STEM workforce that it has not encountered since the mid-1950s (ACT, 2006; Gilbert & Jackson, 2007). Moreover, as computer usage cuts across diverse aspects of modern culture, the…

  1. Introduction to the theory of machines and languages

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weidhaas, P. P.

    1976-04-01

    This text is intended to be an elementary ''guided tour'' through some basic concepts of modern computer science. Various models of computing machines and formal languages are studied in detail. Discussions center around questions such as, ''What is the scope of problems that can or cannot be solved by computers.''

  2. Experiments in Computing: A Survey

    PubMed Central

    Moisseinen, Nella

    2014-01-01

    Experiments play a central role in science. The role of experiments in computing is, however, unclear. Questions about the relevance of experiments in computing attracted little attention until the 1980s. As the discipline then saw a push towards experimental computer science, a variety of technically, theoretically, and empirically oriented views on experiments emerged. As a consequence of those debates, today's computing fields use experiments and experiment terminology in a variety of ways. This paper analyzes experimentation debates in computing. It presents five ways in which debaters have conceptualized experiments in computing: feasibility experiment, trial experiment, field experiment, comparison experiment, and controlled experiment. This paper has three aims: to clarify experiment terminology in computing; to contribute to disciplinary self-understanding of computing; and, due to computing's centrality in other fields, to promote understanding of experiments in modern science in general. PMID:24688404

  3. Experiments in computing: a survey.

    PubMed

    Tedre, Matti; Moisseinen, Nella

    2014-01-01

    Experiments play a central role in science. The role of experiments in computing is, however, unclear. Questions about the relevance of experiments in computing attracted little attention until the 1980s. As the discipline then saw a push towards experimental computer science, a variety of technically, theoretically, and empirically oriented views on experiments emerged. As a consequence of those debates, today's computing fields use experiments and experiment terminology in a variety of ways. This paper analyzes experimentation debates in computing. It presents five ways in which debaters have conceptualized experiments in computing: feasibility experiment, trial experiment, field experiment, comparison experiment, and controlled experiment. This paper has three aims: to clarify experiment terminology in computing; to contribute to disciplinary self-understanding of computing; and, due to computing's centrality in other fields, to promote understanding of experiments in modern science in general.

  4. Defining Computational Thinking for Mathematics and Science Classrooms

    NASA Astrophysics Data System (ADS)

    Weintrop, David; Beheshti, Elham; Horn, Michael; Orton, Kai; Jona, Kemi; Trouille, Laura; Wilensky, Uri

    2016-02-01

    Science and mathematics are becoming computational endeavors. This fact is reflected in the recently released Next Generation Science Standards and the decision to include "computational thinking" as a core scientific practice. With this addition, and the increased presence of computation in mathematics and scientific contexts, a new urgency has come to the challenge of defining computational thinking and providing a theoretical grounding for what form it should take in school science and mathematics classrooms. This paper presents a response to this challenge by proposing a definition of computational thinking for mathematics and science in the form of a taxonomy consisting of four main categories: data practices, modeling and simulation practices, computational problem solving practices, and systems thinking practices. In formulating this taxonomy, we draw on the existing computational thinking literature, interviews with mathematicians and scientists, and exemplary computational thinking instructional materials. This work was undertaken as part of a larger effort to infuse computational thinking into high school science and mathematics curricular materials. In this paper, we argue for the approach of embedding computational thinking in mathematics and science contexts, present the taxonomy, and discuss how we envision the taxonomy being used to bring current educational efforts in line with the increasingly computational nature of modern science and mathematics.

  5. ExpoCast: Exposure Science for Prioritization and Toxicity Testing (T)

    EPA Science Inventory

    The US EPA National Center for Computational Toxicology (NCCT) has a mission to integrate modern computing and information technology with molecular biology to improve Agency prioritization of data requirements and risk assessment of chemicals. Recognizing the critical need for ...

  6. The State of the Art in Information Handling. Operation PEP/Executive Information Systems.

    ERIC Educational Resources Information Center

    Summers, J. K.; Sullivan, J. E.

    This document explains recent developments in computer science and information systems of interest to the educational manager. A brief history of computers is included, together with an examination of modern computers' capabilities. Various features of card, tape, and disk information storage systems are presented. The importance of time-sharing…

  7. The science in social science

    PubMed Central

    Bernard, H. Russell

    2012-01-01

    A recent poll showed that most people think of science as technology and engineering—life-saving drugs, computers, space exploration, and so on. This was, in fact, the promise of the founders of modern science in the 17th century. It is less commonly understood that social and behavioral sciences have also produced technologies and engineering that dominate our everyday lives. These include polling, marketing, management, insurance, and public health programs. PMID:23213222

  8. Scaling Watershed Models: Modern Approaches to Science Computation with MapReduce, Parallelization, and Cloud Optimization

    EPA Science Inventory

    Environmental models are products of the computer architecture and software tools available at the time of development. Scientifically sound algorithms may persist in their original state even as system architectures and software development approaches evolve and progress. Dating...

  9. Merging the Machines of Modern Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wolf, Laura; Collins, Jim

    Two recent projects have harnessed supercomputing resources at the US Department of Energy’s Argonne National Laboratory in a novel way to support major fusion science and particle collider experiments. Using leadership computing resources, one team ran fine-grid analysis of real-time data to make near-real-time adjustments to an ongoing experiment, while a second team is working to integrate Argonne’s supercomputers into the Large Hadron Collider/ATLAS workflow. Together these efforts represent a new paradigm of the high-performance computing center as a partner in experimental science.

  10. Computing Principal Eigenvectors of Large Web Graphs: Algorithms and Accelerations Related to PageRank and HITS

    ERIC Educational Resources Information Center

    Nagasinghe, Iranga

    2010-01-01

    This thesis investigates and develops a few acceleration techniques for the search engine algorithms used in PageRank and HITS computations. PageRank and HITS methods are two highly successful applications of modern Linear Algebra in computer science and engineering. They constitute the essential technologies accounted for the immense growth and…

  11. Connecting Biology and Organic Chemistry Introductory Laboratory Courses through a Collaborative Research Project

    ERIC Educational Resources Information Center

    Boltax, Ariana L.; Armanious, Stephanie; Kosinski-Collins, Melissa S.; Pontrello, Jason K.

    2015-01-01

    Modern research often requires collaboration of experts in fields, such as math, chemistry, biology, physics, and computer science to develop unique solutions to common problems. Traditional introductory undergraduate laboratory curricula in the sciences often do not emphasize connections possible between the various disciplines. We designed an…

  12. Bernoulli's Principle: Science as a Human Endeavor

    ERIC Educational Resources Information Center

    McCarthy, Deborah

    2008-01-01

    What do the ideas of Daniel Bernoulli--an 18th-century Swiss mathematician, physicist, natural scientist, and professor--and your students' next landing of the space shuttle via computer simulation have in common? Because of his contribution, referred in physical science as Bernoulli's principle, modern flight is possible. The mini learning-cycle…

  13. The Japan-China Amity Delegation Written Report Association for Science, Technology and Economics, Inc.

    DTIC Science & Technology

    1981-03-12

    agriculture, raw materials, energy sources, computers, lasers , space and aeronautics, high energy physics, and genetics. The four modernizations will be...accomp- lished and the strong socialist country that is born at the end of the century will be a keyhole for the promotion of science and technology...Process (FNP). Its purpose is to connect with the Kiautsu University computer (model 108) and then to connect a data terminal . This will make a

  14. All biology is computational biology.

    PubMed

    Markowetz, Florian

    2017-03-01

    Here, I argue that computational thinking and techniques are so central to the quest of understanding life that today all biology is computational biology. Computational biology brings order into our understanding of life, it makes biological concepts rigorous and testable, and it provides a reference map that holds together individual insights. The next modern synthesis in biology will be driven by mathematical, statistical, and computational methods being absorbed into mainstream biological training, turning biology into a quantitative science.

  15. First Encounters of the Close Kind: The Formation Process of Airline Flight Crews

    DTIC Science & Technology

    1987-01-01

    process and aircrew performance, Foushee notes an interesting etymological parallel: "Webster’s New Collegiate Dictionary (1961) defines cockpit as ’a...here combines applications from the physical science of chemistry and the modern science of computers. In chemistry , a shell is a space occupied by

  16. Scientific Computing Strategic Plan for the Idaho National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whiting, Eric Todd

    Scientific computing is a critical foundation of modern science. Without innovations in the field of computational science, the essential missions of the Department of Energy (DOE) would go unrealized. Taking a leadership role in such innovations is Idaho National Laboratory’s (INL’s) challenge and charge, and is central to INL’s ongoing success. Computing is an essential part of INL’s future. DOE science and technology missions rely firmly on computing capabilities in various forms. Modeling and simulation, fueled by innovations in computational science and validated through experiment, are a critical foundation of science and engineering. Big data analytics from an increasing numbermore » of widely varied sources is opening new windows of insight and discovery. Computing is a critical tool in education, science, engineering, and experiments. Advanced computing capabilities in the form of people, tools, computers, and facilities, will position INL competitively to deliver results and solutions on important national science and engineering challenges. A computing strategy must include much more than simply computers. The foundational enabling component of computing at many DOE national laboratories is the combination of a showcase like data center facility coupled with a very capable supercomputer. In addition, network connectivity, disk storage systems, and visualization hardware are critical and generally tightly coupled to the computer system and co located in the same facility. The existence of these resources in a single data center facility opens the doors to many opportunities that would not otherwise be possible.« less

  17. Translational Bounds for Factorial n and the Factorial Polynomial

    ERIC Educational Resources Information Center

    Mahmood, Munir; Edwards, Phillip

    2009-01-01

    During the period 1729-1826 Bernoulli, Euler, Goldbach and Legendre developed expressions for defining and evaluating "n"! and the related gamma function. Expressions related to "n"! and the gamma function are a common feature in computer science and engineering applications. In the modern computer age people live in now, two common tests to…

  18. Modernization (Selected Articles),

    DTIC Science & Technology

    1986-09-18

    newly developed science such as control theory, artificial intelligence, model identification, computer and microelectronics technology, graphic...five "top guns" from around the country specializing in intellignece , mechanics, software and hardware as our technical advisors. In addition

  19. Scientific Visualization: The Modern Oscilloscope for "Seeing the Unseeable" (LBNL Summer Lecture Series)

    ScienceCinema

    Bethel, E. Wes [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Computational Research Division and Scientific Visualization Group

    2018-05-07

    Summer Lecture Series 2008: Scientific visualization transforms abstract data into readily comprehensible images, provide a vehicle for "seeing the unseeable," and play a central role in both experimental and computational sciences. Wes Bethel, who heads the Scientific Visualization Group in the Computational Research Division, presents an overview of visualization and computer graphics, current research challenges, and future directions for the field.

  20. Gravitational Many-Body Problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makino, J.

    2008-04-29

    In this paper, we briefly review some aspects of the gravitational many-body problem, which is one of the oldest problems in the modern mathematical science. Then we review our GRAPE project to design computers specialized to this problem.

  1. Education through the prism of computation

    NASA Astrophysics Data System (ADS)

    Kaurov, Vitaliy

    2014-03-01

    With the rapid development of technology, computation claims its irrevocable place among research components of modern science. Thus to foster a successful future scientist, engineer or educator we need to add computation to the foundations of scientific education. We will discuss what type of paradigm shifts it brings to these foundations on the example of Wolfram Science Summer School. It is one of the most advanced computational outreach programs run by Wolfram Foundation, welcoming participants of almost all ages and backgrounds. Centered on complexity science and physics, it also covers numerous adjacent and interdisciplinary fields such as finance, biology, medicine and even music. We will talk about educational and research experiences in this program during the 12 years of its existence. We will review statistics and outputs the program has produced. Among these are interactive electronic publications at the Wolfram Demonstrations Project and contributions to the computational knowledge engine Wolfram|Alpa.

  2. A Middleware Platform for Providing Mobile and Embedded Computing Instruction to Software Engineering Students

    ERIC Educational Resources Information Center

    Mattmann, C. A.; Medvidovic, N.; Malek, S.; Edwards, G.; Banerjee, S.

    2012-01-01

    As embedded software systems have grown in number, complexity, and importance in the modern world, a corresponding need to teach computer science students how to effectively engineer such systems has arisen. Embedded software systems, such as those that control cell phones, aircraft, and medical equipment, are subject to requirements and…

  3. Using the Eclipse Parallel Tools Platform to Assist Earth Science Model Development and Optimization on High Performance Computers

    NASA Astrophysics Data System (ADS)

    Alameda, J. C.

    2011-12-01

    Development and optimization of computational science models, particularly on high performance computers, and with the advent of ubiquitous multicore processor systems, practically on every system, has been accomplished with basic software tools, typically, command-line based compilers, debuggers, performance tools that have not changed substantially from the days of serial and early vector computers. However, model complexity, including the complexity added by modern message passing libraries such as MPI, and the need for hybrid code models (such as openMP and MPI) to be able to take full advantage of high performance computers with an increasing core count per shared memory node, has made development and optimization of such codes an increasingly arduous task. Additional architectural developments, such as many-core processors, only complicate the situation further. In this paper, we describe how our NSF-funded project, "SI2-SSI: A Productive and Accessible Development Workbench for HPC Applications Using the Eclipse Parallel Tools Platform" (WHPC) seeks to improve the Eclipse Parallel Tools Platform, an environment designed to support scientific code development targeted at a diverse set of high performance computing systems. Our WHPC project to improve Eclipse PTP takes an application-centric view to improve PTP. We are using a set of scientific applications, each with a variety of challenges, and using PTP to drive further improvements to both the scientific application, as well as to understand shortcomings in Eclipse PTP from an application developer perspective, to drive our list of improvements we seek to make. We are also partnering with performance tool providers, to drive higher quality performance tool integration. We have partnered with the Cactus group at Louisiana State University to improve Eclipse's ability to work with computational frameworks and extremely complex build systems, as well as to develop educational materials to incorporate into computational science and engineering codes. Finally, we are partnering with the lead PTP developers at IBM, to ensure we are as effective as possible within the Eclipse community development. We are also conducting training and outreach to our user community, including conference BOF sessions, monthly user calls, and an annual user meeting, so that we can best inform the improvements we make to Eclipse PTP. With these activities we endeavor to encourage use of modern software engineering practices, as enabled through the Eclipse IDE, with computational science and engineering applications. These practices include proper use of source code repositories, tracking and rectifying issues, measuring and monitoring code performance changes against both optimizations as well as ever-changing software stacks and configurations on HPC systems, as well as ultimately encouraging development and maintenance of testing suites -- things that have become commonplace in many software endeavors, but have lagged in the development of science applications. We view that the challenge with the increased complexity of both HPC systems and science applications demands the use of better software engineering methods, preferably enabled by modern tools such as Eclipse PTP, to help the computational science community thrive as we evolve the HPC landscape.

  4. Basic energy sciences: Summary of accomplishments

    NASA Astrophysics Data System (ADS)

    1990-05-01

    For more than four decades, the Department of Energy, including its predecessor agencies, has supported a program of basic research in nuclear- and energy related sciences, known as Basic Energy Sciences. The purpose of the program is to explore fundamental phenomena, create scientific knowledge, and provide unique user facilities necessary for conducting basic research. Its technical interests span the range of scientific disciplines: physical and biological sciences, geological sciences, engineering, mathematics, and computer sciences. Its products and facilities are essential to technology development in many of the more applied areas of the Department's energy, science, and national defense missions. The accomplishments of Basic Energy Sciences research are numerous and significant. Not only have they contributed to Departmental missions, but have aided significantly the development of technologies which now serve modern society daily in business, industry, science, and medicine. In a series of stories, this report highlights 22 accomplishments, selected because of their particularly noteworthy contributions to modern society. A full accounting of all the accomplishments would be voluminous. Detailed documentation of the research results can be found in many thousands of articles published in peer-reviewed technical literature.

  5. Basic Energy Sciences: Summary of Accomplishments

    DOE R&D Accomplishments Database

    1990-05-01

    For more than four decades, the Department of Energy, including its predecessor agencies, has supported a program of basic research in nuclear- and energy-related sciences, known as Basic Energy Sciences. The purpose of the program is to explore fundamental phenomena, create scientific knowledge, and provide unique user'' facilities necessary for conducting basic research. Its technical interests span the range of scientific disciplines: physical and biological sciences, geological sciences, engineering, mathematics, and computer sciences. Its products and facilities are essential to technology development in many of the more applied areas of the Department's energy, science, and national defense missions. The accomplishments of Basic Energy Sciences research are numerous and significant. Not only have they contributed to Departmental missions, but have aided significantly the development of technologies which now serve modern society daily in business, industry, science, and medicine. In a series of stories, this report highlights 22 accomplishments, selected because of their particularly noteworthy contributions to modern society. A full accounting of all the accomplishments would be voluminous. Detailed documentation of the research results can be found in many thousands of articles published in peer-reviewed technical literature.

  6. Using the Principles of BIO2010 to Develop an Introductory, Interdisciplinary Course for Biology Students

    PubMed Central

    Adams, Peter; Goos, Merrilyn

    2010-01-01

    Modern biological sciences require practitioners to have increasing levels of knowledge, competence, and skills in mathematics and programming. A recent review of the science curriculum at the University of Queensland, a large, research-intensive institution in Australia, resulted in the development of a more quantitatively rigorous undergraduate program. Inspired by the National Research Council's BIO2010 report, a new interdisciplinary first-year course (SCIE1000) was created, incorporating mathematics and computer programming in the context of modern science. In this study, the perceptions of biological science students enrolled in SCIE1000 in 2008 and 2009 are measured. Analysis indicates that, as a result of taking SCIE1000, biological science students gained a positive appreciation of the importance of mathematics in their discipline. However, the data revealed that SCIE1000 did not contribute positively to gains in appreciation for computing and only slightly influenced students' motivation to enroll in upper-level quantitative-based courses. Further comparisons between 2008 and 2009 demonstrated the positive effect of using genuine, real-world contexts to enhance student perceptions toward the relevance of mathematics. The results support the recommendation from BIO2010 that mathematics should be introduced to biology students in first-year courses using real-world examples, while challenging the benefits of introducing programming in first-year courses. PMID:20810961

  7. A High School-Collegiate Outreach Program in Chemistry and Biology Delivering Modern Technology in a Mobile Van

    NASA Astrophysics Data System (ADS)

    Craney, Chris; Mazzeo, April; Lord, Kaye

    1996-07-01

    During the past five years the nation's concern for science education has expanded from a discussion about the future supply of Ph.D. scientists and its impact on the nation's scientific competitiveness to the broader consideration of the science education available to all students. Efforts to improve science education have led many authors to suggest greater collaboration between high school science teachers and their college/university colleagues. This article reviews the experience and outcomes of the Teachers + Occidental = Partnership in Science (TOPS) van program operating in the Los Angeles Metropolitan area. The program emphasizes an extensive ongoing staff development, responsiveness to teachers' concerns, technical and on-site support, and sustained interaction between participants and program staff. Access to modern technology, including computer-driven instruments and commercial data analysis software, coupled with increased teacher content knowledge has led to empowerment of teachers and changes in student interest in science. Results of student and teacher questionnaires are reviewed.

  8. Influence of Computer-Aided Assessment on Ways of Working with Mathematics

    ERIC Educational Resources Information Center

    Rønning, Frode

    2017-01-01

    This paper is based on an on-going project for modernizing the basic education in mathematics for engineers at the Norwegian University of Science and Technology. One of the components in the project is using a computer-aided assessment system (Maple T.A.) for handling students' weekly hand-ins. Successful completion of a certain number of problem…

  9. JPRS Report, Science & Technology, USSR: Computers, Control Systems and Machines

    DTIC Science & Technology

    1989-03-14

    optimizatsii slozhnykh sistem (Coding Theory and Complex System Optimization ). Alma-Ata, Nauka Press, 1977, pp. 8-16. 11. Author’s certificate number...Interpreter Specifics [0. I. Amvrosova] ............................................. 141 Creation of Modern Computer Systems for Complex Ecological...processor can be designed to decrease degradation upon failure and assure more reliable processor operation, without requiring more complex software or

  10. Overview of the SAMSI year-long program on Statistical, Mathematical and Computational Methods for Astronomy

    NASA Astrophysics Data System (ADS)

    Jogesh Babu, G.

    2017-01-01

    A year-long research (Aug 2016- May 2017) program on `Statistical, Mathematical and Computational Methods for Astronomy (ASTRO)’ is well under way at Statistical and Applied Mathematical Sciences Institute (SAMSI), a National Science Foundation research institute in Research Triangle Park, NC. This program has brought together astronomers, computer scientists, applied mathematicians and statisticians. The main aims of this program are: to foster cross-disciplinary activities; to accelerate the adoption of modern statistical and mathematical tools into modern astronomy; and to develop new tools needed for important astronomical research problems. The program provides multiple avenues for cross-disciplinary interactions, including several workshops, long-term visitors, and regular teleconferences, so participants can continue collaborations, even if they can only spend limited time in residence at SAMSI. The main program is organized around five working groups:i) Uncertainty Quantification and Astrophysical Emulationii) Synoptic Time Domain Surveysiii) Multivariate and Irregularly Sampled Time Seriesiv) Astrophysical Populationsv) Statistics, computation, and modeling in cosmology.A brief description of each of the work under way by these groups will be given. Overlaps among various working groups will also be highlighted. How the wider astronomy community can both participate and benefit from the activities, will be briefly mentioned.

  11. Applications of modern statistical methods to analysis of data in physical science

    NASA Astrophysics Data System (ADS)

    Wicker, James Eric

    Modern methods of statistical and computational analysis offer solutions to dilemmas confronting researchers in physical science. Although the ideas behind modern statistical and computational analysis methods were originally introduced in the 1970's, most scientists still rely on methods written during the early era of computing. These researchers, who analyze increasingly voluminous and multivariate data sets, need modern analysis methods to extract the best results from their studies. The first section of this work showcases applications of modern linear regression. Since the 1960's, many researchers in spectroscopy have used classical stepwise regression techniques to derive molecular constants. However, problems with thresholds of entry and exit for model variables plagues this analysis method. Other criticisms of this kind of stepwise procedure include its inefficient searching method, the order in which variables enter or leave the model and problems with overfitting data. We implement an information scoring technique that overcomes the assumptions inherent in the stepwise regression process to calculate molecular model parameters. We believe that this kind of information based model evaluation can be applied to more general analysis situations in physical science. The second section proposes new methods of multivariate cluster analysis. The K-means algorithm and the EM algorithm, introduced in the 1960's and 1970's respectively, formed the basis of multivariate cluster analysis methodology for many years. However, several shortcomings of these methods include strong dependence on initial seed values and inaccurate results when the data seriously depart from hypersphericity. We propose new cluster analysis methods based on genetic algorithms that overcomes the strong dependence on initial seed values. In addition, we propose a generalization of the Genetic K-means algorithm which can accurately identify clusters with complex hyperellipsoidal covariance structures. We then use this new algorithm in a genetic algorithm based Expectation-Maximization process that can accurately calculate parameters describing complex clusters in a mixture model routine. Using the accuracy of this GEM algorithm, we assign information scores to cluster calculations in order to best identify the number of mixture components in a multivariate data set. We will showcase how these algorithms can be used to process multivariate data from astronomical observations.

  12. Structural biology computing: Lessons for the biomedical research sciences.

    PubMed

    Morin, Andrew; Sliz, Piotr

    2013-11-01

    The field of structural biology, whose aim is to elucidate the molecular and atomic structures of biological macromolecules, has long been at the forefront of biomedical sciences in adopting and developing computational research methods. Operating at the intersection between biophysics, biochemistry, and molecular biology, structural biology's growth into a foundational framework on which many concepts and findings of molecular biology are interpreted1 has depended largely on parallel advancements in computational tools and techniques. Without these computing advances, modern structural biology would likely have remained an exclusive pursuit practiced by few, and not become the widely practiced, foundational field it is today. As other areas of biomedical research increasingly embrace research computing techniques, the successes, failures and lessons of structural biology computing can serve as a useful guide to progress in other biomedically related research fields. Copyright © 2013 Wiley Periodicals, Inc.

  13. Non-parallel processing: Gendered attrition in academic computer science

    NASA Astrophysics Data System (ADS)

    Cohoon, Joanne Louise Mcgrath

    2000-10-01

    This dissertation addresses the issue of disproportionate female attrition from computer science as an instance of gender segregation in higher education. By adopting a theoretical framework from organizational sociology, it demonstrates that the characteristics and processes of computer science departments strongly influence female retention. The empirical data identifies conditions under which women are retained in the computer science major at comparable rates to men. The research for this dissertation began with interviews of students, faculty, and chairpersons from five computer science departments. These exploratory interviews led to a survey of faculty and chairpersons at computer science and biology departments in Virginia. The data from these surveys are used in comparisons of the computer science and biology disciplines, and for statistical analyses that identify which departmental characteristics promote equal attrition for male and female undergraduates in computer science. This three-pronged methodological approach of interviews, discipline comparisons, and statistical analyses shows that departmental variation in gendered attrition rates can be explained largely by access to opportunity, relative numbers, and other characteristics of the learning environment. Using these concepts, this research identifies nine factors that affect the differential attrition of women from CS departments. These factors are: (1) The gender composition of enrolled students and faculty; (2) Faculty turnover; (3) Institutional support for the department; (4) Preferential attitudes toward female students; (5) Mentoring and supervising by faculty; (6) The local job market, starting salaries, and competitiveness of graduates; (7) Emphasis on teaching; and (8) Joint efforts for student success. This work contributes to our understanding of the gender segregation process in higher education. In addition, it contributes information that can lead to effective solutions for an economically significant issue in modern American society---gender equality in computer science.

  14. Integrated environmental modeling: A vision and roadmap for the future

    EPA Science Inventory

    Integrated environmental modeling (IEM) is inspired by modern environmental problems, decisions, and policies and enabled by transdisciplinary science and computer capabilities that allow the environment to be considered in a holistic way. The problems are characterized by the ex...

  15. Formal methods in computer system design

    NASA Astrophysics Data System (ADS)

    Hoare, C. A. R.

    1989-12-01

    This note expounds a philosophy of engineering design which is stimulated, guided and checked by mathematical calculations and proofs. Its application to software engineering promises the same benifits as those derived from the use of mathematics in all other branches of modern science.

  16. JPRS Report, Science & Technology, USSR: Science & Technology Policy

    DTIC Science & Technology

    1988-09-23

    number of library personnel for preparing survey -analyt- ical references, but by equipping them with modern computer hardware for acquiring information...of manpower, material, technical, and financial resources and limits of capital investments and planning, surveying , and contractual work, which...USSR State Prize for the development and introduction of a technology of the production of shampoo from fish protein. During the period under review

  17. VERAIn

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simunovic, Srdjan

    2015-02-16

    CASL's modeling and simulation technology, the Virtual Environment for Reactor Applications (VERA), incorporates coupled physics and science-based models, state-of-the-art numerical methods, modern computational science, integrated uncertainty quantification (UQ) and validation against data from operating pressurized water reactors (PWRs), single-effect experiments, and integral tests. The computational simulation component of VERA is the VERA Core Simulator (VERA-CS). The core simulator is the specific collection of multi-physics computer codes used to model and deplete a LWR core over multiple cycles. The core simulator has a single common input file that drives all of the different physics codes. The parser code, VERAIn, converts VERAmore » Input into an XML file that is used as input to different VERA codes.« less

  18. Neuromorphic Computing – From Materials Research to Systems Architecture Roundtable

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schuller, Ivan K.; Stevens, Rick; Pino, Robinson

    2015-10-29

    Computation in its many forms is the engine that fuels our modern civilization. Modern computation—based on the von Neumann architecture—has allowed, until now, the development of continuous improvements, as predicted by Moore’s law. However, computation using current architectures and materials will inevitably—within the next 10 years—reach a limit because of fundamental scientific reasons. DOE convened a roundtable of experts in neuromorphic computing systems, materials science, and computer science in Washington on October 29-30, 2015 to address the following basic questions: Can brain-like (“neuromorphic”) computing devices based on new material concepts and systems be developed to dramatically outperform conventional CMOS basedmore » technology? If so, what are the basic research challenges for materials sicence and computing? The overarching answer that emerged was: The development of novel functional materials and devices incorporated into unique architectures will allow a revolutionary technological leap toward the implementation of a fully “neuromorphic” computer. To address this challenge, the following issues were considered: The main differences between neuromorphic and conventional computing as related to: signaling models, timing/clock, non-volatile memory, architecture, fault tolerance, integrated memory and compute, noise tolerance, analog vs. digital, and in situ learning New neuromorphic architectures needed to: produce lower energy consumption, potential novel nanostructured materials, and enhanced computation Device and materials properties needed to implement functions such as: hysteresis, stability, and fault tolerance Comparisons of different implementations: spin torque, memristors, resistive switching, phase change, and optical schemes for enhanced breakthroughs in performance, cost, fault tolerance, and/or manufacturability.« less

  19. Computational sciences in the upstream oil and gas industry

    PubMed Central

    Halsey, Thomas C.

    2016-01-01

    The predominant technical challenge of the upstream oil and gas industry has always been the fundamental uncertainty of the subsurface from which it produces hydrocarbon fluids. The subsurface can be detected remotely by, for example, seismic waves, or it can be penetrated and studied in the extremely limited vicinity of wells. Inevitably, a great deal of uncertainty remains. Computational sciences have been a key avenue to reduce and manage this uncertainty. In this review, we discuss at a relatively non-technical level the current state of three applications of computational sciences in the industry. The first of these is seismic imaging, which is currently being revolutionized by the emergence of full wavefield inversion, enabled by algorithmic advances and petascale computing. The second is reservoir simulation, also being advanced through the use of modern highly parallel computing architectures. Finally, we comment on the role of data analytics in the upstream industry. This article is part of the themed issue ‘Energy and the subsurface’. PMID:27597785

  20. Advanced Methodologies for NASA Science Missions

    NASA Astrophysics Data System (ADS)

    Hurlburt, N. E.; Feigelson, E.; Mentzel, C.

    2017-12-01

    Most of NASA's commitment to computational space science involves the organization and processing of Big Data from space-based satellites, and the calculations of advanced physical models based on these datasets. But considerable thought is also needed on what computations are needed. The science questions addressed by space data are so diverse and complex that traditional analysis procedures are often inadequate. The knowledge and skills of the statistician, applied mathematician, and algorithmic computer scientist must be incorporated into programs that currently emphasize engineering and physical science. NASA's culture and administrative mechanisms take full cognizance that major advances in space science are driven by improvements in instrumentation. But it is less well recognized that new instruments and science questions give rise to new challenges in the treatment of satellite data after it is telemetered to the ground. These issues might be divided into two stages: data reduction through software pipelines developed within NASA mission centers; and science analysis that is performed by hundreds of space scientists dispersed through NASA, U.S. universities, and abroad. Both stages benefit from the latest statistical and computational methods; in some cases, the science result is completely inaccessible using traditional procedures. This paper will review the current state of NASA and present example applications using modern methodologies.

  1. A Multidisciplined Teaching Reform of Biomaterials Course for Undergraduate Students

    NASA Astrophysics Data System (ADS)

    Li, Xiaoming; Zhao, Feng; Pu, Fang; Liu, Haifeng; Niu, Xufeng; Zhou, Gang; Li, Deyu; Fan, Yubo; Feng, Qingling; Cui, Fu-zhai; Watari, Fumio

    2015-12-01

    The biomaterials science has advanced in a high speed with global science and technology development during the recent decades, which experts predict to be more obvious in the near future with a more significant position for medicine and health care. Although the three traditional subjects, such as medical science, materials science and biology that act as a scaffold to support the structure of biomaterials science, are still essential for the research and education of biomaterials, other subjects, such as mechanical engineering, mechanics, computer science, automatic science, nanotechnology, and Bio-MEMS, are playing more and more important roles in the modern biomaterials science development. Thus, the research and education of modern biomaterials science should require a logical integration of the interdisciplinary science and technology, which not only concerns medical science, materials science and biology, but also includes other subjects that have been stated above. This article focuses on multidisciplinary nature of biomaterials, the awareness of which is currently lacking in the education at undergraduate stage. In order to meet this educational challenge, we presented a multidisciplinary course that referred to not only traditional sciences, but also frontier sciences and lasted for a whole academic year for senior biomaterials undergraduate students with principles of a better understanding of the modern biomaterials science and meeting the requirements of the future development in this area. The course has been shown to gain the recognition of the participants by questionaries and specific "before and after" comments and has also gained high recognition and persistent supports from our university. The idea of this course might be also fit for the education and construction of some other disciplines.

  2. The modern library: lost and found.

    PubMed Central

    Lindberg, D A

    1996-01-01

    The modern library, a term that was heard frequently in the mid-twentieth century, has fallen into disuse. The over-promotion of computers and all that their enthusiasts promised probably hastened its demise. Today, networking is transforming how libraries provide--and users seek--information. Although the Internet is the natural environment for the health sciences librarian, it is going through growing pains as we face issues of censorship and standards. Today's "modern librarian" must not only be adept at using the Internet but must become familiar with digital information in all its forms--images, full text, and factual data banks. Most important, to stay "modern," today's librarians must embark on a program of lifelong learning that will enable them to make optimum use of the advantages offered by modern technology. PMID:8938334

  3. Astrobiology for the 21st Century

    NASA Astrophysics Data System (ADS)

    Oliveira, C.

    2008-02-01

    We live in a scientific world. Science is all around us. We take scientific principles for granted every time we use a piece of technological apparatus, such as a car, a computer, or a cellphone. In today's world, citizens frequently have to make decisions that require them to have some basic scientific knowledge. To be a contributing citizen in a modern democracy, a person needs to understand the general principles of science.

  4. Clocks to Computers: A Machine-Based “Big Picture” of the History of Modern Science.

    PubMed

    van Lunteren, Frans

    2016-12-01

    Over the last few decades there have been several calls for a “big picture” of the history of science. There is a general need for a concise overview of the rise of modern science, with a clear structure allowing for a rough division into periods. This essay proposes such a scheme, one that is both elementary and comprehensive. It focuses on four machines, which can be seen to have mediated between science and society during successive periods of time: the clock, the balance, the steam engine, and the computer. Following an extended developmental phase, each of these machines came to play a highly visible role in Western societies, both socially and economically. Each of these machines, moreover, was used as a powerful resource for the understanding of both inorganic and organic nature. More specifically, their metaphorical use helped to construe and refine some key concepts that would play a prominent role in such understanding. In each case the key concept would at some point be considered to represent the ultimate building block of reality. Finally, in a refined form, each of these machines would eventually make its entry in scientific research, thereby strengthening the ties between these machines and nature.

  5. A Multidimensional Software Engineering Course

    ERIC Educational Resources Information Center

    Barzilay, O.; Hazzan, O.; Yehudai, A.

    2009-01-01

    Software engineering (SE) is a multidimensional field that involves activities in various areas and disciplines, such as computer science, project management, and system engineering. Though modern SE curricula include designated courses that address these various subjects, an advanced summary course that synthesizes them is still missing. Such a…

  6. Design and Implementation of a Modern Automatic Deformation Monitoring System

    NASA Astrophysics Data System (ADS)

    Engel, Philipp; Schweimler, Björn

    2016-03-01

    The deformation monitoring of structures and buildings is an important task field of modern engineering surveying, ensuring the standing and reliability of supervised objects over a long period. Several commercial hardware and software solutions for the realization of such monitoring measurements are available on the market. In addition to them, a research team at the University of Applied Sciences in Neubrandenburg (NUAS) is actively developing a software package for monitoring purposes in geodesy and geotechnics, which is distributed under an open source licence and free of charge. The task of managing an open source project is well-known in computer science, but it is fairly new in a geodetic context. This paper contributes to that issue by detailing applications, frameworks, and interfaces for the design and implementation of open hardware and software solutions for sensor control, sensor networks, and data management in automatic deformation monitoring. It will be discussed how the development effort of networked applications can be reduced by using free programming tools, cloud computing technologies, and rapid prototyping methods.

  7. Turbulence Model Effects on Cold-Gas Lateral Jet Interaction in a Supersonic Crossflow

    DTIC Science & Technology

    2014-06-01

    performance computing time from the U.S. Department of Defense (DOD) High Performance Computing Modernization program at the U.S. Army Research Laboratory... time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the...thanks Dr. Ross Chaplin , Defence Science Technology Laboratory, United Kingdom (UK), and Dr. David MacManus and Robert Christie, Cranfield University, UK

  8. Cultural stereotypes as gatekeepers: increasing girls' interest in computer science and engineering by diversifying stereotypes.

    PubMed

    Cheryan, Sapna; Master, Allison; Meltzoff, Andrew N

    2015-01-01

    Despite having made significant inroads into many traditionally male-dominated fields (e.g., biology, chemistry), women continue to be underrepresented in computer science and engineering. We propose that students' stereotypes about the culture of these fields-including the kind of people, the work involved, and the values of the field-steer girls away from choosing to enter them. Computer science and engineering are stereotyped in modern American culture as male-oriented fields that involve social isolation, an intense focus on machinery, and inborn brilliance. These stereotypes are compatible with qualities that are typically more valued in men than women in American culture. As a result, when computer science and engineering stereotypes are salient, girls report less interest in these fields than their male peers. However, altering these stereotypes-by broadening the representation of the people who do this work, the work itself, and the environments in which it occurs-significantly increases girls' sense of belonging and interest in the field. Academic stereotypes thus serve as gatekeepers, driving girls away from certain fields and constraining their learning opportunities and career aspirations.

  9. Cultural stereotypes as gatekeepers: increasing girls’ interest in computer science and engineering by diversifying stereotypes

    PubMed Central

    Cheryan, Sapna; Master, Allison; Meltzoff, Andrew N.

    2015-01-01

    Despite having made significant inroads into many traditionally male-dominated fields (e.g., biology, chemistry), women continue to be underrepresented in computer science and engineering. We propose that students’ stereotypes about the culture of these fields—including the kind of people, the work involved, and the values of the field—steer girls away from choosing to enter them. Computer science and engineering are stereotyped in modern American culture as male-oriented fields that involve social isolation, an intense focus on machinery, and inborn brilliance. These stereotypes are compatible with qualities that are typically more valued in men than women in American culture. As a result, when computer science and engineering stereotypes are salient, girls report less interest in these fields than their male peers. However, altering these stereotypes—by broadening the representation of the people who do this work, the work itself, and the environments in which it occurs—significantly increases girls’ sense of belonging and interest in the field. Academic stereotypes thus serve as gatekeepers, driving girls away from certain fields and constraining their learning opportunities and career aspirations. PMID:25717308

  10. Educational process in modern climatology within the web-GIS platform "Climate"

    NASA Astrophysics Data System (ADS)

    Gordova, Yulia; Gorbatenko, Valentina; Gordov, Evgeny; Martynova, Yulia; Okladnikov, Igor; Titov, Alexander; Shulgina, Tamara

    2013-04-01

    These days, common to all scientific fields the problem of training of scientists in the environmental sciences is exacerbated by the need to develop new computational and information technology skills in distributed multi-disciplinary teams. To address this and other pressing problems of Earth system sciences, software infrastructure for information support of integrated research in the geosciences was created based on modern information and computational technologies and a software and hardware platform "Climate» (http://climate.scert.ru/) was developed. In addition to the direct analysis of geophysical data archives, the platform is aimed at teaching the basics of the study of changes in regional climate. The educational component of the platform includes a series of lectures on climate, environmental and meteorological modeling and laboratory work cycles on the basics of analysis of current and potential future regional climate change using Siberia territory as an example. The educational process within the Platform is implemented using the distance learning system Moodle (www.moodle.org). This work is partially supported by the Ministry of education and science of the Russian Federation (contract #8345), SB RAS project VIII.80.2.1, RFBR grant #11-05-01190a, and integrated project SB RAS #131.

  11. Building a Data Science capability for USGS water research and communication

    NASA Astrophysics Data System (ADS)

    Appling, A.; Read, E. K.

    2015-12-01

    Interpreting and communicating water issues in an era of exponentially increasing information requires a blend of domain expertise, computational proficiency, and communication skills. The USGS Office of Water Information has established a Data Science team to meet these needs, providing challenging careers for diverse domain scientists and innovators in the fields of information technology and data visualization. Here, we detail the experience of building a Data Science capability as a bridging element between traditional water resources analyses and modern computing tools and data management techniques. This approach includes four major components: 1) building reusable research tools, 2) documenting data-intensive research approaches in peer reviewed journals, 3) communicating complex water resources issues with interactive web visualizations, and 4) offering training programs for our peers in scientific computing. These components collectively improve the efficiency, transparency, and reproducibility of USGS data analyses and scientific workflows.

  12. CSM research: Methods and application studies

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.

    1989-01-01

    Computational mechanics is that discipline of applied science and engineering devoted to the study of physical phenomena by means of computational methods based on mathematical modeling and simulation, utilizing digital computers. The discipline combines theoretical and applied mechanics, approximation theory, numerical analysis, and computer science. Computational mechanics has had a major impact on engineering analysis and design. When applied to structural mechanics, the discipline is referred to herein as computational structural mechanics. Complex structures being considered by NASA for the 1990's include composite primary aircraft structures and the space station. These structures will be much more difficult to analyze than today's structures and necessitate a major upgrade in computerized structural analysis technology. NASA has initiated a research activity in structural analysis called Computational Structural Mechanics (CSM). The broad objective of the CSM activity is to develop advanced structural analysis technology that will exploit modern and emerging computers, such as those with vector and/or parallel processing capabilities. Here, the current research directions for the Methods and Application Studies Team of the Langley CSM activity are described.

  13. Research Experiences for 14 Year Olds: preliminary report on the `Sky Explorer' pilot program at Springfield (MA) High School of Science and Technology

    NASA Astrophysics Data System (ADS)

    Tucker, G. E.

    1997-05-01

    This NSF supported program, emphasizing hands-on learning and observation with modern instruments, is described in its pilot phase, prior to being launched nationally. A group of 14 year old students are using a small (21 cm) computer controlled telescope and CCD camera to do: (1) a 'sky survey' of brighter celestial objects, finding, identifying, and learning about them, and accumulating a portfolio of images, (2) photometry of variable stars, reducing the data to get a light curve, and (3) learn modern computer-based communication/dissemination skills by posting images and data to a Web site they are designing (http://www.javanet.com/ sky) and contributing data to archives (e.g. AAVSO) via the Internet. To attract more interest to astronomy and science in general and have a wider impact on the school and surrounding community, peer teaching is used as a pedagogical technique and families are encouraged to participate. Students teach e.g. astronomy, software and computers, Internet, instrumentation, and observing to other students, parents and the community by means of daytime presentations of their results (images and data) and evening public viewing at the telescope, operating the equipment themselves. Students can contribute scientifically significant data and experience the `discovery' aspect of science through observing projects where a measurement is made. Their `informal education' activities also help improve the perception of science in general and astronomy in particular in society at large. This program could benefit from collaboration with astronomers wanting to organize geographically distributed observing campaigns coordinated over the Internet and willing to advise on promising observational programs for small telescopes in the context of current science.

  14. Preparing Students for Careers in Science and Industry with Computational Physics

    NASA Astrophysics Data System (ADS)

    Florinski, V. A.

    2011-12-01

    Funded by NSF CAREER grant, the University of Alabama (UAH) in Huntsville has launched a new graduate program in Computational Physics. It is universally accepted that today's physics is done on a computer. The program blends the boundary between physics and computer science by teaching student modern, practical techniques of solving difficult physics problems using diverse computational platforms. Currently consisting of two courses first offered in the Fall of 2011, the program will eventually include 5 courses covering methods for fluid dynamics, particle transport via stochastic methods, and hybrid and PIC plasma simulations. The UAH's unique location allows courses to be shaped through discussions with faculty, NASA/MSFC researchers and local R&D business representatives, i.e., potential employers of the program's graduates. Students currently participating in the program have all begun their research careers in space and plasma physics; many are presenting their research at this meeting.

  15. Computational Physics in a Nutshell

    NASA Astrophysics Data System (ADS)

    Schillaci, Michael

    2001-11-01

    Too often students of science are expected to ``pick-up'' what they need to know about the Art of Science. A description of the two-semester Computational Physics course being taught by the author offers a remedy to this situation. The course teaches students the three pillars of modern scientific research: Problem Solving, Programming, and Presentation. Using FORTRAN, LaTeXe, MAPLE V, HTML, and JAVA, students learn the fundamentals of algorithm development, how to implement classes and packages written by others, how to produce publication quality graphics and documents and how to publish them on the world-wide-web. The course content is outlined and project examples are offered.

  16. Hands-on approach to teaching Earth system sciences using a information-computational web-GIS portal "Climate"

    NASA Astrophysics Data System (ADS)

    Gordova, Yulia; Gorbatenko, Valentina; Martynova, Yulia; Shulgina, Tamara

    2014-05-01

    A problem of making education relevant to the workplace tasks is a key problem of higher education because old-school training programs are not keeping pace with the rapidly changing situation in the professional field of environmental sciences. A joint group of specialists from Tomsk State University and Siberian center for Environmental research and Training/IMCES SB RAS developed several new courses for students of "Climatology" and "Meteorology" specialties, which comprises theoretical knowledge from up-to-date environmental sciences with practical tasks. To organize the educational process we use an open-source course management system Moodle (www.moodle.org). It gave us an opportunity to combine text and multimedia in a theoretical part of educational courses. The hands-on approach is realized through development of innovative trainings which are performed within the information-computational platform "Climate" (http://climate.scert.ru/) using web GIS tools. These trainings contain practical tasks on climate modeling and climate changes assessment and analysis and should be performed using typical tools which are usually used by scientists performing such kind of research. Thus, students are engaged in n the use of modern tools of the geophysical data analysis and it cultivates dynamic of their professional learning. The hands-on approach can help us to fill in this gap because it is the only approach that offers experience, increases students involvement, advance the use of modern information and communication tools. The courses are implemented at Tomsk State University and help forming modern curriculum in Earth system science area. This work is partially supported by SB RAS project VIII.80.2.1, RFBR grants numbers 13-05-12034 and 14-05-00502.

  17. The application of computer image analysis in life sciences and environmental engineering

    NASA Astrophysics Data System (ADS)

    Mazur, R.; Lewicki, A.; Przybył, K.; Zaborowicz, M.; Koszela, K.; Boniecki, P.; Mueller, W.; Raba, B.

    2014-04-01

    The main aim of the article was to present research on the application of computer image analysis in Life Science and Environmental Engineering. The authors used different methods of computer image analysis in developing of an innovative biotest in modern biomonitoring of water quality. Created tools were based on live organisms such as bioindicators Lemna minor L. and Hydra vulgaris Pallas as well as computer image analysis method in the assessment of negatives reactions during the exposition of the organisms to selected water toxicants. All of these methods belong to acute toxicity tests and are particularly essential in ecotoxicological assessment of water pollutants. Developed bioassays can be used not only in scientific research but are also applicable in environmental engineering and agriculture in the study of adverse effects on water quality of various compounds used in agriculture and industry.

  18. Provenance Challenges for Earth Science Dataset Publication

    NASA Technical Reports Server (NTRS)

    Tilmes, Curt

    2011-01-01

    Modern science is increasingly dependent on computational analysis of very large data sets. Organizing, referencing, publishing those data has become a complex problem. Published research that depends on such data often fails to cite the data in sufficient detail to allow an independent scientist to reproduce the original experiments and analyses. This paper explores some of the challenges related to data identification, equivalence and reproducibility in the domain of data intensive scientific processing. It will use the example of Earth Science satellite data, but the challenges also apply to other domains.

  19. Educational Assessment Using Intelligent Systems. Research Report. ETS RR-08-68

    ERIC Educational Resources Information Center

    Shute, Valerie J.; Zapata-Rivera, Diego

    2008-01-01

    Recent advances in educational assessment, cognitive science, and artificial intelligence have made it possible to integrate valid assessment and instruction in the form of modern computer-based intelligent systems. These intelligent systems leverage assessment information that is gathered from various sources (e.g., summative and formative). This…

  20. Rethinking Technology-Enhanced Physics Teacher Education: From Theory to Practice

    ERIC Educational Resources Information Center

    Milner-Bolotin, Marina

    2016-01-01

    This article discusses how modern technology, such as electronic response systems, PeerWise system, data collection and analysis tools, computer simulations, and modeling software can be used in physics methods courses to promote teacher-candidates' professional competencies and their positive attitudes about mathematics and science education. We…

  1. Advanced Scientific Computing Research Exascale Requirements Review. An Office of Science review sponsored by Advanced Scientific Computing Research, September 27-29, 2016, Rockville, Maryland

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Almgren, Ann; DeMar, Phil; Vetter, Jeffrey

    The widespread use of computing in the American economy would not be possible without a thoughtful, exploratory research and development (R&D) community pushing the performance edge of operating systems, computer languages, and software libraries. These are the tools and building blocks — the hammers, chisels, bricks, and mortar — of the smartphone, the cloud, and the computing services on which we rely. Engineers and scientists need ever-more specialized computing tools to discover new material properties for manufacturing, make energy generation safer and more efficient, and provide insight into the fundamentals of the universe, for example. The research division of themore » U.S. Department of Energy’s (DOE’s) Office of Advanced Scientific Computing and Research (ASCR Research) ensures that these tools and building blocks are being developed and honed to meet the extreme needs of modern science. See also http://exascaleage.org/ascr/ for additional information.« less

  2. Brønsted acidity of protic ionic liquids: a modern ab initio valence bond theory perspective.

    PubMed

    Patil, Amol Baliram; Mahadeo Bhanage, Bhalchandra

    2016-09-21

    Room temperature ionic liquids (ILs), especially protic ionic liquids (PILs), are used in many areas of the chemical sciences. Ionicity, the extent of proton transfer, is a key parameter which determines many physicochemical properties and in turn the suitability of PILs for various applications. The spectrum of computational chemistry techniques applied to investigate ionic liquids includes classical molecular dynamics, Monte Carlo simulations, ab initio molecular dynamics, Density Functional Theory (DFT), CCSD(t) etc. At the other end of the spectrum is another computational approach: modern ab initio Valence Bond Theory (VBT). VBT differs from molecular orbital theory based methods in the expression of the molecular wave function. The molecular wave function in the valence bond ansatz is expressed as a linear combination of valence bond structures. These structures include covalent and ionic structures explicitly. Modern ab initio valence bond theory calculations of representative primary and tertiary ammonium protic ionic liquids indicate that modern ab initio valence bond theory can be employed to assess the acidity and ionicity of protic ionic liquids a priori.

  3. A synthetic design environment for ship design

    NASA Technical Reports Server (NTRS)

    Chipman, Richard R.

    1995-01-01

    Rapid advances in computer science and information system technology have made possible the creation of synthetic design environments (SDE) which use virtual prototypes to increase the efficiency and agility of the design process. This next generation of computer-based design tools will rely heavily on simulation and advanced visualization techniques to enable integrated product and process teams to concurrently conceptualize, design, and test a product and its fabrication processes. This paper summarizes a successful demonstration of the feasibility of using a simulation based design environment in the shipbuilding industry. As computer science and information science technologies have evolved, there have been many attempts to apply and integrate the new capabilities into systems for the improvement of the process of design. We see the benefits of those efforts in the abundance of highly reliable, technologically complex products and services in the modern marketplace. Furthermore, the computer-based technologies have been so cost effective that the improvements embodied in modern products have been accompanied by lowered costs. Today the state-of-the-art in computerized design has advanced so dramatically that the focus is no longer on merely improving design methodology; rather the goal is to revolutionize the entire process by which complex products are conceived, designed, fabricated, tested, deployed, operated, maintained, refurbished and eventually decommissioned. By concurrently addressing all life-cycle issues, the basic decision making process within an enterprise will be improved dramatically, leading to new levels of quality, innovation, efficiency, and customer responsiveness. By integrating functions and people with an enterprise, such systems will change the fundamental way American industries are organized, creating companies that are more competitive, creative, and productive.

  4. Astronomy and Computing: A new journal for the astronomical computing community

    NASA Astrophysics Data System (ADS)

    Accomazzi, Alberto; Budavári, Tamás; Fluke, Christopher; Gray, Norman; Mann, Robert G.; O'Mullane, William; Wicenec, Andreas; Wise, Michael

    2013-02-01

    We introduce Astronomy and Computing, a new journal for the growing population of people working in the domain where astronomy overlaps with computer science and information technology. The journal aims to provide a new communication channel within that community, which is not well served by current journals, and to help secure recognition of its true importance within modern astronomy. In this inaugural editorial, we describe the rationale for creating the journal, outline its scope and ambitions, and seek input from the community in defining in detail how the journal should work towards its high-level goals.

  5. Levitating Trains and Kamikaze Genes: Technological Literacy for the Future

    NASA Astrophysics Data System (ADS)

    Brennan, Richard P.

    1994-08-01

    A lively survey of the horizons of modern technology. Provides easy-to-read summaries of the state of the art in space science, biotechnology, computer science, exotic energy sources and materials engineering as well as life-enhancing medical advancements and environmental, transportation and defense/weapons technologies. Each chapter explains how a current or future technology works and provides an understanding of the underlying scientific concepts. Includes an extensive self-test to review your knowledge.

  6. Conflict Management in Collaborative Engineering Design: Basic Research in Fundamental Theory, Modeling Framework, and Computer Support for Collaborative Engineering Activities

    DTIC Science & Technology

    2002-01-01

    behaviors are influenced by social interactions, and to how modern IT sys- tems should be designed to support these group technical activities. The...engineering disciplines to behavior, decision, psychology, organization, and the social sciences. “Conflict manage- ment activity in collaborative...Researchers instead began to search for an entirely new paradigm, starting from a theory in social science, to construct a conceptual framework to describe

  7. Application of SLURM, BOINC, and GlusterFS as Software System for Sustainable Modeling and Data Analytics

    NASA Astrophysics Data System (ADS)

    Kashansky, Vladislav V.; Kaftannikov, Igor L.

    2018-02-01

    Modern numerical modeling experiments and data analytics problems in various fields of science and technology reveal a wide variety of serious requirements for distributed computing systems. Many scientific computing projects sometimes exceed the available resource pool limits, requiring extra scalability and sustainability. In this paper we share the experience and findings of our own on combining the power of SLURM, BOINC and GlusterFS as software system for scientific computing. Especially, we suggest a complete architecture and highlight important aspects of systems integration.

  8. Digital pathology in nephrology clinical trials, research, and pathology practice.

    PubMed

    Barisoni, Laura; Hodgin, Jeffrey B

    2017-11-01

    In this review, we will discuss (i) how the recent advancements in digital technology and computational engineering are currently applied to nephropathology in the setting of clinical research, trials, and practice; (ii) the benefits of the new digital environment; (iii) how recognizing its challenges provides opportunities for transformation; and (iv) nephropathology in the upcoming era of kidney precision and predictive medicine. Recent studies highlighted how new standardized protocols facilitate the harmonization of digital pathology database infrastructure and morphologic, morphometric, and computer-aided quantitative analyses. Digital pathology enables robust protocols for clinical trials and research, with the potential to identify previously underused or unrecognized clinically useful parameters. The integration of digital pathology with molecular signatures is leading the way to establishing clinically relevant morpho-omic taxonomies of renal diseases. The introduction of digital pathology in clinical research and trials, and the progressive implementation of the modern software ecosystem, opens opportunities for the development of new predictive diagnostic paradigms and computer-aided algorithms, transforming the practice of renal disease into a modern computational science.

  9. Community Coordinated Modeling Center: A Powerful Resource in Space Science and Space Weather Education

    NASA Astrophysics Data System (ADS)

    Chulaki, A.; Kuznetsova, M. M.; Rastaetter, L.; MacNeice, P. J.; Shim, J. S.; Pulkkinen, A. A.; Taktakishvili, A.; Mays, M. L.; Mendoza, A. M. M.; Zheng, Y.; Mullinix, R.; Collado-Vega, Y. M.; Maddox, M. M.; Pembroke, A. D.; Wiegand, C.

    2015-12-01

    Community Coordinated Modeling Center (CCMC) is a NASA affiliated interagency partnership with the primary goal of aiding the transition of modern space science models into space weather forecasting while supporting space science research. Additionally, over the past ten years it has established itself as a global space science education resource supporting undergraduate and graduate education and research, and spreading space weather awareness worldwide. A unique combination of assets, capabilities and close ties to the scientific and educational communities enable this small group to serve as a hub for raising generations of young space scientists and engineers. CCMC resources are publicly available online, providing unprecedented global access to the largest collection of modern space science models (developed by the international research community). CCMC has revolutionized the way simulations are utilized in classrooms settings, student projects, and scientific labs and serves hundreds of educators, students and researchers every year. Another major CCMC asset is an expert space weather prototyping team primarily serving NASA's interplanetary space weather needs. Capitalizing on its unrivaled capabilities and experiences, the team provides in-depth space weather training to students and professionals worldwide, and offers an amazing opportunity for undergraduates to engage in real-time space weather monitoring, analysis, forecasting and research. In-house development of state-of-the-art space weather tools and applications provides exciting opportunities to students majoring in computer science and computer engineering fields to intern with the software engineers at the CCMC while also learning about the space weather from the NASA scientists.

  10. Discover the Cosmos - Bringing Cutting Edge Science to Schools across Europe

    NASA Astrophysics Data System (ADS)

    Doran, Rosa

    2015-03-01

    The fast growing number of science data repositories is opening enormous possibilities to scientists all over the world. The emergence of citizen science projects is engaging in science discovery a large number of citizens globally. Astronomical research is now a possibility to anyone having a computer and some form of data access. This opens a very interesting and strategic possibility to engage large audiences in the making and understanding of science. On another perspective it would be only natural to imagine that soon enough data mining will be an active part of the academic path of university or even secondary schools students. The possibility is very exciting but the road not very promising. Even in the most developed nations, where all schools are equipped with modern ICT facilities the use of such possibilities is still a very rare episode. The Galileo Teacher Training Program GTTP, a legacy of IYA2009, is participating in some of the most emblematic projects funded by the European Commission and targeting modern tools, resources and methodologies for science teaching. One of this projects is Discover the Cosmos which is aiming to target this issue by empowering educators with the necessary skills to embark on this innovative path: teaching science while doing science.

  11. The computational challenges of Earth-system science.

    PubMed

    O'Neill, Alan; Steenman-Clark, Lois

    2002-06-15

    The Earth system--comprising atmosphere, ocean, land, cryosphere and biosphere--is an immensely complex system, involving processes and interactions on a wide range of space- and time-scales. To understand and predict the evolution of the Earth system is one of the greatest challenges of modern science, with success likely to bring enormous societal benefits. High-performance computing, along with the wealth of new observational data, is revolutionizing our ability to simulate the Earth system with computer models that link the different components of the system together. There are, however, considerable scientific and technical challenges to be overcome. This paper will consider four of them: complexity, spatial resolution, inherent uncertainty and time-scales. Meeting these challenges requires a significant increase in the power of high-performance computers. The benefits of being able to make reliable predictions about the evolution of the Earth system should, on their own, amply repay this investment.

  12. Mechanical Modeling and Computer Simulation of Protein Folding

    ERIC Educational Resources Information Center

    Prigozhin, Maxim B.; Scott, Gregory E.; Denos, Sharlene

    2014-01-01

    In this activity, science education and modern technology are bridged to teach students at the high school and undergraduate levels about protein folding and to strengthen their model building skills. Students are guided from a textbook picture of a protein as a rigid crystal structure to a more realistic view: proteins are highly dynamic…

  13. The Effectiveness of Screencasts and Cognitive Tools as Scaffolding for Novice Object-Oriented Programmers

    ERIC Educational Resources Information Center

    Lee, Mark J. W.; Pradhan, Sunam; Dalgarno, Barney

    2008-01-01

    Modern information technology and computer science curricula employ a variety of graphical tools and development environments to facilitate student learning of introductory programming concepts and techniques. While the provision of interactive features and the use of visualization can enhance students' understanding and assist them in grasping…

  14. Bridging the Gap between Basic and Clinical Sciences: A Description of a Radiological Anatomy Course

    ERIC Educational Resources Information Center

    Torres, Anna; Staskiewicz, Grzegorz J.; Lisiecka, Justyna; Pietrzyk, Lukasz; Czekajlo, Michael; Arancibia, Carlos U.; Maciejewski, Ryszard; Torres, Kamil

    2016-01-01

    A wide variety of medical imaging techniques pervade modern medicine, and the changing portability and performance of tools like ultrasound imaging have brought these medical imaging techniques into the everyday practice of many specialties outside of radiology. However, proper interpretation of ultrasonographic and computed tomographic images…

  15. Engaging College Science Students and Changing Academic Achievement with Technology: A Quasi-Experimental Preliminary Investigation

    ERIC Educational Resources Information Center

    Carle, Adam C.; Jaffee, David; Miller, Deborah

    2009-01-01

    Can modern, computer-based technology engage college students and improve their academic achievement in college? Although numerous examples detail technology's classroom uses, few studies empirically examine whether technologically oriented pedagogical changes factually lead to positive outcomes among college students. In this pilot study, we used…

  16. Intricacies of modern supercomputing illustrated with recent advances in simulations of strongly correlated electron systems

    NASA Astrophysics Data System (ADS)

    Schulthess, Thomas C.

    2013-03-01

    The continued thousand-fold improvement in sustained application performance per decade on modern supercomputers keeps opening new opportunities for scientific simulations. But supercomputers have become very complex machines, built with thousands or tens of thousands of complex nodes consisting of multiple CPU cores or, most recently, a combination of CPU and GPU processors. Efficient simulations on such high-end computing systems require tailored algorithms that optimally map numerical methods to particular architectures. These intricacies will be illustrated with simulations of strongly correlated electron systems, where the development of quantum cluster methods, Monte Carlo techniques, as well as their optimal implementation by means of algorithms with improved data locality and high arithmetic density have gone hand in hand with evolving computer architectures. The present work would not have been possible without continued access to computing resources at the National Center for Computational Science of Oak Ridge National Laboratory, which is funded by the Facilities Division of the Office of Advanced Scientific Computing Research, and the Swiss National Supercomputing Center (CSCS) that is funded by ETH Zurich.

  17. Leveraging e-Science infrastructure for electrochemical research.

    PubMed

    Peachey, Tom; Mashkina, Elena; Lee, Chong-Yong; Enticott, Colin; Abramson, David; Bond, Alan M; Elton, Darrell; Gavaghan, David J; Stevenson, Gareth P; Kennedy, Gareth F

    2011-08-28

    As in many scientific disciplines, modern chemistry involves a mix of experimentation and computer-supported theory. Historically, these skills have been provided by different groups, and range from traditional 'wet' laboratory science to advanced numerical simulation. Increasingly, progress is made by global collaborations, in which new theory may be developed in one part of the world and applied and tested in the laboratory elsewhere. e-Science, or cyber-infrastructure, underpins such collaborations by providing a unified platform for accessing scientific instruments, computers and data archives, and collaboration tools. In this paper we discuss the application of advanced e-Science software tools to electrochemistry research performed in three different laboratories--two at Monash University in Australia and one at the University of Oxford in the UK. We show that software tools that were originally developed for a range of application domains can be applied to electrochemical problems, in particular Fourier voltammetry. Moreover, we show that, by replacing ad-hoc manual processes with e-Science tools, we obtain more accurate solutions automatically.

  18. Bioinformatics.

    PubMed

    Moore, Jason H

    2007-11-01

    Bioinformatics is an interdisciplinary field that blends computer science and biostatistics with biological and biomedical sciences such as biochemistry, cell biology, developmental biology, genetics, genomics, and physiology. An important goal of bioinformatics is to facilitate the management, analysis, and interpretation of data from biological experiments and observational studies. The goal of this review is to introduce some of the important concepts in bioinformatics that must be considered when planning and executing a modern biological research study. We review database resources as well as data mining software tools.

  19. A brief history of the most remarkable numbers e, i and γ in mathematical sciences with applications

    NASA Astrophysics Data System (ADS)

    Debnath, Lokenath

    2015-08-01

    This paper deals with a brief history of the most remarkable Euler numbers e, i and γ in mathematical sciences. Included are many properties of the constants e, i and γ and their applications in algebra, geometry, physics, chemistry, ecology, business and industry. Special attention is given to the growth and decay phenomena in many real-world problems including stability and instability of their solutions. Some specific and modern applications of logarithms, complex numbers and complex exponential functions to electrical circuits and mechanical systems are presented with examples. Included are the use of complex numbers and complex functions in the description and analysis of chaos and fractals with the aid of modern computer technology. In addition, the phasor method is described with examples of applications in engineering science. The major focus of this paper is to provide basic information through historical approach to mathematics teaching and learning of the fundamental knowledge and skills required for students and teachers at all levels so that they can understand the concepts of mathematics, and mathematics education in science and technology.

  20. Designing and Implementing a Computational Methods Course for Upper-level Undergraduates and Postgraduates in Atmospheric and Oceanic Sciences

    NASA Astrophysics Data System (ADS)

    Nelson, E.; L'Ecuyer, T. S.; Douglas, A.; Hansen, Z.

    2017-12-01

    In the modern computing age, scientists must utilize a wide variety of skills to carry out scientific research. Programming, including a focus on collaborative development, has become more prevalent in both academic and professional career paths. Faculty in the Department of Atmospheric and Oceanic Sciences at the University of Wisconsin—Madison recognized this need and recently approved a new course offering for undergraduates and postgraduates in computational methods that was first held in Spring 2017. Three programming languages were covered in the inaugural course semester and development themes such as modularization, data wrangling, and conceptual code models were woven into all of the sections. In this presentation, we will share successes and challenges in developing a research project-focused computational course that leverages hands-on computer laboratory learning and open-sourced course content. Improvements and changes in future iterations of the course based on the first offering will also be discussed.

  1. Why Machine-Information Metaphors are Bad for Science and Science Education

    NASA Astrophysics Data System (ADS)

    Pigliucci, Massimo; Boudry, Maarten

    2011-05-01

    Genes are often described by biologists using metaphors derived from computational science: they are thought of as carriers of information, as being the equivalent of "blueprints" for the construction of organisms. Likewise, cells are often characterized as "factories" and organisms themselves become analogous to machines. Accordingly, when the human genome project was initially announced, the promise was that we would soon know how a human being is made, just as we know how to make airplanes and buildings. Importantly, modern proponents of Intelligent Design, the latest version of creationism, have exploited biologists' use of the language of information and blueprints to make their spurious case, based on pseudoscientific concepts such as "irreducible complexity" and on flawed analogies between living cells and mechanical factories. However, the living organism = machine analogy was criticized already by David Hume in his Dialogues Concerning Natural Religion. In line with Hume's criticism, over the past several years a more nuanced and accurate understanding of what genes are and how they operate has emerged, ironically in part from the work of computational scientists who take biology, and in particular developmental biology, more seriously than some biologists seem to do. In this article we connect Hume's original criticism of the living organism = machine analogy with the modern ID movement, and illustrate how the use of misleading and outdated metaphors in science can play into the hands of pseudoscientists. Thus, we argue that dropping the blueprint and similar metaphors will improve both the science of biology and its understanding by the general public.

  2. Cellular automaton supercomputing

    NASA Technical Reports Server (NTRS)

    Wolfram, Stephen

    1987-01-01

    Many of the models now used in science and engineering are over a century old. And most of them can be implemented on modern digital computers only with considerable difficulty. Some new basic models are discussed which are much more directly suitable for digital computer simulation. The fundamental principle is that the models considered herein are as suitable as possible for implementation on digital computers. It is then a matter of scientific analysis to determine whether such models can reproduce the behavior seen in physical and other systems. Such analysis was carried out in several cases, and the results are very encouraging.

  3. Equation-free and variable free modeling for complex/multiscale systems. Coarse-grained computation in science and engineering using fine-grained models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kevrekidis, Ioannis G.

    The work explored the linking of modern developing machine learning techniques (manifold learning and in particular diffusion maps) with traditional PDE modeling/discretization/scientific computation techniques via the equation-free methodology developed by the PI. The result (in addition to several PhD degrees, two of them by CSGF Fellows) was a sequence of strong developments - in part on the algorithmic side, linking data mining with scientific computing, and in part on applications, ranging from PDE discretizations to molecular dynamics and complex network dynamics.

  4. Algorithms in Modern Mathematics and Computer Science.

    DTIC Science & Technology

    1980-01-01

    importance, since we will go on doing what we are doing no matter what it is called; after all, other disciplines like Mathematics and Chemistry are no...longer related very strongly to the etymology of their names. However, if I had a chance to vote for the name of my own discipline, I would choose to call

  5. Modern Physics Simulations

    NASA Astrophysics Data System (ADS)

    Brandt, Douglas; Hiller, John R.; Moloney, Michael J.

    1995-10-01

    The Consortium for Upper Level Physics Software (CUPS) has developed a comprehensive series of Nine Book/Software packages that Wiley will publish in FY `95 and `96. CUPS is an international group of 27 physicists, all with extensive backgrounds in the research, teaching, and development of instructional software. The project is being supported by the National Science Foundation (PHY-9014548), and it has received other support from the IBM Corp., Apple Computer Corp., and George Mason University. The Simulations being developed are: Astrophysics, Classical Mechanics, Electricity & Magnetism, Modern Physics, Nuclear and Particle Physics, Quantum Mechanics, Solid State, Thermal and Statistical, and Wave and Optics.

  6. Telescience workstation

    NASA Technical Reports Server (NTRS)

    Brown, Robert L.; Doyle, Dee; Haines, Richard F.; Slocum, Michael

    1989-01-01

    As part of the Telescience Testbed Pilot Program, the Universities Space Research Association/ Research Institute for Advanced Computer Science (USRA/RIACS) proposed to support remote communication by providing a network of human/machine interfaces, computer resources, and experimental equipment which allows: remote science, collaboration, technical exchange, and multimedia communication. The telescience workstation is intended to provide a local computing environment for telescience. The purpose of the program are as follows: (1) to provide a suitable environment to integrate existing and new software for a telescience workstation; (2) to provide a suitable environment to develop new software in support of telescience activities; (3) to provide an interoperable environment so that a wide variety of workstations may be used in the telescience program; (4) to provide a supportive infrastructure and a common software base; and (5) to advance, apply, and evaluate the telescience technolgy base. A prototype telescience computing environment designed to bring practicing scientists in domains other than their computer science into a modern style of doing their computing was created and deployed. This environment, the Telescience Windowing Environment, Phase 1 (TeleWEn-1), met some, but not all of the goals stated above. The TeleWEn-1 provided a window-based workstation environment and a set of tools for text editing, document preparation, electronic mail, multimedia mail, raster manipulation, and system management.

  7. Modern gyrokinetic particle-in-cell simulation of fusion plasmas on top supercomputers

    DOE PAGES

    Wang, Bei; Ethier, Stephane; Tang, William; ...

    2017-06-29

    The Gyrokinetic Toroidal Code at Princeton (GTC-P) is a highly scalable and portable particle-in-cell (PIC) code. It solves the 5D Vlasov-Poisson equation featuring efficient utilization of modern parallel computer architectures at the petascale and beyond. Motivated by the goal of developing a modern code capable of dealing with the physics challenge of increasing problem size with sufficient resolution, new thread-level optimizations have been introduced as well as a key additional domain decomposition. GTC-P's multiple levels of parallelism, including inter-node 2D domain decomposition and particle decomposition, as well as intra-node shared memory partition and vectorization have enabled pushing the scalability ofmore » the PIC method to extreme computational scales. In this paper, we describe the methods developed to build a highly parallelized PIC code across a broad range of supercomputer designs. This particularly includes implementations on heterogeneous systems using NVIDIA GPU accelerators and Intel Xeon Phi (MIC) co-processors and performance comparisons with state-of-the-art homogeneous HPC systems such as Blue Gene/Q. New discovery science capabilities in the magnetic fusion energy application domain are enabled, including investigations of Ion-Temperature-Gradient (ITG) driven turbulence simulations with unprecedented spatial resolution and long temporal duration. Performance studies with realistic fusion experimental parameters are carried out on multiple supercomputing systems spanning a wide range of cache capacities, cache-sharing configurations, memory bandwidth, interconnects and network topologies. These performance comparisons using a realistic discovery-science-capable domain application code provide valuable insights on optimization techniques across one of the broadest sets of current high-end computing platforms worldwide.« less

  8. Modern gyrokinetic particle-in-cell simulation of fusion plasmas on top supercomputers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Bei; Ethier, Stephane; Tang, William

    The Gyrokinetic Toroidal Code at Princeton (GTC-P) is a highly scalable and portable particle-in-cell (PIC) code. It solves the 5D Vlasov-Poisson equation featuring efficient utilization of modern parallel computer architectures at the petascale and beyond. Motivated by the goal of developing a modern code capable of dealing with the physics challenge of increasing problem size with sufficient resolution, new thread-level optimizations have been introduced as well as a key additional domain decomposition. GTC-P's multiple levels of parallelism, including inter-node 2D domain decomposition and particle decomposition, as well as intra-node shared memory partition and vectorization have enabled pushing the scalability ofmore » the PIC method to extreme computational scales. In this paper, we describe the methods developed to build a highly parallelized PIC code across a broad range of supercomputer designs. This particularly includes implementations on heterogeneous systems using NVIDIA GPU accelerators and Intel Xeon Phi (MIC) co-processors and performance comparisons with state-of-the-art homogeneous HPC systems such as Blue Gene/Q. New discovery science capabilities in the magnetic fusion energy application domain are enabled, including investigations of Ion-Temperature-Gradient (ITG) driven turbulence simulations with unprecedented spatial resolution and long temporal duration. Performance studies with realistic fusion experimental parameters are carried out on multiple supercomputing systems spanning a wide range of cache capacities, cache-sharing configurations, memory bandwidth, interconnects and network topologies. These performance comparisons using a realistic discovery-science-capable domain application code provide valuable insights on optimization techniques across one of the broadest sets of current high-end computing platforms worldwide.« less

  9. Computer networks for financial activity management, control and statistics of databases of economic administration at the Joint Institute for Nuclear Research

    NASA Astrophysics Data System (ADS)

    Tyupikova, T. V.; Samoilov, V. N.

    2003-04-01

    Modern information technologies urge natural sciences to further development. But it comes together with evaluation of infrastructures, to spotlight favorable conditions for the development of science and financial base in order to prove and protect legally new research. Any scientific development entails accounting and legal protection. In the report, we consider a new direction in software, organization and control of common databases on the example of the electronic document handling, which functions in some departments of the Joint Institute for Nuclear Research.

  10. A Debate over the Teaching of a Legacy Programming Language in an Information Technology (IT) Program

    ERIC Educational Resources Information Center

    Ali, Azad; Smith, David

    2014-01-01

    This paper presents a debate between two faculty members regarding the teaching of the legacy programming course (COBOL) in a Computer Science (CS) program. Among the two faculty members, one calls for the continuation of teaching this language and the other calls for replacing it with another modern language. Although CS programs are notorious…

  11. Heliophysics Legacy Data Restoration

    NASA Astrophysics Data System (ADS)

    Candey, R. M.; Bell, E. V., II; Bilitza, D.; Chimiak, R.; Cooper, J. F.; Garcia, L. N.; Grayzeck, E. J.; Harris, B. T.; Hills, H. K.; Johnson, R. C.; Kovalick, T. J.; Lal, N.; Leckner, H. A.; Liu, M. H.; McCaslin, P. W.; McGuire, R. E.; Papitashvili, N. E.; Rhodes, S. A.; Roberts, D. A.; Yurow, R. E.

    2016-12-01

    The Space Physics Data Facility (SPDF) , in collaboration with the National Space Science Data Coordinated Archive (NSSDCA), is converting datasets from older NASA missions to online storage. Valuable science is still buried within these datasets, particularly by applying modern algorithms on computers with vastly more storage and processing power than available when originally measured, and when analyzed in conjunction with other data and models. The data were also not readily accessible as archived on 7- and 9-track tapes, microfilm and microfiche and other media. Although many datasets have now been moved online in formats that are readily analyzed, others will still require some deciphering to puzzle out the data values and scientific meaning. There is an ongoing effort to convert the datasets to a modern Common Data Format (CDF) and add metadata for use in browse and analysis tools such as CDAWeb .

  12. The simplicity principle in perception and cognition

    PubMed Central

    Feldman, Jacob

    2016-01-01

    The simplicity principle, traditionally referred to as Occam’s razor, is the idea that simpler explanations of observations should be preferred to more complex ones. In recent decades the principle has been clarified via the incorporation of modern notions of computation and probability, allowing a more precise understanding of how exactly complexity minimization facilitates inference. The simplicity principle has found many applications in modern cognitive science, in contexts as diverse as perception, categorization, reasoning, and neuroscience. In all these areas, the common idea is that the mind seeks the simplest available interpretation of observations— or, more precisely, that it balances a bias towards simplicity with a somewhat opposed constraint to choose models consistent with perceptual or cognitive observations. This brief tutorial surveys some of the uses of the simplicity principle across cognitive science, emphasizing how complexity minimization in a number of forms has been incorporated into probabilistic models of inference. PMID:27470193

  13. Reflections on the history of indoor air science, focusing on the last 50 years.

    PubMed

    Sundell, J

    2017-07-01

    The scientific articles and Indoor Air conference publications of the indoor air sciences (IAS) during the last 50 years are summarized. In total 7524 presentations, from 79 countries, have been made at Indoor Air conferences held between 1978 (49 presentations) and 2014 (1049 presentations). In the Web of Science, 26 992 articles on indoor air research (with the word "indoor" as a search term) have been found (as of 1 Jan 2016) of which 70% were published during the last 10 years. The modern scientific history started in the 1970s with a question: "did indoor air pose a threat to health as did outdoor air?" Soon it was recognized that indoor air is more important, from a health point of view, than outdoor air. Topics of concern were first radon, environmental tobacco smoke, and lung cancer, followed by volatile organic compounds, formaldehyde and sick building syndrome, house dust-mites, asthma and allergies, Legionnaires disease, and other airborne infections. Later emerged dampness/mold-associated allergies and today's concern with "modern exposures-modern diseases." Ventilation, thermal comfort, indoor air chemistry, semi-volatile organic compounds, building simulation by computational fluid dynamics, and fine particulate matter are common topics today. From their beginning in Denmark and Sweden, then in the USA, the indoor air sciences now show increasing activity in East and Southeast Asia. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  14. Access and visualization using clusters and other parallel computers

    NASA Technical Reports Server (NTRS)

    Katz, Daniel S.; Bergou, Attila; Berriman, Bruce; Block, Gary; Collier, Jim; Curkendall, Dave; Good, John; Husman, Laura; Jacob, Joe; Laity, Anastasia; hide

    2003-01-01

    JPL's Parallel Applications Technologies Group has been exploring the issues of data access and visualization of very large data sets over the past 10 or so years. this work has used a number of types of parallel computers, and today includes the use of commodity clusters. This talk will highlight some of the applications and tools we have developed, including how they use parallel computing resources, and specifically how we are using modern clusters. Our applications focus on NASA's needs; thus our data sets are usually related to Earth and Space Science, including data delivered from instruments in space, and data produced by telescopes on the ground.

  15. Computer-based, Jeopardy™-like game in general chemistry for engineering majors

    NASA Astrophysics Data System (ADS)

    Ling, S. S.; Saffre, F.; Kadadha, M.; Gater, D. L.; Isakovic, A. F.

    2013-03-01

    We report on the design of Jeopardy™-like computer game for enhancement of learning of general chemistry for engineering majors. While we examine several parameters of student achievement and attitude, our primary concern is addressing the motivation of students, which tends to be low in a traditionally run chemistry lectures. The effect of the game-playing is tested by comparing paper-based game quiz, which constitutes a control group, and computer-based game quiz, constituting a treatment group. Computer-based game quizzes are Java™-based applications that students run once a week in the second part of the last lecture of the week. Overall effectiveness of the semester-long program is measured through pretest-postest conceptual testing of general chemistry. The objective of this research is to determine to what extent this ``gamification'' of the course delivery and course evaluation processes may be beneficial to the undergraduates' learning of science in general, and chemistry in particular. We present data addressing gender-specific difference in performance, as well as background (pre-college) level of general science and chemistry preparation. We outline the plan how to extend such approach to general physics courses and to modern science driven electives, and we offer live, in-lectures examples of our computer gaming experience. We acknowledge support from Khalifa University, Abu Dhabi

  16. Pious Science: The Gulen Community and the Making of a Conservative Modernity in Turkey

    ERIC Educational Resources Information Center

    Arslan, Berna

    2009-01-01

    This dissertation explores the ways in which the Islamic Fethullah Gulen community engages with science as a response to globalization and modernity. Framed with the theoretical discussions on multiple modernities, it investigates how the community contests for hegemony in the field of science against the project of secular modernity, and…

  17. The seven sins in academic behavior in the natural sciences.

    PubMed

    van Gunsteren, Wilfred F

    2013-01-02

    "Seven deadly sins" in modern academic research and publishing can be condensed into a list ranging from poorly described experimental or computational setups to falsification of data. This Essay describes these sins and their ramifications, and serves as a code of best practice for researchers in their quest for scientific truth. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Report of the Army Science Board Summer Study on Installations 2025

    DTIC Science & Technology

    2009-12-01

    stresses , beha- vioral health problems, and injuries associated with war. Transform: IMCOM is modernizing installation management processes, policies...well. For example, "Prediction is very difficult, especially about the future" (Niels Bohr). Others stress that the future will be a lot like the...34homogenization" Endangered species Continuous and ubiquitous of society Islanding computing Telecommuting Wireless proliferation across appliances

  19. Reasoning About Nature: Graduate students and teachers integrating historic and modern science in high school math and science classes

    NASA Astrophysics Data System (ADS)

    Davis, J. B.; Rigsby, C. A.; Muston, C.; Robinson, Z.; Morehead, A.; Stellwag, E. J.; Shinpaugh, J.; Thompson, A.; Teller, J.

    2010-12-01

    Graduate students and faculty at East Carolina University are working with area high schools to address the common science and mathematics deficiencies of many high school students. Project RaN (Reasoning about Nature), an interdisciplinary science/math/education research project, addresses these deficiencies by focusing on the history of science and the relationship between that history and modern scientific thought and practice. The geological sciences portion of project RaN has three specific goals: (1) to elucidate the relationships among the history of scientific discovery, the geological sciences, and modern scientific thought; (2) to develop, and utilize in the classroom, instructional modules that are relevant to the modern geological sciences curriculum and that relate fundamental scientific discoveries and principles to multiple disciplines and to modern societal issues; and (3) to use these activity-based modules to heighten students’ interest in science disciplines and to generate enthusiasm for doing science in both students and instructors. The educational modules that result from this linkage of modern and historical scientific thought are activity-based, directly related to the National Science Standards for the high school sciences curriculum, and adaptable to fit each state’s standard course of study for the sciences and math. They integrate historic sciences and mathematics with modern science, contain relevant background information on both the concept(s) and scientist(s) involved, present questions that compel students to think more deeply (both qualitatively and quantitatively) about the subject matter, and include threads that branch off to related topics. Modules on topics ranging from the density to cladistics to Kepler’s laws of planetary motion have been developed and tested. Pre- and post-module data suggest that both students and teachers benefit from these interdisciplinary historically based classroom experiences.

  20. Colour and Optical Properties of Materials: An Exploration of the Relationship Between Light, the Optical Properties of Materials and Colour

    NASA Astrophysics Data System (ADS)

    Tilley, Richard J. D.

    2003-05-01

    Colour is an important and integral part of everyday life, and an understanding and knowledge of the scientific principles behind colour, with its many applications and uses, is becoming increasingly important to a wide range of academic disciplines, from physical, medical and biological sciences through to the arts. Colour and the Optical Properties of Materials carefully introduces the science behind the subject, along with many modern and cutting-edge applications, chose to appeal to today's students. For science students, it provides a broad introduction to the subject and the many applications of colour. To more applied students, such as engineering and arts students, it provides the essential scientific background to colour and the many applications. Features: * Introduces the science behind the subject whilst closely connecting it to modern applications, such as colour displays, optical amplifiers and colour centre lasers * Richly illustrated with full-colour plates * Includes many worked examples, along with problems and exercises at the end of each chapter and selected answers at the back of the book * A Web site, including additional problems and full solutions to all the problems, which may be accessed at: www.cardiff.ac.uk/uwcc/engin/staff/rdjt/colour Written for students taking an introductory course in colour in a wide range of disciplines such as physics, chemistry, engineering, materials science, computer science, design, photography, architecture and textiles.

  1. M ssbauer spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hermann, Raphael P

    2017-01-01

    This most comprehensive and unrivaled compendium in the field provides an up-to-date account of the chemistry of solids, nanoparticles and hybrid materials. Following a valuable introductory chapter reviewing important synthesis techniques, the handbook presents a series of contributions by about 150 international leading experts -- the "Who's Who" of solid state science. Clearly structured, in six volumes it collates the knowledge available on solid state chemistry, starting from the synthesis, and modern methods of structure determination. Understanding and measuring the physical properties of bulk solids and the theoretical basis of modern computational treatments of solids are given ample space, asmore » are such modern trends as nanoparticles, surface properties and heterogeneous catalysis. Emphasis is placed throughout not only on the design and structure of solids but also on practical applications of these novel materials in real chemical situations.« less

  2. Influence of computational domain size on the pattern formation of the phase field crystals

    NASA Astrophysics Data System (ADS)

    Starodumov, Ilya; Galenko, Peter; Alexandrov, Dmitri; Kropotin, Nikolai

    2017-04-01

    Modeling of crystallization process by the phase field crystal method (PFC) represents one of the important directions of modern computational materials science. This method makes it possible to research the formation of stable or metastable crystal structures. In this paper, we study the effect of computational domain size on the crystal pattern formation obtained as a result of computer simulation by the PFC method. In the current report, we show that if the size of a computational domain is changed, the result of modeling may be a structure in metastable phase instead of pure stable state. The authors present a possible theoretical justification for the observed effect and provide explanations on the possible modification of the PFC method to account for this phenomenon.

  3. [Imaging and the new fabric of the human body].

    PubMed

    Moulin, Anne-Marie; Baulieu, Jean-Louis

    2010-11-01

    A short historical survey recalls the main techniques of medical imaging, based on modern physico-chemistry and computer science. Imagery has provided novel visions of the inside of the body, which are not self-obvious but require a training of the gaze. Yet, these new images have permeated the contemporary mind and inspired esthetic ventures. The popularity of these images may be related to their ambiguous status, between real and virtual. The images, reminiscent of Vesalius' De humani corporis fabrica, crosslink art, science and society in a specific way: which role will they play in the "empowerment" of the tomorrow patient?

  4. Integrating Xgrid into the HENP distributed computing model

    NASA Astrophysics Data System (ADS)

    Hajdu, L.; Kocoloski, A.; Lauret, J.; Miller, M.

    2008-07-01

    Modern Macintosh computers feature Xgrid, a distributed computing architecture built directly into Apple's OS X operating system. While the approach is radically different from those generally expected by the Unix based Grid infrastructures (Open Science Grid, TeraGrid, EGEE), opportunistic computing on Xgrid is nonetheless a tempting and novel way to assemble a computing cluster with a minimum of additional configuration. In fact, it requires only the default operating system and authentication to a central controller from each node. OS X also implements arbitrarily extensible metadata, allowing an instantly updated file catalog to be stored as part of the filesystem itself. The low barrier to entry allows an Xgrid cluster to grow quickly and organically. This paper and presentation will detail the steps that can be taken to make such a cluster a viable resource for HENP research computing. We will further show how to provide to users a unified job submission framework by integrating Xgrid through the STAR Unified Meta-Scheduler (SUMS), making tasks and jobs submission effortlessly at reach for those users already using the tool for traditional Grid or local cluster job submission. We will discuss additional steps that can be taken to make an Xgrid cluster a full partner in grid computing initiatives, focusing on Open Science Grid integration. MIT's Xgrid system currently supports the work of multiple research groups in the Laboratory for Nuclear Science, and has become an important tool for generating simulations and conducting data analyses at the Massachusetts Institute of Technology.

  5. Optimizing high performance computing workflow for protein functional annotation.

    PubMed

    Stanberry, Larissa; Rekepalli, Bhanu; Liu, Yuan; Giblock, Paul; Higdon, Roger; Montague, Elizabeth; Broomall, William; Kolker, Natali; Kolker, Eugene

    2014-09-10

    Functional annotation of newly sequenced genomes is one of the major challenges in modern biology. With modern sequencing technologies, the protein sequence universe is rapidly expanding. Newly sequenced bacterial genomes alone contain over 7.5 million proteins. The rate of data generation has far surpassed that of protein annotation. The volume of protein data makes manual curation infeasible, whereas a high compute cost limits the utility of existing automated approaches. In this work, we present an improved and optmized automated workflow to enable large-scale protein annotation. The workflow uses high performance computing architectures and a low complexity classification algorithm to assign proteins into existing clusters of orthologous groups of proteins. On the basis of the Position-Specific Iterative Basic Local Alignment Search Tool the algorithm ensures at least 80% specificity and sensitivity of the resulting classifications. The workflow utilizes highly scalable parallel applications for classification and sequence alignment. Using Extreme Science and Engineering Discovery Environment supercomputers, the workflow processed 1,200,000 newly sequenced bacterial proteins. With the rapid expansion of the protein sequence universe, the proposed workflow will enable scientists to annotate big genome data.

  6. Optimizing high performance computing workflow for protein functional annotation

    PubMed Central

    Stanberry, Larissa; Rekepalli, Bhanu; Liu, Yuan; Giblock, Paul; Higdon, Roger; Montague, Elizabeth; Broomall, William; Kolker, Natali; Kolker, Eugene

    2014-01-01

    Functional annotation of newly sequenced genomes is one of the major challenges in modern biology. With modern sequencing technologies, the protein sequence universe is rapidly expanding. Newly sequenced bacterial genomes alone contain over 7.5 million proteins. The rate of data generation has far surpassed that of protein annotation. The volume of protein data makes manual curation infeasible, whereas a high compute cost limits the utility of existing automated approaches. In this work, we present an improved and optmized automated workflow to enable large-scale protein annotation. The workflow uses high performance computing architectures and a low complexity classification algorithm to assign proteins into existing clusters of orthologous groups of proteins. On the basis of the Position-Specific Iterative Basic Local Alignment Search Tool the algorithm ensures at least 80% specificity and sensitivity of the resulting classifications. The workflow utilizes highly scalable parallel applications for classification and sequence alignment. Using Extreme Science and Engineering Discovery Environment supercomputers, the workflow processed 1,200,000 newly sequenced bacterial proteins. With the rapid expansion of the protein sequence universe, the proposed workflow will enable scientists to annotate big genome data. PMID:25313296

  7. Network-based statistical comparison of citation topology of bibliographic databases

    PubMed Central

    Šubelj, Lovro; Fiala, Dalibor; Bajec, Marko

    2014-01-01

    Modern bibliographic databases provide the basis for scientific research and its evaluation. While their content and structure differ substantially, there exist only informal notions on their reliability. Here we compare the topological consistency of citation networks extracted from six popular bibliographic databases including Web of Science, CiteSeer and arXiv.org. The networks are assessed through a rich set of local and global graph statistics. We first reveal statistically significant inconsistencies between some of the databases with respect to individual statistics. For example, the introduced field bow-tie decomposition of DBLP Computer Science Bibliography substantially differs from the rest due to the coverage of the database, while the citation information within arXiv.org is the most exhaustive. Finally, we compare the databases over multiple graph statistics using the critical difference diagram. The citation topology of DBLP Computer Science Bibliography is the least consistent with the rest, while, not surprisingly, Web of Science is significantly more reliable from the perspective of consistency. This work can serve either as a reference for scholars in bibliometrics and scientometrics or a scientific evaluation guideline for governments and research agencies. PMID:25263231

  8. S'COOL Provides Research Opportunities and Current Data for Today's Technological Classroom

    NASA Technical Reports Server (NTRS)

    Green, Carolyn J.; Chambers, Lin H.; Racel, Anne M.

    1999-01-01

    NASA's Students' Cloud Observations On-Line (S'COOL) project, a hands-on educational project, was an innovative idea conceived by the scientists in the Radiation Sciences Branch at NASA Langley Research Center, Hampton, Virginia, in 1996. It came about after a local teacher expressed the idea that she wanted her students to be involved in real-life science. S'COOL supports NASA's Clouds and the Earth's Radiant Energy System (CERES) instrument, which was launched on the Tropical Rainforest Measuring Mission (TRMM) in November, 1997, as part of NASA's Earth Science Enterprise. With the S'COOL project students observe clouds and related weather conditions, compute data and note vital information while obtaining ground truth observations for the CERES instrument. The observations can then be used to help validate the CERES measurements, particularly detection of clear sky from space. In addition to meeting math, science and geography standards, students are engaged in using the computer to obtain, report and analyze current data, thus bringing modern technology into the realm of classroom, a paradigm that demands our attention.

  9. HPC Annual Report 2017

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dennig, Yasmin

    Sandia National Laboratories has a long history of significant contributions to the high performance community and industry. Our innovative computer architectures allowed the United States to become the first to break the teraFLOP barrier—propelling us to the international spotlight. Our advanced simulation and modeling capabilities have been integral in high consequence US operations such as Operation Burnt Frost. Strong partnerships with industry leaders, such as Cray, Inc. and Goodyear, have enabled them to leverage our high performance computing (HPC) capabilities to gain a tremendous competitive edge in the marketplace. As part of our continuing commitment to providing modern computing infrastructuremore » and systems in support of Sandia missions, we made a major investment in expanding Building 725 to serve as the new home of HPC systems at Sandia. Work is expected to be completed in 2018 and will result in a modern facility of approximately 15,000 square feet of computer center space. The facility will be ready to house the newest National Nuclear Security Administration/Advanced Simulation and Computing (NNSA/ASC) Prototype platform being acquired by Sandia, with delivery in late 2019 or early 2020. This new system will enable continuing advances by Sandia science and engineering staff in the areas of operating system R&D, operation cost effectiveness (power and innovative cooling technologies), user environment and application code performance.« less

  10. Applying "Climate" system to teaching basic climatology and raising public awareness of climate change issues

    NASA Astrophysics Data System (ADS)

    Gordova, Yulia; Okladnikov, Igor; Titov, Alexander; Gordov, Evgeny

    2016-04-01

    While there is a strong demand for innovation in digital learning, available training programs in the environmental sciences have no time to adapt to rapid changes in the domain content. A joint group of scientists and university teachers develops and implements an educational environment for new learning experiences in basics of climatic science and its applications. This so-called virtual learning laboratory "Climate" contains educational materials and interactive training courses developed to provide undergraduate and graduate students with profound understanding of changes in regional climate and environment. The main feature of this Laboratory is that students perform their computational tasks on climate modeling and evaluation and assessment of climate change using the typical tools of the "Climate" information-computational system, which are usually used by real-life practitioners performing such kind of research. Students have an opportunity to perform computational laboratory works using information-computational tools of the system and improve skills of their usage simultaneously with mastering the subject. We did not create an artificial learning environment to pass the trainings. On the contrary, the main purpose of association of the educational block and computational information system was to familiarize students with the real existing technologies for monitoring and analysis of data on the state of the climate. Trainings are based on technologies and procedures which are typical for Earth system sciences. Educational courses are designed to permit students to conduct their own investigations of ongoing and future climate changes in a manner that is essentially identical to the techniques used by national and international climate research organizations. All trainings are supported by lectures, devoted to the basic aspects of modern climatology, including analysis of current climate change and its possible impacts ensuring effective links between theory and practice. Along with its usage in graduate and postgraduate education, "Climate" is used as a framework for a developed basic information course on climate change for common public. In this course basic concepts and problems of modern climate change and its possible consequences are described for non-specialists. The course will also include links to relevant information resources on topical issues of Earth Sciences and a number of case studies, which are carried out for a selected region to consolidate the received knowledge.

  11. A High School Level Course On Robot Design And Construction

    NASA Astrophysics Data System (ADS)

    Sadler, Paul M.; Crandall, Jack L.

    1984-02-01

    The Robotics Design and Construction Class at Sehome High School was developed to offer gifted and/or highly motivated students an in-depth introduction to a modern engineering topic. The course includes instruction in basic electronics, digital and radio electronics, construction skills, robotics literacy, construction of the HERO 1 Heathkit Robot, computer/ robot programming, and voice synthesis. A key element which leads to the success of the course is the involvement of various community assets including manpower and financial assistance. The instructors included a physics/electronics teacher, a computer science teacher, two retired engineers, and an electronics technician.

  12. Cellular intelligence: Microphenomenology and the realities of being.

    PubMed

    Ford, Brian J

    2017-12-01

    Traditions of Eastern thought conceptualised life in a holistic sense, emphasising the processes of maintaining health and conquering sickness as manifestations of an essentially spiritual principle that was of overriding importance in the conduct of living. Western science, which drove the overriding and partial eclipse of Eastern traditions, became founded on a reductionist quest for ultimate realities which, in the modern scientific world, has embraced the notion that every living process can be successfully modelled by a digital computer system. It is argued here that the essential processes of cognition, response and decision-making inherent in living cells transcend conventional modelling, and microscopic studies of organisms like the shell-building amoebae and the rhodophyte alga Antithamnion reveal a level of cellular intelligence that is unrecognized by science and is not amenable to computer analysis. Copyright © 2017. Published by Elsevier Ltd.

  13. Computational immunology--from bench to virtual reality.

    PubMed

    Chan, Cliburn; Kepler, Thomas B

    2007-02-01

    Drinking from a fire-hose is an old cliché for the experience of learning basic and clinical sciences in medical school, and the pipe has been growing fatter at an alarming rate. Of course, it does not stop when one graduates; if anything, both the researcher and clinician are flooded with even more information. Slightly embarrassingly, while modern science is very good at generating new information, our ability to weave multiple strands of data into a useful and coherent story lags quite far behind. Bioinformatics, systems biology and computational medicine have arisen in recent years to address just this challenge. This essay is an introduction to the problem of data synthesis and integration in biology and medicine, and how the relatively new art of biological simulation can provide a new kind of map for understanding physiology and pathology. The nascent field of computational immunology will be used for illustration, but similar trends are occurring broadly across all of biology and medicine.

  14. Using NASA Space Imaging Technology to Teach Earth and Sun Topics

    NASA Astrophysics Data System (ADS)

    Verner, E.; Bruhweiler, F. C.; Long, T.

    2011-12-01

    We teach an experimental college-level course, directed toward elementary education majors, emphasizing "hands-on" activities that can be easily applied to the elementary classroom. This course, Physics 240: "The Sun-Earth Connection" includes various ways to study selected topics in physics, earth science, and basic astronomy. Our lesson plans and EPO materials make extensive use of NASA imagery and cover topics about magnetism, the solar photospheric, chromospheric, coronal spectra, as well as earth science and climate. In addition we are developing and will cover topics on ecosystem structure, biomass and water on Earth. We strive to free the non-science undergraduate from the "fear of science" and replace it with the excitement of science such that these future teachers will carry this excitement to their future students. Hands-on experiments, computer simulations, analysis of real NASA data, and vigorous seminar discussions are blended in an inquiry-driven curriculum to instill confident understanding of basic physical science and modern, effective methods for teaching it. The course also demonstrates ways how scientific thinking and hands-on activities could be implemented in the classroom. We have designed this course to provide the non-science student a confident basic understanding of physical science and modern, effective methods for teaching it. Most of topics were selected using National Science Standards and National Mathematics Standards that are addressed in grades K-8. The course focuses on helping education majors: 1) Build knowledge of scientific concepts and processes; 2) Understand the measurable attributes of objects and the units and methods of measurements; 3) Conduct data analysis (collecting, organizing, presenting scientific data, and to predict the result); 4) Use hands-on approaches to teach science; 5) Be familiar with Internet science teaching resources. Here we share our experiences and challenges we face while teaching this course.

  15. Virtual Observatories, Data Mining, and Astroinformatics

    NASA Astrophysics Data System (ADS)

    Borne, Kirk

    The historical, current, and future trends in knowledge discovery from data in astronomy are presented here. The story begins with a brief history of data gathering and data organization. A description of the development ofnew information science technologies for astronomical discovery is then presented. Among these are e-Science and the virtual observatory, with its data discovery, access, display, and integration protocols; astroinformatics and data mining for exploratory data analysis, information extraction, and knowledge discovery from distributed data collections; new sky surveys' databases, including rich multivariate observational parameter sets for large numbers of objects; and the emerging discipline of data-oriented astronomical research, called astroinformatics. Astroinformatics is described as the fourth paradigm of astronomical research, following the three traditional research methodologies: observation, theory, and computation/modeling. Astroinformatics research areas include machine learning, data mining, visualization, statistics, semantic science, and scientific data management.Each of these areas is now an active research discipline, with significantscience-enabling applications in astronomy. Research challenges and sample research scenarios are presented in these areas, in addition to sample algorithms for data-oriented research. These information science technologies enable scientific knowledge discovery from the increasingly large and complex data collections in astronomy. The education and training of the modern astronomy student must consequently include skill development in these areas, whose practitioners have traditionally been limited to applied mathematicians, computer scientists, and statisticians. Modern astronomical researchers must cross these traditional discipline boundaries, thereby borrowing the best of breed methodologies from multiple disciplines. In the era of large sky surveys and numerous large telescopes, the potential for astronomical discovery is equally large, and so the data-oriented research methods, algorithms, and techniques that are presented here will enable the greatest discovery potential from the ever-growing data and information resources in astronomy.

  16. Living well in light of science.

    PubMed

    McMahon, Darrin M

    2016-11-01

    This article discusses some findings of the modern science of happiness in the context of historical understandings of happiness. Comparing teachings of the ancient wisdom traditions to those of modern positive psychology and social science, I argue that there is surprising correspondence between the two. The happy life, both ancients and modern agree, involves training and the development and mastery of particular character traits. © 2016 New York Academy of Sciences.

  17. The Six Core Theories of Modern Physics

    NASA Astrophysics Data System (ADS)

    Stevens, Charles F.

    1996-09-01

    Charles Stevens, a prominent neurobiologist who originally trained as a biophysicist (with George Uhlenbeck and Mark Kac), wrote this book almost by accident. Each summer he found himself reviewing key areas of physics that he had once known and understood well, for use in his present biological research. Since there was no book, he created his own set of notes, which formed the basis for this brief, clear, and self-contained summary of the basic theoretical structures of classical mechanics, electricity and magnetism, quantum mechanics, statistical physics, special relativity, and quantum field theory. The Six Core Theories of Modern Physics can be used by advanced undergraduates or beginning graduate students as a supplement to the standard texts or for an uncluttered, succinct review of the key areas. Professionals in such quantitative sciences as chemistry, engineering, computer science, applied mathematics, and biophysics who need to brush up on the essentials of a particular area will find most of the required background material, including the mathematics.

  18. Planetary Exploration in the Classroom

    NASA Astrophysics Data System (ADS)

    Slivan, S. M.; Binzel, R. P.

    1997-07-01

    We have developed educational materials to seed a series of undergraduate level exercises on "Planetary Exploration in the Classroom." The goals of the series are to teach modern methods of planetary exploration and discovery to students having both science and non-science backgrounds. Using personal computers in a "hands-on" approach with images recorded by planetary spacecraft, students working through the exercises learn that modern scientific images are digital objects that can be examined and manipulated in quantitative detail. The initial exercises we've developed utilize NIH Image in conjunction with images from the Voyager spacecraft CDs. Current exercises are titled "Using 'NIH IMAGE' to View Voyager Images", "Resolving Surface Features on Io", "Discovery of Volcanoes on Io", and "Topography of Canyons on Ariel." We expect these exercises will be released during Fall 1997 and will be available via 'anonymous ftp'; detailed information about obtaining the exercises will be on the Web at "http://web.mit.edu/12s23/www/pec.html." This curriculum development was sponsored by NSF Grant DUE-9455329.

  19. [Computational chemistry in structure-based drug design].

    PubMed

    Cao, Ran; Li, Wei; Sun, Han-Zi; Zhou, Yu; Huang, Niu

    2013-07-01

    Today, the understanding of the sequence and structure of biologically relevant targets is growing rapidly and researchers from many disciplines, physics and computational science in particular, are making significant contributions to modern biology and drug discovery. However, it remains challenging to rationally design small molecular ligands with desired biological characteristics based on the structural information of the drug targets, which demands more accurate calculation of ligand binding free-energy. With the rapid advances in computer power and extensive efforts in algorithm development, physics-based computational chemistry approaches have played more important roles in structure-based drug design. Here we reviewed the newly developed computational chemistry methods in structure-based drug design as well as the elegant applications, including binding-site druggability assessment, large scale virtual screening of chemical database, and lead compound optimization. Importantly, here we address the current bottlenecks and propose practical solutions.

  20. Computer programing for geosciences: Teach your students how to make tools

    NASA Astrophysics Data System (ADS)

    Grapenthin, Ronni

    2011-12-01

    When I announced my intention to pursue a Ph.D. in geophysics, some people gave me confused looks, because I was working on a master's degree in computer science at the time. My friends, like many incoming geoscience graduate students, have trouble linking these two fields. From my perspective, it is pretty straightforward: Much of geoscience evolves around novel analyses of large data sets that require custom tools—computer programs—to minimize the drudgery of manual data handling; other disciplines share this characteristic. While most faculty adapted to the need for tool development quite naturally, as they grew up around computer terminal interfaces, incoming graduate students lack intuitive understanding of programing concepts such as generalization and automation. I believe the major cause is the intuitive graphical user interfaces of modern operating systems and applications, which isolate the user from all technical details. Generally, current curricula do not recognize this gap between user and machine. For students to operate effectively, they require specialized courses teaching them the skills they need to make tools that operate on particular data sets and solve their specific problems. Courses in computer science departments are aimed at a different audience and are of limited help.

  1. When technology became language: the origins of the linguistic conception of computer programming, 1950-1960.

    PubMed

    Nofre, David; Priestley, Mark; Alberts, Gerard

    2014-01-01

    Language is one of the central metaphors around which the discipline of computer science has been built. The language metaphor entered modern computing as part of a cybernetic discourse, but during the second half of the 1950s acquired a more abstract meaning, closely related to the formal languages of logic and linguistics. The article argues that this transformation was related to the appearance of the commercial computer in the mid-1950s. Managers of computing installations and specialists on computer programming in academic computer centers, confronted with an increasing variety of machines, called for the creation of "common" or "universal languages" to enable the migration of computer code from machine to machine. Finally, the article shows how the idea of a universal language was a decisive step in the emergence of programming languages, in the recognition of computer programming as a proper field of knowledge, and eventually in the way we think of the computer.

  2. High-Productivity Computing in Computational Physics Education

    NASA Astrophysics Data System (ADS)

    Tel-Zur, Guy

    2011-03-01

    We describe the development of a new course in Computational Physics at the Ben-Gurion University. This elective course for 3rd year undergraduates and MSc. students is being taught during one semester. Computational Physics is by now well accepted as the Third Pillar of Science. This paper's claim is that modern Computational Physics education should deal also with High-Productivity Computing. The traditional approach of teaching Computational Physics emphasizes ``Correctness'' and then ``Accuracy'' and we add also ``Performance.'' Along with topics in Mathematical Methods and case studies in Physics the course deals a significant amount of time with ``Mini-Courses'' in topics such as: High-Throughput Computing - Condor, Parallel Programming - MPI and OpenMP, How to build a Beowulf, Visualization and Grid and Cloud Computing. The course does not intend to teach neither new physics nor new mathematics but it is focused on an integrated approach for solving problems starting from the physics problem, the corresponding mathematical solution, the numerical scheme, writing an efficient computer code and finally analysis and visualization.

  3. The Growth of Physical Science

    NASA Astrophysics Data System (ADS)

    Jeans, James

    2009-07-01

    1. The remote beginnings; 2. Ionia and early Greece; 3. Science and Alexandria; 4. Science in the dark ages; 5. The birth of modern science; 6. The century of genius; 7. The two centuries after Newton; 8. The era of modern physics.

  4. Modern Dilemmas - Science (World History Series).

    ERIC Educational Resources Information Center

    Montgomery County Public Schools, Rockville, MD.

    The publication, referred to as a unit on "Modern Dilemmas," was completed in 1969 and is part of a Modern World History pilot project integrating areas of art, literature, philosophy, and science into the social studies curriculum. The unit seeks to explore all of the facets of science as part of man's search for meaning, but because of time…

  5. Biology Needs Evolutionary Software Tools: Let’s Build Them Right

    PubMed Central

    Team, Galaxy; Goecks, Jeremy; Taylor, James

    2018-01-01

    Abstract Research in population genetics and evolutionary biology has always provided a computational backbone for life sciences as a whole. Today evolutionary and population biology reasoning are essential for interpretation of large complex datasets that are characteristic of all domains of today’s life sciences ranging from cancer biology to microbial ecology. This situation makes algorithms and software tools developed by our community more important than ever before. This means that we, developers of software tool for molecular evolutionary analyses, now have a shared responsibility to make these tools accessible using modern technological developments as well as provide adequate documentation and training. PMID:29688462

  6. 2001: Things to come.

    PubMed

    Apuzzo, M L; Liu, C Y

    2001-10-01

    THIS ARTICLE DISCUSSES elements in the definition of modernity and emerging futurism in neurological surgery. In particular, it describes evolution, discovery, and paradigm shifts in the field and forces responsible for their realization. It analyzes the cyclical reinvention of the discipline experienced during the past generation and attempts to identify apertures to the near and more remote future. Subsequently, it focuses on forces and discovery in computational science, imaging, molecular science, biomedical engineering, and information processing as they relate to the theme of minimalism that is evident in the field. These areas are explained in the light of future possibilities offered by the emerging field of nanotechnology with molecular engineering.

  7. Accessible Earth: Enhancing diversity in the Geosciences through accessible course design and Experiential Learning Theory

    NASA Astrophysics Data System (ADS)

    Bennett, Rick; Lamb, Diedre

    2017-04-01

    The tradition of field-based instruction in the geoscience curriculum, which culminates in a capstone geological field camp, presents an insurmountable barrier to many disabled students who might otherwise choose to pursue geoscience careers. There is a widespread perception that success as a practicing geoscientist requires direct access to outcrops and vantage points available only to those able to traverse inaccessible terrain. Yet many modern geoscience activities are based on remotely sensed geophysical data, data analysis, and computation that take place entirely from within the laboratory. To challenge the perception of geoscience as a career option only for the able bodied, we have created the capstone Accessible Earth Study Abroad Program, an alternative to geologic field camp with a focus on modern geophysical observation systems, computational thinking, and data science. In this presentation, we will report on the theoretical bases for developing the course, our experiences in teaching the course to date, and our plan for ongoing assessment, refinement, and dissemination of the effectiveness of our efforts.

  8. Experience of validation and tuning of turbulence models as applied to the problem of boundary layer separation on a finite-width wedge

    NASA Astrophysics Data System (ADS)

    Babulin, A. A.; Bosnyakov, S. M.; Vlasenko, V. V.; Engulatova, M. F.; Matyash, S. V.; Mikhailov, S. V.

    2016-06-01

    Modern differential turbulence models are validated by computing a separation zone generated in the supersonic flow past a compression wedge lying on a plate of finite width. The results of three- and two-dimensional computations based on the ( q-ω), SST, and Spalart-Allmaras turbulence models are compared with experimental data obtained for 8°, 25°, and 45° wedges by A.A. Zheltovodov at the Institute of Theoretical and Applied Mechanics of the Siberian Branch of the Russian Academy of Sciences. An original law-of-the-wall boundary condition and modifications of the SST model intended for improving the quality of the computed separation zone are described.

  9. Building a digital library for the health sciences: information space complementing information place.

    PubMed Central

    Lucier, R E

    1995-01-01

    In 1990, the University of California, San Francisco, dedicated a new library to serve the faculty, staff, and students and to meet their academic information needs for several decades to come. Major environmental changes present new and additional information management challenges, which can effectively be handled only through the widespread use of computing and computing technologies. Over the next five years, a three-pronged strategy will be followed. We are refining the current physical, paper-based library through the continuous application of technology for modernization and functional improvement. At the same time, we have begun the planning, design, and implementation of a "free-standing" Digital Library of the Health Sciences, focusing on the innovative application of technology. To ensure complementarity and product integrity where the two libraries interface, we will look to technology to transform these separate entities into an eventual, integral whole. PMID:7581192

  10. Human Science for Human Freedom? Piaget's Developmental Research and Foucault's Ethical Truth Games

    ERIC Educational Resources Information Center

    Zhao, Guoping

    2012-01-01

    The construction of the modern subject and the pursuit of human freedom and autonomy, as well as the practice of human science has been pivotal in the development of modern education. But for Foucault, the subject is only the effect of discourses and power-knowledge arrangements, and modern human science is part of the very arrangement that has…

  11. 20th International Conference for Students and Young Scientists: Modern Techniques and Technologies (MTT'2014)

    NASA Astrophysics Data System (ADS)

    2014-10-01

    The active involvement of young researchers in scientific processes and the acquisition of scientific experience by gifted youth currently have a great value for the development of science. One of the research activities of National Research Tomsk Polytechnic University, aimed at the preparing and formation of the next generation of scientists, is the International Conference of Students and Young Scientists ''Modern Techniques and Technologies'', which was held in 2014 for the twentieth time. Great experience in the organization of scientific events has been acquired through years of carrying the conference. There are all the necessary resources for this: a team of organizers - employees of Tomsk Polytechnic University, premises provided with modern office equipment and equipment for demonstration, and leading scientists - professors of TPU, as well as the status of the university as a leading research university in Russia. This way the conference is able to attract world leading scientists for the collaboration. For the previous years the conference proved itself as a major scientific event at international level, which attracts more than 600 students and young scientists from Russia, CIS and other countries. The conference provides oral plenary and section reports. The conference is organized around lectures, where leading Russian and foreign scientists deliver plenary presentations to young audiences. An important indicator of this scientific event is the magnitude of the coverage of scientific fields: energy, heat and power, instrument making, engineering, systems and devices for medical purposes, electromechanics, material science, computer science and control in technical systems, nanotechnologies and nanomaterials, physical methods in science and technology, control and quality management, design and technology of artistic materials processing. The main issues considered by young researchers at the conference were related to the analysis of contemporary problems using new techniques and application of new technologies.

  12. Multidisciplinary analysis and design of printed wiring boards

    NASA Astrophysics Data System (ADS)

    Fulton, Robert E.; Hughes, Joseph L.; Scott, Waymond R., Jr.; Umeagukwu, Charles; Yeh, Chao-Pin

    1991-04-01

    Modern printed wiring board design depends on electronic prototyping using computer-based simulation and design tools. Existing electrical computer-aided design (ECAD) tools emphasize circuit connectivity with only rudimentary analysis capabilities. This paper describes a prototype integrated PWB design environment denoted Thermal Structural Electromagnetic Testability (TSET) being developed at Georgia Tech in collaboration with companies in the electronics industry. TSET provides design guidance based on enhanced electrical and mechanical CAD capabilities including electromagnetic modeling testability analysis thermal management and solid mechanics analysis. TSET development is based on a strong analytical and theoretical science base and incorporates an integrated information framework and a common database design based on a systematic structured methodology.

  13. Computer vision applications for coronagraphic optical alignment and image processing.

    PubMed

    Savransky, Dmitry; Thomas, Sandrine J; Poyneer, Lisa A; Macintosh, Bruce A

    2013-05-10

    Modern coronagraphic systems require very precise alignment between optical components and can benefit greatly from automated image processing. We discuss three techniques commonly employed in the fields of computer vision and image analysis as applied to the Gemini Planet Imager, a new facility instrument for the Gemini South Observatory. We describe how feature extraction and clustering methods can be used to aid in automated system alignment tasks, and also present a search algorithm for finding regular features in science images used for calibration and data processing. Along with discussions of each technique, we present our specific implementation and show results of each one in operation.

  14. Bioinformatics in High School Biology Curricula: A Study of State Science Standards

    ERIC Educational Resources Information Center

    Wefer, Stephen H.; Sheppard, Keith

    2008-01-01

    The proliferation of bioinformatics in modern biology marks a modern revolution in science that promises to influence science education at all levels. This study analyzed secondary school science standards of 49 U.S. states (Iowa has no science framework) and the District of Columbia for content related to bioinformatics. The bioinformatics…

  15. Evaluation of Proteus as a Tool for the Rapid Development of Models of Hydrologic Systems

    NASA Astrophysics Data System (ADS)

    Weigand, T. M.; Farthing, M. W.; Kees, C. E.; Miller, C. T.

    2013-12-01

    Models of modern hydrologic systems can be complex and involve a variety of operators with varying character. The goal is to implement approximations of such models that are both efficient for the developer and computationally efficient, which is a set of naturally competing objectives. Proteus is a Python-based toolbox that supports prototyping of model formulations as well as a wide variety of modern numerical methods and parallel computing. We used Proteus to develop numerical approximations for three models: Richards' equation, a brine flow model derived using the Thermodynamically Constrained Averaging Theory (TCAT), and a multiphase TCAT-based tumor growth model. For Richards' equation, we investigated discontinuous Galerkin solutions with higher order time integration based on the backward difference formulas. The TCAT brine flow model was implemented using Proteus and a variety of numerical methods were compared to hand coded solutions. Finally, an existing tumor growth model was implemented in Proteus to introduce more advanced numerics and allow the code to be run in parallel. From these three example models, Proteus was found to be an attractive open-source option for rapidly developing high quality code for solving existing and evolving computational science models.

  16. Empowering the impaired through the appropriate use of Information Technology and Internet.

    PubMed

    Sanyal, Ishita

    2006-01-01

    Developments in the fields of science and technology have revolutionized Human Life at material level. But in actuality, this progress is only superficial: underneath modern men and women are living in conditions of great mental and emotional stress, even in developed and affluent countries. People from all over the world irrespective of culture and economic background suffer from mental illness and though a number of researches are carried out worldwide but till date it has not been possible to resolve the problem. In today's world stress is increasing everyday. The individualistic approach towards life; the neonatal family system has increased the burden even further. Without adequate support system of friends and relatives--people are falling prey to mental illness. The insecurities, the inferiority feelings of these persons lead to disruption of communication between the sufferer and the family members and friends. The sufferers prefer to confine themselves within the four walls of their home and remain withdrawn from the whole world. They prefer to stay in touch with their world of fantasy--far away from the world of reality. Disability caused by some of the mental illnesses often remains invisible to the society leading to lack of support system and facilities for them. These unfortunate disabled persons not only need medication and counseling but a thorough rehabilitation programme to bring them back to the main stream of life. The task being not an easy one. According to the research works these persons need some work and income to improve their quality of life. In this scenario where society is adverse towards them, where stigma towards mental illness prevails; where help from friends and community is not available- training them in computer and forming groups through computer was thought to be an ideal option for the solution- a solution to the problems of modern life through modern technology. * It was seen that this insecure disabled persons feel free to experiment with machine more easily than with society and people. * Computer provides them the needed education and information needed for their further developments. * Computers provide them facilities to interact with others and form self-help groups. * Computers also enabled them to earn their livelihood. Thus this modern gadget, which is sometimes believed to make a man loner, has been actually acting as the bridge between the persons suffering from mental illness to the society in general. The disabled person also gains confidence and courage as they gain control over the machine. Gaining control over the machine helps them to gain control over their life. The product of Science and technology has been seen to revolutionized Human Life not only in material level but also on personal level- helping the disabled to gain control over their lives.

  17. The emergence of mind and brain: an evolutionary, computational, and philosophical approach.

    PubMed

    Mainzer, Klaus

    2008-01-01

    Modern philosophy of mind cannot be understood without recent developments in computer science, artificial intelligence (AI), robotics, neuroscience, biology, linguistics, and psychology. Classical philosophy of formal languages as well as symbolic AI assume that all kinds of knowledge must explicitly be represented by formal or programming languages. This assumption is limited by recent insights into the biology of evolution and developmental psychology of the human organism. Most of our knowledge is implicit and unconscious. It is not formally represented, but embodied knowledge, which is learnt by doing and understood by bodily interacting with changing environments. That is true not only for low-level skills, but even for high-level domains of categorization, language, and abstract thinking. The embodied mind is considered an emergent capacity of the brain as a self-organizing complex system. Actually, self-organization has been a successful strategy of evolution to handle the increasing complexity of the world. Genetic programs are not sufficient and cannot prepare the organism for all kinds of complex situations in the future. Self-organization and emergence are fundamental concepts in the theory of complex dynamical systems. They are also applied in organic computing as a recent research field of computer science. Therefore, cognitive science, AI, and robotics try to model the embodied mind in an artificial evolution. The paper analyzes these approaches in the interdisciplinary framework of complex dynamical systems and discusses their philosophical impact.

  18. Modern Science and Conservative Islam: An Uneasy Relationship

    ERIC Educational Resources Information Center

    Edis, Taner

    2009-01-01

    Familiar Western debates about religion, science, and science education have parallels in the Islamic world. There are difficulties reconciling conservative, traditional versions of Islam with modern science, particularly theories such as evolution. As a result, many conservative Muslim thinkers are drawn toward creationism, hopes of Islamizing…

  19. And yet, we were modern. The paradoxes of Iberian science after the Grand Narratives.

    PubMed

    Pimentel, Juan; Pardo-Tomás, José

    2017-06-01

    In this article, we try to explain the origin of a disagreement; the sort that often arises when the subject is the history of early modern Spanish science. In the decades between 1970 and 1990, while some historians were trying to include Spain in the grand narrative of the rise of modern science, the very historical category of the Scientific Revolution was beginning to be dismantled. It could be said that Spaniards were boarding the flagship of modern science right before it sank. To understand this décalage it would be helpful to recall the role of the history of science during the years after the Franco dictatorship and Spain's transition to democracy. It was a discipline useful for putting behind us the Black Legend and Spanish exceptionalism.

  20. Method and computer program product for maintenance and modernization backlogging

    DOEpatents

    Mattimore, Bernard G; Reynolds, Paul E; Farrell, Jill M

    2013-02-19

    According to one embodiment, a computer program product for determining future facility conditions includes a computer readable medium having computer readable program code stored therein. The computer readable program code includes computer readable program code for calculating a time period specific maintenance cost, for calculating a time period specific modernization factor, and for calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. In another embodiment, a computer-implemented method for calculating future facility conditions includes calculating a time period specific maintenance cost, calculating a time period specific modernization factor, and calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. Other embodiments are also presented.

  1. Computational ecology as an emerging science

    PubMed Central

    Petrovskii, Sergei; Petrovskaya, Natalia

    2012-01-01

    It has long been recognized that numerical modelling and computer simulations can be used as a powerful research tool to understand, and sometimes to predict, the tendencies and peculiarities in the dynamics of populations and ecosystems. It has been, however, much less appreciated that the context of modelling and simulations in ecology is essentially different from those that normally exist in other natural sciences. In our paper, we review the computational challenges arising in modern ecology in the spirit of computational mathematics, i.e. with our main focus on the choice and use of adequate numerical methods. Somewhat paradoxically, the complexity of ecological problems does not always require the use of complex computational methods. This paradox, however, can be easily resolved if we recall that application of sophisticated computational methods usually requires clear and unambiguous mathematical problem statement as well as clearly defined benchmark information for model validation. At the same time, many ecological problems still do not have mathematically accurate and unambiguous description, and available field data are often very noisy, and hence it can be hard to understand how the results of computations should be interpreted from the ecological viewpoint. In this scientific context, computational ecology has to deal with a new paradigm: conventional issues of numerical modelling such as convergence and stability become less important than the qualitative analysis that can be provided with the help of computational techniques. We discuss this paradigm by considering computational challenges arising in several specific ecological applications. PMID:23565336

  2. Effects of a Science Education Module on Attitudes towards Modern Biotechnology of Secondary School Students

    NASA Astrophysics Data System (ADS)

    Klop, Tanja; Severiens, Sabine E.; Knippels, Marie-Christine P. J.; van Mil, Marc H. W.; Ten Dam, Geert T. M.

    2010-06-01

    This article evaluated the impact of a four-lesson science module on the attitudes of secondary school students. This science module (on cancer and modern biotechnology) utilises several design principles, related to a social constructivist perspective on learning. The expectation was that the module would help students become more articulate in this particular field. In a quasi-experimental design (experimental-, control groups, and pre- and post-tests), secondary school students' attitudes (N = 365) towards modern biotechnology were measured by a questionnaire. Data were analysed using Chi-square tests. Significant differences were obtained between the control and experimental conditions. Results showed that the science module had a significant effect on attitudes, although predominantly towards a more supportive and not towards a more critical stance. It is discussed that offering a science module of this kind can indeed encourage students to become more aware of modern biotechnology, although promoting a more critical attitude towards modern biotechnology should receive more attention.

  3. Visions of the Future - the Changing Role of Actors in Data-Intensive Science

    NASA Astrophysics Data System (ADS)

    Schäfer, L.; Klump, J. F.

    2013-12-01

    Around the world scientific disciplines are increasingly facing the challenge of a burgeoning volume of research data. This data avalanche consists of a stream of information generated from sensors and scientific instruments, digital recordings, social-science surveys or drawn from the World Wide Web. All areas of the scientific economy are affected by this rapid growth in data, from the logging of digs in Archaeology, telescope data with observations of distant galaxies in Astrophysics or data from polls and surveys in the Social Sciences. The challenge for science is not only to process the data through analysis, reduction and visualization, but also to set up infrastructures for provisioning and storing the data. The rise of new technologies and developments also poses new challenges for the actors in the area of research data infrastructures. Libraries, as one of the actors, enable access to digital media and support the publication of research data and its long-term archiving. Digital media and research data, however, introduce new aspects into the libraries' range of activities. How are we to imagine the library of the future? The library as an interface to the computer centers? Will library and computer center fuse into a new service unit? What role will scientific publishers play in future? Currently the traditional form of publication still carry greater weight - articles for conferences and journals. But will this still be the case in future? New forms of publication are already making their presence felt. The tasks of the computer centers may also change. Yesterday their remit was provisioning of rapid hardware, whereas now everything revolves around the topic of data and services. Finally, how about the researchers themselves? Not such a long time ago, Geoscience was not necessarily seen as linked to Computer Science. Nowadays, modern Geoscience relies heavily on IT and its techniques. Thus, in how far will the profile of the modern geoscientist change? This gives rise to the question of what tools are required to locate and pursue the correct course in a networked world. One tool from the area of innovation management is the scenario technique. This poster will outline visions of the future as possible developments of the scientific world in 2020 (or later). The scenarios presented will show possible developments - both positive and negative. It is up then to the actors themselves to define their own position in this context, to rethink it and consider steps that can achieve a positive development for the future.

  4. On What Basis Hope? Modern Progress and Postmodern Possibilities.

    ERIC Educational Resources Information Center

    Danforth, Scot

    1997-01-01

    Examines modern and postmodern concepts of hope as applied to services for persons having mental retardation. Contrasts modernist theories of special education, based on interventionist social science, with postmodernist views, which critique modern social science as perpetuating stigmatized "mentally retarded" identities defined by…

  5. Computing with Beowulf

    NASA Technical Reports Server (NTRS)

    Cohen, Jarrett

    1999-01-01

    Parallel computers built out of mass-market parts are cost-effectively performing data processing and simulation tasks. The Supercomputing (now known as "SC") series of conferences celebrated its 10th anniversary last November. While vendors have come and gone, the dominant paradigm for tackling big problems still is a shared-resource, commercial supercomputer. Growing numbers of users needing a cheaper or dedicated-access alternative are building their own supercomputers out of mass-market parts. Such machines are generally called Beowulf-class systems after the 11th century epic. This modern-day Beowulf story began in 1994 at NASA's Goddard Space Flight Center. A laboratory for the Earth and space sciences, computing managers there threw down a gauntlet to develop a $50,000 gigaFLOPS workstation for processing satellite data sets. Soon, Thomas Sterling and Don Becker were working on the Beowulf concept at the University Space Research Association (USRA)-run Center of Excellence in Space Data and Information Sciences (CESDIS). Beowulf clusters mix three primary ingredients: commodity personal computers or workstations, low-cost Ethernet networks, and the open-source Linux operating system. One of the larger Beowulfs is Goddard's Highly-parallel Integrated Virtual Environment, or HIVE for short.

  6. Soccer science and the Bayes community: exploring the cognitive implications of modern scientific communication.

    PubMed

    Shrager, Jeff; Billman, Dorrit; Convertino, Gregorio; Massar, J P; Pirolli, Peter

    2010-01-01

    Science is a form of distributed analysis involving both individual work that produces new knowledge and collaborative work to exchange information with the larger community. There are many particular ways in which individual and community can interact in science, and it is difficult to assess how efficient these are, and what the best way might be to support them. This paper reports on a series of experiments in this area and a prototype implementation using a research platform called CACHE. CACHE both supports experimentation with different structures of interaction between individual and community cognition and serves as a prototype for computational support for those structures. We particularly focus on CACHE-BC, the Bayes community version of CACHE, within which the community can break up analytical tasks into "mind-sized" units and use provenance tracking to keep track of the relationship between these units. Copyright © 2009 Cognitive Science Society, Inc.

  7. Photo-realistic Terrain Modeling and Visualization for Mars Exploration Rover Science Operations

    NASA Technical Reports Server (NTRS)

    Edwards, Laurence; Sims, Michael; Kunz, Clayton; Lees, David; Bowman, Judd

    2005-01-01

    Modern NASA planetary exploration missions employ complex systems of hardware and software managed by large teams of. engineers and scientists in order to study remote environments. The most complex and successful of these recent projects is the Mars Exploration Rover mission. The Computational Sciences Division at NASA Ames Research Center delivered a 30 visualization program, Viz, to the MER mission that provides an immersive, interactive environment for science analysis of the remote planetary surface. In addition, Ames provided the Athena Science Team with high-quality terrain reconstructions generated with the Ames Stereo-pipeline. The on-site support team for these software systems responded to unanticipated opportunities to generate 30 terrain models during the primary MER mission. This paper describes Viz, the Stereo-pipeline, and the experiences of the on-site team supporting the scientists at JPL during the primary MER mission.

  8. A need for a code of ethics in science communication?

    NASA Astrophysics Data System (ADS)

    Benestad, R. E.

    2009-09-01

    The modern western civilization and high standard of living are to a large extent the 'fruits' of scientific endeavor over generations. Some examples include the longer life expectancy due to progress in medical sciences, and changes in infrastructure associated with the utilization of electromagnetism. Modern meteorology is not possible without the state-of-the-art digital computers, satellites, remote sensing, and communications. Science also is of relevance for policy making, e.g. the present hot topic of climate change. Climate scientists have recently become much exposed to media focus and mass communications, a task for which many are not trained. Furthermore, science, communication, and politics have different objectives, and do not necessarily mix. Scientists have an obligation to provide unbiased information, and a code of ethics is needed to give a guidance for acceptable and unacceptable conduct. Some examples of questionable conduct in Norway include using the title 'Ph.D' to imply scientific authority when the person never had obtained such an academic degree, or writing biased and one-sided articles in Norwegian encyclopedia that do not reflect the scientific consensus. It is proposed here that a set of guide lines (for the scientists and journalists) and a code of conduct could provide recommendation for regarding how to act in media - similar to a code of conduct with respect to carrying out research - to which everyone could agree, even when disagreeing on specific scientific questions.

  9. The Place of Science in the Modern World: A Speech by Robert Millikan

    NASA Astrophysics Data System (ADS)

    Williams, Kathryn R.

    2001-07-01

    A speech by Robert Millikan, reprinted in the May 1930 issue, pertains to issues still prevalent in the 21st century. In the "The Place of Science in the Modern World", the Nobel laureate defends science against charges of its detrimental effects on society, its materialistic intentions, and the destructive powers realized during the first World War. He also expresses concern that "this particular generation of Americans" may lack the moral qualities needed to make responsible use of the increased powers afforded by modern science.

  10. Cajal and consciousness. Introduction.

    PubMed

    Marijuán, P C

    2001-04-01

    One hundred years after Santiago Ramón Cajal established the bases of modern neuroscience in his masterpiece Textura del sistema nervioso del hombre y de los vertebrados, the question is stated again: What is the status of consciousness today? The responses in this book, by contemporary leading figures of neuroscience, evolution, molecular biology, computer science, and quantum physics, collectively compose a fascinating conceptual landscape. Both the evolutionary emergence of consciousness and its development towards the highest level may be analyzed by a wealth of new theories and hypotheses, including Cajal's prescient ones. Some noticeable gaps remain, however. Celebrating the centennial of Textura is a timely occasion to reassess how close--and how far--our system of the sciences is to explaining consciousness.

  11. Is homeopathy a science?--Continuity and clash of concepts of science within holistic medicine.

    PubMed

    Schmidt, Josef M

    2009-06-01

    The question of whether homeopathy is a science is currently discussed almost exclusively against the background of the modern concept of natural science. This approach, however, fails to notice that homeopathy-in terms of history of science-rests on different roots that can essentially be traced back to two most influential traditions of science: on the one hand, principles and notions of Aristotelism which determined 2,000 years of Western history of science and, on the other hand, the modern concept of natural science that has been dominating the history of medicine for less than 200 years. While Aristotle's "science of the living" still included ontologic and teleologic dimensions for the sake of comprehending nature in a uniform way, the interest of modern natural science was reduced to functional and causal explanations of all phenomena for the purpose of commanding nature. In order to prevent further ecological catastrophes as well as to regain lost dimensions of our lives, the one-sidedness and theory-loadedness of our modern natural-scientific view of life should henceforth be counterbalanced by lifeworld-practical Aristotelic categories. In this way, the ground would be ready to conceive the scientific character of homeopathy-in a broader, Aristotelian sense.

  12. BiteScis: Connecting K-12 teachers with science graduate students to produce lesson plans on modern science research

    NASA Astrophysics Data System (ADS)

    Battersby, Cara

    2016-01-01

    Many students graduate high school having never learned about the process and people behind modern science research. The BiteScis program addresses this gap by providing easily implemented lesson plans that incorporate the whos, whats, and hows of today's scienctific discoveries. We bring together practicing scientists (motivated graduate students from the selective communicating science conference, ComSciCon) with K-12 science teachers to produce, review, and disseminate K-12 lesson plans based on modern science research. These lesson plans vary in topic from environmental science to neurobiology to astrophysics, and involve a range of activities from laboratory exercises to art projects, debates, or group discussion. An integral component of the program is a series of short, "bite-size" articles on modern science research written for K-12 students. The "bite-size" articles and lesson plans will be made freely available online in an easily searchable web interface that includes association with a variety of curriculum standards. This ongoing program is in its first year with about 15 lesson plans produced to date.

  13. Trimodernism and Social Sciences: A Note

    ERIC Educational Resources Information Center

    Snell, Joel C.

    2012-01-01

    The issues of premodern, modern, and postmodern can often confuse the social scientists because so much is drawn from modernism as the foundation of the social methodologies. Briefly, the author would like to differentiate the three modernism philosophies and indicate how a coalition of the three may apply to social sciences.

  14. Valuing Science: A Turkish-American Comparison

    ERIC Educational Resources Information Center

    Titrek, Osman; Cobern, William W.

    2011-01-01

    The process of modernization began in Turkey under the reform government of Mustafa Kemal Ataturk (1881-1938). Turkey officially became a secular nation seeking to develop a modern economy with modern science and technology and political democracy. Turkey also has long been, and remains, a deeply religious society. Specifically, the practice of…

  15. The simplicity principle in perception and cognition.

    PubMed

    Feldman, Jacob

    2016-09-01

    The simplicity principle, traditionally referred to as Occam's razor, is the idea that simpler explanations of observations should be preferred to more complex ones. In recent decades the principle has been clarified via the incorporation of modern notions of computation and probability, allowing a more precise understanding of how exactly complexity minimization facilitates inference. The simplicity principle has found many applications in modern cognitive science, in contexts as diverse as perception, categorization, reasoning, and neuroscience. In all these areas, the common idea is that the mind seeks the simplest available interpretation of observations- or, more precisely, that it balances a bias toward simplicity with a somewhat opposed constraint to choose models consistent with perceptual or cognitive observations. This brief tutorial surveys some of the uses of the simplicity principle across cognitive science, emphasizing how complexity minimization in a number of forms has been incorporated into probabilistic models of inference. WIREs Cogn Sci 2016, 7:330-340. doi: 10.1002/wcs.1406 For further resources related to this article, please visit the WIREs website. © 2016 Wiley Periodicals, Inc.

  16. MDTraj: A Modern Open Library for the Analysis of Molecular Dynamics Trajectories.

    PubMed

    McGibbon, Robert T; Beauchamp, Kyle A; Harrigan, Matthew P; Klein, Christoph; Swails, Jason M; Hernández, Carlos X; Schwantes, Christian R; Wang, Lee-Ping; Lane, Thomas J; Pande, Vijay S

    2015-10-20

    As molecular dynamics (MD) simulations continue to evolve into powerful computational tools for studying complex biomolecular systems, the necessity of flexible and easy-to-use software tools for the analysis of these simulations is growing. We have developed MDTraj, a modern, lightweight, and fast software package for analyzing MD simulations. MDTraj reads and writes trajectory data in a wide variety of commonly used formats. It provides a large number of trajectory analysis capabilities including minimal root-mean-square-deviation calculations, secondary structure assignment, and the extraction of common order parameters. The package has a strong focus on interoperability with the wider scientific Python ecosystem, bridging the gap between MD data and the rapidly growing collection of industry-standard statistical analysis and visualization tools in Python. MDTraj is a powerful and user-friendly software package that simplifies the analysis of MD data and connects these datasets with the modern interactive data science software ecosystem in Python. Copyright © 2015 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  17. MDTraj: A Modern Open Library for the Analysis of Molecular Dynamics Trajectories

    PubMed Central

    McGibbon, Robert T.; Beauchamp, Kyle A.; Harrigan, Matthew P.; Klein, Christoph; Swails, Jason M.; Hernández, Carlos X.; Schwantes, Christian R.; Wang, Lee-Ping; Lane, Thomas J.; Pande, Vijay S.

    2015-01-01

    As molecular dynamics (MD) simulations continue to evolve into powerful computational tools for studying complex biomolecular systems, the necessity of flexible and easy-to-use software tools for the analysis of these simulations is growing. We have developed MDTraj, a modern, lightweight, and fast software package for analyzing MD simulations. MDTraj reads and writes trajectory data in a wide variety of commonly used formats. It provides a large number of trajectory analysis capabilities including minimal root-mean-square-deviation calculations, secondary structure assignment, and the extraction of common order parameters. The package has a strong focus on interoperability with the wider scientific Python ecosystem, bridging the gap between MD data and the rapidly growing collection of industry-standard statistical analysis and visualization tools in Python. MDTraj is a powerful and user-friendly software package that simplifies the analysis of MD data and connects these datasets with the modern interactive data science software ecosystem in Python. PMID:26488642

  18. Introducing Computed Tomography Standards for Age Estimation of Modern Australian Subadults Using Postnatal Ossification Timings of Select Cranial and Cervical Sites(.).

    PubMed

    Lottering, Nicolene; MacGregor, Donna M; Alston, Clair L; Watson, Debbie; Gregory, Laura S

    2016-01-01

    Contemporary, population-specific ossification timings of the cranium are lacking in current literature due to challenges in obtaining large repositories of documented subadult material, forcing Australian practitioners to rely on North American, arguably antiquated reference standards for age estimation. This study assessed the temporal pattern of ossification of the cranium and provides recalibrated probabilistic information for age estimation of modern Australian children. Fusion status of the occipital and frontal bones, atlas, and axis was scored using a modified two- to four-tier system from cranial/cervical DICOM datasets of 585 children aged birth to 10 years. Transition analysis was applied to elucidate maximum-likelihood estimates between consecutive fusion stages, in conjunction with Bayesian statistics to calculate credible intervals for age estimation. Results demonstrate significant sex differences in skeletal maturation (p < 0.05) and earlier timings in comparison with major literary sources, underscoring the requisite of updated standards for age estimation of modern individuals. © 2015 American Academy of Forensic Sciences.

  19. A Web-based interface to calculate phonotactic probability for words and nonwords in Modern Standard Arabic.

    PubMed

    Aljasser, Faisal; Vitevitch, Michael S

    2018-02-01

    A number of databases (Storkel Behavior Research Methods, 45, 1159-1167, 2013) and online calculators (Vitevitch & Luce Behavior Research Methods, Instruments, and Computers, 36, 481-487, 2004) have been developed to provide statistical information about various aspects of language, and these have proven to be invaluable assets to researchers, clinicians, and instructors in the language sciences. The number of such resources for English is quite large and continues to grow, whereas the number of such resources for other languages is much smaller. This article describes the development of a Web-based interface to calculate phonotactic probability in Modern Standard Arabic (MSA). A full description of how the calculator can be used is provided. It can be freely accessed at http://phonotactic.drupal.ku.edu/ .

  20. Human anatomy nomenclature rules for the computer age.

    PubMed

    Neumann, Paul E; Baud, Robert; Sprumont, Pierre

    2017-04-01

    Information systems are increasing in importance in biomedical sciences and medical practice. The nomenclature rules of human anatomy were reviewed for adequacy with respect to modern needs. New rules are proposed here to ensure that each Latin term is uniquely associated with an anatomical entity, as short and simple as possible, and machine-interpretable. Observance of these recommendations will also benefit students and translators of the Latin terms into other languages. Clin. Anat. 30:300-302, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  1. Fiji: an open-source platform for biological-image analysis.

    PubMed

    Schindelin, Johannes; Arganda-Carreras, Ignacio; Frise, Erwin; Kaynig, Verena; Longair, Mark; Pietzsch, Tobias; Preibisch, Stephan; Rueden, Curtis; Saalfeld, Stephan; Schmid, Benjamin; Tinevez, Jean-Yves; White, Daniel James; Hartenstein, Volker; Eliceiri, Kevin; Tomancak, Pavel; Cardona, Albert

    2012-06-28

    Fiji is a distribution of the popular open-source software ImageJ focused on biological-image analysis. Fiji uses modern software engineering practices to combine powerful software libraries with a broad range of scripting languages to enable rapid prototyping of image-processing algorithms. Fiji facilitates the transformation of new algorithms into ImageJ plugins that can be shared with end users through an integrated update system. We propose Fiji as a platform for productive collaboration between computer science and biology research communities.

  2. Regime, phase and paradigm shifts: making community ecology the basic science for fisheries

    PubMed Central

    Mangel, Marc; Levin, Phillip S.

    2005-01-01

    Modern fishery science, which began in 1957 with Beverton and Holt, is ca. 50 years old. At its inception, fishery science was limited by a nineteenth century mechanistic worldview and by computational technology; thus, the relatively simple equations of population ecology became the fundamental ecological science underlying fisheries. The time has come for this to change and for community ecology to become the fundamental ecological science underlying fisheries. This point will be illustrated with two examples. First, when viewed from a community perspective, excess production must be considered in the context of biomass left for predators. We argue that this is a better measure of the effects of fisheries than spawning biomass per recruit. Second, we shall analyse a simple, but still multi-species, model for fishery management that considers the alternatives of harvest regulations, inshore marine protected areas and offshore marine protected areas. Population or community perspectives lead to very different predictions about the efficacy of reserves. PMID:15713590

  3. Randomized Approaches for Nearest Neighbor Search in Metric Space When Computing the Pairwise Distance Is Extremely Expensive

    NASA Astrophysics Data System (ADS)

    Wang, Lusheng; Yang, Yong; Lin, Guohui

    Finding the closest object for a query in a database is a classical problem in computer science. For some modern biological applications, computing the similarity between two objects might be very time consuming. For example, it takes a long time to compute the edit distance between two whole chromosomes and the alignment cost of two 3D protein structures. In this paper, we study the nearest neighbor search problem in metric space, where the pair-wise distance between two objects in the database is known and we want to minimize the number of distances computed on-line between the query and objects in the database in order to find the closest object. We have designed two randomized approaches for indexing metric space databases, where objects are purely described by their distances with each other. Analysis and experiments show that our approaches only need to compute O(logn) objects in order to find the closest object, where n is the total number of objects in the database.

  4. Gait biomechanics in the era of data science.

    PubMed

    Ferber, Reed; Osis, Sean T; Hicks, Jennifer L; Delp, Scott L

    2016-12-08

    Data science has transformed fields such as computer vision and economics. The ability of modern data science methods to extract insights from large, complex, heterogeneous, and noisy datasets is beginning to provide a powerful complement to the traditional approaches of experimental motion capture and biomechanical modeling. The purpose of this article is to provide a perspective on how data science methods can be incorporated into our field to advance our understanding of gait biomechanics and improve treatment planning procedures. We provide examples of how data science approaches have been applied to biomechanical data. We then discuss the challenges that remain for effectively using data science approaches in clinical gait analysis and gait biomechanics research, including the need for new tools, better infrastructure and incentives for sharing data, and education across the disciplines of biomechanics and data science. By addressing these challenges, we can revolutionize treatment planning and biomechanics research by capitalizing on the wealth of knowledge gained by gait researchers over the past decades and the vast, but often siloed, data that are collected in clinical and research laboratories around the world. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Sciences from below: feminisms, postcolonialities, and modernities.

    PubMed

    Weaver, Harlan

    2010-01-01

    Sandra Harding's newest book, Sciences from Below: Feminisms, Postcolonialities, and Modernities, continues her work in feminist standpoint theory and science and technologies studies, asking how we might judge "good" science. Attentive to race, class, gender, and imperialism, Harding critically examines Northern and Southern sciences and technologies by adopting the perspective of those who see from below. This vision from the peripheries lets Harding question stories of modern scientific progress, revealing a multiplicity of "ethnosciences" and critiquing modernity itself. However, while Harding aims to produce knowledge for the North's others by emphasizing woman's experience, she fails to question the category "woman," ignoring contemporary transgender and queer scholarship. Further, it is Harding's care for the North's subjugated others that motivates her writing, revealing that the struggle to achieve the standpoint "from below" so critical to her project is fueled by what her ally Maria Puig de la Bellacasa would term not thinking from, but thinking with, or, more precisely, "thinking with care."

  6. Origins of the historiography of modern Greek science.

    PubMed

    Patiniotis, Manolis

    2008-01-01

    The purpose of the paper is to examine how Greek historians account for the presence of modern scientific ideas in the intellectual environment of eighteenth-century Greek-speaking society. It will also discuss the function of the history of modern Greek science in the context of Greek national historiography. As will be shown, the history of modem Greek science spent most of its life under the shadow of the history of ideas. Despite its seemingly secondary role, however, it occupied a distinctive place within national historiography because it formed the ground upon which different perceptions of the country's European identity converged. In this respect, one of the main goals of this paper is to outline the particular ideological presumptions, which shaped the historiography of modern Greek science under different historical circumstances. At the end an attempt will be made to articulate a viewpoint more in tandem with the recent methodological developments in the history of science.

  7. Enabling a new Paradigm to Address Big Data and Open Science Challenges

    NASA Astrophysics Data System (ADS)

    Ramamurthy, Mohan; Fisher, Ward

    2017-04-01

    Data are not only the lifeblood of the geosciences but they have become the currency of the modern world in science and society. Rapid advances in computing, communi¬cations, and observational technologies — along with concomitant advances in high-resolution modeling, ensemble and coupled-systems predictions of the Earth system — are revolutionizing nearly every aspect of our field. Modern data volumes from high-resolution ensemble prediction/projection/simulation systems and next-generation remote-sensing systems like hyper-spectral satellite sensors and phased-array radars are staggering. For example, CMIP efforts alone will generate many petabytes of climate projection data for use in assessments of climate change. And NOAA's National Climatic Data Center projects that it will archive over 350 petabytes by 2030. For researchers and educators, this deluge and the increasing complexity of data brings challenges along with the opportunities for discovery and scientific breakthroughs. The potential for big data to transform the geosciences is enormous, but realizing the next frontier depends on effectively managing, analyzing, and exploiting these heterogeneous data sources, extracting knowledge and useful information from heterogeneous data sources in ways that were previously impossible, to enable discoveries and gain new insights. At the same time, there is a growing focus on the area of "Reproducibility or Replicability in Science" that has implications for Open Science. The advent of cloud computing has opened new avenues for not only addressing both big data and Open Science challenges to accelerate scientific discoveries. However, to successfully leverage the enormous potential of cloud technologies, it will require the data providers and the scientific communities to develop new paradigms to enable next-generation workflows and transform the conduct of science. Making data readily available is a necessary but not a sufficient condition. Data providers also need to give scientists an ecosystem that includes data, tools, workflows and other services needed to perform analytics, integration, interpretation, and synthesis - all in the same environment or platform. Instead of moving data to processing systems near users, as is the tradition, the cloud permits one to bring processing, computing, analysis and visualization to data - so called data proximate workbench capabilities, also known as server-side processing. In this talk, I will present the ongoing work at Unidata to facilitate a new paradigm for doing science by offering a suite of tools, resources, and platforms to leverage cloud technologies for addressing both big data and Open Science/reproducibility challenges. That work includes the development and deployment of new protocols for data access and server-side operations and Docker container images of key applications, JupyterHub Python notebook tools, and cloud-based analysis and visualization capability via the CloudIDV tool to enable reproducible workflows and effectively use the accessed data.

  8. People Interview: Black-tie science gets modern

    NASA Astrophysics Data System (ADS)

    2009-03-01

    INTERVIEW Black-tie science gets modern Baroness Susan Greenfield CBE is director of the Royal Institution and professor of pharmacology at Oxford where she heads a multidisciplinary group studying neurodegenerative disorders. David Smith speaks to her about specialities, keeping busy and how science is changing.

  9. [Elucidating! But how? Insights into the impositions of modern science communication].

    PubMed

    Lehmkuh, Markus

    2015-01-01

    The talk promotes the view that science communication should abandon the claim that scientific information can convince others. This is identified as one of the impositions modern science communication is exposed to. Instead of convin cing others, science communication should focus on identifying societally relevant scientific knowledge and on communicating it accurately and coherently.

  10. On Modern Cosmology and Its Place in Science Education

    ERIC Educational Resources Information Center

    Kragh, Helge

    2011-01-01

    Cosmology in its current meaning of the science of the universe is a topic that attracts as much popular as scientific interest. This paper argues that modern cosmology and its philosophical aspects should have a prominent place in science education. In the context of science teaching a partly historical approach is recommended, in particular an…

  11. Computational Analysis of Behavior.

    PubMed

    Egnor, S E Roian; Branson, Kristin

    2016-07-08

    In this review, we discuss the emerging field of computational behavioral analysis-the use of modern methods from computer science and engineering to quantitatively measure animal behavior. We discuss aspects of experiment design important to both obtaining biologically relevant behavioral data and enabling the use of machine vision and learning techniques for automation. These two goals are often in conflict. Restraining or restricting the environment of the animal can simplify automatic behavior quantification, but it can also degrade the quality or alter important aspects of behavior. To enable biologists to design experiments to obtain better behavioral measurements, and computer scientists to pinpoint fruitful directions for algorithm improvement, we review known effects of artificial manipulation of the animal on behavior. We also review machine vision and learning techniques for tracking, feature extraction, automated behavior classification, and automated behavior discovery, the assumptions they make, and the types of data they work best with.

  12. Bethune-Cookman University STEM Research Lab. DOE Renovation Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thompson, Herbert W.

    DOE funding was used to renovate 4,500 square feet of aging laboratories and classrooms that support science, engineering, and mathematics disciplines (specifically environmental science, and computer engineering). The expansion of the labs was needed to support robotics and environmental science research, and to better accommodate a wide variety of teaching situations. The renovated space includes a robotics laboratory, two multi-use labs, safe spaces for the storage of instrumentation, modern ventilation equipment, and other “smart” learning venues. The renovated areas feature technologies that are environmentally friendly with reduced energy costs. A campus showcase, the laboratories are a reflection of the University’smore » commitment to the environment and research as a tool for teaching. As anticipated, the labs facilitate the exploration of emerging technologies that are compatible with local and regional economic plans.« less

  13. The Development of the Foundations of Modern Pedagogy: Paradigmal and Methodological Aspects of Research

    ERIC Educational Resources Information Center

    Dmitrenko, ?amara ?.; Lavryk, Tatjana V.; Yaresko, Ekaterina V.

    2015-01-01

    Changes in the various fields of knowledge influenced the pedagogical science. The article explains the structure of the foundations of modern pedagogy through paradigmal and methodological aspects. Bases of modern pedagogy include complex of paradigms, object and subject of science, general and specific principles, methods and technologies.…

  14. Also a Centennial Year for Ernest Orlando Lawrence

    Science.gov Websites

    research with multidisciplinary teams of scientists and engineers-the team-based approach to modern science should be remembered as the inventor of the modern way of doing science," said Lawrence team member Revolutionary Idea that Changed Modern Physics A Few Important Events in Lawrence's Life E.O. Lawrence

  15. Computing exponentially faster: implementing a non-deterministic universal Turing machine using DNA

    PubMed Central

    Currin, Andrew; Korovin, Konstantin; Ababi, Maria; Roper, Katherine; Kell, Douglas B.; Day, Philip J.

    2017-01-01

    The theory of computer science is based around universal Turing machines (UTMs): abstract machines able to execute all possible algorithms. Modern digital computers are physical embodiments of classical UTMs. For the most important class of problem in computer science, non-deterministic polynomial complete problems, non-deterministic UTMs (NUTMs) are theoretically exponentially faster than both classical UTMs and quantum mechanical UTMs (QUTMs). However, no attempt has previously been made to build an NUTM, and their construction has been regarded as impossible. Here, we demonstrate the first physical design of an NUTM. This design is based on Thue string rewriting systems, and thereby avoids the limitations of most previous DNA computing schemes: all the computation is local (simple edits to strings) so there is no need for communication, and there is no need to order operations. The design exploits DNA's ability to replicate to execute an exponential number of computational paths in P time. Each Thue rewriting step is embodied in a DNA edit implemented using a novel combination of polymerase chain reactions and site-directed mutagenesis. We demonstrate that the design works using both computational modelling and in vitro molecular biology experimentation: the design is thermodynamically favourable, microprogramming can be used to encode arbitrary Thue rules, all classes of Thue rule can be implemented, and non-deterministic rule implementation. In an NUTM, the resource limitation is space, which contrasts with classical UTMs and QUTMs where it is time. This fundamental difference enables an NUTM to trade space for time, which is significant for both theoretical computer science and physics. It is also of practical importance, for to quote Richard Feynman ‘there's plenty of room at the bottom’. This means that a desktop DNA NUTM could potentially utilize more processors than all the electronic computers in the world combined, and thereby outperform the world's current fastest supercomputer, while consuming a tiny fraction of its energy. PMID:28250099

  16. Performance of the fusion code GYRO on four generations of Cray computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fahey, Mark R

    2014-01-01

    GYRO is a code used for the direct numerical simulation of plasma microturbulence. It has been ported to a variety of modern MPP platforms including several modern commodity clusters, IBM SPs, and Cray XC, XT, and XE series machines. We briefly describe the mathematical structure of the equations, the data layout, and the redistribution scheme. Also, while the performance and scaling of GYRO on many of these systems has been shown before, here we show the comparative performance and scaling on four generations of Cray supercomputers including the newest addition - the Cray XC30. The more recently added hybrid OpenMP/MPImore » imple- mentation also shows a great deal of promise on custom HPC systems that utilize fast CPUs and proprietary interconnects. Four machines of varying sizes were used in the experiment, all of which are located at the National Institute for Computational Sciences at the University of Tennessee at Knoxville and Oak Ridge National Laboratory. The advantages, limitations, and performance of using each system are discussed.« less

  17. Highly parallel implementation of non-adiabatic Ehrenfest molecular dynamics

    NASA Astrophysics Data System (ADS)

    Kanai, Yosuke; Schleife, Andre; Draeger, Erik; Anisimov, Victor; Correa, Alfredo

    2014-03-01

    While the adiabatic Born-Oppenheimer approximation tremendously lowers computational effort, many questions in modern physics, chemistry, and materials science require an explicit description of coupled non-adiabatic electron-ion dynamics. Electronic stopping, i.e. the energy transfer of a fast projectile atom to the electronic system of the target material, is a notorious example. We recently implemented real-time time-dependent density functional theory based on the plane-wave pseudopotential formalism in the Qbox/qb@ll codes. We demonstrate that explicit integration using a fourth-order Runge-Kutta scheme is very suitable for modern highly parallelized supercomputers. Applying the new implementation to systems with hundreds of atoms and thousands of electrons, we achieved excellent performance and scalability on a large number of nodes both on the BlueGene based ``Sequoia'' system at LLNL as well as the Cray architecture of ``Blue Waters'' at NCSA. As an example, we discuss our work on computing the electronic stopping power of aluminum and gold for hydrogen projectiles, showing an excellent agreement with experiment. These first-principles calculations allow us to gain important insight into the the fundamental physics of electronic stopping.

  18. Your Higgs number - how fundamental physics is connected to technology and societal revolutions

    NASA Astrophysics Data System (ADS)

    Lidström, Suzy; Allen, Roland E.

    2015-03-01

    Fundamental physics, as exemplified by the recently discovered Higgs boson, often appears to be completely disconnected from practical applications and ordinary human life. But this is not really the case, because science, technology, and human affairs are profoundly integrated in ways that are not immediately obvious. We illustrate this by defining a ``Higgs number'' through overlapping activities. Following three different paths, which end respectively in applications of the World Wide Web, digital photography, and modern electronic devices, we find that most people have a Higgs number of no greater than 3. Specific examples chosen for illustration, with their assigned Higgs numbers, are: LHC experimentalists employing the Worldwide Computing Grid (0) - Timothy Berners-Lee (1) - Marissa Mayer, of Google and Yahoo, and Sheryl Sandberg, of Facebook (2) - users of all web-based enterprises (3). CMS and ATLAS experimentalists (0) - particle detector developers (1) - inventors of CCDs and active-pixel sensors (2) - users of digital cameras and camcorders (3). Philip Anderson (0) - John Bardeen (1) - Jack Kilby (2) - users of personal computers, mobile phones, and all other modern electronic devices (3).

  19. Hands-on Approach to Prepare Specialists in Climate Changes Modeling and Analysis Using an Information-Computational Web-GIS Portal "Climate"

    NASA Astrophysics Data System (ADS)

    Shulgina, T. M.; Gordova, Y. E.; Martynova, Y. V.

    2014-12-01

    A problem of making education relevant to the workplace tasks is a key problem of higher education in the professional field of environmental sciences. To answer this challenge several new courses for students of "Climatology" and "Meteorology" specialties were developed and implemented at the Tomsk State University, which comprises theoretical knowledge from up-to-date environmental sciences with computational tasks. To organize the educational process we use an open-source course management system Moodle (www.moodle.org). It gave us an opportunity to combine text and multimedia in a theoretical part of educational courses. The hands-on approach is realized through development of innovative trainings which are performed within the information-computational web GIS platform "Climate" (http://climate.scert.ru/). The platform has a set of tools and data bases allowing a researcher to perform climate changes analysis on the selected territory. The tools are also used for students' trainings, which contain practical tasks on climate modeling and climate changes assessment and analysis. Laboratory exercises are covering three topics: "Analysis of regional climate changes"; "Analysis of climate extreme indices on the regional scale"; and "Analysis of future climate". They designed to consolidate students' knowledge of discipline, to instill in them the skills to work independently with large amounts of geophysical data using modern processing and analysis tools of web-GIS platform "Climate" and to train them to present results obtained on laboratory work as reports with the statement of the problem, the results of calculations and logically justified conclusion. Thus, students are engaged in n the use of modern tools of the geophysical data analysis and it cultivates dynamic of their professional learning. The approach can help us to fill in this gap because it is the only approach that offers experience, increases students involvement, advance the use of modern information and communication tools. Financial support for this research from the RFBR (13-05-12034, 14-05-00502), SB RAS project VIII.80.2.1 and grant of the President of RF (№ 181) is acknowledged.

  20. The International Symposium on Grids and Clouds

    NASA Astrophysics Data System (ADS)

    The International Symposium on Grids and Clouds (ISGC) 2012 will be held at Academia Sinica in Taipei from 26 February to 2 March 2012, with co-located events and workshops. The conference is hosted by the Academia Sinica Grid Computing Centre (ASGC). 2012 is the decennium anniversary of the ISGC which over the last decade has tracked the convergence, collaboration and innovation of individual researchers across the Asia Pacific region to a coherent community. With the continuous support and dedication from the delegates, ISGC has provided the primary international distributed computing platform where distinguished researchers and collaboration partners from around the world share their knowledge and experiences. The last decade has seen the wide-scale emergence of e-Infrastructure as a critical asset for the modern e-Scientist. The emergence of large-scale research infrastructures and instruments that has produced a torrent of electronic data is forcing a generational change in the scientific process and the mechanisms used to analyse the resulting data deluge. No longer can the processing of these vast amounts of data and production of relevant scientific results be undertaken by a single scientist. Virtual Research Communities that span organisations around the world, through an integrated digital infrastructure that connects the trust and administrative domains of multiple resource providers, have become critical in supporting these analyses. Topics covered in ISGC 2012 include: High Energy Physics, Biomedicine & Life Sciences, Earth Science, Environmental Changes and Natural Disaster Mitigation, Humanities & Social Sciences, Operations & Management, Middleware & Interoperability, Security and Networking, Infrastructure Clouds & Virtualisation, Business Models & Sustainability, Data Management, Distributed Volunteer & Desktop Grid Computing, High Throughput Computing, and High Performance, Manycore & GPU Computing.

  1. Overview of computational structural methods for modern military aircraft

    NASA Technical Reports Server (NTRS)

    Kudva, J. N.

    1992-01-01

    Computational structural methods are essential for designing modern military aircraft. This briefing deals with computational structural methods (CSM) currently used. First a brief summary of modern day aircraft structural design procedures is presented. Following this, several ongoing CSM related projects at Northrop are discussed. Finally, shortcomings in this area, future requirements, and summary remarks are given.

  2. Long live the Data Scientist, but can he/she persist?

    NASA Astrophysics Data System (ADS)

    Wyborn, L. A.

    2011-12-01

    In recent years the fourth paradigm of data intensive science has slowly taken hold as the increased capacity of instruments and an increasing number of instruments (in particular sensor networks) have changed how fundamental research is undertaken. Most modern scientific research is about digital capture of data direct from instruments, processing it by computers, storing the results on computers and only publishing a small fraction of data in hard copy publications. At the same time, the rapid increase in capacity of supercomputers, particularly at petascale, means that far larger data sets can be analysed and to greater resolution than previously possible. The new cloud computing paradigm which allows distributed data, software and compute resources to be linked by seamless workflows, is creating new opportunities in processing of high volumes of data to an increasingly larger number of researchers. However, to take full advantage of these compute resources, data sets for analysis have to be aggregated from multiple sources to create high performance data sets. These new technology developments require that scientists must become more skilled in data management and/or have a higher degree of computer literacy. In almost every science discipline there is now an X-informatics branch and a computational X branch (eg, Geoinformatics and Computational Geoscience): both require a new breed of researcher that has skills in both the science fundamentals and also knowledge of some ICT aspects (computer programming, data base design and development, data curation, software engineering). People that can operate in both science and ICT are increasingly known as 'data scientists'. Data scientists are a critical element of many large scale earth and space science informatics projects, particularly those that are tackling current grand challenges at an international level on issues such as climate change, hazard prediction and sustainable development of our natural resources. These projects by their very nature require the integration of multiple digital data sets from multiple sources. Often the preparation of the data for computational analysis can take months and requires painstaking attention to detail to ensure that anomalies identified are real and are not just artefacts of the data preparation and/or the computational analysis. Although data scientists are increasingly vital to successful data intensive earth and space science projects, unless they are recognised for their capabilities in both the science and the computational domains they are likely to migrate to either a science role or an ICT role as their career advances. Most reward and recognition systems do not recognise those with skills in both, hence, getting trained data scientists to persist beyond one or two projects can be challenge. Those data scientists that persist in the profession are characteristically committed and enthusiastic people who have the support of their organisations to take on this role. They also tend to be people who share developments and are critical to the success of the open source software movement. However, the fact remains that survival of the data scientist as a species is being threatened unless something is done to recognise their invaluable contributions to the new fourth paradigm of science.

  3. Remote control system for high-perfomance computer simulation of crystal growth by the PFC method

    NASA Astrophysics Data System (ADS)

    Pavlyuk, Evgeny; Starodumov, Ilya; Osipov, Sergei

    2017-04-01

    Modeling of crystallization process by the phase field crystal method (PFC) - one of the important directions of modern computational materials science. In this paper, the practical side of the computer simulation of the crystallization process by the PFC method is investigated. To solve problems using this method, it is necessary to use high-performance computing clusters, data storage systems and other often expensive complex computer systems. Access to such resources is often limited, unstable and accompanied by various administrative problems. In addition, the variety of software and settings of different computing clusters sometimes does not allow researchers to use unified program code. There is a need to adapt the program code for each configuration of the computer complex. The practical experience of the authors has shown that the creation of a special control system for computing with the possibility of remote use can greatly simplify the implementation of simulations and increase the performance of scientific research. In current paper we show the principal idea of such a system and justify its efficiency.

  4. Colonizing nature: scientific knowledge, colonial power and the incorporation of India into the modern world-system.

    PubMed

    Baber, Z

    2001-03-01

    In this paper, the role of scientific knowledge, institutions and colonialism in mutually co-producing each other is analysed. Under the overarching rubric of colonial structures and imperatives, amateur scientists sought to deploy scientific expertise to expand the empire while at the same time seeking to take advantage of the opportunities to develop their careers as 'scientists'. The role of a complex interplay of structure and agency in the development of modern science, not just in India but in Britain too is analysed. The role of science and technology in the incorporation of South Asian into the modern world system, as well as the consequences of the emergent structures in understanding the trajectory of modern science in post-colonial India is examined. Overall, colonial rule did not simply diffuse modern science from the core to the periphery. Rather the colonial encounter led to the development of new forms of scientific knowledge and institutions both in the periphery and the core.

  5. The Study on the Core Concepts of Contemporary Sociology of Education and Its Theoretical Construction

    ERIC Educational Resources Information Center

    Qian, Min-hui

    2006-01-01

    Within the sphere of contemporary social sciences, the terms "modernity," "post-modernity" and "globalization" have penetrated, as the core concepts, into various fields of social sciences in a logical way. In constituting the concept of "modernity," sociology of education develops the educational theory, as sociological theory does, into a "grand…

  6. Performance Analysis, Design Considerations, and Applications of Extreme-Scale In Situ Infrastructures

    DOE PAGES

    Ayachit, Utkarsh; Bauer, Andrew; Duque, Earl P. N.; ...

    2016-11-01

    A key trend facing extreme-scale computational science is the widening gap between computational and I/O rates, and the challenge that follows is how to best gain insight from simulation data when it is increasingly impractical to save it to persistent storage for subsequent visual exploration and analysis. One approach to this challenge is centered around the idea of in situ processing, where visualization and analysis processing is performed while data is still resident in memory. Our paper examines several key design and performance issues related to the idea of in situ processing at extreme scale on modern platforms: Scalability, overhead,more » performance measurement and analysis, comparison and contrast with a traditional post hoc approach, and interfacing with simulation codes. We illustrate these principles in practice with studies, conducted on large-scale HPC platforms, that include a miniapplication and multiple science application codes, one of which demonstrates in situ methods in use at greater than 1M-way concurrency.« less

  7. With Great Measurements Come Great Results

    NASA Astrophysics Data System (ADS)

    Williams, Carl

    Measurements are the foundation for science and modern life. Technologies we take for granted every day depend on them-cell phones, CAT scans, pharmaceuticals, even sports equipment. Metrology, or measurement science, determines what industry can make reliably and what they cannot. At the National Institute of Standards and Technology (NIST) we specialize in making world class measurements that an incredibly wide range of industries use to continually improve their products - computer chips with nanoscale components, atomic clocks that you can hold in your hand, lasers for both super-strong welds and delicate eye surgeries. Think of all the key technologies developed over the last 100 years and better measurements, standards, or analysis techniques played a role in making them possible. NIST works collaboratively with industry researchers on the advanced metrology for tomorrow's technologies. A new kilogram based on electromagnetic force, cars that weigh half as much but are just as strong, quantum computers, personalized medicine, single atom devices - it's all happening in our labs now. This talk will focus on how metrology creates the future.

  8. Current Developments in Machine Learning Techniques in Biological Data Mining.

    PubMed

    Dumancas, Gerard G; Adrianto, Indra; Bello, Ghalib; Dozmorov, Mikhail

    2017-01-01

    This supplement is intended to focus on the use of machine learning techniques to generate meaningful information on biological data. This supplement under Bioinformatics and Biology Insights aims to provide scientists and researchers working in this rapid and evolving field with online, open-access articles authored by leading international experts in this field. Advances in the field of biology have generated massive opportunities to allow the implementation of modern computational and statistical techniques. Machine learning methods in particular, a subfield of computer science, have evolved as an indispensable tool applied to a wide spectrum of bioinformatics applications. Thus, it is broadly used to investigate the underlying mechanisms leading to a specific disease, as well as the biomarker discovery process. With a growth in this specific area of science comes the need to access up-to-date, high-quality scholarly articles that will leverage the knowledge of scientists and researchers in the various applications of machine learning techniques in mining biological data.

  9. Climate Analytics as a Service

    NASA Technical Reports Server (NTRS)

    Schnase, John L.; Duffy, Daniel Q.; McInerney, Mark A.; Webster, W. Phillip; Lee, Tsengdar J.

    2014-01-01

    Climate science is a big data domain that is experiencing unprecedented growth. In our efforts to address the big data challenges of climate science, we are moving toward a notion of Climate Analytics-as-a-Service (CAaaS). CAaaS combines high-performance computing and data-proximal analytics with scalable data management, cloud computing virtualization, the notion of adaptive analytics, and a domain-harmonized API to improve the accessibility and usability of large collections of climate data. MERRA Analytic Services (MERRA/AS) provides an example of CAaaS. MERRA/AS enables MapReduce analytics over NASA's Modern-Era Retrospective Analysis for Research and Applications (MERRA) data collection. The MERRA reanalysis integrates observational data with numerical models to produce a global temporally and spatially consistent synthesis of key climate variables. The effectiveness of MERRA/AS has been demonstrated in several applications. In our experience, CAaaS is providing the agility required to meet our customers' increasing and changing data management and data analysis needs.

  10. dREL: a relational expression language for dictionary methods.

    PubMed

    Spadaccini, Nick; Castleden, Ian R; du Boulay, Doug; Hall, Sydney R

    2012-08-27

    The provision of precise metadata is an important but a largely underrated challenge for modern science [Nature 2009, 461, 145]. We describe here a dictionary methods language dREL that has been designed to enable complex data relationships to be expressed as formulaic scripts in data dictionaries written in DDLm [Spadaccini and Hall J. Chem. Inf. Model.2012 doi:10.1021/ci300075z]. dREL describes data relationships in a simple but powerful canonical form that is easy to read and understand and can be executed computationally to evaluate or validate data. The execution of dREL expressions is not a substitute for traditional scientific computation; it is to provide precise data dependency information to domain-specific definitions and a means for cross-validating data. Some scientific fields apply conventional programming languages to methods scripts but these tend to inhibit both dictionary development and accessibility. dREL removes the programming barrier and encourages the production of the metadata needed for seamless data archiving and exchange in science.

  11. Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application.

    PubMed

    Hanwell, Marcus D; de Jong, Wibe A; Harris, Christopher J

    2017-10-30

    An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction-connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platform with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web-going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.

  12. Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application

    DOE PAGES

    Hanwell, Marcus D.; de Jong, Wibe A.; Harris, Christopher J.

    2017-10-30

    An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction - connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platformmore » with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web - going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.« less

  13. Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanwell, Marcus D.; de Jong, Wibe A.; Harris, Christopher J.

    An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction - connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platformmore » with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web - going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.« less

  14. FOREWORD: Focus on Combinatorial Materials Science Focus on Combinatorial Materials Science

    NASA Astrophysics Data System (ADS)

    Chikyo, Toyohiro

    2011-10-01

    About 15 years have passed since the introduction of modern combinatorial synthesis and high-throughput techniques for the development of novel inorganic materials; however, similar methods existed before. The most famous was reported in 1970 by Hanak who prepared composition-spread films of metal alloys by sputtering mixed-material targets. Although this method was innovative, it was rarely used because of the large amount of data to be processed. This problem is solved in the modern combinatorial material research, which is strongly related to computer data analysis and robotics. This field is still at the developing stage and may be enriched by new methods. Nevertheless, given the progress in measurement equipment and procedures, we believe the combinatorial approach will become a major and standard tool of materials screening and development. The first article of this journal, published in 2000, was titled 'Combinatorial solid state materials science and technology', and this focus issue aims to reintroduce this topic to the Science and Technology of Advanced Materials audience. It covers recent progress in combinatorial materials research describing new results in catalysis, phosphors, polymers and metal alloys for shape memory materials. Sophisticated high-throughput characterization schemes and innovative synthesis tools are also presented, such as spray deposition using nanoparticles or ion plating. On a technical note, data handling systems are introduced to familiarize researchers with the combinatorial methodology. We hope that through this focus issue a wide audience of materials scientists can learn about recent and future trends in combinatorial materials science and high-throughput experimentation.

  15. Petascale supercomputing to accelerate the design of high-temperature alloys

    DOE PAGES

    Shin, Dongwon; Lee, Sangkeun; Shyam, Amit; ...

    2017-10-25

    Recent progress in high-performance computing and data informatics has opened up numerous opportunities to aid the design of advanced materials. Herein, we demonstrate a computational workflow that includes rapid population of high-fidelity materials datasets via petascale computing and subsequent analyses with modern data science techniques. We use a first-principles approach based on density functional theory to derive the segregation energies of 34 microalloying elements at the coherent and semi-coherent interfaces between the aluminium matrix and the θ'-Al 2Cu precipitate, which requires several hundred supercell calculations. We also perform extensive correlation analyses to identify materials descriptors that affect the segregation behaviourmore » of solutes at the interfaces. Finally, we show an example of leveraging machine learning techniques to predict segregation energies without performing computationally expensive physics-based simulations. As a result, the approach demonstrated in the present work can be applied to any high-temperature alloy system for which key materials data can be obtained using high-performance computing.« less

  16. Petascale supercomputing to accelerate the design of high-temperature alloys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shin, Dongwon; Lee, Sangkeun; Shyam, Amit

    Recent progress in high-performance computing and data informatics has opened up numerous opportunities to aid the design of advanced materials. Herein, we demonstrate a computational workflow that includes rapid population of high-fidelity materials datasets via petascale computing and subsequent analyses with modern data science techniques. We use a first-principles approach based on density functional theory to derive the segregation energies of 34 microalloying elements at the coherent and semi-coherent interfaces between the aluminium matrix and the θ'-Al 2Cu precipitate, which requires several hundred supercell calculations. We also perform extensive correlation analyses to identify materials descriptors that affect the segregation behaviourmore » of solutes at the interfaces. Finally, we show an example of leveraging machine learning techniques to predict segregation energies without performing computationally expensive physics-based simulations. As a result, the approach demonstrated in the present work can be applied to any high-temperature alloy system for which key materials data can be obtained using high-performance computing.« less

  17. Petascale supercomputing to accelerate the design of high-temperature alloys

    NASA Astrophysics Data System (ADS)

    Shin, Dongwon; Lee, Sangkeun; Shyam, Amit; Haynes, J. Allen

    2017-12-01

    Recent progress in high-performance computing and data informatics has opened up numerous opportunities to aid the design of advanced materials. Herein, we demonstrate a computational workflow that includes rapid population of high-fidelity materials datasets via petascale computing and subsequent analyses with modern data science techniques. We use a first-principles approach based on density functional theory to derive the segregation energies of 34 microalloying elements at the coherent and semi-coherent interfaces between the aluminium matrix and the θ‧-Al2Cu precipitate, which requires several hundred supercell calculations. We also perform extensive correlation analyses to identify materials descriptors that affect the segregation behaviour of solutes at the interfaces. Finally, we show an example of leveraging machine learning techniques to predict segregation energies without performing computationally expensive physics-based simulations. The approach demonstrated in the present work can be applied to any high-temperature alloy system for which key materials data can be obtained using high-performance computing.

  18. MODERN SCIENCE. INSTRUCTIONAL GUIDE FOR SENIOR HIGH SCHOOL.

    ERIC Educational Resources Information Center

    RICE, GLORIA; AND OTHERS

    ELEVEN UNITS OF STUDY INCLUDE--SCIENCE IN OUR LIVES TODAY, APPLIED CHEMISTRY, MODERN MATERIALS, MAN AND MECHANICS, HEAT AND FUELS, NUCLEAR ENERGY, SOUND, LIGHT, ELECTRICITY, ELECTRONICS, AND SPACE. ALL ARE DIRECTED AT THE STUDENT WHO WOULD USE THE INFORMATION GAINED IN EVERYDAY LIFE, RATHER THAN AT THE POTENTIAL SCIENCE STUDENT. UNIT 1 EXPLAINS…

  19. Valuing Science: A Turkish-American comparison

    NASA Astrophysics Data System (ADS)

    Titrek, Osman; Cobern, William W.

    2011-02-01

    The process of modernization began in Turkey under the reform government of Mustafa Kemal Ataturk (1881-1938). Turkey officially became a secular nation seeking to develop a modern economy with modern science and technology and political democracy. Turkey also has long been, and remains, a deeply religious society. Specifically, the practice of Islam is widespread, which raises the important question: whether the path of modernization in Turkey will look more like the American pattern or the European, where the Europeans are much more philosophically secular than the Americans? One way to look at this question is by examining how people value science vis-à-vis other important aspects of society and culture. Hence, our study is a comparative look at Turkish and American opinions about science. The American society, which is certainly a very modern society, is of particular interest in Turkey, given the significant religiosity of the American people, making the American and Turkish societies similar at least on this one significant point. Although we do not have comparable European data at this time, our Turkish-American comparison can be suggestive of whether or not Turkey is likely to follow the American pattern of a highly modernized yet deeply religious society.

  20. First principles statistical mechanics of alloys and magnetism

    NASA Astrophysics Data System (ADS)

    Eisenbach, Markus; Khan, Suffian N.; Li, Ying Wai

    Modern high performance computing resources are enabling the exploration of the statistical physics of phase spaces with increasing size and higher fidelity of the Hamiltonian of the systems. For selected systems, this now allows the combination of Density Functional based first principles calculations with classical Monte Carlo methods for parameter free, predictive thermodynamics of materials. We combine our locally selfconsistent real space multiple scattering method for solving the Kohn-Sham equation with Wang-Landau Monte-Carlo calculations (WL-LSMS). In the past we have applied this method to the calculation of Curie temperatures in magnetic materials. Here we will present direct calculations of the chemical order - disorder transitions in alloys. We present our calculated transition temperature for the chemical ordering in CuZn and the temperature dependence of the short-range order parameter and specific heat. Finally we will present the extension of the WL-LSMS method to magnetic alloys, thus allowing the investigation of the interplay of magnetism, structure and chemical order in ferrous alloys. This research was supported by the U.S. Department of Energy, Office of Science, Basic Energy Sciences, Materials Science and Engineering Division and it used Oak Ridge Leadership Computing Facility resources at Oak Ridge National Laboratory.

  1. The challenges of developing computational physics: the case of South Africa

    NASA Astrophysics Data System (ADS)

    Salagaram, T.; Chetty, N.

    2013-08-01

    Most modern scientific research problems are complex and interdisciplinary in nature. It is impossible to study such problems in detail without the use of computation in addition to theory and experiment. Although it is widely agreed that students should be introduced to computational methods at the undergraduate level, it remains a challenge to do this in a full traditional undergraduate curriculum. In this paper, we report on a survey that we conducted of undergraduate physics curricula in South Africa to determine the content and the approach taken in the teaching of computational physics. We also considered the pedagogy of computational physics at the postgraduate and research levels at various South African universities, research facilities and institutions. We conclude that the state of computational physics training in South Africa, especially at the undergraduate teaching level, is generally weak and needs to be given more attention at all universities. Failure to do so will impact negatively on the countrys capacity to grow its endeavours generally in the field of computational sciences, with negative impacts on research, and in commerce and industry.

  2. Earth Science Data Fusion with Event Building Approach

    NASA Technical Reports Server (NTRS)

    Lukashin, C.; Bartle, Ar.; Callaway, E.; Gyurjyan, V.; Mancilla, S.; Oyarzun, R.; Vakhnin, A.

    2015-01-01

    Objectives of the NASA Information And Data System (NAIADS) project are to develop a prototype of a conceptually new middleware framework to modernize and significantly improve efficiency of the Earth Science data fusion, big data processing and analytics. The key components of the NAIADS include: Service Oriented Architecture (SOA) multi-lingual framework, multi-sensor coincident data Predictor, fast into-memory data Staging, multi-sensor data-Event Builder, complete data-Event streaming (a work flow with minimized IO), on-line data processing control and analytics services. The NAIADS project is leveraging CLARA framework, developed in Jefferson Lab, and integrated with the ZeroMQ messaging library. The science services are prototyped and incorporated into the system. Merging the SCIAMACHY Level-1 observations and MODIS/Terra Level-2 (Clouds and Aerosols) data products, and ECMWF re- analysis will be used for NAIADS demonstration and performance tests in compute Cloud and Cluster environments.

  3. Science Thought and Practices: A Professional Development Workshop on Teaching Scientific Reasoning, Mathematical Modeling and Data Analysis

    NASA Astrophysics Data System (ADS)

    Robbins, Dennis; Ford, K. E. Saavik

    2018-01-01

    The NSF-supported “AstroCom NYC” program, a collaboration of the City University of New York and the American Museum of Natural History (AMNH), has developed and offers hands-on workshops to undergraduate faculty on teaching science thought and practices. These professional development workshops emphasize a curriculum and pedagogical strategies that uses computers and other digital devices in a laboratory environment to teach students fundamental topics, including: proportional reasoning, control of variables thinking, experimental design, hypothesis testing, reasoning with data, and drawing conclusions from graphical displays. Topics addressed here are rarely taught in-depth during the formal undergraduate years and are frequently learned only after several apprenticeship research experiences. The goal of these workshops is to provide working and future faculty with an interactive experience in science learning and teaching using modern technological tools.

  4. Applying colour science in colour design

    NASA Astrophysics Data System (ADS)

    Luo, Ming Ronnier

    2006-06-01

    Although colour science has been widely used in a variety of industries over the years, it has not been fully explored in the field of product design. This paper will initially introduce the three main application fields of colour science: colour specification, colour-difference evaluation and colour appearance modelling. By integrating these advanced colour technologies together with modern colour imaging devices such as display, camera, scanner and printer, some computer systems have been recently developed to assist designers for designing colour palettes through colour selection by means of a number of widely used colour order systems, for creating harmonised colour schemes via a categorical colour system, for generating emotion colours using various colour emotional scales and for facilitating colour naming via a colour-name library. All systems are also capable of providing accurate colour representation on displays and output to different imaging devices such as printers.

  5. Density functional theory in the solid state

    PubMed Central

    Hasnip, Philip J.; Refson, Keith; Probert, Matt I. J.; Yates, Jonathan R.; Clark, Stewart J.; Pickard, Chris J.

    2014-01-01

    Density functional theory (DFT) has been used in many fields of the physical sciences, but none so successfully as in the solid state. From its origins in condensed matter physics, it has expanded into materials science, high-pressure physics and mineralogy, solid-state chemistry and more, powering entire computational subdisciplines. Modern DFT simulation codes can calculate a vast range of structural, chemical, optical, spectroscopic, elastic, vibrational and thermodynamic phenomena. The ability to predict structure–property relationships has revolutionized experimental fields, such as vibrational and solid-state NMR spectroscopy, where it is the primary method to analyse and interpret experimental spectra. In semiconductor physics, great progress has been made in the electronic structure of bulk and defect states despite the severe challenges presented by the description of excited states. Studies are no longer restricted to known crystallographic structures. DFT is increasingly used as an exploratory tool for materials discovery and computational experiments, culminating in ex nihilo crystal structure prediction, which addresses the long-standing difficult problem of how to predict crystal structure polymorphs from nothing but a specified chemical composition. We present an overview of the capabilities of solid-state DFT simulations in all of these topics, illustrated with recent examples using the CASTEP computer program. PMID:24516184

  6. Diagnosis of the Computer-Controlled Milling Machine, Definition of the Working Errors and Input Corrections on the Basis of Mathematical Model

    NASA Astrophysics Data System (ADS)

    Starikov, A. I.; Nekrasov, R. Yu; Teploukhov, O. J.; Soloviev, I. V.; Narikov, K. A.

    2016-10-01

    Manufactures, machinery and equipment improve of constructively as science advances and technology, and requirements are improving of quality and longevity. That is, the requirements for surface quality and precision manufacturing, oil and gas equipment parts are constantly increasing. Production of oil and gas engineering products on modern machine tools with computer numerical control - is a complex synthesis of technical and electrical equipment parts, as well as the processing procedure. Technical machine part wears during operation and in the electrical part are accumulated mathematical errors. Thus, the above-mentioned disadvantages of any of the following parts of metalworking equipment affect the manufacturing process of products in general, and as a result lead to the flaw.

  7. [Possibilities of use of digital imaging in forensic medicine].

    PubMed

    Gaval'a, P; Ivicsics, I; Mlynár, J; Novomeský, F

    2005-07-01

    Based on the daily practice with digital photography and documentation, the authors point out the achievements of the computer technologies implementation to the practice of forensic medicine. The modern methods of imaging, especially the digital photography, offer a wide spectrum of use in forensic medicine--the digital documentation and archivation of autopsy findings, the possibility of immediate consultation of findings with another experts via Internet, and many others. Another possibility is a creation of digital photographic atlas of forensic medicine as a useful aid in pre- and postgradual study. Thus the application of the state-of-the-art computer technologies to the forensic medicine discloses the unknown before possibilities for further development of such a discipline of human medical sciences.

  8. Proceedings of USC (University of Southern California) Workshop on VLSI (Very Large Scale Integration) & Modern Signal Processing, held at Los Angeles, California on 1-3 November 1982

    DTIC Science & Technology

    1983-11-15

    Concurrent Algorithms", A. Cremers , Dortmund University, West Germany, and T. Hibbard, JPL, Pasadena, CA 64 "An Overview of Signal Representations in...n O f\\ n O P- A -> Problem-oriented specification of concurrent algorithms Armin B. Cremers and Thomas N. Hibbard Preliminary version September...1982 s* Armin B. Cremers Computer Science Department University of Dortmund P.O. Box 50 05 00 D-4600 Dortmund 50 Fed. Rep. Germany

  9. Dreams and creative problem-solving.

    PubMed

    Barrett, Deirdre

    2017-10-01

    Dreams have produced art, music, novels, films, mathematical proofs, designs for architecture, telescopes, and computers. Dreaming is essentially our brain thinking in another neurophysiologic state-and therefore it is likely to solve some problems on which our waking minds have become stuck. This neurophysiologic state is characterized by high activity in brain areas associated with imagery, so problems requiring vivid visualization are also more likely to get help from dreaming. This article reviews great historical dreams and modern laboratory research to suggest how dreams can aid creativity and problem-solving. © 2017 New York Academy of Sciences.

  10. How Political Science Became Modern: Racial Thought and the Transformation of the Discipline, 1880-1930

    ERIC Educational Resources Information Center

    Blatt, Jessica

    2009-01-01

    This dissertation argues that changing ideas about race and engagement with race science were at the heart of a major transformation of political science in the 1920s, a transformation that I characterize as "becoming modern." This transformation was at once conceptual--visible in the basic categories and theoretical apparatus of the…

  11. Effects of a Science Education Module on Attitudes towards Modern Biotechnology of Secondary School Students

    ERIC Educational Resources Information Center

    Klop, Tanja; Severiens, Sabine E.; Knippels, Marie-Christine P. J.; van Mil, Marc H. W.; Ten Dam, Geert T. M.

    2010-01-01

    This article evaluated the impact of a four-lesson science module on the attitudes of secondary school students. This science module (on cancer and modern biotechnology) utilises several design principles, related to a social constructivist perspective on learning. The expectation was that the module would help students become more articulate in…

  12. The Development of Sociocultural Competence with the Help of Computer Technology

    ERIC Educational Resources Information Center

    Rakhimova, Alina E.; Yashina, Marianna E.; Mukhamadiarova, Albina F.; Sharipova, Astrid V.

    2017-01-01

    The article deals with the description of the process of development sociocultural knowledge and competences using computer technologies. On the whole the development of modern computer technologies allows teachers to broaden trainees' sociocultural outlook and trace their progress online. Observation of modern computer technologies and estimation…

  13. [Postmodernism and the issue of nursing].

    PubMed

    Kong, Byung-Hye

    2004-06-01

    The purpose of this study was to illustrate the main stream of postmodernism which has influenced theory and research in the nursing science, and then to consider the meaning and value ofwhat the postmodern perspective has meant to nursing science in the 21st century. Derrida and Foucaults philosophical thoughts that characterized postmodernism through the interpretation of their major literature was studied. Based on their philosophy, it was shown how Derrida's idea could be applied in deconstructing the core paradigm in modern nursing science. In terms of Foucault's post-structuralism, reinterpretation of the nursing science in relation to power/knowledge was completed. Postmodernism created multiple and diverse paradigms of nursing theory as well as nursing research. This was accomplished by deconstructing the modernism of nursing science which was based on the positivism and medical-cure centralism. Specifically, the post-structuralist perspective revealed issues around the relationship of power and knowledge, which dominated and produced modern nursing science. Contemporary nursing science accepts pluralism and needs no unitary meta-paradigm, which can reintegrate multiple and diverse paradigms. In considering the issue of nursing science in postmodernism, it can be summarized as follows: the postmodern thinking discovers and reveals diverse and potential nursing values which were veiled by the domination of western modern nursing science. These were motivated to create nursing knowledge by conversation in interpersonal relationships, which can contribute to practical utilities for the caring-healing situation.

  14. Twenty-Five Year Site Plan FY2013 - FY2037

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, William H.

    2012-07-12

    Los Alamos National Laboratory (the Laboratory) is the nation's premier national security science laboratory. Its mission is to develop and apply science and technology to ensure the safety, security, and reliability of the United States (U.S.) nuclear stockpile; reduce the threat of weapons of mass destruction, proliferation, and terrorism; and solve national problems in defense, energy, and the environment. The fiscal year (FY) 2013-2037 Twenty-Five Year Site Plan (TYSP) is a vital component for planning to meet the National Nuclear Security Administration (NNSA) commitment to ensure the U.S. has a safe, secure, and reliable nuclear deterrent. The Laboratory also usesmore » the TYSP as an integrated planning tool to guide development of an efficient and responsive infrastructure that effectively supports the Laboratory's missions and workforce. Emphasizing the Laboratory's core capabilities, this TYSP reflects the Laboratory's role as a prominent contributor to NNSA missions through its programs and campaigns. The Laboratory is aligned with Nuclear Security Enterprise (NSE) modernization activities outlined in the NNSA Strategic Plan (May 2011) which include: (1) ensuring laboratory plutonium space effectively supports pit manufacturing and enterprise-wide special nuclear materials consolidation; (2) constructing the Chemistry and Metallurgy Research Replacement Nuclear Facility (CMRR-NF); (3) establishing shared user facilities to more cost effectively manage high-value, experimental, computational and production capabilities; and (4) modernizing enduring facilities while reducing the excess facility footprint. Th is TYSP is viewed by the Laboratory as a vital planning tool to develop an effi cient and responsive infrastructure. Long range facility and infrastructure development planning are critical to assure sustainment and modernization. Out-year re-investment is essential for sustaining existing facilities, and will be re-evaluated on an annual basis. At the same time, major modernization projects will require new line-item funding. This document is, in essence, a roadmap that defines a path forward for the Laboratory to modernize, streamline, consolidate, and sustain its infrastructure to meet its national security mission.« less

  15. Discovering indigenous science: Implications for science education

    NASA Astrophysics Data System (ADS)

    Snively, Gloria; Corsiglia, John

    2001-01-01

    Indigenous science relates to both the science knowledge of long-resident, usually oral culture peoples, as well as the science knowledge of all peoples who as participants in culture are affected by the worldview and relativist interests of their home communities. This article explores aspects of multicultural science and pedagogy and describes a rich and well-documented branch of indigenous science known to biologists and ecologists as traditional ecological knowledge (TEK). Although TEK has been generally inaccessible, educators can now use a burgeoning science-based TEK literature that documents numerous examples of time-proven, ecologically relevant, and cost effective indigenous science. Disputes regarding the universality of the standard scientific account are of critical importance for science educators because the definition of science is a de facto gatekeeping device for determining what can be included in a school science curriculum and what cannot. When Western modern science (WMS) is defined as universal it does displace revelation-based knowledge (i.e., creation science); however, it also displaces pragmatic local indigenous knowledge that does not conform with formal aspects of the standard account. Thus, in most science classrooms around the globe, Western modern science has been taught at the expense of indigenous knowledge. However, because WMS has been implicated in many of the world's ecological disasters, and because the traditional wisdom component of TEK is particularly rich in time-tested approaches that foster sustainability and environmental integrity, it is possible that the universalist gatekeeper can be seen as increasingly problematic and even counter productive. This paper describes many examples from Canada and around the world of indigenous people's contributions to science, environmental understanding, and sustainability. The authors argue the view that Western or modern science is just one of many sciences that need to be addressed in the science classroom. We conclude by presenting instructional strategies that can help all science learners negotiate border crossings between Western modern science and indigenous science.

  16. The community FabLab platform: applications and implications in biomedical engineering.

    PubMed

    Stephenson, Makeda K; Dow, Douglas E

    2014-01-01

    Skill development in science, technology, engineering and math (STEM) education present one of the most formidable challenges of modern society. The Community FabLab platform presents a viable solution. Each FabLab contains a suite of modern computer numerical control (CNC) equipment, electronics and computing hardware and design, programming, computer aided design (CAD) and computer aided machining (CAM) software. FabLabs are community and educational resources and open to the public. Development of STEM based workforce skills such as digital fabrication and advanced manufacturing can be enhanced using this platform. Particularly notable is the potential of the FabLab platform in STEM education. The active learning environment engages and supports a diversity of learners, while the iterative learning that is supported by the FabLab rapid prototyping platform facilitates depth of understanding, creativity, innovation and mastery. The product and project based learning that occurs in FabLabs develops in the student a personal sense of accomplishment, self-awareness, command of the material and technology. This helps build the interest and confidence necessary to excel in STEM and throughout life. Finally the introduction and use of relevant technologies at every stage of the education process ensures technical familiarity and a broad knowledge base needed for work in STEM based fields. Biomedical engineering education strives to cultivate broad technical adeptness, creativity, interdisciplinary thought, and an ability to form deep conceptual understanding of complex systems. The FabLab platform is well designed to enhance biomedical engineering education.

  17. How to reconcile the multiculturalist and universalist approaches to science education

    NASA Astrophysics Data System (ADS)

    Hansson, Sven Ove

    2017-06-01

    The "multiculturalist" and "universalist" approaches to science education both fail to recognize the strong continuities between modern science and its forerunners in traditional societies. Various fact-finding practices in indigenous cultures exhibit the hallmarks of scientific investigations, such as collectively achieved rationality, a careful distinction between facts and values, a search for shared, well-founded judgments in empirical matters, and strivings for continuous improvement of these judgments. Prominent examples are hunters' discussions when tracking a prey, systematic agricultural experiments performed by indigenous farmers, and remarkably advanced experiments performed by craftspeople long before the advent of modern science. When the continuities between science and these prescientific practices are taken into account, it becomes obvious that the traditional forms of both multiculturalism and universalism should be replaced by a new approach that dissolves the alleged conflict between adherence to modern science and respect for traditional cultures.

  18. UC Merced Center for Computational Biology Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Colvin, Michael; Watanabe, Masakatsu

    Final report for the UC Merced Center for Computational Biology. The Center for Computational Biology (CCB) was established to support multidisciplinary scientific research and academic programs in computational biology at the new University of California campus in Merced. In 2003, the growing gap between biology research and education was documented in a report from the National Academy of Sciences, Bio2010 Transforming Undergraduate Education for Future Research Biologists. We believed that a new type of biological sciences undergraduate and graduate programs that emphasized biological concepts and considered biology as an information science would have a dramatic impact in enabling the transformationmore » of biology. UC Merced as newest UC campus and the first new U.S. research university of the 21st century was ideally suited to adopt an alternate strategy - to create a new Biological Sciences majors and graduate group that incorporated the strong computational and mathematical vision articulated in the Bio2010 report. CCB aimed to leverage this strong commitment at UC Merced to develop a new educational program based on the principle of biology as a quantitative, model-driven science. Also we expected that the center would be enable the dissemination of computational biology course materials to other university and feeder institutions, and foster research projects that exemplify a mathematical and computations-based approach to the life sciences. As this report describes, the CCB has been successful in achieving these goals, and multidisciplinary computational biology is now an integral part of UC Merced undergraduate, graduate and research programs in the life sciences. The CCB began in fall 2004 with the aid of an award from U.S. Department of Energy (DOE), under its Genomes to Life program of support for the development of research and educational infrastructure in the modern biological sciences. This report to DOE describes the research and academic programs made possible by the CCB from its inception until August, 2010, at the end of the final extension. Although DOE support for the center ended in August 2010, the CCB will continue to exist and support its original objectives. The research and academic programs fostered by the CCB have led to additional extramural funding from other agencies, and we anticipate that CCB will continue to provide support for quantitative and computational biology program at UC Merced for many years to come. Since its inception in fall 2004, CCB research projects have continuously had a multi-institutional collaboration with Lawrence Livermore National Laboratory (LLNL), and the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign, as well as individual collaborators at other sites. CCB affiliated faculty cover a broad range of computational and mathematical research including molecular modeling, cell biology, applied math, evolutional biology, bioinformatics, etc. The CCB sponsored the first distinguished speaker series at UC Merced, which had an important role is spreading the word about the computational biology emphasis at this new campus. One of CCB's original goals is to help train a new generation of biologists who bridge the gap between the computational and life sciences. To archive this goal, by summer 2006, a new program - summer undergraduate internship program, have been established under CCB to train the highly mathematical and computationally intensive Biological Science researchers. By the end of summer 2010, 44 undergraduate students had gone through this program. Out of those participants, 11 students have been admitted to graduate schools and 10 more students are interested in pursuing graduate studies in the sciences. The center is also continuing to facilitate the development and dissemination of undergraduate and graduate course materials based on the latest research in computational biology.« less

  19. Evaluation of a Multicore-Optimized Implementation for Tomographic Reconstruction

    PubMed Central

    Agulleiro, Jose-Ignacio; Fernández, José Jesús

    2012-01-01

    Tomography allows elucidation of the three-dimensional structure of an object from a set of projection images. In life sciences, electron microscope tomography is providing invaluable information about the cell structure at a resolution of a few nanometres. Here, large images are required to combine wide fields of view with high resolution requirements. The computational complexity of the algorithms along with the large image size then turns tomographic reconstruction into a computationally demanding problem. Traditionally, high-performance computing techniques have been applied to cope with such demands on supercomputers, distributed systems and computer clusters. In the last few years, the trend has turned towards graphics processing units (GPUs). Here we present a detailed description and a thorough evaluation of an alternative approach that relies on exploitation of the power available in modern multicore computers. The combination of single-core code optimization, vector processing, multithreading and efficient disk I/O operations succeeds in providing fast tomographic reconstructions on standard computers. The approach turns out to be competitive with the fastest GPU-based solutions thus far. PMID:23139768

  20. The challenge of cardiac modeling--interaction and integration.

    PubMed

    Sideman, Samuel

    2006-10-01

    The goal of clinical cardiology is to obtain an integrated picture of the interacting parameters of muscle and vessel mechanics, blood circulation and myocardial perfusion, oxygen consumption and energy metabolism, and electrical activation and heart rate, thus relating to the true physiological and pathophysiological characteristics of the heart. Scientific insight into the cardiac physiology and performance is achieved by utilizing life sciences, for example, molecular biology, genetics and related intra- and intercellular phenomena, as well as the exact sciences, for example, mathematics, computer science, and related imaging and visualization techniques. The tools to achieve these goals are based on the intimate interactions between engineering science and medicine and the developments of modern, medically oriented technology. Most significant is the beneficiary effect of the globalization of science, the Internet, and the unprecedented international interaction and scientific cooperation in facing difficult multidisciplined challenges. This meeting aims to explore some important interactions in the cardiac system and relate to the integration of spatial and temporal interacting system parameters, so as to gain better insight into the structure and function of the cardiac system, thus leading to better therapeutic modalities.

  1. Special Section: Complementary and Alternative Medicine (CAM): Acupuncture From Ancient Practice to Modern Science

    MedlinePlus

    ... Home Current Issue Past Issues Special Section CAM Acupuncture From Ancient Practice to Modern Science Past Issues / ... percent of U.S. adults use acupuncture. What Is Acupuncture? Dr. Adeline Ge adjusts placement of acupuncture needles ...

  2. An Introduction to Programming for Bioscientists: A Python-Based Primer

    PubMed Central

    Mura, Cameron

    2016-01-01

    Computing has revolutionized the biological sciences over the past several decades, such that virtually all contemporary research in molecular biology, biochemistry, and other biosciences utilizes computer programs. The computational advances have come on many fronts, spurred by fundamental developments in hardware, software, and algorithms. These advances have influenced, and even engendered, a phenomenal array of bioscience fields, including molecular evolution and bioinformatics; genome-, proteome-, transcriptome- and metabolome-wide experimental studies; structural genomics; and atomistic simulations of cellular-scale molecular assemblies as large as ribosomes and intact viruses. In short, much of post-genomic biology is increasingly becoming a form of computational biology. The ability to design and write computer programs is among the most indispensable skills that a modern researcher can cultivate. Python has become a popular programming language in the biosciences, largely because (i) its straightforward semantics and clean syntax make it a readily accessible first language; (ii) it is expressive and well-suited to object-oriented programming, as well as other modern paradigms; and (iii) the many available libraries and third-party toolkits extend the functionality of the core language into virtually every biological domain (sequence and structure analyses, phylogenomics, workflow management systems, etc.). This primer offers a basic introduction to coding, via Python, and it includes concrete examples and exercises to illustrate the language’s usage and capabilities; the main text culminates with a final project in structural bioinformatics. A suite of Supplemental Chapters is also provided. Starting with basic concepts, such as that of a “variable,” the Chapters methodically advance the reader to the point of writing a graphical user interface to compute the Hamming distance between two DNA sequences. PMID:27271528

  3. An Introduction to Programming for Bioscientists: A Python-Based Primer.

    PubMed

    Ekmekci, Berk; McAnany, Charles E; Mura, Cameron

    2016-06-01

    Computing has revolutionized the biological sciences over the past several decades, such that virtually all contemporary research in molecular biology, biochemistry, and other biosciences utilizes computer programs. The computational advances have come on many fronts, spurred by fundamental developments in hardware, software, and algorithms. These advances have influenced, and even engendered, a phenomenal array of bioscience fields, including molecular evolution and bioinformatics; genome-, proteome-, transcriptome- and metabolome-wide experimental studies; structural genomics; and atomistic simulations of cellular-scale molecular assemblies as large as ribosomes and intact viruses. In short, much of post-genomic biology is increasingly becoming a form of computational biology. The ability to design and write computer programs is among the most indispensable skills that a modern researcher can cultivate. Python has become a popular programming language in the biosciences, largely because (i) its straightforward semantics and clean syntax make it a readily accessible first language; (ii) it is expressive and well-suited to object-oriented programming, as well as other modern paradigms; and (iii) the many available libraries and third-party toolkits extend the functionality of the core language into virtually every biological domain (sequence and structure analyses, phylogenomics, workflow management systems, etc.). This primer offers a basic introduction to coding, via Python, and it includes concrete examples and exercises to illustrate the language's usage and capabilities; the main text culminates with a final project in structural bioinformatics. A suite of Supplemental Chapters is also provided. Starting with basic concepts, such as that of a "variable," the Chapters methodically advance the reader to the point of writing a graphical user interface to compute the Hamming distance between two DNA sequences.

  4. Computational methods in the pricing and risk management of modern financial derivatives

    NASA Astrophysics Data System (ADS)

    Deutsch, Hans-Peter

    1999-09-01

    In the last 20 years modern finance has developed into a complex mathematically challenging field. Very complicated risks exist in financial markets which need very advanced methods to measure and/or model them. The financial instruments invented by the market participants to trade these risk, the so called derivatives are usually even more complicated than the risks themselves and also sometimes generate new riks. Topics like random walks, stochastic differential equations, martingale measures, time series analysis, implied correlations, etc. are of common use in the field. This is why more and more people with a science background, such as physicists, mathematicians, or computer scientists, are entering the field of finance. The measurement and management of all theses risks is the key to the continuing success of banks. This talk gives insight into today's common methods of modern market risk management such as variance-covariance, historical simulation, Monte Carlo, “Greek” ratios, etc., including the statistical concepts on which they are based. Derivatives are at the same time the main reason for and the most effective means of conducting risk management. As such, they stand at the beginning and end of risk management. The valuation of derivatives and structured financial instruments is therefore the prerequisite, the condition sine qua non, for all risk management. This talk introduces some of the important valuation methods used in modern derivatives pricing such as present value, Black-Scholes, binomial trees, Monte Carlo, etc. In summary this talk highlights an area outside physics where there is a lot of interesting work to do, especially for physicists. Or as one of our consultants said: The fascinating thing about this job is that Arthur Andersen hired me not ALTHOUGH I am a physicist but BECAUSE I am a physicist.

  5. New directions in the history of modern science in China: global science and comparative history.

    PubMed

    Elman, Benjamin A

    2007-09-01

    These essays collectively present new perspectives on the history of modem science in China since 1900. Fa-ti Fan describes how science under the Republic of China after 1911 exhibited a complex local and international character that straddled both imperialism and colonialism. Danian Hu focuses on the fate of relativity in the physics community in China after 1917. Zuoyue Wang hopes that a less nationalist political atmosphere in China will stimulate more transnational studies of modern science, which will in turn reveal the underlying commonalities in different national contexts. Sigrid Schmalzer compares the socialist and the capitalist contexts for science in China and reopens the sensitive question of the "mass line" during the Cultural Revolution. Grace Shen describes the tensions early Chinese scientists felt when choosing between foreign models for modem geology and their own professional identities in China. Taken together, these accounts present us with a comparative history of modern science in China that is both globally and locally informed.

  6. Accessible Earth: Enhancing diversity in the Geosciences through accessible course design

    NASA Astrophysics Data System (ADS)

    Bennett, R. A.; Lamb, D. A.

    2017-12-01

    The tradition of field-based instruction in the geoscience curriculum, which culminates in a capstone geological field camp, presents an insurmountable barrier to many disabled students who might otherwise choose to pursue geoscience careers. There is a widespread perception that success as a practicing geoscientist requires direct access to outcrops and vantage points available only to those able to traverse inaccessible terrain. Yet many modern geoscience activities are based on remotely sensed geophysical data, data analysis, and computation that take place entirely from within the laboratory. To challenge the perception of geoscience as a career option only for the non-disabled, we have created the capstone Accessible Earth Study Abroad Program, an alternative to geologic field camp for all students, with a focus on modern geophysical observation systems, computational thinking, data science, and professional development.In this presentation, we will review common pedagogical approaches in geosciences and current efforts to make the field more inclusive. We will review curricular access and inclusivity relative to a wide range of learners and provide examples of accessible course design based on our experiences in teaching a study abroad course in central Italy, and our plans for ongoing assessment, refinement, and dissemination of the effectiveness of our efforts.

  7. Extreme Scale Computing to Secure the Nation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, D L; McGraw, J R; Johnson, J R

    2009-11-10

    Since the dawn of modern electronic computing in the mid 1940's, U.S. national security programs have been dominant users of every new generation of high-performance computer. Indeed, the first general-purpose electronic computer, ENIAC (the Electronic Numerical Integrator and Computer), was used to calculate the expected explosive yield of early thermonuclear weapons designs. Even the U. S. numerical weather prediction program, another early application for high-performance computing, was initially funded jointly by sponsors that included the U.S. Air Force and Navy, agencies interested in accurate weather predictions to support U.S. military operations. For the decades of the cold war, national securitymore » requirements continued to drive the development of high performance computing (HPC), including advancement of the computing hardware and development of sophisticated simulation codes to support weapons and military aircraft design, numerical weather prediction as well as data-intensive applications such as cryptography and cybersecurity U.S. national security concerns continue to drive the development of high-performance computers and software in the U.S. and in fact, events following the end of the cold war have driven an increase in the growth rate of computer performance at the high-end of the market. This mainly derives from our nation's observance of a moratorium on underground nuclear testing beginning in 1992, followed by our voluntary adherence to the Comprehensive Test Ban Treaty (CTBT) beginning in 1995. The CTBT prohibits further underground nuclear tests, which in the past had been a key component of the nation's science-based program for assuring the reliability, performance and safety of U.S. nuclear weapons. In response to this change, the U.S. Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship (SBSS) program in response to the Fiscal Year 1994 National Defense Authorization Act, which requires, 'in the absence of nuclear testing, a progam to: (1) Support a focused, multifaceted program to increase the understanding of the enduring stockpile; (2) Predict, detect, and evaluate potential problems of the aging of the stockpile; (3) Refurbish and re-manufacture weapons and components, as required; and (4) Maintain the science and engineering institutions needed to support the nation's nuclear deterrent, now and in the future'. This program continues to fulfill its national security mission by adding significant new capabilities for producing scientific results through large-scale computational simulation coupled with careful experimentation, including sub-critical nuclear experiments permitted under the CTBT. To develop the computational science and the computational horsepower needed to support its mission, SBSS initiated the Accelerated Strategic Computing Initiative, later renamed the Advanced Simulation & Computing (ASC) program (sidebar: 'History of ASC Computing Program Computing Capability'). The modern 3D computational simulation capability of the ASC program supports the assessment and certification of the current nuclear stockpile through calibration with past underground test (UGT) data. While an impressive accomplishment, continued evolution of national security mission requirements will demand computing resources at a significantly greater scale than we have today. In particular, continued observance and potential Senate confirmation of the Comprehensive Test Ban Treaty (CTBT) together with the U.S administration's promise for a significant reduction in the size of the stockpile and the inexorable aging and consequent refurbishment of the stockpile all demand increasing refinement of our computational simulation capabilities. Assessment of the present and future stockpile with increased confidence of the safety and reliability without reliance upon calibration with past or future test data is a long-term goal of the ASC program. This will be accomplished through significant increases in the scientific bases that underlie the computational tools. Computer codes must be developed that replace phenomenology with increased levels of scientific understanding together with an accompanying quantification of uncertainty. These advanced codes will place significantly higher demands on the computing infrastructure than do the current 3D ASC codes. This article discusses not only the need for a future computing capability at the exascale for the SBSS program, but also considers high performance computing requirements for broader national security questions. For example, the increasing concern over potential nuclear terrorist threats demands a capability to assess threats and potential disablement technologies as well as a rapid forensic capability for determining a nuclear weapons design from post-detonation evidence (nuclear counterterrorism).« less

  8. Richard Feynman and computation

    NASA Astrophysics Data System (ADS)

    Hey, Tony

    1999-04-01

    The enormous contribution of Richard Feynman to modern physics is well known, both to teaching through his famous Feynman Lectures on Physics, and to research with his Feynman diagram approach to quantum field theory and his path integral formulation of quantum mechanics. Less well known perhaps is his long-standing interest in the physics of computation and this is the subject of this paper. Feynman lectured on computation at Caltech for most of the last decade of his life, first with John Hopfield and Carver Mead, and then with Gerry Sussman. The story of how these lectures came to be written up as the Feynman Lectures on Computation is briefly recounted. Feynman also discussed the fundamentals of computation with other legendary figures of the computer science and physics community such as Ed Fredkin, Rolf Landauer, Carver Mead, Marvin Minsky and John Wheeler. He was also instrumental in stimulating developments in both nanotechnology and quantum computing. During the 1980s Feynman re-visited long-standing interests both in parallel computing with Geoffrey Fox and Danny Hillis, and in reversible computation and quantum computing with Charles Bennett, Norman Margolus, Tom Toffoli and Wojciech Zurek. This paper records Feynman's links with the computational community and includes some reminiscences about his involvement with the fundamentals of computing.

  9. Modernization and optimization of a legacy open-source CFD code for high-performance computing architectures

    NASA Astrophysics Data System (ADS)

    Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha; Kalinkin, Alexander A.

    2017-02-01

    Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, which is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,'bottom-up' and 'top-down', are illustrated. Preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.

  10. History of mathematics and history of science reunited?

    PubMed

    Gray, Jeremy

    2011-09-01

    For some years now, the history of modern mathematics and the history of modern science have developed independently. A step toward a reunification that would benefit both disciplines could come about through a revived appreciation of mathematical practice. Detailed studies of what mathematicians actually do, whether local or broadly based, have often led in recent work to examinations of the social, cultural, and national contexts, and more can be done. Another recent approach toward a historical understanding of the abstractness of modern mathematics has been to see it as a species of modernism, and this thesis will be tested by the raft of works on the history of modern applied mathematics currently under way.

  11. Contemporary machine learning: techniques for practitioners in the physical sciences

    NASA Astrophysics Data System (ADS)

    Spears, Brian

    2017-10-01

    Machine learning is the science of using computers to find relationships in data without explicitly knowing or programming those relationships in advance. Often without realizing it, we employ machine learning every day as we use our phones or drive our cars. Over the last few years, machine learning has found increasingly broad application in the physical sciences. This most often involves building a model relationship between a dependent, measurable output and an associated set of controllable, but complicated, independent inputs. The methods are applicable both to experimental observations and to databases of simulated output from large, detailed numerical simulations. In this tutorial, we will present an overview of current tools and techniques in machine learning - a jumping-off point for researchers interested in using machine learning to advance their work. We will discuss supervised learning techniques for modeling complicated functions, beginning with familiar regression schemes, then advancing to more sophisticated decision trees, modern neural networks, and deep learning methods. Next, we will cover unsupervised learning and techniques for reducing the dimensionality of input spaces and for clustering data. We'll show example applications from both magnetic and inertial confinement fusion. Along the way, we will describe methods for practitioners to help ensure that their models generalize from their training data to as-yet-unseen test data. We will finally point out some limitations to modern machine learning and speculate on some ways that practitioners from the physical sciences may be particularly suited to help. This work was performed by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  12. The New Alliance between Science and Education: Otto Neurath's Modernity beyond Descartes' "Adamitic" Science

    ERIC Educational Resources Information Center

    Oliverio, Stefano

    2014-01-01

    Starting from a suggestion of Stephen Toulmin and through an interpretation of the criticism to which Neurath, one of the founders of the Vienna Circle, submits Descartes' views on science, the paper attempts to outline a pattern of modernity opposed to the Cartesian one, that has been obtaining over the last four centuries. In particular, it…

  13. Are modern health worries, personality and attitudes to science associated with the use of complementary and alternative medicine?

    PubMed

    Furnham, Adrian

    2007-05-01

    To investigate whether personality traits, modern health worries (MHWs) and attitudes to science predict attitudes to, and beliefs about, complementary and alternative medicine (CAM). This study set out to test whether belief in, and use of CAM was significantly associated with high levels of MHWs, a high level of neuroticism and sceptical attitudes towards science. Two hundred and forty-three British adults completed a four part questionnaire that measured MHWs, the Big Five personality traits and beliefs about science and medicine and attitudes to CAM. There were many gender differences in MHWs (females expressed more), though results were similar to previous studies. Contrary to prediction, personality traits were not related to MHWs, CAM usage or beliefs about CAM. Regular and occasional users of CAM did have higher MHWs than those non or infrequent users. Those with high totalled MHWs also tended to believe in the importance of psychological factors in health and illness, as well as the potential harmful effects of modern medicine. Young males who had positive attitudes to science were least likely to be CAM users. Further, positive attitudes to science were associated with increased scepticism about CAM. Concern about health, belief about modern medicine and CAM are logically inter-related. Those who have high MHWs tend to be more sceptical about modern medicine and more convinced of the possible role of psychological factors in personal health and illness.

  14. Retraining the Modern Civil Engineer.

    ERIC Educational Resources Information Center

    Priscoli, Jerome Delli

    1983-01-01

    Discusses why modern engineering requires social science and the nature of planning. After these conceptional discussions, 12 practical tools which social science brings to engineering are reviewed. A tested approach to training engineers in these tools is then described. Tools include institutional analysis, policy profiling, and other impact…

  15. Computational Chemistry Using Modern Electronic Structure Methods

    ERIC Educational Resources Information Center

    Bell, Stephen; Dines, Trevor J.; Chowdhry, Babur Z.; Withnall, Robert

    2007-01-01

    Various modern electronic structure methods are now days used to teach computational chemistry to undergraduate students. Such quantum calculations can now be easily used even for large size molecules.

  16. Archives and the Boundaries of Early Modern Science.

    PubMed

    Popper, Nicholas

    2016-03-01

    This contribution argues that the study of early modern archives suggests a new agenda for historians of early modern science. While in recent years historians of science have begun to direct increased attention toward the collections amassed by figures and institutions traditionally portrayed as proto-scientific, archives proliferated across early modern Europe, emerging as powerful tools for creating knowledge in politics, history, and law as well as natural philosophy, botany, and more. The essay investigates the methods of production, collection, organization, and manipulation used by English statesmen and Crown officers such as Keeper of the State Papers Thomas Wilson and Secretary of State Joseph Williamson to govern their disorderly collections. Their methods, it is shown, were shared with contemporaries seeking to generate and manage other troves of evidence and in fact reflect a complex ecosystem of imitation and exchange across fields of inquiry. These commonalities suggest that historians of science should look beyond the ancestors of modern scientific disciplines to examine how practices of producing knowledge emerged and migrated throughout cultures of learning in Europe and beyond. Creating such a map of knowledge production and exchange, the essay concludes, would provide a renewed and expansive ambition for the field.

  17. Zilsel's Thesis, Maritime Culture, and Iberian Science in Early Modern Europe.

    PubMed

    Leitão, Henrique; Sánchez, Antonio

    2017-01-01

    Zilsel's thesis on the artisanal origins of modern science remains one of the most original proposals about the emergence of scientific modernity. We propose to inspect the scientific developments in Iberia in the early modern period using Zilsel's ideas as a guideline. Our purpose is to show that his ideas illuminate the situation in Iberia but also that the Iberian case is a remarkable illustration of Zilsel's thesis. Furthermore, we argue that Zilsel's thesis is essentially a sociological explanation that cannot be applied to isolated cases; its use implies global events that involve extended societies over large periods of time.

  18. Scientists and artists: ""Hey! You got art in my science! You got science on my art

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elfman, Mary E; Hayes, Birchard P; Michel, Kelly D

    The pairing of science and art has proven to be a powerful combination since the Renaissance. The combination of these two seemingly disparate disciplines ensured that even complex scientific theories could be explored and effectively communicated to both the subject matter expert and the layman. In modern times, science and art have frequently been considered disjoint, with objectives, philosophies, and perspectives often in direct opposition to each other. However, given the technological advances in computer science and high fidelity 3-D graphics development tools, this marriage of art and science is once again logically complimentary. Art, in the form of computermore » graphics and animation created on supercomputers, has already proven to be a powerful tool for improving scientific research and providing insight into nuclear phenomena. This paper discusses the power of pairing artists with scientists and engineers in order to pursue the possibilities of a widely accessible lightweight, interactive approach. We will use a discussion of photo-realism versus stylization to illuminate the expected beneficial outcome of such collaborations and the societal advantages gained by a non-traditional pa11nering of these two fields.« less

  19. From multisensory integration in peripersonal space to bodily self-consciousness: from statistical regularities to statistical inference.

    PubMed

    Noel, Jean-Paul; Blanke, Olaf; Serino, Andrea

    2018-06-06

    Integrating information across sensory systems is a critical step toward building a cohesive representation of the environment and one's body, and as illustrated by numerous illusions, scaffolds subjective experience of the world and self. In the last years, classic principles of multisensory integration elucidated in the subcortex have been translated into the language of statistical inference understood by the neocortical mantle. Most importantly, a mechanistic systems-level description of multisensory computations via probabilistic population coding and divisive normalization is actively being put forward. In parallel, by describing and understanding bodily illusions, researchers have suggested multisensory integration of bodily inputs within the peripersonal space as a key mechanism in bodily self-consciousness. Importantly, certain aspects of bodily self-consciousness, although still very much a minority, have been recently casted under the light of modern computational understandings of multisensory integration. In doing so, we argue, the field of bodily self-consciousness may borrow mechanistic descriptions regarding the neural implementation of inference computations outlined by the multisensory field. This computational approach, leveraged on the understanding of multisensory processes generally, promises to advance scientific comprehension regarding one of the most mysterious questions puzzling humankind, that is, how our brain creates the experience of a self in interaction with the environment. © 2018 The Authors. Annals of the New York Academy of Sciences published by Wiley Periodicals, Inc. on behalf of New York Academy of Sciences.

  20. Mobile high-performance computing (HPC) for synthetic aperture radar signal processing

    NASA Astrophysics Data System (ADS)

    Misko, Joshua; Kim, Youngsoo; Qi, Chenchen; Sirkeci, Birsen

    2018-04-01

    The importance of mobile high-performance computing has emerged in numerous battlespace applications at the tactical edge in hostile environments. Energy efficient computing power is a key enabler for diverse areas ranging from real-time big data analytics and atmospheric science to network science. However, the design of tactical mobile data centers is dominated by power, thermal, and physical constraints. Presently, it is very unlikely to achieve required computing processing power by aggregating emerging heterogeneous many-core processing platforms consisting of CPU, Field Programmable Gate Arrays and Graphic Processor cores constrained by power and performance. To address these challenges, we performed a Synthetic Aperture Radar case study for Automatic Target Recognition (ATR) using Deep Neural Networks (DNNs). However, these DNN models are typically trained using GPUs with gigabytes of external memories and massively used 32-bit floating point operations. As a result, DNNs do not run efficiently on hardware appropriate for low power or mobile applications. To address this limitation, we proposed for compressing DNN models for ATR suited to deployment on resource constrained hardware. This proposed compression framework utilizes promising DNN compression techniques including pruning and weight quantization while also focusing on processor features common to modern low-power devices. Following this methodology as a guideline produced a DNN for ATR tuned to maximize classification throughput, minimize power consumption, and minimize memory footprint on a low-power device.

  1. Computer-assisted learning in critical care: from ENIAC to HAL.

    PubMed

    Tegtmeyer, K; Ibsen, L; Goldstein, B

    2001-08-01

    Computers are commonly used to serve many functions in today's modern intensive care unit. One of the most intriguing and perhaps most challenging applications of computers has been to attempt to improve medical education. With the introduction of the first computer, medical educators began looking for ways to incorporate their use into the modern curriculum. Prior limitations of cost and complexity of computers have consistently decreased since their introduction, making it increasingly feasible to incorporate computers into medical education. Simultaneously, the capabilities and capacities of computers have increased. Combining the computer with other modern digital technology has allowed the development of more intricate and realistic educational tools. The purpose of this article is to briefly describe the history and use of computers in medical education with special reference to critical care medicine. In addition, we will examine the role of computers in teaching and learning and discuss the types of interaction between the computer user and the computer.

  2. Web-GIS platform for monitoring and forecasting of regional climate and ecological changes

    NASA Astrophysics Data System (ADS)

    Gordov, E. P.; Krupchatnikov, V. N.; Lykosov, V. N.; Okladnikov, I.; Titov, A. G.; Shulgina, T. M.

    2012-12-01

    Growing volume of environmental data from sensors and model outputs makes development of based on modern information-telecommunication technologies software infrastructure for information support of integrated scientific researches in the field of Earth sciences urgent and important task (Gordov et al, 2012, van der Wel, 2005). It should be considered that original heterogeneity of datasets obtained from different sources and institutions not only hampers interchange of data and analysis results but also complicates their intercomparison leading to a decrease in reliability of analysis results. However, modern geophysical data processing techniques allow combining of different technological solutions for organizing such information resources. Nowadays it becomes a generally accepted opinion that information-computational infrastructure should rely on a potential of combined usage of web- and GIS-technologies for creating applied information-computational web-systems (Titov et al, 2009, Gordov et al. 2010, Gordov, Okladnikov and Titov, 2011). Using these approaches for development of internet-accessible thematic information-computational systems, and arranging of data and knowledge interchange between them is a very promising way of creation of distributed information-computation environment for supporting of multidiscipline regional and global research in the field of Earth sciences including analysis of climate changes and their impact on spatial-temporal vegetation distribution and state. Experimental software and hardware platform providing operation of a web-oriented production and research center for regional climate change investigations which combines modern web 2.0 approach, GIS-functionality and capabilities of running climate and meteorological models, large geophysical datasets processing, visualization, joint software development by distributed research groups, scientific analysis and organization of students and post-graduate students education is presented. Platform software developed (Shulgina et al, 2012, Okladnikov et al, 2012) includes dedicated modules for numerical processing of regional and global modeling results for consequent analysis and visualization. Also data preprocessing, run and visualization of modeling results of models WRF and «Planet Simulator» integrated into the platform is provided. All functions of the center are accessible by a user through a web-portal using common graphical web-browser in the form of an interactive graphical user interface which provides, particularly, capabilities of visualization of processing results, selection of geographical region of interest (pan and zoom) and data layers manipulation (order, enable/disable, features extraction). Platform developed provides users with capabilities of heterogeneous geophysical data analysis, including high-resolution data, and discovering of tendencies in climatic and ecosystem changes in the framework of different multidisciplinary researches (Shulgina et al, 2011). Using it even unskilled user without specific knowledge can perform computational processing and visualization of large meteorological, climatological and satellite monitoring datasets through unified graphical web-interface.

  3. Science in the cloud (SIC): A use case in MRI connectomics

    PubMed Central

    Gorgolewski, Krzysztof J.; Kleissas, Dean; Roncal, William Gray; Litt, Brian; Wandell, Brian; Poldrack, Russel A.; Wiener, Martin; Vogelstein, R. Jacob; Burns, Randal

    2017-01-01

    Abstract Modern technologies are enabling scientists to collect extraordinary amounts of complex and sophisticated data across a huge range of scales like never before. With this onslaught of data, we can allow the focal point to shift from data collection to data analysis. Unfortunately, lack of standardized sharing mechanisms and practices often make reproducing or extending scientific results very difficult. With the creation of data organization structures and tools that drastically improve code portability, we now have the opportunity to design such a framework for communicating extensible scientific discoveries. Our proposed solution leverages these existing technologies and standards, and provides an accessible and extensible model for reproducible research, called ‘science in the cloud’ (SIC). Exploiting scientific containers, cloud computing, and cloud data services, we show the capability to compute in the cloud and run a web service that enables intimate interaction with the tools and data presented. We hope this model will inspire the community to produce reproducible and, importantly, extensible results that will enable us to collectively accelerate the rate at which scientific breakthroughs are discovered, replicated, and extended. PMID:28327935

  4. Development of AN Open-Source Automatic Deformation Monitoring System for Geodetical and Geotechnical Measurements

    NASA Astrophysics Data System (ADS)

    Engel, P.; Schweimler, B.

    2016-04-01

    The deformation monitoring of structures and buildings is an important task field of modern engineering surveying, ensuring the standing and reliability of supervised objects over a long period. Several commercial hardware and software solutions for the realization of such monitoring measurements are available on the market. In addition to them, a research team at the Neubrandenburg University of Applied Sciences (NUAS) is actively developing a software package for monitoring purposes in geodesy and geotechnics, which is distributed under an open source licence and free of charge. The task of managing an open source project is well-known in computer science, but it is fairly new in a geodetic context. This paper contributes to that issue by detailing applications, frameworks, and interfaces for the design and implementation of open hardware and software solutions for sensor control, sensor networks, and data management in automatic deformation monitoring. It will be discussed how the development effort of networked applications can be reduced by using free programming tools, cloud computing technologies, and rapid prototyping methods.

  5. Science in the cloud (SIC): A use case in MRI connectomics.

    PubMed

    Kiar, Gregory; Gorgolewski, Krzysztof J; Kleissas, Dean; Roncal, William Gray; Litt, Brian; Wandell, Brian; Poldrack, Russel A; Wiener, Martin; Vogelstein, R Jacob; Burns, Randal; Vogelstein, Joshua T

    2017-05-01

    Modern technologies are enabling scientists to collect extraordinary amounts of complex and sophisticated data across a huge range of scales like never before. With this onslaught of data, we can allow the focal point to shift from data collection to data analysis. Unfortunately, lack of standardized sharing mechanisms and practices often make reproducing or extending scientific results very difficult. With the creation of data organization structures and tools that drastically improve code portability, we now have the opportunity to design such a framework for communicating extensible scientific discoveries. Our proposed solution leverages these existing technologies and standards, and provides an accessible and extensible model for reproducible research, called 'science in the cloud' (SIC). Exploiting scientific containers, cloud computing, and cloud data services, we show the capability to compute in the cloud and run a web service that enables intimate interaction with the tools and data presented. We hope this model will inspire the community to produce reproducible and, importantly, extensible results that will enable us to collectively accelerate the rate at which scientific breakthroughs are discovered, replicated, and extended. © The Author 2017. Published by Oxford University Press.

  6. (Re)cognizing postmodernity: helps for historians--of science especially.

    PubMed

    Forman, Paul

    2010-06-01

    Postmodernity, a historical era demarcated from modernity by a broad reversal in cultural presuppositions, is distinguished from postmodernism, an intellectual posture adopted by self-identified postmodernists early in postmodernity. Two principal features of postmodernity are addressed: first, the downgrading of science and the upgrading of technology in cultural rank--on which postmodernity and postmodernism are in accord; second, the displacement of the methodical, disinterested scientist, modernity's beau ideal, not by a fragmented subject as postmodernism claims, but by the single-minded entrepreneur, resourcefully pursuing his self-interest in disregard of all rules. The reversal in rank and role as between science and technology, setting in circa 1980, is a marker of the transition from modernity to postmodernity. That reversal is to be cognized primarily as rejection of rule-following, of proceeding methodically--'methodism' being the cultural perspective that uniquely distinguished modernity--but also as rejection of disinterestedness, the quality of mind especially highly esteemed in modernity. Postmodernity is constituted by this transvaluation of values, whose well-spring is the egocentric, transgressive (hence 'risk taking'), postmodern personality and its anti-social presumptions regarding personhood. Within the history of science itself there has been since circa 1980 a corresponding turn of scholarly attention away from science to technology, and a growing distaste for social perspectives, reflected, i.a., in the rejection of causalist 'influence' explanations in favor of voluntarist 'resource' explanations.

  7. Managing the technological edge: the UNESCO International Computation Centre and the limits to the transfer of computer technology, 1946-61.

    PubMed

    Nofre, David

    2014-07-01

    The spread of the modern computer is assumed to have been a smooth process of technology transfer. This view relies on an assessment of the open circulation of knowledge ensured by the US and British governments in the early post-war years. This article presents new historical evidence that question this view. At the centre of the article lies the ill-fated establishment of the UNESCO International Computation Centre. The project was initially conceived in 1946 to provide advanced computation capabilities to scientists of all nations. It soon became a prize sought by Western European countries like The Netherlands and Italy seeking to speed up their own national research programs. Nonetheless, as the article explains, the US government's limitations on the research function of the future centre resulted in the withdrawal of European support for the project. These limitations illustrate the extent to which US foreign science policy could operate as (stealth) industrial policy to secure a competitive technological advantage and the prospects of US manufacturers in a future European market.

  8. The advanced role of computational mechanics and visualization in science and technology: analysis of the Germanwings Flight 9525 crash

    NASA Astrophysics Data System (ADS)

    Chen, Goong; Wang, Yi-Ching; Perronnet, Alain; Gu, Cong; Yao, Pengfei; Bin-Mohsin, Bandar; Hajaiej, Hichem; Scully, Marlan O.

    2017-03-01

    Computational mathematics, physics and engineering form a major constituent of modern computational science, which now stands on an equal footing with the established branches of theoretical and experimental sciences. Computational mechanics solves problems in science and engineering based upon mathematical modeling and computing, bypassing the need for expensive and time-consuming laboratory setups and experimental measurements. Furthermore, it allows the numerical simulations of large scale systems, such as the formation of galaxies that could not be done in any earth bound laboratories. This article is written as part of the 21st Century Frontiers Series to illustrate some state-of-the-art computational science. We emphasize how to do numerical modeling and visualization in the study of a contemporary event, the pulverizing crash of the Germanwings Flight 9525 on March 24, 2015, as a showcase. Such numerical modeling and the ensuing simulation of aircraft crashes into land or mountain are complex tasks as they involve both theoretical study and supercomputing of a complex physical system. The most tragic type of crash involves ‘pulverization’ such as the one suffered by this Germanwings flight. Here, we show pulverizing airliner crashes by visualization through video animations from supercomputer applications of the numerical modeling tool LS-DYNA. A sound validation process is challenging but essential for any sophisticated calculations. We achieve this by validation against the experimental data from a crash test done in 1993 of an F4 Phantom II fighter jet into a wall. We have developed a method by hybridizing two primary methods: finite element analysis and smoothed particle hydrodynamics. This hybrid method also enhances visualization by showing a ‘debris cloud’. Based on our supercomputer simulations and the visualization, we point out that prior works on this topic based on ‘hollow interior’ modeling can be quite problematic and, thus, not likely to be correct. We discuss the effects of terrain on pulverization using the information from the recovered flight-data-recorder and show our forensics and assessments of what may have happened during the final moments of the crash. Finally, we point out that our study has potential for being made into real-time flight crash simulators to help the study of crashworthiness and survivability for future aviation safety. Some forward-looking statements are also made.

  9. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions.

    PubMed

    Dinov, Ivo D; Siegrist, Kyle; Pearl, Dennis K; Kalinin, Alexandr; Christou, Nicolas

    2016-06-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome , which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the learning assessment protocols.

  10. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions

    PubMed Central

    Dinov, Ivo D.; Siegrist, Kyle; Pearl, Dennis K.; Kalinin, Alexandr; Christou, Nicolas

    2015-01-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome, which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the learning assessment protocols. PMID:27158191

  11. Crossing over...Markov meets Mendel.

    PubMed

    Mneimneh, Saad

    2012-01-01

    Chromosomal crossover is a biological mechanism to combine parental traits. It is perhaps the first mechanism ever taught in any introductory biology class. The formulation of crossover, and resulting recombination, came about 100 years after Mendel's famous experiments. To a great extent, this formulation is consistent with the basic genetic findings of Mendel. More importantly, it provides a mathematical insight for his two laws (and corrects them). From a mathematical perspective, and while it retains similarities, genetic recombination guarantees diversity so that we do not rapidly converge to the same being. It is this diversity that made the study of biology possible. In particular, the problem of genetic mapping and linkage-one of the first efforts towards a computational approach to biology-relies heavily on the mathematical foundation of crossover and recombination. Nevertheless, as students we often overlook the mathematics of these phenomena. Emphasizing the mathematical aspect of Mendel's laws through crossover and recombination will prepare the students to make an early realization that biology, in addition to being experimental, IS a computational science. This can serve as a first step towards a broader curricular transformation in teaching biological sciences. I will show that a simple and modern treatment of Mendel's laws using a Markov chain will make this step possible, and it will only require basic college-level probability and calculus. My personal teaching experience confirms that students WANT to know Markov chains because they hear about them from bioinformaticists all the time. This entire exposition is based on three homework problems that I designed for a course in computational biology. A typical reader is, therefore, an instructional staff member or a student in a computational field (e.g., computer science, mathematics, statistics, computational biology, bioinformatics). However, other students may easily follow by omitting the mathematically more elaborate parts. I kept those as separate sections in the exposition.

  12. Toward a computational framework for cognitive biology: Unifying approaches from cognitive neuroscience and comparative cognition

    NASA Astrophysics Data System (ADS)

    Fitch, W. Tecumseh

    2014-09-01

    Progress in understanding cognition requires a quantitative, theoretical framework, grounded in the other natural sciences and able to bridge between implementational, algorithmic and computational levels of explanation. I review recent results in neuroscience and cognitive biology that, when combined, provide key components of such an improved conceptual framework for contemporary cognitive science. Starting at the neuronal level, I first discuss the contemporary realization that single neurons are powerful tree-shaped computers, which implies a reorientation of computational models of learning and plasticity to a lower, cellular, level. I then turn to predictive systems theory (predictive coding and prediction-based learning) which provides a powerful formal framework for understanding brain function at a more global level. Although most formal models concerning predictive coding are framed in associationist terms, I argue that modern data necessitate a reinterpretation of such models in cognitive terms: as model-based predictive systems. Finally, I review the role of the theory of computation and formal language theory in the recent explosion of comparative biological research attempting to isolate and explore how different species differ in their cognitive capacities. Experiments to date strongly suggest that there is an important difference between humans and most other species, best characterized cognitively as a propensity by our species to infer tree structures from sequential data. Computationally, this capacity entails generative capacities above the regular (finite-state) level; implementationally, it requires some neural equivalent of a push-down stack. I dub this unusual human propensity "dendrophilia", and make a number of concrete suggestions about how such a system may be implemented in the human brain, about how and why it evolved, and what this implies for models of language acquisition. I conclude that, although much remains to be done, a neurally-grounded framework for theoretical cognitive science is within reach that can move beyond polarized debates and provide a more adequate theoretical future for cognitive biology.

  13. Toward a computational framework for cognitive biology: unifying approaches from cognitive neuroscience and comparative cognition.

    PubMed

    Fitch, W Tecumseh

    2014-09-01

    Progress in understanding cognition requires a quantitative, theoretical framework, grounded in the other natural sciences and able to bridge between implementational, algorithmic and computational levels of explanation. I review recent results in neuroscience and cognitive biology that, when combined, provide key components of such an improved conceptual framework for contemporary cognitive science. Starting at the neuronal level, I first discuss the contemporary realization that single neurons are powerful tree-shaped computers, which implies a reorientation of computational models of learning and plasticity to a lower, cellular, level. I then turn to predictive systems theory (predictive coding and prediction-based learning) which provides a powerful formal framework for understanding brain function at a more global level. Although most formal models concerning predictive coding are framed in associationist terms, I argue that modern data necessitate a reinterpretation of such models in cognitive terms: as model-based predictive systems. Finally, I review the role of the theory of computation and formal language theory in the recent explosion of comparative biological research attempting to isolate and explore how different species differ in their cognitive capacities. Experiments to date strongly suggest that there is an important difference between humans and most other species, best characterized cognitively as a propensity by our species to infer tree structures from sequential data. Computationally, this capacity entails generative capacities above the regular (finite-state) level; implementationally, it requires some neural equivalent of a push-down stack. I dub this unusual human propensity "dendrophilia", and make a number of concrete suggestions about how such a system may be implemented in the human brain, about how and why it evolved, and what this implies for models of language acquisition. I conclude that, although much remains to be done, a neurally-grounded framework for theoretical cognitive science is within reach that can move beyond polarized debates and provide a more adequate theoretical future for cognitive biology. Copyright © 2014. Published by Elsevier B.V.

  14. Crossing Over…Markov Meets Mendel

    PubMed Central

    Mneimneh, Saad

    2012-01-01

    Chromosomal crossover is a biological mechanism to combine parental traits. It is perhaps the first mechanism ever taught in any introductory biology class. The formulation of crossover, and resulting recombination, came about 100 years after Mendel's famous experiments. To a great extent, this formulation is consistent with the basic genetic findings of Mendel. More importantly, it provides a mathematical insight for his two laws (and corrects them). From a mathematical perspective, and while it retains similarities, genetic recombination guarantees diversity so that we do not rapidly converge to the same being. It is this diversity that made the study of biology possible. In particular, the problem of genetic mapping and linkage—one of the first efforts towards a computational approach to biology—relies heavily on the mathematical foundation of crossover and recombination. Nevertheless, as students we often overlook the mathematics of these phenomena. Emphasizing the mathematical aspect of Mendel's laws through crossover and recombination will prepare the students to make an early realization that biology, in addition to being experimental, IS a computational science. This can serve as a first step towards a broader curricular transformation in teaching biological sciences. I will show that a simple and modern treatment of Mendel's laws using a Markov chain will make this step possible, and it will only require basic college-level probability and calculus. My personal teaching experience confirms that students WANT to know Markov chains because they hear about them from bioinformaticists all the time. This entire exposition is based on three homework problems that I designed for a course in computational biology. A typical reader is, therefore, an instructional staff member or a student in a computational field (e.g., computer science, mathematics, statistics, computational biology, bioinformatics). However, other students may easily follow by omitting the mathematically more elaborate parts. I kept those as separate sections in the exposition. PMID:22629235

  15. Creating technical heritage object replicas in a virtual environment

    NASA Astrophysics Data System (ADS)

    Egorova, Olga; Shcherbinin, Dmitry

    2016-03-01

    The paper presents innovative informatics methods for creating virtual technical heritage replicas, which are of significant scientific and practical importance not only to researchers but to the public in general. By performing 3D modeling and animation of aircrafts, spaceships, architectural-engineering buildings, and other technical objects, the process of learning is achieved while promoting the preservation of the replicas for future generations. Modern approaches based on the wide usage of computer technologies attract a greater number of young people to explore the history of science and technology and renew their interest in the field of mechanical engineering.

  16. Information-computational platform for collaborative multidisciplinary investigations of regional climatic changes and their impacts

    NASA Astrophysics Data System (ADS)

    Gordov, Evgeny; Lykosov, Vasily; Krupchatnikov, Vladimir; Okladnikov, Igor; Titov, Alexander; Shulgina, Tamara

    2013-04-01

    Analysis of growing volume of related to climate change data from sensors and model outputs requires collaborative multidisciplinary efforts of researchers. To do it timely and in reliable way one needs in modern information-computational infrastructure supporting integrated studies in the field of environmental sciences. Recently developed experimental software and hardware platform Climate (http://climate.scert.ru/) provides required environment for regional climate change related investigations. The platform combines modern web 2.0 approach, GIS-functionality and capabilities to run climate and meteorological models, process large geophysical datasets and support relevant analysis. It also supports joint software development by distributed research groups, and organization of thematic education for students and post-graduate students. In particular, platform software developed includes dedicated modules for numerical processing of regional and global modeling results for consequent analysis and visualization. Also run of integrated into the platform WRF and «Planet Simulator» models, modeling results data preprocessing and visualization is provided. All functions of the platform are accessible by a user through a web-portal using common graphical web-browser in the form of an interactive graphical user interface which provides, particularly, capabilities of selection of geographical region of interest (pan and zoom), data layers manipulation (order, enable/disable, features extraction) and visualization of results. Platform developed provides users with capabilities of heterogeneous geophysical data analysis, including high-resolution data, and discovering of tendencies in climatic and ecosystem changes in the framework of different multidisciplinary researches. Using it even unskilled user without specific knowledge can perform reliable computational processing and visualization of large meteorological, climatic and satellite monitoring datasets through unified graphical web-interface. Partial support of RF Ministry of Education and Science grant 8345, SB RAS Program VIII.80.2 and Projects 69, 131, 140 and APN CBA2012-16NSY project is acknowledged.

  17. Cscibox: A Software System for Age-Model Construction and Evaluation

    NASA Astrophysics Data System (ADS)

    Bradley, E.; Anderson, K. A.; Marchitto, T. M., Jr.; de Vesine, L. R.; White, J. W. C.; Anderson, D. M.

    2014-12-01

    CSciBox is an integrated software system for the construction and evaluation of age models of paleo-environmetal archives, both directly dated and cross dated. The time has come to encourage cross-pollinization between earth science and computer science in dating paleorecords. This project addresses that need. The CSciBox code, which is being developed by a team of computer scientists and geoscientists, is open source and freely available on github. The system employs modern database technology to store paleoclimate proxy data and analysis results in an easily accessible and searchable form. This makes it possible to do analysis on the whole core at once, in an interactive fashion, or to tailor the analysis to a subset of the core without loading the entire data file. CSciBox provides a number of 'components' that perform the common steps in age-model construction and evaluation: calibrations, reservoir-age correction, interpolations, statistics, and so on. The user employs these components via a graphical user interface (GUI) to go from raw data to finished age model in a single tool: e.g., an IntCal09 calibration of 14C data from a marine sediment core, followed by a piecewise-linear interpolation. CSciBox's GUI supports plotting of any measurement in the core against any other measurement, or against any of the variables in the calculation of the age model-with or without explicit error representations. Using the GUI, CSciBox's user can import a new calibration curve or other background data set and define a new module that employs that information. Users can also incorporate other software (e.g., Calib, BACON) as 'plug ins.' In the case of truly large data or significant computational effort, CSciBox is parallelizable across modern multicore processors, or clusters, or even the cloud. The next generation of the CSciBox code, currently in the testing stages, includes an automated reasoning engine that supports a more-thorough exploration of plausible age models and cross-dating scenarios.

  18. Modernization and optimization of a legacy open-source CFD code for high-performance computing architectures

    DOE PAGES

    Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha; ...

    2017-03-20

    Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, whichmore » is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,‘bottom-up’ and ‘top-down’, are illustrated. Here, preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.« less

  19. Modernization and optimization of a legacy open-source CFD code for high-performance computing architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha

    Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, whichmore » is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,‘bottom-up’ and ‘top-down’, are illustrated. Here, preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.« less

  20. Speculative Truth - Henry Cavendish, Natural Philosophy, and the Rise of Modern Theoretical Science

    NASA Astrophysics Data System (ADS)

    McCormmach, Russell

    2004-03-01

    With a never-before published paper by Lord Henry Cavendish, as well as a biography on him, this book offers a fascinating discourse on the rise of scientific attitudes and ways of knowing. A pioneering British physicist in the late 18th and early 19th centuries, Cavendish was widely considered to be the first full-time scientist in the modern sense. Through the lens of this unique thinker and writer, this book is about the birth of modern science.

  1. NDE in aerospace-requirements for science, sensors and sense.

    PubMed

    Heyman, J S

    1989-01-01

    The complexity of modern NDE (nondestructive evaluation) arises from four main factors: quantitative measurement, science, physical models for computational analysis, realistic interfacing with engineering decisions, and direct access to management priorities. Recent advances in the four factors of NDE are addressed. Physical models of acoustic propagation are presented that have led to the development of measurement technologies advancing the ability to assure that materials and structures will perform a design. In addition, a brief discussion is given of current research for future mission needs such as smart structures that sense their own health. Such advances permit projects to integrate design for inspection into their plans, bringing NDE into engineering and management priorities. The measurement focus is on ultrasonics with generous case examples. Problem solutions highlighted include critical stress in fasteners, residual stress in steel, NDE laminography, and solid rocket motor NDE.

  2. The evolution and future of minimalism in neurological surgery.

    PubMed

    Liu, Charles Y; Wang, Michael Y; Apuzzo, Michael L J

    2004-11-01

    The evolution of the field of neurological surgery has been marked by a progressive minimalism. This has been evident in the development of an entire arsenal of modern neurosurgical enterprises, including microneurosurgery, neuroendoscopy, stereotactic neurosurgery, endovascular techniques, radiosurgical systems, intraoperative and navigational devices, and in the last decade, cellular and molecular adjuvants. In addition to reviewing the major developments and paradigm shifts in the cyclic reinvention of the field as it currently stands, this paper attempts to identify forces and developments that are likely to fuel the irresistible escalation of minimalism into the future. These forces include discoveries in computational science, imaging, molecular science, biomedical engineering, and information processing as they relate to the theme of minimalism. These areas are explained in the light of future possibilities offered by the emerging field of nanotechnology with molecular engineering.

  3. NDE in aerospace - Requirements for science, sensors and sense

    NASA Technical Reports Server (NTRS)

    Heyman, Joseph S.

    1989-01-01

    The complexity of modern nondestructive evaluation (NDE) arises from four main factors: quantitative measurement science, physical models for computational analysis, realistic interfacing with engineering decisions, and direct access to management priorities. Recent advances in the four factors of NDE are addressed. Physical models of acoustic propagation are presented that have led to the development of measurement technologies advancing the ability to assure that materials and structures will perform as designed. In addition, a brief discussion is given of current research for future mission needs such as smart structures that sense their own health. Such advances permit projects to integrate design for inspection into their plans, bringing NDE into engineering and management priorities. The measurement focus is on ultrasonics with generous case examples. Problem solutions highlighted include critical stress in fasteners, residual stress in steel, NDE laminography, and solid rocket motor NDE.

  4. Spitzer - Hot & Colorful Student Activities

    NASA Astrophysics Data System (ADS)

    McDonald, D.; Rebull, L. M.; DeWolf, C.; Guastella, P.; Johnson, C. H.; Schaefers, J.; Spuck, T.; McDonald, J. G., III; DeWolf, T.; Brock, S.; Boerma, J.; Bemis, G.; Paulsen, K.; Yueh, N.; Peter, A.; Wassmer, W.; Haber, R.; Scaramucci, A.; Butchart, J.; Holcomb, A.; Karns, B.; Kennedy, S.; Siegel, R.; Weiser, S.

    2009-01-01

    In this poster, we present the results of several activities developed for the general science student to explore infrared light. The first activity involved measuring infrared radiation using an updated version of Newton's experiment of splitting white light and finding IR radiation. The second used Leslie's cube to allow students to observe different radiators, while the third used a modern infrared thermometer to measure and identify IR sources in an enclosed box. The last activity involved students making false-color images from narrow-band filter images from data sets from Spitzer Space Telescope, STScI Digitized Sky Survey and other sources. Using computer programs like Adobe Photoshop and free software such as ds9, Spot and Leopard, poster-like images were created by the students. This research is funded by the Spitzer Science Center (SSC) and the National Optical Astronomy Observatory (NOAO). Please see our companion poster, Johnson et al., on the science aspect of this program, and another poster on the educational aspects, Guastella et al.

  5. Metaheuristic Optimization and its Applications in Earth Sciences

    NASA Astrophysics Data System (ADS)

    Yang, Xin-She

    2010-05-01

    A common but challenging task in modelling geophysical and geological processes is to handle massive data and to minimize certain objectives. This can essentially be considered as an optimization problem, and thus many new efficient metaheuristic optimization algorithms can be used. In this paper, we will introduce some modern metaheuristic optimization algorithms such as genetic algorithms, harmony search, firefly algorithm, particle swarm optimization and simulated annealing. We will also discuss how these algorithms can be applied to various applications in earth sciences, including nonlinear least-squares, support vector machine, Kriging, inverse finite element analysis, and data-mining. We will present a few examples to show how different problems can be reformulated as optimization. Finally, we will make some recommendations for choosing various algorithms to suit various problems. References 1) D. H. Wolpert and W. G. Macready, No free lunch theorems for optimization, IEEE Trans. Evolutionary Computation, Vol. 1, 67-82 (1997). 2) X. S. Yang, Nature-Inspired Metaheuristic Algorithms, Luniver Press, (2008). 3) X. S. Yang, Mathematical Modelling for Earth Sciences, Dunedin Academic Press, (2008).

  6. Investigating the Purpose of Trigonometry in the Modern Sciences

    ERIC Educational Resources Information Center

    Hertel, Joshua T.

    2013-01-01

    This dissertation reports the results of a qualitative research project that aimed to develop a research-based perspective on the purpose of trigonometry in the modern sciences. The investigation was guided by three objectives. First, the study sought to identify the purpose of trigonometry as described by educators and high school textbooks.…

  7. The Fateful Rift: The San Andreas Fault in the Modern Mind.

    ERIC Educational Resources Information Center

    Percy, Walker

    1990-01-01

    Claims that modern science is radically incoherent and that this incoherence lies within the practice of science. Details the work of the scientist and philosopher Charles Sanders Pierce, expounding on the difference between Rene Descartes' dualistic philosophy and Pierce's triadic view. Concludes with a brief description of the human existence.…

  8. 77 FR 75885 - Control of Communicable Diseases: Foreign; Scope and Definitions

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-26

    ... primary authority supporting this rulemaking is section 361 of the Public Health Service Act (42 U.S.C... the scope and definitions to part 71 to reflect modern science and current practices. HHS/CDC has... products'' in subpart F. This revision more adequately reflects modern science and current practice which...

  9. Krakatoa Erupts!: Using a Historic Cataclysm to Teach Modern Science

    ERIC Educational Resources Information Center

    Clary, Renee; Wandersee, James

    2011-01-01

    Through integration of geology, biology, chemistry, and the history of science, the historic Krakatoa eruption offers a unique portal for student inquiry in the classroom. Students are inherently fascinated by natural disasters, and modern comparisons to the Krakatoa cataclysm are as close as the day's news. This article uses the historic Krakatoa…

  10. Mathematical biology modules based on modern molecular biology and modern discrete mathematics.

    PubMed

    Robeva, Raina; Davies, Robin; Hodge, Terrell; Enyedi, Alexander

    2010-01-01

    We describe an ongoing collaborative curriculum materials development project between Sweet Briar College and Western Michigan University, with support from the National Science Foundation. We present a collection of modules under development that can be used in existing mathematics and biology courses, and we address a critical national need to introduce students to mathematical methods beyond the interface of biology with calculus. Based on ongoing research, and designed to use the project-based-learning approach, the modules highlight applications of modern discrete mathematics and algebraic statistics to pressing problems in molecular biology. For the majority of projects, calculus is not a required prerequisite and, due to the modest amount of mathematical background needed for some of the modules, the materials can be used for an early introduction to mathematical modeling. At the same time, most modules are connected with topics in linear and abstract algebra, algebraic geometry, and probability, and they can be used as meaningful applied introductions into the relevant advanced-level mathematics courses. Open-source software is used to facilitate the relevant computations. As a detailed example, we outline a module that focuses on Boolean models of the lac operon network.

  11. Mathematical Biology Modules Based on Modern Molecular Biology and Modern Discrete Mathematics

    PubMed Central

    Davies, Robin; Hodge, Terrell; Enyedi, Alexander

    2010-01-01

    We describe an ongoing collaborative curriculum materials development project between Sweet Briar College and Western Michigan University, with support from the National Science Foundation. We present a collection of modules under development that can be used in existing mathematics and biology courses, and we address a critical national need to introduce students to mathematical methods beyond the interface of biology with calculus. Based on ongoing research, and designed to use the project-based-learning approach, the modules highlight applications of modern discrete mathematics and algebraic statistics to pressing problems in molecular biology. For the majority of projects, calculus is not a required prerequisite and, due to the modest amount of mathematical background needed for some of the modules, the materials can be used for an early introduction to mathematical modeling. At the same time, most modules are connected with topics in linear and abstract algebra, algebraic geometry, and probability, and they can be used as meaningful applied introductions into the relevant advanced-level mathematics courses. Open-source software is used to facilitate the relevant computations. As a detailed example, we outline a module that focuses on Boolean models of the lac operon network. PMID:20810955

  12. Dire necessity and transformation: entry-points for modern science in Islamic bioethical assessment of porcine products in vaccines.

    PubMed

    Padela, Aasim I; Furber, Steven W; Kholwadia, Mohammad A; Moosa, Ebrahim

    2014-02-01

    The field of medicine provides an important window through which to examine the encounters between religion and science, and between modernity and tradition. While both religion and science consider health to be a 'good' that is to be preserved, and promoted, religious and science-based teachings may differ in their conception of what constitutes good health, and how that health is to be achieved. This paper analyzes the way the Islamic ethico-legal tradition assesses the permissibility of using vaccines that contain porcine-derived components by referencing opinions of several Islamic authorities. In the Islamic ethico-legal tradition controversy surrounds the use of proteins from an animal (pig) that is considered to be impure by Islamic law. As we discuss the Islamic ethico-legal constructs used to argue for or against the use of porcine-based vaccines we will call attention to areas where modern medical data may make the arguments more precise. By highlighting areas where science can buttress and clarify the ethico-legal arguments we hope to spur an enhanced applied Islamic bioethics discourse where religious scholars and medical experts use modern science in a way that remains faithful to the epistemology of Islamic ethics to clarify what Islam requires of Muslim patients and healthcare workers. © 2013 John Wiley & Sons Ltd.

  13. The need for data standards in zoomorphology.

    PubMed

    Vogt, Lars; Nickel, Michael; Jenner, Ronald A; Deans, Andrew R

    2013-07-01

    eScience is a new approach to research that focuses on data mining and exploration rather than data generation or simulation. This new approach is arguably a driving force for scientific progress and requires data to be openly available, easily accessible via the Internet, and compatible with each other. eScience relies on modern standards for the reporting and documentation of data and metadata. Here, we suggest necessary components (i.e., content, concept, nomenclature, format) of such standards in the context of zoomorphology. We document the need for using data repositories to prevent data loss and how publication practice is currently changing, with the emergence of dynamic publications and the publication of digital datasets. Subsequently, we demonstrate that in zoomorphology the scientific record is still limited to published literature and that zoomorphological data are usually not accessible through data repositories. The underlying problem is that zoomorphology lacks the standards for data and metadata. As a consequence, zoomorphology cannot participate in eScience. We argue that the standardization of morphological data requires i) a standardized framework for terminologies for anatomy and ii) a formalized method of description that allows computer-parsable morphological data to be communicable, compatible, and comparable. The role of controlled vocabularies (e.g., ontologies) for developing respective terminologies and methods of description is discussed, especially in the context of data annotation and semantic enhancement of publications. Finally, we introduce the International Consortium for Zoomorphology Standards, a working group that is open to everyone and whose aim is to stimulate and synthesize dialog about standards. It is the Consortium's ultimate goal to assist the zoomorphology community in developing modern data and metadata standards, including anatomy ontologies, thereby facilitating the participation of zoomorphology in eScience. Copyright © 2013 Wiley Periodicals, Inc.

  14. Integrated environmental modeling: a vision and roadmap for the future

    USGS Publications Warehouse

    Laniak, Gerard F.; Olchin, Gabriel; Goodall, Jonathan; Voinov, Alexey; Hill, Mary; Glynn, Pierre; Whelan, Gene; Geller, Gary; Quinn, Nigel; Blind, Michiel; Peckham, Scott; Reaney, Sim; Gaber, Noha; Kennedy, Philip R.; Hughes, Andrew

    2013-01-01

    Integrated environmental modeling (IEM) is inspired by modern environmental problems, decisions, and policies and enabled by transdisciplinary science and computer capabilities that allow the environment to be considered in a holistic way. The problems are characterized by the extent of the environmental system involved, dynamic and interdependent nature of stressors and their impacts, diversity of stakeholders, and integration of social, economic, and environmental considerations. IEM provides a science-based structure to develop and organize relevant knowledge and information and apply it to explain, explore, and predict the behavior of environmental systems in response to human and natural sources of stress. During the past several years a number of workshops were held that brought IEM practitioners together to share experiences and discuss future needs and directions. In this paper we organize and present the results of these discussions. IEM is presented as a landscape containing four interdependent elements: applications, science, technology, and community. The elements are described from the perspective of their role in the landscape, current practices, and challenges that must be addressed. Workshop participants envision a global scale IEM community that leverages modern technologies to streamline the movement of science-based knowledge from its sources in research, through its organization into databases and models, to its integration and application for problem solving purposes. Achieving this vision will require that the global community of IEM stakeholders transcend social, and organizational boundaries and pursue greater levels of collaboration. Among the highest priorities for community action are the development of standards for publishing IEM data and models in forms suitable for automated discovery, access, and integration; education of the next generation of environmental stakeholders, with a focus on transdisciplinary research, development, and decision making; and providing a web-based platform for community interactions (e.g., continuous virtual workshops).

  15. Gender-related beliefs of Turkish female science teachers and their effect on interactions with female and male students

    NASA Astrophysics Data System (ADS)

    Uysal, Sibel

    The purpose of this study is to examine the relationship between Turkish female science teachers' gender-related beliefs and those teachers' corresponding interaction with their male and female students. The data was collected from five different sources: Surveys, interviews, observations, chi square data from the observation phase, and interviews with selected teachers. The data was analyzed using the Ericson interpretive method of socio-cultural theories which provided a framework for understanding the development of teacher beliefs and their interactions with their students. In this study, the survey revealed three types of teachers ranging from traditional, moderate to modern. Moderate teachers exhibited characteristics that were on a continuum between the traditional and modern teachers. Traditional teachers believed that males and females should have certain defined roles. Females should be responsible for taking care of the needs of their children and their husbands. By comparison, modern teachers did not assign specific roles to either males or females. With regard to the role of women in science, traditional teachers believed that female scientists could not be as successful as male scientists. By comparison, modern teachers believed that female scientists could be as successful as male scientists. Modern teachers did indicate that they thought females needed to work harder than males to prove themselves. When it came to the teachers' views and beliefs regarding their female and male students' success in their science classrooms, traditional teachers believed that their male students were brighter than their female students. They also believed that female students excelled only because they worked harder. Modern teachers believed that success is dependent on each student's background and his or her interest in science. Classroom observation indicated that traditional and modern teachers interacted differently with their male and female students. Traditional teachers provided more speaking time to male students and permitted male students to ask more questions than their female students. Modern teachers, on the other hand, paid equal attention to all their students. Both groups' belief systems were apparent and impacted their interactions with their students.

  16. Mass Media Decision in China's Post-Mao Zedong Modernization Program: Some Unanticipated Consequences.

    ERIC Educational Resources Information Center

    Koo, Charles M.

    In 1978, China launched its "Four Modernizations" program, which included modernization in agriculture, industry, national defense, and science and technology. To promote this program and to mobilize the Chinese masses to take a more positive and active attitude toward modernization, the government called upon the forces of the mass…

  17. Computers and neurosurgery.

    PubMed

    Shaikhouni, Ammar; Elder, J Bradley

    2012-11-01

    At the turn of the twentieth century, the only computational device used in neurosurgical procedures was the brain of the surgeon. Today, most neurosurgical procedures rely at least in part on the use of a computer to help perform surgeries accurately and safely. The techniques that revolutionized neurosurgery were mostly developed after the 1950s. Just before that era, the transistor was invented in the late 1940s, and the integrated circuit was invented in the late 1950s. During this time, the first automated, programmable computational machines were introduced. The rapid progress in the field of neurosurgery not only occurred hand in hand with the development of modern computers, but one also can state that modern neurosurgery would not exist without computers. The focus of this article is the impact modern computers have had on the practice of neurosurgery. Neuroimaging, neuronavigation, and neuromodulation are examples of tools in the armamentarium of the modern neurosurgeon that owe each step in their evolution to progress made in computer technology. Advances in computer technology central to innovations in these fields are highlighted, with particular attention to neuroimaging. Developments over the last 10 years in areas of sensors and robotics that promise to transform the practice of neurosurgery further are discussed. Potential impacts of advances in computers related to neurosurgery in developing countries and underserved regions are also discussed. As this article illustrates, the computer, with its underlying and related technologies, is central to advances in neurosurgery over the last half century. Copyright © 2012 Elsevier Inc. All rights reserved.

  18. Chinese physicists educated in Germany and America: Their scientific contributions and their impact on China's higher education

    NASA Astrophysics Data System (ADS)

    Qu, Jing Cheng

    1998-11-01

    This dissertation records the historical paths of Chinese physicists educated in Germany and America, explores their representative achievements in modern physics that have not been recognized by Chinese scholars, and provides sociological analyses of their contributions to China's higher education. We have found that Chinese students of physics in Germany and America were not passive recipients of Western science, but active contributors. They were also crucial contributors to science education and important scientific projects upon their return to China. Chapter One briefly describes physics knowledge in ancient China and introduces the transplantation of modern science and technology to China. Three distinct historical periods have been identified. In Chapter Two and Chapter Three, 30 Chinese physicists educated in Germany and 89 in America have been investigated. This research analyzes the significant achievements of these physicists. It also examines the political changes, the social background, and other factors impacting on their studies in the two countries. The selected cases in the two chapters are Li Fo-ki, Chinese physics students in Berlin, Werner Heisenberg and his Chinese students, Max Born and his Chinese students, Robert Millikan and Chinese physicists, the first two Chinese physicists from Harvard, and the Science Society of China. Chapter Four explores the geographical distribution, education and careers, return and expatriation, and the social influence exerted by these Chinese physicists. Statistical compilation and quantitative analyses comprise the basic methodology. In terms of two periods and two generations, this dissertation explores the physicists' contributions to the development of modern science in China and to education in China. Significant cases from Beijing University, Qinghua University, and Yanjing University are analyzed. The last chapter, Chapter Five, concludes that some of the achievements of these Chinese physicists were critical steps in modern physics even though China remained domestically rather weak in the development of modern science. Returning to China, most of them became pioneers and active contributors to modern science and to higher education in China. They comprised the majority of the physics community of China and played a leading role in the formation of modern science in China. After 1949, China continued to benefit from the contributions of these physicists. China independently constructed an atomic bomb in 1964 and a hydrogen bomb in 1967. In 1970, China successfully launched a man-made satellite. The Chinese physicists trained in Western countries constituted the main research force behind these projects.

  19. Cerebral localization in the nineteenth century--the birth of a science and its modern consequences.

    PubMed

    Steinberg, David A

    2009-07-01

    Although many individuals contributed to the development of the science of cerebral localization, its conceptual framework is the work of a single man--John Hughlings Jackson (1835-1911), a Victorian physician practicing in London. Hughlings Jackson's formulation of a neurological science consisted of an axiomatic basis, an experimental methodology, and a clinical neurophysiology. His axiom--that the brain is an exclusively sensorimotor machine--separated neurology from psychiatry and established a rigorous and sophisticated structure for the brain and mind. Hughlings Jackson's experimental method utilized the focal lesion as a probe of brain function and created an evolutionary structure of somatotopic representation to explain clinical neurophysiology. His scientific theory of cerebral localization can be described as a weighted ordinal representation. Hughlings Jackson's theory of weighted ordinal representation forms the scientific basis for modern neurology. Though this science is utilized daily by every neurologist and forms the basis of neuroscience, the consequences of Hughlings Jackson's ideas are still not generally appreciated. For example, they imply the intrinsic inconsistency of some modern fields of neuroscience and neurology. Thus, "cognitive imaging" and the "neurology of art"--two topics of modern interest--are fundamentally oxymoronic according to the science of cerebral localization. Neuroscientists, therefore, still have much to learn from John Hughlings Jackson.

  20. Teaching Einsteinian Physics at Schools: Part 1, Models and Analogies for Relativity

    ERIC Educational Resources Information Center

    Kaur, Tejinder; Blair, David; Moschilla, John; Stannard, Warren; Zadnik, Marjan

    2017-01-01

    The Einstein-First project aims to change the paradigm of school science teaching through the introduction of modern Einsteinian concepts of space and time, gravity and quanta at an early age. These concepts are rarely taught to school students despite their central importance to modern science and technology. The key to implementing the…

  1. Probing Scientists' Beliefs: How Open-Minded Are Modern Scientists?

    ERIC Educational Resources Information Center

    Coll, Richard; Taylor, Neil

    2004-01-01

    Just how open-minded are modern scientists? In this paper we examine this question for the science faculty from New Zealand and UK universities. The Exeter questionnaire used by Preece and Baxter (2000) to examine superstitious beliefs of high school students and preservice science teachers was used as a basis for a series of in-depth interviews…

  2. Teaching Earth Signals Analysis Using the Java-DSP Earth Systems Edition: Modern and Past Climate Change

    ERIC Educational Resources Information Center

    Ramamurthy, Karthikeyan Natesan; Hinnov, Linda A.; Spanias, Andreas S.

    2014-01-01

    Modern data collection in the Earth Sciences has propelled the need for understanding signal processing and time-series analysis techniques. However, there is an educational disconnect in the lack of instruction of time-series analysis techniques in many Earth Science academic departments. Furthermore, there are no platform-independent freeware…

  3. Courses in Modern Physics for Non-science Majors, Future Science Teachers, and Biology Students

    NASA Astrophysics Data System (ADS)

    Zollman, Dean

    2001-03-01

    For the past 15 years Kansas State University has offered a course in modern physics for students who are not majoring in physics. This course carries a prerequisite of one physics course so that the students have a basic introduction in classical topics. The majors of students range from liberal arts to engineering. Future secondary science teachers whose first area of teaching is not physics can use the course as part of their study of science. The course has evolved from a lecture format to one which is highly interactive and uses a combination of hands-on activities, tutorials and visualizations, particularly the Visual Quantum Mechanics materials. Another course encourages biology students to continue their physics learning beyond the introductory course. Modern Miracle Medical Machines introduces the basic physics which underlie diagnosis techniques such as MRI and PET and laser surgical techniques. Additional information is available at http://www.phys.ksu.edu/perg/

  4. Artificial Intelligence in Medical Practice: The Question to the Answer?

    PubMed

    Miller, D Douglas; Brown, Eric W

    2018-02-01

    Computer science advances and ultra-fast computing speeds find artificial intelligence (AI) broadly benefitting modern society-forecasting weather, recognizing faces, detecting fraud, and deciphering genomics. AI's future role in medical practice remains an unanswered question. Machines (computers) learn to detect patterns not decipherable using biostatistics by processing massive datasets (big data) through layered mathematical models (algorithms). Correcting algorithm mistakes (training) adds to AI predictive model confidence. AI is being successfully applied for image analysis in radiology, pathology, and dermatology, with diagnostic speed exceeding, and accuracy paralleling, medical experts. While diagnostic confidence never reaches 100%, combining machines plus physicians reliably enhances system performance. Cognitive programs are impacting medical practice by applying natural language processing to read the rapidly expanding scientific literature and collate years of diverse electronic medical records. In this and other ways, AI may optimize the care trajectory of chronic disease patients, suggest precision therapies for complex illnesses, reduce medical errors, and improve subject enrollment into clinical trials. Copyright © 2018 Elsevier Inc. All rights reserved.

  5. Centre for Research Infrastructure of Polish GNSS Data - response and possible contribution to EPOS

    NASA Astrophysics Data System (ADS)

    Araszkiewicz, Andrzej; Rohm, Witold; Bosy, Jaroslaw; Szolucha, Marcin; Kaplon, Jan; Kroszczynski, Krzysztof

    2017-04-01

    In the frame of the first call under Action 4.2: Development of modern research infrastructure of the science sector in the Smart Growth Operational Programme 2014-2020 in the late of 2016 the "EPOS-PL" project has launched. Following institutes are responsible for the implementation of this project: Institute of Geophysics, Polish Academy of Sciences - Project Leader, Academic Computer Centre Cyfronet AGH University of Science and Technology, Central Mining Institute, the Institute of Geodesy and Cartography, Wrocław University of Environmental and Life Sciences, Military University of Technology. In addition, resources constituting entrepreneur's own contribution will come from the Polish Mining Group. Research Infrastructure EPOS-PL will integrate both existing and newly built National Research Infrastructures (Theme Centre for Research Infrastructures), which, under the premise of the program EPOS, are financed exclusively by the national founds. In addition, the e-science platform will be developed. The Centre for Research Infrastructure of GNSS Data (CIBDG - Task 5) will be built based on the experience and facilities of two institutions: Military University of Technology and Wrocław University of Environmental and Life Sciences. The project includes the construction of the National GNNS Repository with data QC procedures and adaptation of two Regional GNNS Analysis Centres for rapid and long-term geodynamical monitoring.

  6. [Franz Joseph Gall and his "talking skulls" established the basis of modern brain sciences].

    PubMed

    Wolfgang, Regal; Michael, Nanut

    2008-01-01

    The anatomist and brain scientist Franz Joseph Gall (1758-1828) developed the "phrenology" in the early 19(th) century. At this time, his new teachings were more seen as a temporary fashion than science and were discredited. No more than hundred years ago, it was realised that the phrenology established the basis of modern brain sciences. By all means Gall was the first one to combine defined regions of the cerebral cortex with distinct cognitive functions.

  7. Inference for the physical sciences

    PubMed Central

    Jones, Nick S.; Maccarone, Thomas J.

    2013-01-01

    There is a disconnect between developments in modern data analysis and some parts of the physical sciences in which they could find ready use. This introduction, and this issue, provides resources to help experimental researchers access modern data analysis tools and exposure for analysts to extant challenges in physical science. We include a table of resources connecting statistical and physical disciplines and point to appropriate books, journals, videos and articles. We conclude by highlighting the relevance of each of the articles in the associated issue. PMID:23277613

  8. Computational Cellular Dynamics Based on the Chemical Master Equation: A Challenge for Understanding Complexity

    PubMed Central

    Liang, Jie; Qian, Hong

    2010-01-01

    Modern molecular biology has always been a great source of inspiration for computational science. Half a century ago, the challenge from understanding macromolecular dynamics has led the way for computations to be part of the tool set to study molecular biology. Twenty-five years ago, the demand from genome science has inspired an entire generation of computer scientists with an interest in discrete mathematics to join the field that is now called bioinformatics. In this paper, we shall lay out a new mathematical theory for dynamics of biochemical reaction systems in a small volume (i.e., mesoscopic) in terms of a stochastic, discrete-state continuous-time formulation, called the chemical master equation (CME). Similar to the wavefunction in quantum mechanics, the dynamically changing probability landscape associated with the state space provides a fundamental characterization of the biochemical reaction system. The stochastic trajectories of the dynamics are best known through the simulations using the Gillespie algorithm. In contrast to the Metropolis algorithm, this Monte Carlo sampling technique does not follow a process with detailed balance. We shall show several examples how CMEs are used to model cellular biochemical systems. We shall also illustrate the computational challenges involved: multiscale phenomena, the interplay between stochasticity and nonlinearity, and how macroscopic determinism arises from mesoscopic dynamics. We point out recent advances in computing solutions to the CME, including exact solution of the steady state landscape and stochastic differential equations that offer alternatives to the Gilespie algorithm. We argue that the CME is an ideal system from which one can learn to understand “complex behavior” and complexity theory, and from which important biological insight can be gained. PMID:24999297

  9. Computational Cellular Dynamics Based on the Chemical Master Equation: A Challenge for Understanding Complexity.

    PubMed

    Liang, Jie; Qian, Hong

    2010-01-01

    Modern molecular biology has always been a great source of inspiration for computational science. Half a century ago, the challenge from understanding macromolecular dynamics has led the way for computations to be part of the tool set to study molecular biology. Twenty-five years ago, the demand from genome science has inspired an entire generation of computer scientists with an interest in discrete mathematics to join the field that is now called bioinformatics. In this paper, we shall lay out a new mathematical theory for dynamics of biochemical reaction systems in a small volume (i.e., mesoscopic) in terms of a stochastic, discrete-state continuous-time formulation, called the chemical master equation (CME). Similar to the wavefunction in quantum mechanics, the dynamically changing probability landscape associated with the state space provides a fundamental characterization of the biochemical reaction system. The stochastic trajectories of the dynamics are best known through the simulations using the Gillespie algorithm. In contrast to the Metropolis algorithm, this Monte Carlo sampling technique does not follow a process with detailed balance. We shall show several examples how CMEs are used to model cellular biochemical systems. We shall also illustrate the computational challenges involved: multiscale phenomena, the interplay between stochasticity and nonlinearity, and how macroscopic determinism arises from mesoscopic dynamics. We point out recent advances in computing solutions to the CME, including exact solution of the steady state landscape and stochastic differential equations that offer alternatives to the Gilespie algorithm. We argue that the CME is an ideal system from which one can learn to understand "complex behavior" and complexity theory, and from which important biological insight can be gained.

  10. Trading secrets: Jews and the early modern quest for clandestine knowledge.

    PubMed

    Jütte, Daniel

    2012-12-01

    This essay explores the significance and function of secrecy and secret sciences in Jewish-Christian relations and in Jewish culture in the early modern period. It shows how the trade in clandestine knowledge and the practice of secret sciences became a complex, sometimes hazardous space for contact between Jews and Christians. By examining this trade, the essay clarifies the role of secrecy in the early modern marketplace of knowledge. The attribution of secretiveness to Jews was a widespread topos in early modern European thought. However, relatively little is known about the implications of such beliefs in science or in daily life. The essay pays special attention to the fact that trade in secret knowledge frequently offered Jews a path to the center of power, especially at court. Furthermore, it becomes clear that the practice of secret sciences, the trade in clandestine knowledge, and a mercantile agenda were often inextricably interwoven. Special attention is paid to the Italian-Jewish alchemist, engineer, and entrepreneur Abramo Colorni (ca. 1544-1599), whose career illustrates the opportunities provided by the marketplace of secrets at that time. Much scholarly (and less scholarly) attention has been devoted to whether and what Jews "contributed" to what is commonly called the "Scientific Revolution." This essay argues that the question is misdirected and that, instead, we should pay more attention to the distinctive opportunities offered by the early modern economy of secrecy.

  11. Nuclear and Particle Physics Simulations: The Consortium of Upper-Level Physics Software

    NASA Astrophysics Data System (ADS)

    Bigelow, Roberta; Moloney, Michael J.; Philpott, John; Rothberg, Joseph

    1995-06-01

    The Consortium for Upper Level Physics Software (CUPS) has developed a comprehensive series of Nine Book/Software packages that Wiley will publish in FY `95 and `96. CUPS is an international group of 27 physicists, all with extensive backgrounds in the research, teaching, and development of instructional software. The project is being supported by the National Science Foundation (PHY-9014548), and it has received other support from the IBM Corp., Apple Computer Corp., and George Mason University. The Simulations being developed are: Astrophysics, Classical Mechanics, Electricity & Magnetism, Modern Physics, Nuclear and Particle Physics, Quantum Mechanics, Solid State, Thermal and Statistical, and Wave and Optics.

  12. Feeding People's Curiosity: Leveraging the Cloud for Automatic Dissemination of Mars Images

    NASA Technical Reports Server (NTRS)

    Knight, David; Powell, Mark

    2013-01-01

    Smartphones and tablets have made wireless computing ubiquitous, and users expect instant, on-demand access to information. The Mars Science Laboratory (MSL) operations software suite, MSL InterfaCE (MSLICE), employs a different back-end image processing architecture compared to that of the Mars Exploration Rovers (MER) in order to better satisfy modern consumer-driven usage patterns and to offer greater server-side flexibility. Cloud services are a centerpiece of the server-side architecture that allows new image data to be delivered automatically to both scientists using MSLICE and the general public through the MSL website (http://mars.jpl.nasa.gov/msl/).

  13. Electricity in the treatment of nervous system disease.

    PubMed

    Fodstad, H; Hariz, M

    2007-01-01

    Electricity has been used in medicine for almost two millenniums beginning with electrical chocks from the torpedo fish and ending with the implantation of neuromodulators and neuroprostheses. These implantable stimulators aim to improve functional independence and quality of life in various groups of disabled people. New indications for neuromodulation are still evolving and the field is rapidly advancing. Thanks to modern science and computer technology, electrotherapy has reached a degree of sophistication where it can be applied relatively safely and effectively in a variety of nervous system diseases, including pain, movement disorders, epilepsy, Tourette syndrome, psychiatric disease, addiction, coma, urinary incontinence, impotence, infertility, respiratory paralysis, tinnitus and blindness.

  14. A hybrid Gerchberg-Saxton-like algorithm for DOE and CGH calculation

    NASA Astrophysics Data System (ADS)

    Wang, Haichao; Yue, Weirui; Song, Qiang; Liu, Jingdan; Situ, Guohai

    2017-02-01

    The Gerchberg-Saxton (GS) algorithm is widely used in various disciplines of modern sciences and technologies where phase retrieval is required. However, this legendary algorithm most likely stagnates after a few iterations. Many efforts have been taken to improve this situation. Here we propose to introduce the strategy of gradient descent and weighting technique to the GS algorithm, and demonstrate it using two examples: design of a diffractive optical element (DOE) to achieve off-axis illumination in lithographic tools, and design of a computer generated hologram (CGH) for holographic display. Both numerical simulation and optical experiments are carried out for demonstration.

  15. Accelerator Based Tools of Stockpile Stewardship

    NASA Astrophysics Data System (ADS)

    Seestrom, Susan

    2017-01-01

    The Manhattan Project had to solve difficult challenges in physics and materials science. During the cold war a large nuclear stockpile was developed. In both cases, the approach was largely empirical. Today that stockpile must be certified without nuclear testing, a task that becomes more difficult as the stockpile ages. I will discuss the role of modern accelerator based experiments, such as x-ray radiography, proton radiography, neutron and nuclear physics experiments, in stockpile stewardship. These new tools provide data of exceptional sensitivity and are answering questions about the stockpile, improving our scientific understanding, and providing validation for the computer simulations that are relied upon to certify todays' stockpile.

  16. Foundations of anticipatory logic in biology and physics.

    PubMed

    Bettinger, Jesse S; Eastman, Timothy E

    2017-12-01

    Recent advances in modern physics and biology reveal several scenarios in which top-down effects (Ellis, 2016) and anticipatory systems (Rosen, 1980) indicate processes at work enabling active modeling and inference such that anticipated effects project onto potential causes. We extrapolate a broad landscape of anticipatory systems in the natural sciences extending to computational neuroscience of perception in the capacity of Bayesian inferential models of predictive processing. This line of reasoning also comes with philosophical foundations, which we develop in terms of counterfactual reasoning and possibility space, Whitehead's process thought, and correlations with Eastern wisdom traditions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Biological Principles and Threshold Concepts for Understanding Natural Selection: Implications for Developing Visualizations as a Pedagogic Tool

    ERIC Educational Resources Information Center

    Tibell, Lena A. E.; Harms, Ute

    2017-01-01

    Modern evolutionary theory is both a central theory and an integrative framework of the life sciences. This is reflected in the common references to evolution in modern science education curricula and contexts. In fact, evolution is a core idea that is supposed to support biology learning by facilitating the organization of relevant knowledge. In…

  18. How to Use Pragmatism Pragmatically? Suggestions for the Twenty-First Century

    ERIC Educational Resources Information Center

    Biesta, Gert J. J.

    2009-01-01

    This purpose of this paper is to indicate how one should understand John Dewey's attention to and appreciation for the methods and views of modern science. Against the idea that Dewey is a believer in the methods and views of modern science--which would make his philosophy into a form of positivism or scientism--the author argues that Dewey's…

  19. Analysis of context dependence in social interaction networks of a massively multiplayer online role-playing game.

    PubMed

    Son, Seokshin; Kang, Ah Reum; Kim, Hyun-chul; Kwon, Taekyoung; Park, Juyong; Kim, Huy Kang

    2012-01-01

    Rapid advances in modern computing and information technology have enabled millions of people to interact online via various social network and gaming services. The widespread adoption of such online services have made possible analysis of large-scale archival data containing detailed human interactions, presenting a very promising opportunity to understand the rich and complex human behavior. In collaboration with a leading global provider of Massively Multiplayer Online Role-Playing Games (MMORPGs), here we present a network science-based analysis of the interplay between distinct types of user interaction networks in the virtual world. We find that their properties depend critically on the nature of the context-interdependence of the interactions, highlighting the complex and multilayered nature of human interactions, a robust understanding of which we believe may prove instrumental in the designing of more realistic future virtual arenas as well as provide novel insights to the science of collective human behavior.

  20. Big Biomedical data as the key resource for discovery science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toga, Arthur W.; Foster, Ian; Kesselman, Carl

    Modern biomedical data collection is generating exponentially more data in a multitude of formats. This flood of complex data poses significant opportunities to discover and understand the critical interplay among such diverse domains as genomics, proteomics, metabolomics, and phenomics, including imaging, biometrics, and clinical data. The Big Data for Discovery Science Center is taking an “-ome to home” approach to discover linkages between these disparate data sources by mining existing databases of proteomic and genomic data, brain images, and clinical assessments. In support of this work, the authors developed new technological capabilities that make it easy for researchers to manage,more » aggregate, manipulate, integrate, and model large amounts of distributed data. Guided by biological domain expertise, the Center’s computational resources and software will reveal relationships and patterns, aiding researchers in identifying biomarkers for the most confounding conditions and diseases, such as Parkinson’s and Alzheimer’s.« less

  1. Big biomedical data as the key resource for discovery science

    PubMed Central

    Toga, Arthur W; Foster, Ian; Kesselman, Carl; Madduri, Ravi; Chard, Kyle; Deutsch, Eric W; Price, Nathan D; Glusman, Gustavo; Heavner, Benjamin D; Dinov, Ivo D; Ames, Joseph; Van Horn, John; Kramer, Roger; Hood, Leroy

    2015-01-01

    Modern biomedical data collection is generating exponentially more data in a multitude of formats. This flood of complex data poses significant opportunities to discover and understand the critical interplay among such diverse domains as genomics, proteomics, metabolomics, and phenomics, including imaging, biometrics, and clinical data. The Big Data for Discovery Science Center is taking an “-ome to home” approach to discover linkages between these disparate data sources by mining existing databases of proteomic and genomic data, brain images, and clinical assessments. In support of this work, the authors developed new technological capabilities that make it easy for researchers to manage, aggregate, manipulate, integrate, and model large amounts of distributed data. Guided by biological domain expertise, the Center’s computational resources and software will reveal relationships and patterns, aiding researchers in identifying biomarkers for the most confounding conditions and diseases, such as Parkinson’s and Alzheimer’s. PMID:26198305

  2. Learning about light and optics in on-line general education classes using at-home experimentation

    NASA Astrophysics Data System (ADS)

    Millspaw, Jacob; Wang, Gang; Masters, Mark F.

    2014-07-01

    College students are facing a constantly evolving educational system. Some still see mostly the traditional face to face lecture type classes where as others may never set foot on campus thanks to distance learning programs. In between they may enroll in a mix of face-to-face, two-way broadcasted interactive courses, streaming lecture courses, hybrid face-to-face/ on-line courses and the ominous MOOC! A large number of these non-traditional courses are general education courses and play an important role in developing non-science majors' understanding of science in general, and of physics in particular. We have been keeping pace with theses modern modes of instruction by offering several on-line courses such as Physics for Computer Graphics and Animation and Light and Color. These courses cover basic concepts in light, color and optics.

  3. CSP: A Multifaceted Hybrid Architecture for Space Computing

    NASA Technical Reports Server (NTRS)

    Rudolph, Dylan; Wilson, Christopher; Stewart, Jacob; Gauvin, Patrick; George, Alan; Lam, Herman; Crum, Gary Alex; Wirthlin, Mike; Wilson, Alex; Stoddard, Aaron

    2014-01-01

    Research on the CHREC Space Processor (CSP) takes a multifaceted hybrid approach to embedded space computing. Working closely with the NASA Goddard SpaceCube team, researchers at the National Science Foundation (NSF) Center for High-Performance Reconfigurable Computing (CHREC) at the University of Florida and Brigham Young University are developing hybrid space computers that feature an innovative combination of three technologies: commercial-off-the-shelf (COTS) devices, radiation-hardened (RadHard) devices, and fault-tolerant computing. Modern COTS processors provide the utmost in performance and energy-efficiency but are susceptible to ionizing radiation in space, whereas RadHard processors are virtually immune to this radiation but are more expensive, larger, less energy-efficient, and generations behind in speed and functionality. By featuring COTS devices to perform the critical data processing, supported by simpler RadHard devices that monitor and manage the COTS devices, and augmented with novel uses of fault-tolerant hardware, software, information, and networking within and between COTS devices, the resulting system can maximize performance and reliability while minimizing energy consumption and cost. NASA Goddard has adopted the CSP concept and technology with plans underway to feature flight-ready CSP boards on two upcoming space missions.

  4. Methodological approach to crime scene investigation: the dangers of technology

    NASA Astrophysics Data System (ADS)

    Barnett, Peter D.

    1997-02-01

    The visitor to any modern forensic science laboratory is confronted with equipment and processes that did not exist even 10 years ago: thermocyclers to allow genetic typing of nanogram amounts of DNA isolated from a few spermatozoa; scanning electron microscopes that can nearly automatically detect submicrometer sized particles of molten lead, barium and antimony produced by the discharge of a firearm and deposited on the hands of the shooter; and computers that can compare an image of a latent fingerprint with millions of fingerprints stored in the computer memory. Analysis of populations of physical evidence has permitted statistically minded forensic scientists to use Bayesian inference to draw conclusions based on a priori assumptions which are often poorly understood, irrelevant, or misleading. National commissions who are studying quality control in DNA analysis propose that people with barely relevant graduate degrees and little forensic science experience be placed in charge of forensic DNA laboratories. It is undeniable that high- tech has reversed some miscarriages of justice by establishing the innocence of a number of people who were imprisoned for years for crimes that they did not commit. However, this papers deals with the dangers of technology in criminal investigations.

  5. Overview of the TriBITS Lifecycle Model: Lean/Agile Software Lifecycle Model for Research-based Computational Science and Engineering Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartlett, Roscoe A; Heroux, Dr. Michael A; Willenbring, James

    2012-01-01

    Software lifecycles are becoming an increasingly important issue for computational science & engineering (CSE) software. The process by which a piece of CSE software begins life as a set of research requirements and then matures into a trusted high-quality capability is both commonplace and extremely challenging. Although an implicit lifecycle is obviously being used in any effort, the challenges of this process--respecting the competing needs of research vs. production--cannot be overstated. Here we describe a proposal for a well-defined software lifecycle process based on modern Lean/Agile software engineering principles. What we propose is appropriate for many CSE software projects thatmore » are initially heavily focused on research but also are expected to eventually produce usable high-quality capabilities. The model is related to TriBITS, a build, integration and testing system, which serves as a strong foundation for this lifecycle model, and aspects of this lifecycle model are ingrained in the TriBITS system. Indeed this lifecycle process, if followed, will enable large-scale sustainable integration of many complex CSE software efforts across several institutions.« less

  6. Scientific Tourism Centres in Armenia

    NASA Astrophysics Data System (ADS)

    Mickaelian, A. M.; Farmanyan, S. V.; Mikayelyan, G. A.; Mikayelyan, A. A.

    2016-12-01

    Armenia is rich in scientific sites, among which archaeological sites of scientific nature, modern scientific institutions and science related museums can be mentioned. Examples of archaeological sites are ancient observatories, petroglyphs having astronomical nature, as well as intangible heritage, such as Armenian calendars. Modern institutions having tools or laboratories which can be represented in terms of tourism, are considered as scientific tourism sites. Science related museums are Museum of science and technology, Space museum, Geological museum and other museums. Despite the fact, that scientific tourism is a new direction, it has great perspectives, and Armenia has a great potential in this field. It is very important to introduce Armenia from this angle, including scientific archaeological sites as well as modern institutions and museums. This article presents major scientific tourism centers of Armenia.

  7. The origin of scientific neurology and its consequences for modern and future neuroscience.

    PubMed

    Steinberg, David A

    2014-01-01

    John Hughlings Jackson (1835-1911) created a science of brain function that, in scope and profundity, is among the great scientific discoveries of the 19th century. It is interesting that the magnitude of his achievement is not completely recognized even among his ardent admirers. Although thousands of practitioners around the world use the clinical applications of his science every day, the principles from which bedside neurology is derived have broader consequences-for modern and future science-that remain unrecognized and unexploited. This paper summarizes the scientific formalism that created modern neurology, demonstrates how its direct implications affect a current area of neuroscientific research, and indicates how Hughlings Jackson's ideas form a path toward a novel solution to an important open problem of the brain and mind.

  8. Community Petascale Project for Accelerator Science and Simulation: Advancing Computational Science for Future Accelerators and Accelerator Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spentzouris, P.; /Fermilab; Cary, J.

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessarymore » accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors. ComPASS is in the first year of executing its plan to develop the next-generation HPC accelerator modeling tools. ComPASS aims to develop an integrated simulation environment that will utilize existing and new accelerator physics modules with petascale capabilities, by employing modern computing and solver technologies. The ComPASS vision is to deliver to accelerator scientists a virtual accelerator and virtual prototyping modeling environment, with the necessary multiphysics, multiscale capabilities. The plan for this development includes delivering accelerator modeling applications appropriate for each stage of the ComPASS software evolution. Such applications are already being used to address challenging problems in accelerator design and optimization. The ComPASS organization for software development and applications accounts for the natural domain areas (beam dynamics, electromagnetics, and advanced acceleration), and all areas depend on the enabling technologies activities, such as solvers and component technology, to deliver the desired performance and integrated simulation environment. The ComPASS applications focus on computationally challenging problems important for design or performance optimization to all major HEP, NP, and BES accelerator facilities. With the cost and complexity of particle accelerators rising, the use of computation to optimize their designs and find improved operating regimes becomes essential, potentially leading to significant cost savings with modest investment.« less

  9. Applying ``intelligent`` materials for materials education: The Labless Lab{trademark}

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrade, J.D.; Scheer, R.

    1994-12-31

    A very large number of science and engineering courses taught in colleges and universities today do not involve laboratories. Although good instructors incorporate class demonstrations, hands on homework, and various teaching aids, including computer simulations, the fact is that students in such courses often accept key concepts and experimental results without discovering them for themselves. The only partial solution to this problem has been increasing use of class demonstrations and computer simulations. The authors feel strongly that many complex concepts can be observed and assimilated through experimentation with properly designed materials. They propose the development of materials and specimens designedmore » specifically for education purposes. Intelligent and communicative materials are ideal for this purpose. Specimens which respond in an observable fashion to new environments and situations provided by the students/experimenter provide a far more effective materials science and engineering experience than readouts and data generated by complex and expensive machines, particularly in an introductory course. Modern materials can be designed to literally communicate with the observer. The authors embarked on a project to develop a series of Labless Labs{trademark} utilizing various degrees and levels of intelligence in materials. It is expected that such Labless Labs{trademark} would be complementary to textbooks and computer simulations and to be used to provide a reality for students in courses and other learning situations where access to a laboratory is non-existent or limited.« less

  10. BioVeL: a virtual laboratory for data analysis and modelling in biodiversity science and ecology.

    PubMed

    Hardisty, Alex R; Bacall, Finn; Beard, Niall; Balcázar-Vargas, Maria-Paula; Balech, Bachir; Barcza, Zoltán; Bourlat, Sarah J; De Giovanni, Renato; de Jong, Yde; De Leo, Francesca; Dobor, Laura; Donvito, Giacinto; Fellows, Donal; Guerra, Antonio Fernandez; Ferreira, Nuno; Fetyukova, Yuliya; Fosso, Bruno; Giddy, Jonathan; Goble, Carole; Güntsch, Anton; Haines, Robert; Ernst, Vera Hernández; Hettling, Hannes; Hidy, Dóra; Horváth, Ferenc; Ittzés, Dóra; Ittzés, Péter; Jones, Andrew; Kottmann, Renzo; Kulawik, Robert; Leidenberger, Sonja; Lyytikäinen-Saarenmaa, Päivi; Mathew, Cherian; Morrison, Norman; Nenadic, Aleksandra; de la Hidalga, Abraham Nieva; Obst, Matthias; Oostermeijer, Gerard; Paymal, Elisabeth; Pesole, Graziano; Pinto, Salvatore; Poigné, Axel; Fernandez, Francisco Quevedo; Santamaria, Monica; Saarenmaa, Hannu; Sipos, Gergely; Sylla, Karl-Heinz; Tähtinen, Marko; Vicario, Saverio; Vos, Rutger Aldo; Williams, Alan R; Yilmaz, Pelin

    2016-10-20

    Making forecasts about biodiversity and giving support to policy relies increasingly on large collections of data held electronically, and on substantial computational capability and capacity to analyse, model, simulate and predict using such data. However, the physically distributed nature of data resources and of expertise in advanced analytical tools creates many challenges for the modern scientist. Across the wider biological sciences, presenting such capabilities on the Internet (as "Web services") and using scientific workflow systems to compose them for particular tasks is a practical way to carry out robust "in silico" science. However, use of this approach in biodiversity science and ecology has thus far been quite limited. BioVeL is a virtual laboratory for data analysis and modelling in biodiversity science and ecology, freely accessible via the Internet. BioVeL includes functions for accessing and analysing data through curated Web services; for performing complex in silico analysis through exposure of R programs, workflows, and batch processing functions; for on-line collaboration through sharing of workflows and workflow runs; for experiment documentation through reproducibility and repeatability; and for computational support via seamless connections to supporting computing infrastructures. We developed and improved more than 60 Web services with significant potential in many different kinds of data analysis and modelling tasks. We composed reusable workflows using these Web services, also incorporating R programs. Deploying these tools into an easy-to-use and accessible 'virtual laboratory', free via the Internet, we applied the workflows in several diverse case studies. We opened the virtual laboratory for public use and through a programme of external engagement we actively encouraged scientists and third party application and tool developers to try out the services and contribute to the activity. Our work shows we can deliver an operational, scalable and flexible Internet-based virtual laboratory to meet new demands for data processing and analysis in biodiversity science and ecology. In particular, we have successfully integrated existing and popular tools and practices from different scientific disciplines to be used in biodiversity and ecological research.

  11. Applications of Mapping and Tomographic Techniques in Gem Sciences

    NASA Astrophysics Data System (ADS)

    Shen, A. H.

    2014-12-01

    Gem Sciences are scientific studies of gemstones - their genesis, provenance, synthesis, enhancement, treatment and identification. As high quality forms of specific minerals, the gemstones exhibit unusual physical properties that are usually unseen in the regular counterparts. Most gemstones are colored by trace elements incorporated in the crystal lattice during various growth stages; forming coloration zones of various scales. Studying the spectral and chemical contrast across color zones helps elucidating the origins of colors. These are done by UV-visible spectrometers with microscope and LA-ICPMS in modern gemological laboratories. In the case of diamonds, their colored zones arise from various structural defects incorporated in different growth zones and are studied with FTIR spectrometers with IR microscope and laser photoluminescence spectrometers. Advancement in modern synthetic techniques such as chemical vapor deposition (CVD) has created some problem for identification. Some exploratory experiments in carbon isotope mapping were done on diamonds using SIMS. The most important issue in pearls is to identify one particular pearl being a cultured one or a natural pearl. The price difference can be enormous. Classical way of such identification is done by x-ray radiographs, which clearly show the bead and the nacre. Modern cultured pearl advancement has eliminated the need for an artificial bead, but a small piece of tissue instead. Nowadays, computer x-ray tomography (CT) scanning devices are used to depict the clear image of the interior of a pearl. In the Chinese jade market, filling fissures with epoxy and/or wax are very commonly seen. We are currently exploring Magnetic Resonance Imaging (MRI) technique to map the distribution of artificial resin within a polycrystalline aggregates.

  12. Beyond postcolonialism ... and postpositivism: circulation and the global history of science.

    PubMed

    Raj, Kapil

    2013-06-01

    This essay traces the parallel, but unrelated, evolution of two sets of reactions to traditional idealist history of science in a world-historical context. While the scholars who fostered the postcolonial approach, in dealing with modern science in the non-West, espoused an idealist vision, they nevertheless stressed its political and ideological underpinnings and engaged with the question of its putative Western roots. The postidealist history of science developed its own vision with respect to the question of the global spread of modern science, paying little heed to postcolonial debates. It then proposes a historiographical approach developed in large part by historians of South Asian politics, economics, and science that, without compromising the preoccupations of each of the two groups, could help construct a mutually comprehensible and connected framework for the understanding of the global workings of the sciences.

  13. [Regulatory science: modern trends in science and education for pharmaceutical products].

    PubMed

    Beregovykh, V V; Piatigorskaia, N V; Aladysheva, Zh I

    2012-01-01

    This article reviews modern trends in development of new instruments, standards and approaches to drugs safety, efficacy and quality assessment in USA and EU that can be called by unique term--"regulatory science" which is a new concept for Russian Federation. New education programs (curricula) developed by USA and EU universities within last 3 years are reviewed. These programs were designed in order to build workforce capable to utilize science approach for drug regulation. The principal mechanisms for financing research in regulatory science used by Food and Drug Administration are analyzed. There are no such science and relevant researches in Russian Federation despite the high demand as well as needs for the system for higher education and life-long learning education of specialists for regulatory affairs (or compliance).

  14. Effective approach to spectroscopy and spectral analysis techniques using Matlab

    NASA Astrophysics Data System (ADS)

    Li, Xiang; Lv, Yong

    2017-08-01

    With the development of electronic information, computer and network, modern education technology has entered new era, which would give a great impact on teaching process. Spectroscopy and spectral analysis is an elective course for Optoelectronic Information Science and engineering. The teaching objective of this course is to master the basic concepts and principles of spectroscopy, spectral analysis and testing of basic technical means. Then, let the students learn the principle and technology of the spectrum to study the structure and state of the material and the developing process of the technology. MATLAB (matrix laboratory) is a multi-paradigm numerical computing environment and fourth-generation programming language. A proprietary programming language developed by MathWorks, MATLAB allows matrix manipulations, plotting of functions and data, Based on the teaching practice, this paper summarizes the new situation of applying Matlab to the teaching of spectroscopy. This would be suitable for most of the current school multimedia assisted teaching

  15. Rapid protein alignment in the cloud: HAMOND combines fast DIAMOND alignments with Hadoop parallelism.

    PubMed

    Yu, Jia; Blom, Jochen; Sczyrba, Alexander; Goesmann, Alexander

    2017-09-10

    The introduction of next generation sequencing has caused a steady increase in the amounts of data that have to be processed in modern life science. Sequence alignment plays a key role in the analysis of sequencing data e.g. within whole genome sequencing or metagenome projects. BLAST is a commonly used alignment tool that was the standard approach for more than two decades, but in the last years faster alternatives have been proposed including RapSearch, GHOSTX, and DIAMOND. Here we introduce HAMOND, an application that uses Apache Hadoop to parallelize DIAMOND computation in order to scale-out the calculation of alignments. HAMOND is fault tolerant and scalable by utilizing large cloud computing infrastructures like Amazon Web Services. HAMOND has been tested in comparative genomics analyses and showed promising results both in efficiency and accuracy. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  16. Network neuroscience

    PubMed Central

    Bassett, Danielle S; Sporns, Olaf

    2017-01-01

    Despite substantial recent progress, our understanding of the principles and mechanisms underlying complex brain function and cognition remains incomplete. Network neuroscience proposes to tackle these enduring challenges. Approaching brain structure and function from an explicitly integrative perspective, network neuroscience pursues new ways to map, record, analyze and model the elements and interactions of neurobiological systems. Two parallel trends drive the approach: the availability of new empirical tools to create comprehensive maps and record dynamic patterns among molecules, neurons, brain areas and social systems; and the theoretical framework and computational tools of modern network science. The convergence of empirical and computational advances opens new frontiers of scientific inquiry, including network dynamics, manipulation and control of brain networks, and integration of network processes across spatiotemporal domains. We review emerging trends in network neuroscience and attempt to chart a path toward a better understanding of the brain as a multiscale networked system. PMID:28230844

  17. The markup is the model: reasoning about systems biology models in the Semantic Web era.

    PubMed

    Kell, Douglas B; Mendes, Pedro

    2008-06-07

    Metabolic control analysis, co-invented by Reinhart Heinrich, is a formalism for the analysis of biochemical networks, and is a highly important intellectual forerunner of modern systems biology. Exchanging ideas and exchanging models are part of the international activities of science and scientists, and the Systems Biology Markup Language (SBML) allows one to perform the latter with great facility. Encoding such models in SBML allows their distributed analysis using loosely coupled workflows, and with the advent of the Internet the various software modules that one might use to analyze biochemical models can reside on entirely different computers and even on different continents. Optimization is at the core of many scientific and biotechnological activities, and Reinhart made many major contributions in this area, stimulating our own activities in the use of the methods of evolutionary computing for optimization.

  18. Nonlinear Aerodynamics and the Design of Wing Tips

    NASA Technical Reports Server (NTRS)

    Kroo, Ilan

    1991-01-01

    The analysis and design of wing tips for fixed wing and rotary wing aircraft still remains part art, part science. Although the design of airfoil sections and basic planform geometry is well developed, the tip regions require more detailed consideration. This is important because of the strong impact of wing tip flow on wing drag; although the tip region constitutes a small portion of the wing, its effect on the drag can be significant. The induced drag of a wing is, for a given lift and speed, inversely proportional to the square of the wing span. Concepts are proposed as a means of reducing drag. Modern computational methods provide a tool for studying these issues in greater detail. The purpose of the current research program is to improve the understanding of the fundamental issues involved in the design of wing tips and to develop the range of computational and experimental tools needed for further study of these ideas.

  19. Modern Functions of a Textbook on Social Sciences and Humanities as an Informational Management Tool of University Education

    ERIC Educational Resources Information Center

    Nikonova, Elina I.; Sharonov, Ivan A.; Sorokoumova, Svetlana N.; Suvorova, Olga V.; Sorokoumova, Elena A.

    2016-01-01

    The relevance of the study is conditioned by the changes in the content of socio-humanitarian education, aimed at the acquisition of knowledge, the development of tolerance, civic and moral education. The purpose of the paper is to identify the modern functions of a textbook on social sciences and humanities as an informational management tool of…

  20. Measurement of Bitumen Viscosity in a Room-Temperature Drop Experiment: Student Education, Public Outreach and Modern Science in One

    ERIC Educational Resources Information Center

    Widdicombe, A. T.; Ravindrarajah, P.; Sapelkin, A.; Phillips, A. E.; Dunstan, D.; Dove, M. T.; Brazhkin, V. V.; Trachenko, K.

    2014-01-01

    The slow flow of a viscous liquid is a thought-provoking experiment that challenges students, academics and the public to think about some fundamental questions in modern science. In the Queensland demonstration--the world's longest-running experiment, which has earned the Ig Nobel prize--one drop of pitch takes about ten years to fall, leading to…

  1. Modern network science of neurological disorders.

    PubMed

    Stam, Cornelis J

    2014-10-01

    Modern network science has revealed fundamental aspects of normal brain-network organization, such as small-world and scale-free patterns, hierarchical modularity, hubs and rich clubs. The next challenge is to use this knowledge to gain a better understanding of brain disease. Recent developments in the application of network science to conditions such as Alzheimer's disease, multiple sclerosis, traumatic brain injury and epilepsy have challenged the classical concept of neurological disorders being either 'local' or 'global', and have pointed to the overload and failure of hubs as a possible final common pathway in neurological disorders.

  2. The Navajo Learning Network and the NASA Life Sciences/AFOSR Infrastructure Development Project

    NASA Technical Reports Server (NTRS)

    1999-01-01

    The NSF-funded Navajo Learning Network project, with help from NASA Life Sciences and AFOSR, enabled Dine College to take a giant leap forward technologically - in a way that could never had been possible had these projects been managed separately. The combination of these and other efforts created a network of over 500 computers located at ten sites across the Navajo reservation. Additionally, the college was able to install a modern telephone system which shares network data, and purchase a new higher education management system. The NASA Life Sciences funds further allowed the college library system to go online and become available to the entire campus community. NSF, NASA and AFOSR are committed to improving minority access to higher education opportunities and promoting faculty development and undergraduate research through infrastructure support and development. This project has begun to address critical inequalities in access to science, mathematics, engineering and technology for Navajo students and educators. As a result, Navajo K-12 education has been bolstered and Dine College will therefore better prepare students to transfer successfully to four-year institutions. Due to the integration of the NSF and NASA/AFOSR components of the project, a unified project report is appropriate.

  3. Experimenter's laboratory for visualized interactive science

    NASA Technical Reports Server (NTRS)

    Hansen, Elaine R.; Klemp, Marjorie K.; Lasater, Sally W.; Szczur, Marti R.; Klemp, Joseph B.

    1992-01-01

    The science activities of the 1990's will require the analysis of complex phenomena and large diverse sets of data. In order to meet these needs, we must take advantage of advanced user interaction techniques: modern user interface tools; visualization capabilities; affordable, high performance graphics workstations; and interoperable data standards and translator. To meet these needs, we propose to adopt and upgrade several existing tools and systems to create an experimenter's laboratory for visualized interactive science. Intuitive human-computer interaction techniques have already been developed and demonstrated at the University of Colorado. A Transportable Applications Executive (TAE+), developed at GSFC, is a powerful user interface tool for general purpose applications. A 3D visualization package developed by NCAR provides both color shaded surface displays and volumetric rendering in either index or true color. The Network Common Data Form (NetCDF) data access library developed by Unidata supports creation, access and sharing of scientific data in a form that is self-describing and network transparent. The combination and enhancement of these packages constitutes a powerful experimenter's laboratory capable of meeting key science needs of the 1990's. This proposal encompasses the work required to build and demonstrate this capability.

  4. Experimenter's laboratory for visualized interactive science

    NASA Technical Reports Server (NTRS)

    Hansen, Elaine R.; Klemp, Marjorie K.; Lasater, Sally W.; Szczur, Marti R.; Klemp, Joseph B.

    1993-01-01

    The science activities of the 1990's will require the analysis of complex phenomena and large diverse sets of data. In order to meet these needs, we must take advantage of advanced user interaction techniques: modern user interface tools; visualization capabilities; affordable, high performance graphics workstations; and interoperatable data standards and translator. To meet these needs, we propose to adopt and upgrade several existing tools and systems to create an experimenter's laboratory for visualized interactive science. Intuitive human-computer interaction techniques have already been developed and demonstrated at the University of Colorado. A Transportable Applications Executive (TAE+), developed at GSFC, is a powerful user interface tool for general purpose applications. A 3D visualization package developed by NCAR provides both color-shaded surface displays and volumetric rendering in either index or true color. The Network Common Data Form (NetCDF) data access library developed by Unidata supports creation, access and sharing of scientific data in a form that is self-describing and network transparent. The combination and enhancement of these packages constitutes a powerful experimenter's laboratory capable of meeting key science needs of the 1990's. This proposal encompasses the work required to build and demonstrate this capability.

  5. Research integrity and rights of indigenous peoples: appropriating Foucault's critique of knowledge/power.

    PubMed

    Swazo, Norman K

    2005-09-01

    In this paper I appropriate the philosophical critique of Michel Foucault as it applies to the engagement of Western science and indigenous peoples in the context of biomedical research. The science of population genetics, specifically as pursued in the Human Genome Diversity Project, is the obvious example to illustrate (a) the contraposition of modern science and 'indigenous science', (b) the tendency to depreciate and marginalize indigenous knowledge systems, and (c) the subsumption of indigenous moral preferences in the juridical armature of international human rights law. I suggest that international bioethicists may learn from Foucault's critique, specifically of the need for vigilance about the knowledge/power relation expressed by the contraposition of modern science and 'indigeneity'.

  6. Towards a Science of Science Teaching

    ERIC Educational Resources Information Center

    Yates, Carolyn

    2009-01-01

    This article is a contribution to the search for evidence-based models of learning to improve science education. The author believes that modern teachers should look to the sciences of cognitive psychology and neuroscience to build a science of science teaching. Understanding the relationships between learning and the brain's structure and…

  7. Know Your Discipline: Teaching the Philosophy of Computer Science

    ERIC Educational Resources Information Center

    Tedre, Matti

    2007-01-01

    The diversity and interdisciplinarity of computer science and the multiplicity of its uses in other sciences make it hard to define computer science and to prescribe how computer science should be carried out. The diversity of computer science also causes friction between computer scientists from different branches. Computer science curricula, as…

  8. Inhibition of the Thyroid Hormone Pathway in Xenopus laevis by 2-mercaptobenzothiazole

    EPA Science Inventory

    Modernizing the battery of EDSP Tier I tests is not only desirable, but it is inevitable. Advances in science over the past decade have enabled such modernization to occur. The studies presented here establish the groundwork for the Agency to move toward modernization of the AMA,...

  9. Numerical characteristics of quantum computer simulation

    NASA Astrophysics Data System (ADS)

    Chernyavskiy, A.; Khamitov, K.; Teplov, A.; Voevodin, V.; Voevodin, Vl.

    2016-12-01

    The simulation of quantum circuits is significantly important for the implementation of quantum information technologies. The main difficulty of such modeling is the exponential growth of dimensionality, thus the usage of modern high-performance parallel computations is relevant. As it is well known, arbitrary quantum computation in circuit model can be done by only single- and two-qubit gates, and we analyze the computational structure and properties of the simulation of such gates. We investigate the fact that the unique properties of quantum nature lead to the computational properties of the considered algorithms: the quantum parallelism make the simulation of quantum gates highly parallel, and on the other hand, quantum entanglement leads to the problem of computational locality during simulation. We use the methodology of the AlgoWiki project (algowiki-project.org) to analyze the algorithm. This methodology consists of theoretical (sequential and parallel complexity, macro structure, and visual informational graph) and experimental (locality and memory access, scalability and more specific dynamic characteristics) parts. Experimental part was made by using the petascale Lomonosov supercomputer (Moscow State University, Russia). We show that the simulation of quantum gates is a good base for the research and testing of the development methods for data intense parallel software, and considered methodology of the analysis can be successfully used for the improvement of the algorithms in quantum information science.

  10. Romanticism and Romantic Science: Their Contribution to Science Education

    ERIC Educational Resources Information Center

    Hadzigeorgiou, Yannis; Schulz, Roland

    2014-01-01

    The unique contributions of romanticism and romantic science have been generally ignored or undervalued in history and philosophy of science studies and science education. Although more recent research in history of science has come to delineate the value of both topics for the development of modern science, their merit for the educational field…

  11. The Implications for Science Education of Heidegger's Philosophy of Science

    ERIC Educational Resources Information Center

    Shaw, Robert

    2013-01-01

    Science teaching always engages a philosophy of science. This article introduces a modern philosophy of science and indicates its implications for science education. The hermeneutic philosophy of science is the tradition of Kant, Heidegger, and Heelan. Essential to this tradition are two concepts of truth, truth as correspondence and truth as…

  12. High Resolution Nature Runs and the Big Data Challenge

    NASA Technical Reports Server (NTRS)

    Webster, W. Phillip; Duffy, Daniel Q.

    2015-01-01

    NASA's Global Modeling and Assimilation Office at Goddard Space Flight Center is undertaking a series of very computationally intensive Nature Runs and a downscaled reanalysis. The nature runs use the GEOS-5 as an Atmospheric General Circulation Model (AGCM) while the reanalysis uses the GEOS-5 in Data Assimilation mode. This paper will present computational challenges from three runs, two of which are AGCM and one is downscaled reanalysis using the full DAS. The nature runs will be completed at two surface grid resolutions, 7 and 3 kilometers and 72 vertical levels. The 7 km run spanned 2 years (2005-2006) and produced 4 PB of data while the 3 km run will span one year and generate 4 BP of data. The downscaled reanalysis (MERRA-II Modern-Era Reanalysis for Research and Applications) will cover 15 years and generate 1 PB of data. Our efforts to address the big data challenges of climate science, we are moving toward a notion of Climate Analytics-as-a-Service (CAaaS), a specialization of the concept of business process-as-a-service that is an evolving extension of IaaS, PaaS, and SaaS enabled by cloud computing. In this presentation, we will describe two projects that demonstrate this shift. MERRA Analytic Services (MERRA/AS) is an example of cloud-enabled CAaaS. MERRA/AS enables MapReduce analytics over MERRA reanalysis data collection by bringing together the high-performance computing, scalable data management, and a domain-specific climate data services API. NASA's High-Performance Science Cloud (HPSC) is an example of the type of compute-storage fabric required to support CAaaS. The HPSC comprises a high speed Infinib and network, high performance file systems and object storage, and a virtual system environments specific for data intensive, science applications. These technologies are providing a new tier in the data and analytic services stack that helps connect earthbound, enterprise-level data and computational resources to new customers and new mobility-driven applications and modes of work. In our experience, CAaaS lowers the barriers and risk to organizational change, fosters innovation and experimentation, and provides the agility required to meet our customers' increasing and changing needs

  13. Non-Euclidean Space, Movement and Astronomy in Modern Art: Alexander Calder's Mobiles and Ben Nicholson's Reliefs

    NASA Astrophysics Data System (ADS)

    Malloy, Vanja

    2013-09-01

    John Keats once wrote that `there is no such thing as time and space' rather, believing that time and space are mental constructs that are subject to a variety of forms and as diverse as the human mind. In the 1920s through the 1930s, modern physics in many ways supported this idea through the various philosophical writings on the Theory of General Relativity to the masses by scientists such as Arthur Eddington and Albert Einstein. These new concepts of modern physics fundamentally changed our understanding of time and space and had substantial philosophical implications, which were absorbed by modern artists resulting in the 1936 Dimensionist Manifesto. Seeking to internalize the developments of modern science within modern art, this manifesto was widely endorsed by the most prominent figures of the avant-garde such as Marcel Duchamp, Jean Arp, Naum Gabo, Joan Miró, László Moholy-Nagy, Wassily Kandinsky and Alexander Calder. Of particular interest to this manifesto was the new concept of the fourth-dimension, which in many ways revolutionized the arts. Importantly, its interpretation varied widely in the artistic community, ranging from a purely physical four-dimensional space, to a kinetic concept of space in which space and time are linked, to a metaphysical interest in a space that exists beyond the material realm. The impact of modern science and astronomy on avant-garde art is currently a bourgeoning area of research with considerable implications to our rethinking of substantial artistic figures of this era. Through a case study of Alexander Calder's Mobiles and Ben Nicholson's Reliefs, this paper explores how these artworks were informed by an interest in modern science.

  14. Writings on Physics and Philosophy

    NASA Astrophysics Data System (ADS)

    Pauli, Wolfgang Enz, Charles P.; Meyenn, Karl V.

    Like Bohr, Einstein and Heisenberg, Wolfgang Pauli was not only a Nobel laureate and one of the creators of modern physics, but also an eminent philosopher of modern science. This is the first book in English to include all his famous articles on physics and epistemology. They were actually translated during Pauli's lifetime by R. Schlapp and are now edited and annotated by Pauli's former assistant Ch. Enz. Pauli writes about the philosophical significance of complementarity, about space,time and causality, symmetry and the exclusion principle, but also about therole of the unconscious in modern science. His famous article on Kepler is included as well as many historical essays on Bohr, Ehrenfest,and Einstein as well as on the influence of the unconscious on scientific theories. The book addresses not only physicists, philosophers and historians of science, but also the general public.

  15. A Not-So-Gentle Refutation of the Defence of Homeopathy.

    PubMed

    Zawiła-Niedźwiecki, Jakub; Olender, Jacek

    2016-03-01

    In a recent paper, Levy, Gadd, Kerridge, and Komesaroff attempt to defend the ethicality of homeopathy by attacking the utilitarian ethical framework as a basis for medical ethics and by introducing a distinction between evidence-based medicine and modern science. This paper demonstrates that their argumentation is not only insufficient to achieve that goal but also incorrect. Utilitarianism is not required to show that homeopathic practice is unethical; indeed, any normative basis of medical ethics will make it unethical, as a defence of homeopathic practice requires the rejection of modern natural sciences, which are an integral part of medical ethics systems. This paper also points out that evidence-based medicine lies at the very core of modern science. Particular arguments made by Levy et al. within the principlist medical ethics normative system are also shown to be wrong.

  16. "The Name of the Rose": A Path to Discuss the Birth of Modern Science

    ERIC Educational Resources Information Center

    Guerra, Andreia; Braga, Marco

    2014-01-01

    Various science education researchers believe that science tuition should include some discussion about how science has developed over time. Therefore, deliberations about the nature of science should be integrated in the science curriculum. Many researchers argue that teaching the history of science is a good way to place the nature of science in…

  17. Crop improvement using life cycle datasets acquired under field conditions.

    PubMed

    Mochida, Keiichi; Saisho, Daisuke; Hirayama, Takashi

    2015-01-01

    Crops are exposed to various environmental stresses in the field throughout their life cycle. Modern plant science has provided remarkable insights into the molecular networks of plant stress responses in laboratory conditions, but the responses of different crops to environmental stresses in the field need to be elucidated. Recent advances in omics analytical techniques and information technology have enabled us to integrate data from a spectrum of physiological metrics of field crops. The interdisciplinary efforts of plant science and data science enable us to explore factors that affect crop productivity and identify stress tolerance-related genes and alleles. Here, we describe recent advances in technologies that are key components for data driven crop design, such as population genomics, chronological omics analyses, and computer-aided molecular network prediction. Integration of the outcomes from these technologies will accelerate our understanding of crop phenology under practical field situations and identify key characteristics to represent crop stress status. These elements would help us to genetically engineer "designed crops" to prevent yield shortfalls because of environmental fluctuations due to future climate change.

  18. Magic Universe - The Oxford Guide to Modern Science

    NASA Astrophysics Data System (ADS)

    Calder, Nigel

    2003-11-01

    As a prolific author, BBC commentator, and magazine editor, Nigel Calder has spent a lifetime spotting and explaining the big discoveries in all branches of science. In Magic Universe , he draws on his vast experience to offer readers a lively, far-reaching look at modern science in all its glory, shedding light on the latest ideas in physics, biology, chemistry, medicine, astronomy, and many other fields. What is truly magical about Magic Universe is Calder's incredible breadth. Migrating birds, light sensors in the human eye, black holes, antimatter, buckyballs and nanotubes--with exhilarating sweep, Calder can range from the strings of a piano to the superstrings of modern physics, from Pythagoras's theory of musical pitch to the most recent ideas about atoms and gravity and a ten-dimensional universe--all in one essay. The great virtue of this wide-ranging style--besides its liveliness and versatility--is that it allows Calder to illuminate how the modern sciences intermingle and cross-fertilize one another. Indeed, whether discussing astronauts or handedness or dinosaurs, Calder manages to tease out hidden connections between disparate fields of study. What is most wondrous about the "magic universe" is that one can begin with stellar dust and finish with life itself. Drawing on interviews with more than 200 researchers, from graduate students to Nobel prize-winners, Magic Universe takes us on a high-spirited tour through the halls of science, one that will enthrall everyone interested in science, whether a young researcher in a high-tech lab or an amateur buff sitting in the comfort of an armchair.

  19. Science communication as political communication

    PubMed Central

    Scheufele, Dietram A.

    2014-01-01

    Scientific debates in modern societies often blur the lines between the science that is being debated and the political, moral, and legal implications that come with its societal applications. This manuscript traces the origins of this phenomenon to professional norms within the scientific discipline and to the nature and complexities of modern science and offers an expanded model of science communication that takes into account the political contexts in which science communication takes place. In a second step, it explores what we know from empirical work in political communication, public opinion research, and communication research about the dynamics that determine how issues are debated and attitudes are formed in political environments. Finally, it discusses how and why it will be increasingly important for science communicators to draw from these different literatures to ensure that the voice of the scientific community is heard in the broader societal debates surrounding science. PMID:25225389

  20. Science communication as political communication.

    PubMed

    Scheufele, Dietram A

    2014-09-16

    Scientific debates in modern societies often blur the lines between the science that is being debated and the political, moral, and legal implications that come with its societal applications. This manuscript traces the origins of this phenomenon to professional norms within the scientific discipline and to the nature and complexities of modern science and offers an expanded model of science communication that takes into account the political contexts in which science communication takes place. In a second step, it explores what we know from empirical work in political communication, public opinion research, and communication research about the dynamics that determine how issues are debated and attitudes are formed in political environments. Finally, it discusses how and why it will be increasingly important for science communicators to draw from these different literatures to ensure that the voice of the scientific community is heard in the broader societal debates surrounding science.

  1. Western teachers of science or teachers of Western science: On the influence of Western modern science in a post-colonial context

    NASA Astrophysics Data System (ADS)

    Burke, Lydia E. Carol-Ann

    An expanding body of research explores the social, political, cultural and personal challenges presented by the Western emphasis of curricula around the world. The aim of my study is to advance this field of inquiry by gaining insight into perceptions of Western modern science presented by students, teachers and administrators in a given Caribbean setting. Through this study I asked how my research participants described the nature of scientific knowledge, how they related scientific knowledge to other culturally-valued knowledges and the meanings they attached to the geographic origins of science teachers. Situating this work firmly within the practice of Foucauldian critical discourse analysis, I have utilised a conceptual framework defined by the power/knowledge and complicity/resistance themes of post-colonial theory to support my interpretation of participant commentary in an overall quest that is concerned about the ways in which Western modern science might be exerting a colonising influence. Fourteen students, nine teachers (both expatriate and local) and three administrators participated in the study. I combined a semi-structured question and answer interview format with a card sort activity. I used a procedure based on my own adaptation of Stephenson's Q methodology, where the respondents placed 24 statements hierarchically along a continuum of increasing strength of agreement, presenting their rationalisations, personal stories and illustrations as they sorted. I used an inverse factor analysis, in combination with the interview transcripts, to assist me in the identification of three discourse positions described by my research participants: The truth value of scientific knowledge, The pragmatic use of science to promote progress, and The priority of cultural preservation. The interview transcripts were also analysed for emergent themes, providing an additional layer of data interpretation. The research findings raise concerns regarding the hegemonic potency of certain scientific assumptions and assertions of participants, leading me to emphasise the importance of developing teachers' knowledge of the historical, philosophical and social background of Western modern science as well as focusing on developing the conceptual and intellectual engagement of students with Western modern science without demanding the kind of belief commitment that would insist that students replace alternative modes of meaning making.

  2. Does Geophysics Need "A new kind of Science"?

    NASA Astrophysics Data System (ADS)

    Turcotte, D. L.; Rundle, J. B.

    2002-12-01

    Stephen Wolfram's book "A New Kind of Science" has received a great deal of attention in the last six months, both positive and negative. The theme of the book is that "cellular automata", which arise from spatial and temporal coarse-graining of equations of motion, provide the foundations for a new nonlinear science of "complexity". The old science is the science of partial differential equations. Some of the major contributions of this old science have been in geophysics, i.e. gravity, magnetics, seismic waves, heat flow. The basis of the new science is the use of massive computing and numerical simulations. The new science is motivated by the observations that many physical systems display a vast multiplicity of space and time scales, and have hidden dynamics that in many cases are impossible to directly observe. An example would be molecular dynamics. Statistical physics derives continuum equations from the discrete interactions between atoms and molecules, in the modern world the continuum equations are then discretized using finite differences, finite elements, etc. in order to obtain numerical solutions. Examples of widely used cellular automata models include diffusion limited aggregation and site percolation. Also the class of models that are said to exhibit self-organized criticality, the sand-pile model, the slider-block model, the forest-fire model. Applications of these models include drainage networks, seismicity, distributions of minerals,and the evolution of landforms and coastlines. Simple cellular automata models generate deterministic chaos, i.e. the logistic map.

  3. The Data Science Landscape

    NASA Astrophysics Data System (ADS)

    Mentzel, C.

    2017-12-01

    Modern scientific data continue to increase in volume, variety, and velocity, and though the hype of big data has subsided, its usefulness for scientific discovery has only just begun. Harnessing these data for new insights, more efficient decision making, and other mission critical uses requires a combination of skills and expertise, often labeled data science. Data science can be thought of as a combination of statistics, computation and the domain from which the data relate, and so is a true interdisciplinary pursuit. Though it has reaped large benefits in companies able to afford the high cost of the severely limited talent pool, it suffers from lack of support in mission driven organizations. Not purely in any one historical field, data science is proving difficult to find a home in traditional university academic departments and other research organizations. The landscape of data science efforts, from academia, industry and government, can be characterized as nascent, enthusiastic, uneven, and highly competitive. Part of the challenge in documenting these trends is the lack of agreement about what data science is, and who is a data scientist. Defining these terms too closely and too early runs the risk of cutting off a tremendous amount of productive creativity, but waiting too long leaves many people without a sustainable career, and many organizations without the necessary skills to gain value from their data. This talk will explore the landscape of data science efforts in the US, including how organizations are building and sustaining data science teams.

  4. Modern astronomical knowledge as component of general education for sustainable development

    NASA Astrophysics Data System (ADS)

    Nurgaliev, I.

    {It is shown that 1) astronomical knowledge was a foundation of emerging modern physics and natural sciences based on mathematics, 2) mathematical basis of the natural sciences serves as an orientation of progress in the true objective of social sciences. The last example for this chain of impacts is the discovery of the fundamental demographic equation (N=aN^2-bN) full of the astronomical analogy [9]. Modern age endorses new imperatives on education. Reckless exploitation of the natural resources will cause irreversible exhaustion of the agro- and bio-potential of the planet during lifetime of a few generations. The adequate respond to the challenge lies in modern technologies and educating responsible (socially oriented) professionals. That is why the importance of teaching modern technologies along with providing the students with the understanding of global long term consequences of the human industrial activities is growing. The course ``Theoretical Foundations of Modern Technologies" at the Moscow State Agricultural University (Timiryazev Academy) taught by the author is discussed. New experimental project ``Space Technologies, Ecology and Safe Energetics in School of the Future" is presented as a project of a new age in the process of implementing at the Moscow city secondary schools by the colleagues and by the author. The new cosmological models in the frame of the Newtonian and general relativistic treatments developed by the author are considered in this report as an example of immediate implementation of new astro-knowledge into the education for modern agrarian students. The centrifugal forces acting between particles rotating randomly around each other are shown to be able to reverse gravitational collapse.

  5. PREFACE: 9th World Congress on Computational Mechanics and 4th Asian Pacific Congress on Computational Mechanics

    NASA Astrophysics Data System (ADS)

    Khalili, N.; Valliappan, S.; Li, Q.; Russell, A.

    2010-07-01

    The use for mathematical models of natural phenomena has underpinned science and engineering for centuries, but until the advent of modern computers and computational methods, the full utility of most of these models remained outside the reach of the engineering communities. Since World War II, advances in computational methods have transformed the way engineering and science is undertaken throughout the world. Today, theories of mechanics of solids and fluids, electromagnetism, heat transfer, plasma physics, and other scientific disciplines are implemented through computational methods in engineering analysis, design, manufacturing, and in studying broad classes of physical phenomena. The discipline concerned with the application of computational methods is now a key area of research, education, and application throughout the world. In the early 1980's, the International Association for Computational Mechanics (IACM) was founded to promote activities related to computational mechanics and has made impressive progress. The most important scientific event of IACM is the World Congress on Computational Mechanics. The first was held in Austin (USA) in 1986 and then in Stuttgart (Germany) in 1990, Chiba (Japan) in 1994, Buenos Aires (Argentina) in 1998, Vienna (Austria) in 2002, Beijing (China) in 2004, Los Angeles (USA) in 2006 and Venice, Italy; in 2008. The 9th World Congress on Computational Mechanics is held in conjunction with the 4th Asian Pacific Congress on Computational Mechanics under the auspices of Australian Association for Computational Mechanics (AACM), Asian Pacific Association for Computational Mechanics (APACM) and International Association for Computational Mechanics (IACM). The 1st Asian Pacific Congress was in Sydney (Australia) in 2001, then in Beijing (China) in 2004 and Kyoto (Japan) in 2007. The WCCM/APCOM 2010 publications consist of a printed book of abstracts given to delegates, along with 247 full length peer reviewed papers published with free access online in IOP Conference Series: Materials Science and Engineering. The editors acknowledge the help of the paper reviewers in maintaining a high standard of assessment and the co-operation of the authors in complying with the requirements of the editors and the reviewers. We also would like to take this opportunity to thank the members of the Local Organising Committee and the International Scientific Committee for helping make WCCM/APCOM 2010 a successful event. We also thank The University of New South Wales, The University of Newcastle, the Centre for Infrastructure Engineering and Safety (CIES), IACM, APCAM, AACM for their financial support, along with the United States Association for Computational Mechanics for the Travel Awards made available. N. Khalili S. Valliappan Q. Li A. Russell 19 July 2010 Sydney, Australia

  6. MERRA Analytic Services: Meeting the Big Data Challenges of Climate Science through Cloud-Enabled Climate Analytics-as-a-Service

    NASA Astrophysics Data System (ADS)

    Schnase, J. L.; Duffy, D.; Tamkin, G. S.; Nadeau, D.; Thompson, J. H.; Grieg, C. M.; McInerney, M.; Webster, W. P.

    2013-12-01

    Climate science is a Big Data domain that is experiencing unprecedented growth. In our efforts to address the Big Data challenges of climate science, we are moving toward a notion of Climate Analytics-as-a-Service (CAaaS). We focus on analytics, because it is the knowledge gained from our interactions with Big Data that ultimately produce societal benefits. We focus on CAaaS because we believe it provides a useful way of thinking about the problem: a specialization of the concept of business process-as-a-service, which is an evolving extension of IaaS, PaaS, and SaaS enabled by Cloud Computing. Within this framework, Cloud Computing plays an important role; however, we see it as only one element in a constellation of capabilities that are essential to delivering climate analytics as a service. These elements are essential because in the aggregate they lead to generativity, a capacity for self-assembly that we feel is the key to solving many of the Big Data challenges in this domain. MERRA Analytic Services (MERRA/AS) is an example of cloud-enabled CAaaS built on this principle. MERRA/AS enables MapReduce analytics over NASA's Modern-Era Retrospective Analysis for Research and Applications (MERRA) data collection. The MERRA reanalysis integrates observational data with numerical models to produce a global temporally and spatially consistent synthesis of 26 key climate variables. It represents a type of data product that is of growing importance to scientists doing climate change research and a wide range of decision support applications. MERRA/AS brings together the following generative elements in a full, end-to-end demonstration of CAaaS capabilities: (1) high-performance, data proximal analytics, (2) scalable data management, (3) software appliance virtualization, (4) adaptive analytics, and (5) a domain-harmonized API. The effectiveness of MERRA/AS has been demonstrated in several applications. In our experience, Cloud Computing lowers the barriers and risk to organizational change, fosters innovation and experimentation, facilitates technology transfer, and provides the agility required to meet our customers' increasing and changing needs. Cloud Computing is providing a new tier in the data services stack that helps connect earthbound, enterprise-level data and computational resources to new customers and new mobility-driven applications and modes of work. For climate science, Cloud Computing's capacity to engage communities in the construction of new capabilies is perhaps the most important link between Cloud Computing and Big Data.

  7. MERRA Analytic Services: Meeting the Big Data Challenges of Climate Science Through Cloud-enabled Climate Analytics-as-a-service

    NASA Technical Reports Server (NTRS)

    Schnase, John L.; Duffy, Daniel Quinn; Tamkin, Glenn S.; Nadeau, Denis; Thompson, John H.; Grieg, Christina M.; McInerney, Mark A.; Webster, William P.

    2014-01-01

    Climate science is a Big Data domain that is experiencing unprecedented growth. In our efforts to address the Big Data challenges of climate science, we are moving toward a notion of Climate Analytics-as-a-Service (CAaaS). We focus on analytics, because it is the knowledge gained from our interactions with Big Data that ultimately produce societal benefits. We focus on CAaaS because we believe it provides a useful way of thinking about the problem: a specialization of the concept of business process-as-a-service, which is an evolving extension of IaaS, PaaS, and SaaS enabled by Cloud Computing. Within this framework, Cloud Computing plays an important role; however, we it see it as only one element in a constellation of capabilities that are essential to delivering climate analytics as a service. These elements are essential because in the aggregate they lead to generativity, a capacity for self-assembly that we feel is the key to solving many of the Big Data challenges in this domain. MERRA Analytic Services (MERRAAS) is an example of cloud-enabled CAaaS built on this principle. MERRAAS enables MapReduce analytics over NASAs Modern-Era Retrospective Analysis for Research and Applications (MERRA) data collection. The MERRA reanalysis integrates observational data with numerical models to produce a global temporally and spatially consistent synthesis of 26 key climate variables. It represents a type of data product that is of growing importance to scientists doing climate change research and a wide range of decision support applications. MERRAAS brings together the following generative elements in a full, end-to-end demonstration of CAaaS capabilities: (1) high-performance, data proximal analytics, (2) scalable data management, (3) software appliance virtualization, (4) adaptive analytics, and (5) a domain-harmonized API. The effectiveness of MERRAAS has been demonstrated in several applications. In our experience, Cloud Computing lowers the barriers and risk to organizational change, fosters innovation and experimentation, facilitates technology transfer, and provides the agility required to meet our customers' increasing and changing needs. Cloud Computing is providing a new tier in the data services stack that helps connect earthbound, enterprise-level data and computational resources to new customers and new mobility-driven applications and modes of work. For climate science, Cloud Computing's capacity to engage communities in the construction of new capabilies is perhaps the most important link between Cloud Computing and Big Data.

  8. FluxSuite: a New Scientific Tool for Advanced Network Management and Cross-Sharing of Next-Generation Flux Stations

    NASA Astrophysics Data System (ADS)

    Burba, G. G.; Johnson, D.; Velgersdyk, M.; Beaty, K.; Forgione, A.; Begashaw, I.; Allyn, D.

    2015-12-01

    Significant increases in data generation and computing power in recent years have greatly improved spatial and temporal flux data coverage on multiple scales, from a single station to continental flux networks. At the same time, operating budgets for flux teams and stations infrastructure are getting ever more difficult to acquire and sustain. With more stations and networks, larger data flows from each station, and smaller operating budgets, modern tools are needed to effectively and efficiently handle the entire process. This would help maximize time dedicated to answering research questions, and minimize time and expenses spent on data processing, quality control and station management. Cross-sharing the stations with external institutions may also help leverage available funding, increase scientific collaboration, and promote data analyses and publications. FluxSuite, a new advanced tool combining hardware, software and web-service, was developed to address these specific demands. It automates key stages of flux workflow, minimizes day-to-day site management, and modernizes the handling of data flows: Each next-generation station measures all parameters needed for flux computations Field microcomputer calculates final fully-corrected flux rates in real time, including computation-intensive Fourier transforms, spectra, co-spectra, multiple rotations, stationarity, footprint, etc. Final fluxes, radiation, weather and soil data are merged into a single quality-controlled file Multiple flux stations are linked into an automated time-synchronized network Flux network manager, or PI, can see all stations in real time, including fluxes, supporting data, automated reports, and email alerts PI can assign rights, allow or restrict access to stations and data: selected stations can be shared via rights-managed access internally or with external institutions Researchers without stations could form "virtual networks" for specific projects by collaborating with PIs from different actual networks This presentation provides detailed examples of FluxSuite currently utilized by two large flux networks in China (National Academy of Sciences & Agricultural Academy of Sciences), and smaller networks with stations in the USA, Germany, Ireland, Malaysia and other locations around the globe.

  9. Space-Time, Relativity, and Cosmology

    NASA Astrophysics Data System (ADS)

    Wudka, Jose

    2006-07-01

    Space-Time, Relativity and Cosmology provides a historical introduction to modern relativistic cosmology and traces its historical roots and evolution from antiquity to Einstein. The topics are presented in a non-mathematical manner, with the emphasis on the ideas that underlie each theory rather than their detailed quantitative consequences. A significant part of the book focuses on the Special and General theories of relativity. The tests and experimental evidence supporting the theories are explained together with their predictions and their confirmation. Other topics include a discussion of modern relativistic cosmology, the consequences of Hubble's observations leading to the Big Bang hypothesis, and an overview of the most exciting research topics in relativistic cosmology. This textbook is intended for introductory undergraduate courses on the foundations of modern physics. It is also accessible to advanced high school students, as well as non-science majors who are concerned with science issues.• Uses a historical perspective to describe the evolution of modern ideas about space and time • The main arguments are described using a completely non-mathematical approach • Ideal for physics undergraduates and high-school students, non-science majors and general readers

  10. ODI - Portal, Pipeline, and Archive (ODI-PPA): a web-based astronomical compute archive, visualization, and analysis service

    NASA Astrophysics Data System (ADS)

    Gopu, Arvind; Hayashi, Soichi; Young, Michael D.; Harbeck, Daniel R.; Boroson, Todd; Liu, Wilson; Kotulla, Ralf; Shaw, Richard; Henschel, Robert; Rajagopal, Jayadev; Stobie, Elizabeth; Knezek, Patricia; Martin, R. Pierre; Archbold, Kevin

    2014-07-01

    The One Degree Imager-Portal, Pipeline, and Archive (ODI-PPA) is a web science gateway that provides astronomers a modern web interface that acts as a single point of access to their data, and rich computational and visualization capabilities. Its goal is to support scientists in handling complex data sets, and to enhance WIYN Observatory's scientific productivity beyond data acquisition on its 3.5m telescope. ODI-PPA is designed, with periodic user feedback, to be a compute archive that has built-in frameworks including: (1) Collections that allow an astronomer to create logical collations of data products intended for publication, further research, instructional purposes, or to execute data processing tasks (2) Image Explorer and Source Explorer, which together enable real-time interactive visual analysis of massive astronomical data products within an HTML5 capable web browser, and overlaid standard catalog and Source Extractor-generated source markers (3) Workflow framework which enables rapid integration of data processing pipelines on an associated compute cluster and users to request such pipelines to be executed on their data via custom user interfaces. ODI-PPA is made up of several light-weight services connected by a message bus; the web portal built using Twitter/Bootstrap, AngularJS and jQuery JavaScript libraries, and backend services written in PHP (using the Zend framework) and Python; it leverages supercomputing and storage resources at Indiana University. ODI-PPA is designed to be reconfigurable for use in other science domains with large and complex datasets, including an ongoing offshoot project for electron microscopy data.

  11. Science without laws.

    PubMed

    Schweber, Silvan S

    2009-01-01

    During the 1970s, something deeply consequential happened in the cultural, economic, and social relationships between science and technology. Paul Forman has proposed that the abrupt reversal of the culturally ascribed primacy in the science-technology relationship circa 1980 be taken as a demarcation of postmodernity from modernity. Modernity's most basic cultural presuppositions-the superiority of theory to practice, the elevation of the public over the private and that of the disinterested over the interested, and the belief that the means sanctify the ends-were ascribed to science. In postmodernity, science is subsumed under technology, and the status of technology relative to science reflects our pragmatic-utilitarian subordination of means to ends. These cultural changes have resonated with deep epistemological and ontological changes within the sciences themselves, and all these have manifested themselves in universities becoming entrepreneurial, and the consequences thereof. Science Without Laws insightfully illustrates some of the changes within the life and human sciences by analyzing the role played by model systems and case studies.

  12. The Nature of Science and the Role of Knowledge and Belief

    NASA Astrophysics Data System (ADS)

    Cobern, William W.

    In everyday language we tend to think of knowledge as reasoned belief that a proposition is true and the natural sciences provide the archetypal example of what it means to know. Religious and ideological propositions are the typical examples of believed propositions. Moreover, the radical empiricist worldview so often associated with modern science has eroded society's meaningful sense of life. Western history, however, shows that knowledge and belief have not always been constructed separately. In addition, modern developments in the philosophy and history of science have seriously undermined the radical empiricist's excessive confidence in scientific methods. Acknowledging in the science classroom the parallel structure of knowledge and belief, and recognizing that science requires a presuppositional foundation that is itself not empirically verifiable would re introduce a valuable discussion on the meaning of science and its impact on life. Science would less likely be taught as a `rhetoric of conclusions'. The discussion would also help students to gain a firmer integration of science with other important knowledge and beliefs that they hold.

  13. Young Adults’ Belief in Genetic Determinism, and Knowledge and Attitudes towards Modern Genetics and Genomics: The PUGGS Questionnaire

    PubMed Central

    Carver, Rebecca Bruu; Castéra, Jérémy; Gericke, Niklas; Evangelista, Neima Alice Menezes

    2017-01-01

    In this paper we present the development and validation a comprehensive questionnaire to assess college students’ knowledge about modern genetics and genomics, their belief in genetic determinism, and their attitudes towards applications of modern genetics and genomic-based technologies. Written in everyday language with minimal jargon, the Public Understanding and Attitudes towards Genetics and Genomics (PUGGS) questionnaire is intended for use in research on science education and public understanding of science, as a means to investigate relationships between knowledge, determinism and attitudes about modern genetics, which are to date little understood. We developed a set of core ideas and initial items from reviewing the scientific literature on genetics and previous studies on public and student knowledge and attitudes about genetics. Seventeen international experts from different fields (e.g., genetics, education, philosophy of science) reviewed the initial items and their feedback was used to revise the questionnaire. We validated the questionnaire in two pilot tests with samples of university freshmen students. The final questionnaire contains 45 items, including both multiple choice and Likert scale response formats. Cronbach alpha showed good reliability for each section of the questionnaire. In conclusion, the PUGGS questionnaire is a reliable tool for investigating public understanding and attitudes towards modern genetics and genomic-based technologies. PMID:28114357

  14. Young Adults' Belief in Genetic Determinism, and Knowledge and Attitudes towards Modern Genetics and Genomics: The PUGGS Questionnaire.

    PubMed

    Carver, Rebecca Bruu; Castéra, Jérémy; Gericke, Niklas; Evangelista, Neima Alice Menezes; El-Hani, Charbel N

    2017-01-01

    In this paper we present the development and validation a comprehensive questionnaire to assess college students' knowledge about modern genetics and genomics, their belief in genetic determinism, and their attitudes towards applications of modern genetics and genomic-based technologies. Written in everyday language with minimal jargon, the Public Understanding and Attitudes towards Genetics and Genomics (PUGGS) questionnaire is intended for use in research on science education and public understanding of science, as a means to investigate relationships between knowledge, determinism and attitudes about modern genetics, which are to date little understood. We developed a set of core ideas and initial items from reviewing the scientific literature on genetics and previous studies on public and student knowledge and attitudes about genetics. Seventeen international experts from different fields (e.g., genetics, education, philosophy of science) reviewed the initial items and their feedback was used to revise the questionnaire. We validated the questionnaire in two pilot tests with samples of university freshmen students. The final questionnaire contains 45 items, including both multiple choice and Likert scale response formats. Cronbach alpha showed good reliability for each section of the questionnaire. In conclusion, the PUGGS questionnaire is a reliable tool for investigating public understanding and attitudes towards modern genetics and genomic-based technologies.

  15. Modern Computational Techniques for the HMMER Sequence Analysis

    PubMed Central

    2013-01-01

    This paper focuses on the latest research and critical reviews on modern computing architectures, software and hardware accelerated algorithms for bioinformatics data analysis with an emphasis on one of the most important sequence analysis applications—hidden Markov models (HMM). We show the detailed performance comparison of sequence analysis tools on various computing platforms recently developed in the bioinformatics society. The characteristics of the sequence analysis, such as data and compute-intensive natures, make it very attractive to optimize and parallelize by using both traditional software approach and innovated hardware acceleration technologies. PMID:25937944

  16. Where Tradition and "Modern" Knowledge Meet: Exploring Two Islamic Schools in Singapore and Britain

    ERIC Educational Resources Information Center

    Tan, Charlene

    2011-01-01

    Muslims live in a "modern" world where subjects such as the English language, mathematics, sciences, and information and communication technology (ICT) are highly valued and enthusiastically transmitted in schools. How some Islamic schools attempt to equip their students with "modern knowledge" while remaining faithful to their…

  17. Attitudes of Trainers and Medical Students towards Using Modern Practices

    ERIC Educational Resources Information Center

    Hadzhiiliev, Vassil Stefanov; Dobreva, Zhaneta Stoykova

    2011-01-01

    The development of universities as independent scientific centers determines their mission to incorporate the most modern achievements of science into the students' practical training. This research on the attitudes of the participants in this process towards the use of modern practices encompasses both trainers and students, and it consists of…

  18. The "Next Generation Science Standards" and the Earth and Space Sciences

    ERIC Educational Resources Information Center

    Wysession, Michael E.

    2013-01-01

    The "Next Generation Science Standards" ("NGSS"), due to be released this spring, represents a revolutionary step toward establishing modern, national K-12 science education standards. Based on the recommendations of the National Research Council's "A Framework for K-12 Science Education: Practices, Crosscutting…

  19. Gender Equity in Science Education

    ERIC Educational Resources Information Center

    Hall, Johanna R.

    2011-01-01

    The dearth of females in high-level science courses and professions is a well-documented phenomenon in modern society. Inequality in science instruction is a crucial component to the under representation of females in science. This paper provides a review of current literature published concerning gender inequality in K-12 science instruction.…

  20. Soils regulate and mitigate climate change

    USDA-ARS?s Scientific Manuscript database

    Background/Question/Methods: The interaction of soil science and ecology can be traced back to the origins of soil science as an independent discipline within the natural sciences. Vasili Dokuchaev, the founder of modern soil science, identified five soil forming factors: parent material, climate, o...

  1. Gendered Obstacles Faced by Historical Women in Physics and Astronomy

    NASA Astrophysics Data System (ADS)

    Jones, Kristen M.

    2007-12-01

    A gender gap still exists in modern science; this is especially evident in the fields of physics and astronomy. The cause of such a gap is the center of debate. Is this discrepancy the result of inherent ability or socialization? Most studies have focused on modern issues and how women are socialized today. The role of historical gender perspectives and social opinions in creating the field of modern science and any discrepancies within it has not yet been explored in depth. This project investigates the obstacles faced by historical women in physics and astronomy that stem from the officialized gender biases that accompanied the establishment of modern science. Such obstacles are both formal and informal. Four women were chosen to span the three hundred year period between the standardization of the field and the modern day: Laura Bassi, Mary Somerville, Lise Meitner, and Jocelyn Bell Burnell. The investigation reveals that formal obstacles significantly decreased over the time period, while informal obstacles eroded more gradually. Obstacles also reflected historical events such as the World Wars and the Enlightenment. Trends in obstacles faced by four prominent women physicists indicate that education, finances, support networks, and social opinion played a large role in determining success in the field. The applicability to modern day physics issues and the gender gap is discussed. Many thanks to the Pathways Scholars Program and the Ronald E. McNair Post-Baccalaureate Achievement Program for funding for this project.

  2. Novel 3D/VR interactive environment for MD simulations, visualization and analysis.

    PubMed

    Doblack, Benjamin N; Allis, Tim; Dávila, Lilian P

    2014-12-18

    The increasing development of computing (hardware and software) in the last decades has impacted scientific research in many fields including materials science, biology, chemistry and physics among many others. A new computational system for the accurate and fast simulation and 3D/VR visualization of nanostructures is presented here, using the open-source molecular dynamics (MD) computer program LAMMPS. This alternative computational method uses modern graphics processors, NVIDIA CUDA technology and specialized scientific codes to overcome processing speed barriers common to traditional computing methods. In conjunction with a virtual reality system used to model materials, this enhancement allows the addition of accelerated MD simulation capability. The motivation is to provide a novel research environment which simultaneously allows visualization, simulation, modeling and analysis. The research goal is to investigate the structure and properties of inorganic nanostructures (e.g., silica glass nanosprings) under different conditions using this innovative computational system. The work presented outlines a description of the 3D/VR Visualization System and basic components, an overview of important considerations such as the physical environment, details on the setup and use of the novel system, a general procedure for the accelerated MD enhancement, technical information, and relevant remarks. The impact of this work is the creation of a unique computational system combining nanoscale materials simulation, visualization and interactivity in a virtual environment, which is both a research and teaching instrument at UC Merced.

  3. Novel 3D/VR Interactive Environment for MD Simulations, Visualization and Analysis

    PubMed Central

    Doblack, Benjamin N.; Allis, Tim; Dávila, Lilian P.

    2014-01-01

    The increasing development of computing (hardware and software) in the last decades has impacted scientific research in many fields including materials science, biology, chemistry and physics among many others. A new computational system for the accurate and fast simulation and 3D/VR visualization of nanostructures is presented here, using the open-source molecular dynamics (MD) computer program LAMMPS. This alternative computational method uses modern graphics processors, NVIDIA CUDA technology and specialized scientific codes to overcome processing speed barriers common to traditional computing methods. In conjunction with a virtual reality system used to model materials, this enhancement allows the addition of accelerated MD simulation capability. The motivation is to provide a novel research environment which simultaneously allows visualization, simulation, modeling and analysis. The research goal is to investigate the structure and properties of inorganic nanostructures (e.g., silica glass nanosprings) under different conditions using this innovative computational system. The work presented outlines a description of the 3D/VR Visualization System and basic components, an overview of important considerations such as the physical environment, details on the setup and use of the novel system, a general procedure for the accelerated MD enhancement, technical information, and relevant remarks. The impact of this work is the creation of a unique computational system combining nanoscale materials simulation, visualization and interactivity in a virtual environment, which is both a research and teaching instrument at UC Merced. PMID:25549300

  4. [Hans Gross and the beginning of criminology on a scientific basis].

    PubMed

    Bachhiesl, Christian

    2007-01-01

    Modern criminology--if one wants to consider it a separate scientific discipline at all--is usually perceived as being mainly influenced by the methods of natural sciences supplemented by components from the field of psychology, which, at least in some of its conceptions, tends to define itself as a natural science, too. If we take a look at the history of science, we will see development of criminology in this direction was not necessarily inevitable. The scientific work of the Austrian Hans Gross (1847-1915), one of the founding fathers of scientific criminology, serves as an example of the way how natural sciences and their exact methods became established in the methodological apparatus of modern criminology, although in praxi his claim for the application of exact methods was all too often replaced by irrational and intuitive ways of working. Still, Hans Gross' fundamental decision for the exact methods derived from the natural sciences is an important step towards a criminology that can be understood as a part of natural sciences, largely superseding the methods of cultural sciences and anthropological philosophy. This approach made the (criminal) human being an object of measurement and can result in the concept of man as a mere phenomenon of quantity. This is, on the one hand, ethically questionable; on the other hand, it made modern criminology more efficient and successful.

  5. Self-reference and predictive, normative and prescriptive approaches in applications of systems thinking in social sciences—(Survey)

    NASA Astrophysics Data System (ADS)

    Mesjasz, Czesław

    2000-05-01

    Cybernetics, systems thinking or systems theory, have been viewed as instruments of enhancing predictive, normative and prescriptive capabilities of the social sciences, beginning from microscale-management and ending with various reference to the global system. Descriptions, explanations and predictions achieved thanks to various systems ideas were also viewed as supportive for potential governance of social phenomena. The main aim of the paper is to examine what could be the possible applications of modern systems thinking in predictive, normative and prescriptive approaches in modern social sciences, beginning from management theory and ending with global studies. Attention is paid not only to "classical" mathematical systems models but also to the role of predictive, normative and prescriptive interpretations of analogies and metaphors associated with application of the classical ("first order cybernetics") and modern ("second order cybernetics", "complexity theory") systems thinking in social sciences.

  6. Voluntarist theology and early-modern science: The matter of the divine power, absolute and ordained.

    PubMed

    Oakley, Francis

    2018-03-01

    This paper is an intervention in the debate inaugurated by Peter Harrison in 2002 when he called into question the validity of what has come to be called 'the voluntarism and early-modern science thesis'. Though it subsequently drew support from such historians of science as J. E. McGuire, Margaret Osler, and Betty-Joe Teeter Dobbs, the origins of the thesis are usually traced back to articles published in 1934 and 1961 respectively by the philosopher Michael Foster and the historian of ideas Francis Oakley. Central to Harrison's critique of the thesis are claims he made about the meaning of the scholastic distinction between the potentia dei absoluta et ordinata and the role it played in the thinking of early-modern theologians and natural philosophers. This paper calls directly into question the accuracy of Harrison's claims on that very matter.

  7. Data Science Priorities for a University Hospital-Based Institute of Infectious Diseases: A Viewpoint.

    PubMed

    Valleron, Alain-Jacques

    2017-08-15

    Automation of laboratory tests, bioinformatic analysis of biological sequences, and professional data management are used routinely in a modern university hospital-based infectious diseases institute. This dates back to at least the 1980s. However, the scientific methods of this 21st century are changing with the increased power and speed of computers, with the "big data" revolution having already happened in genomics and environment, and eventually arriving in medical informatics. The research will be increasingly "data driven," and the powerful machine learning methods whose efficiency is demonstrated in daily life will also revolutionize medical research. A university-based institute of infectious diseases must therefore not only gather excellent computer scientists and statisticians (as in the past, and as in any medical discipline), but also fully integrate the biologists and clinicians with these computer scientists, statisticians, and mathematical modelers having a broad culture in machine learning, knowledge representation, and knowledge discovery. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail: journals.permissions@oup.com.

  8. Integrating the Apache Big Data Stack with HPC for Big Data

    NASA Astrophysics Data System (ADS)

    Fox, G. C.; Qiu, J.; Jha, S.

    2014-12-01

    There is perhaps a broad consensus as to important issues in practical parallel computing as applied to large scale simulations; this is reflected in supercomputer architectures, algorithms, libraries, languages, compilers and best practice for application development. However, the same is not so true for data intensive computing, even though commercially clouds devote much more resources to data analytics than supercomputers devote to simulations. We look at a sample of over 50 big data applications to identify characteristics of data intensive applications and to deduce needed runtime and architectures. We suggest a big data version of the famous Berkeley dwarfs and NAS parallel benchmarks and use these to identify a few key classes of hardware/software architectures. Our analysis builds on combining HPC and ABDS the Apache big data software stack that is well used in modern cloud computing. Initial results on clouds and HPC systems are encouraging. We propose the development of SPIDAL - Scalable Parallel Interoperable Data Analytics Library -- built on system aand data abstractions suggested by the HPC-ABDS architecture. We discuss how it can be used in several application areas including Polar Science.

  9. Science on Wheels

    ERIC Educational Resources Information Center

    Savitz, Maxine L.

    1973-01-01

    A science program was developed which is based on a mobile laboratory containing scientific experiments in biology, chemistry, physics, applied science, and mathematics. Discussion and experiments differ from the normal classroom setting as they utilize small groups and center around the relationship of modern science and technology of the urban…

  10. Securing Secrets and Managing Trust in Modern Computing Applications

    ERIC Educational Resources Information Center

    Sayler, Andy

    2016-01-01

    The amount of digital data generated and stored by users increases every day. In order to protect this data, modern computing systems employ numerous cryptographic and access control solutions. Almost all of such solutions, however, require the keeping of certain secrets as the basis of their security models. How best to securely store and control…

  11. Implementation and Evaluation of Flipped Classroom as IoT Element into Learning Process of Computer Network Education

    ERIC Educational Resources Information Center

    Zhamanov, Azamat; Yoo, Seong-Moo; Sakhiyeva, Zhulduz; Zhaparov, Meirambek

    2018-01-01

    Students nowadays are hard to be motivated to study lessons with traditional teaching methods. Computers, smartphones, tablets and other smart devices disturb students' attentions. Nevertheless, those smart devices can be used as auxiliary tools of modern teaching methods. In this article, the authors review two popular modern teaching methods:…

  12. Science Teacher Identity and Eco-Transformation of Science Education: Comparing Western Modernism with Confucianism and Reflexive "Bildung"

    ERIC Educational Resources Information Center

    Sjöström, Jesper

    2018-01-01

    This forum article contributes to the understanding of how science teachers' identity is related to their worldviews, cultural values and educational philosophies, and to eco-transformation of science education. Special focus is put on "reform-minded" science teachers. The starting point is the paper "Science education reform in…

  13. From Nutty Professor to Buddy Love--Personality types in modern science.

    PubMed

    Charlton, Bruce G

    2007-01-01

    People often suggest that scientists should have a specific personality type, usually conscientious and self-critical. But this is a mistake. Science as a social system needs to be conscientious and self-critical, but scientists as people do not necessarily have to conform to that stereotype. Since science works by a process of selection, it makes sense to have a wide range of personalities in science. It takes all types. However, the selection pressures within science have changed over recent decades. In the past, a successful scientist often resembled the white-coated, bespectacled and introverted Nutty Professor in Jerry Lewis's movie of that name. But the modern science superstar is more like the Nutty Professor's alter ego, nightclub singer 'Buddy Love': a sharp-suited, good-looking and charismatic charmer. While Nutty was dull but impartial, Buddy is compelling but self-seeking. Our attitude towards public scientific pronouncements should be adjusted accordingly.

  14. Oscillatory threshold logic.

    PubMed

    Borresen, Jon; Lynch, Stephen

    2012-01-01

    In the 1940s, the first generation of modern computers used vacuum tube oscillators as their principle components, however, with the development of the transistor, such oscillator based computers quickly became obsolete. As the demand for faster and lower power computers continues, transistors are themselves approaching their theoretical limit and emerging technologies must eventually supersede them. With the development of optical oscillators and Josephson junction technology, we are again presented with the possibility of using oscillators as the basic components of computers, and it is possible that the next generation of computers will be composed almost entirely of oscillatory devices. Here, we demonstrate how coupled threshold oscillators may be used to perform binary logic in a manner entirely consistent with modern computer architectures. We describe a variety of computational circuitry and demonstrate working oscillator models of both computation and memory.

  15. Philosophical Study on Two Contemporary Iranian Muslim Intellectual Responses to Modern Science and Technology

    ERIC Educational Resources Information Center

    Shamsaei, Maryam; Shah, Mohd Hazim

    2017-01-01

    Iranian modern thinkers in either of the two categories: Western-minded and religious. The most prominent aspect of Western minded thinkers is their emphasis on separation of tradition and modernity. On the other hand, religious thinkers look forward to combining the two. The Western-minded thinkers believe that the most important burden on…

  16. Modern Approaches to the Computation of the Probability of Target Detection in Cluttered Environments

    NASA Astrophysics Data System (ADS)

    Meitzler, Thomas J.

    The field of computer vision interacts with fields such as psychology, vision research, machine vision, psychophysics, mathematics, physics, and computer science. The focus of this thesis is new algorithms and methods for the computation of the probability of detection (Pd) of a target in a cluttered scene. The scene can be either a natural visual scene such as one sees with the naked eye (visual), or, a scene displayed on a monitor with the help of infrared sensors. The relative clutter and the temperature difference between the target and background (DeltaT) are defined and then used to calculate a relative signal -to-clutter ratio (SCR) from which the Pd is calculated for a target in a cluttered scene. It is shown how this definition can include many previous definitions of clutter and (DeltaT). Next, fuzzy and neural -fuzzy techniques are used to calculate the Pd and it is shown how these methods can give results that have a good correlation with experiment. The experimental design for actually measuring the Pd of a target by observers is described. Finally, wavelets are applied to the calculation of clutter and it is shown how this new definition of clutter based on wavelets can be used to compute the Pd of a target.

  17. Syllabus Computer in Astronomy

    NASA Astrophysics Data System (ADS)

    Hojaev, Alisher S.

    2015-08-01

    One of the most important and actual subjects and training courses in the curricula for undergraduate level students at the National university of Uzbekistan is ‘Computer Methods in Astronomy’. It covers two semesters and includes both lecture and practice classes. Based on the long term experience we prepared the tutorial for students which contain the description of modern computer applications in astronomy.The main directions of computer application in field of astronomy briefly as follows:1) Automating the process of observation, data acquisition and processing2) Create and store databases (the results of observations, experiments and theoretical calculations) their generalization, classification and cataloging, working with large databases3) The decisions of the theoretical problems (physical modeling, mathematical modeling of astronomical objects and phenomena, derivation of model parameters to obtain a solution of the corresponding equations, numerical simulations), appropriate software creation4) The utilization in the educational process (e-text books, presentations, virtual labs, remote education, testing), amateur astronomy and popularization of the science5) The use as a means of communication and data transfer, research result presenting and dissemination (web-journals), the creation of a virtual information system (local and global computer networks).During the classes the special attention is drawn on the practical training and individual work of students including the independent one.

  18. Factors influencing exemplary science teachers' levels of computer use

    NASA Astrophysics Data System (ADS)

    Hakverdi, Meral

    This study examines exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their students' use of computer applications/tools in or for their science class. After a relevant review of the literature certain variables were selected for analysis. These variables included personal self-efficacy in teaching with computers, outcome expectancy, pupil-control ideology, level of computer use, age, gender, teaching experience, personal computer use, professional computer use and science teachers' level of knowledge/skills in using specific computer applications for science instruction. The sample for this study includes middle and high school science teachers who received the Presidential Award for Excellence in Science Teaching Award (sponsored by the White House and the National Science Foundation) between the years 1997 and 2003 from all 50 states and U.S. territories. Award-winning science teachers were contacted about the survey via e-mail or letter with an enclosed return envelope. Of the 334 award-winning science teachers, usable responses were received from 92 science teachers, which made a response rate of 27.5%. Analysis of the survey responses indicated that exemplary science teachers have a variety of knowledge/skills in using computer related applications/tools. The most commonly used computer applications/tools are information retrieval via the Internet, presentation tools, online communication, digital cameras, and data collection probes. Results of the study revealed that students' use of technology in their science classroom is highly correlated with the frequency of their science teachers' use of computer applications/tools. The results of the multiple regression analysis revealed that personal self-efficacy related to the exemplary science teachers' level of computer use suggesting that computer use is dependent on perceived abilities at using computers. The teachers' use of computer-related applications/tools during class, and their personal self-efficacy, age, and gender are highly related with their level of knowledge/skills in using specific computer applications for science instruction. The teachers' level of knowledge/skills in using specific computer applications for science instruction and gender related to their use of computer-related applications/tools during class and the students' use of computer-related applications/tools in or for their science class. In conclusion, exemplary science teachers need assistance in learning and using computer-related applications/tool in their science class.

  19. Science-Technology-Society or Technology-Society-Science? Insights from an Ancient Technology

    ERIC Educational Resources Information Center

    Lee, Yeung Chung

    2010-01-01

    Current approaches to science-technology-society (STS) education focus primarily on the controversial socio-scientific issues that arise from the application of science in modern technology. This paper argues for an interdisciplinary approach to STS education that embraces science, technology, history, and social and cultural studies. By employing…

  20. Teachers, Research, and Reform: Improving Teaching and Learning in High School Science Courses.

    ERIC Educational Resources Information Center

    Kaiser, Bonnie

    One of the challenges issued by the National Science Education Standards is for students to learn the content and process of modern scientific inquiry by engaging in research and entering science competitions. The Rockefeller University Precollege Science Education Outreach Programs (Science Outreach) provide access for about 70 students from…

  1. Simon van der Meer (1925-2011):. A Modest Genius of Accelerator Science

    NASA Astrophysics Data System (ADS)

    Chohan, Vinod C.

    2011-02-01

    Simon van der Meer was a brilliant scientist and a true giant of accelerator science. His seminal contributions to accelerator science have been essential to this day in our quest for satisfying the demands of modern particle physics. Whether we talk of long base-line neutrino physics or antiproton-proton physics at Fermilab or proton-proton physics at LHC, his techniques and inventions have been a vital part of the modern day successes. Simon van der Meer and Carlo Rubbia were the first CERN scientists to become Nobel laureates in Physics, in 1984. Van der Meer's lesserknown contributions spanned a whole range of subjects in accelerator science, from magnet design to power supply design, beam measurements, slow beam extraction, sophisticated programs and controls.

  2. Applied Mathematics at the U.S. Department of Energy: Past, Present and a View to the Future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, D L; Bell, J; Estep, D

    2008-02-15

    Over the past half-century, the Applied Mathematics program in the U.S. Department of Energy's Office of Advanced Scientific Computing Research has made significant, enduring advances in applied mathematics that have been essential enablers of modern computational science. Motivated by the scientific needs of the Department of Energy and its predecessors, advances have been made in mathematical modeling, numerical analysis of differential equations, optimization theory, mesh generation for complex geometries, adaptive algorithms and other important mathematical areas. High-performance mathematical software libraries developed through this program have contributed as much or more to the performance of modern scientific computer codes as themore » high-performance computers on which these codes run. The combination of these mathematical advances and the resulting software has enabled high-performance computers to be used for scientific discovery in ways that could only be imagined at the program's inception. Our nation, and indeed our world, face great challenges that must be addressed in coming years, and many of these will be addressed through the development of scientific understanding and engineering advances yet to be discovered. The U.S. Department of Energy (DOE) will play an essential role in providing science-based solutions to many of these problems, particularly those that involve the energy, environmental and national security needs of the country. As the capability of high-performance computers continues to increase, the types of questions that can be answered by applying this huge computational power become more varied and more complex. It will be essential that we find new ways to develop and apply the mathematics necessary to enable the new scientific and engineering discoveries that are needed. In August 2007, a panel of experts in applied, computational and statistical mathematics met for a day and a half in Berkeley, California to understand the mathematical developments required to meet the future science and engineering needs of the DOE. It is important to emphasize that the panelists were not asked to speculate only on advances that might be made in their own research specialties. Instead, the guidance this panel was given was to consider the broad science and engineering challenges that the DOE faces and identify the corresponding advances that must occur across the field of mathematics for these challenges to be successfully addressed. As preparation for the meeting, each panelist was asked to review strategic planning and other informational documents available for one or more of the DOE Program Offices, including the Offices of Science, Nuclear Energy, Fossil Energy, Environmental Management, Legacy Management, Energy Efficiency & Renewable Energy, Electricity Delivery & Energy Reliability and Civilian Radioactive Waste Management as well as the National Nuclear Security Administration. The panelists reported on science and engineering needs for each of these offices, and then discussed and identified mathematical advances that will be required if these challenges are to be met. A review of DOE challenges in energy, the environment and national security brings to light a broad and varied array of questions that the DOE must answer in the coming years. A representative subset of such questions includes: (1) Can we predict the operating characteristics of a clean coal power plant? (2) How stable is the plasma containment in a tokamak? (3) How quickly is climate change occurring and what are the uncertainties in the predicted time scales? (4) How quickly can an introduced bio-weapon contaminate the agricultural environment in the US? (5) How do we modify models of the atmosphere and clouds to incorporate newly collected data of possibly of new types? (6) How quickly can the United States recover if part of the power grid became inoperable? (7) What are optimal locations and communication protocols for sensing devices in a remote-sensing network? (8) How can new materials be designed with a specified desirable set of properties? In comparing and contrasting these and other questions of importance to DOE, the panel found that while the scientific breadth of the requirements is enormous, a central theme emerges: Scientists are being asked to identify or provide technology, or to give expert analysis to inform policy-makers that requires the scientific understanding of increasingly complex physical and engineered systems. In addition, as the complexity of the systems of interest increases, neither experimental observation nor mathematical and computational modeling alone can access all components of the system over the entire range of scales or conditions needed to provide the required scientific understanding.« less

  3. Special data base of Informational - Computational System 'INM RAS - Black Sea' for solving inverse and data assimilation problems

    NASA Astrophysics Data System (ADS)

    Zakharova, Natalia; Piskovatsky, Nicolay; Gusev, Anatoly

    2014-05-01

    Development of Informational-Computational Systems (ICS) for data assimilation procedures is one of multidisciplinary problems. To study and solve these problems one needs to apply modern results from different disciplines and recent developments in: mathematical modeling; theory of adjoint equations and optimal control; inverse problems; numerical methods theory; numerical algebra and scientific computing. The above problems are studied in the Institute of Numerical Mathematics of the Russian Academy of Science (INM RAS) in ICS for personal computers. In this work the results on the Special data base development for ICS "INM RAS - Black Sea" are presented. In the presentation the input information for ICS is discussed, some special data processing procedures are described. In this work the results of forecast using ICS "INM RAS - Black Sea" with operational observation data assimilation are presented. This study was supported by the Russian Foundation for Basic Research (project No 13-01-00753) and by Presidium Program of Russian Academy of Sciences (project P-23 "Black sea as an imitational ocean model"). References 1. V.I. Agoshkov, M.V. Assovskii, S.A. Lebedev, Numerical simulation of Black Sea hydrothermodynamics taking into account tide-forming forces. Russ. J. Numer. Anal. Math. Modelling (2012) 27, No.1, pp. 5-31. 2. E.I. Parmuzin, V.I. Agoshkov, Numerical solution of the variational assimilation problem for sea surface temperature in the model of the Black Sea dynamics. Russ. J. Numer. Anal. Math. Modelling (2012) 27, No.1, pp. 69-94. 3. V.B. Zalesny, N.A. Diansky, V.V. Fomin, S.N. Moshonkin, S.G. Demyshev, Numerical model of the circulation of Black Sea and Sea of Azov. Russ. J. Numer. Anal. Math. Modelling (2012) 27, No.1, pp. 95-111. 4. Agoshkov V.I.,Assovsky M.B., Giniatulin S. V., Zakharova N.B., Kuimov G.V., Parmuzin E.I., Fomin V.V. Informational Computational system of variational assimilation of observation data "INM RAS - Black sea"// Ecological safety of coastal and shelf zones and complex use of shelf resources: Collection of scientific works. Issue 26, Volume 2. - National Academy of Sciences of Ukraine, Marine Hydrophysical Institute, Sebastopol, 2012. Pages 352-360. (In russian)

  4. Beck, Asia and second modernity.

    PubMed

    Calhoun, Craig

    2010-09-01

    The work of Ulrich Beck has been important in bringing sociological attention to the ways issues of risk are embedded in contemporary globalization, in developing a theory of 'reflexive modernization', and in calling for social science to transcend 'methodological nationalism'. In recent studies, he and his colleagues help to correct for the Western bias of many accounts of cosmopolitanism and reflexive modernization, and seek to distinguish normative goals from empirical analysis. In this paper I argue that further clarification of this latter distinction is needed but hard to reach within a framework that still embeds the normative account in the idea that empirical change has a clear direction. Similar issues beset the presentation of diverse patterns in recent history as all variants of 'second modernity'. Lastly, I note that ironically, given the declared 'methodological cosmopolitanism' of the authors, the empirical studies here all focus on national cases. © London School of Economics and Political Science 2010.

  5. Improving robustness and computational efficiency using modern C++

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paterno, M.; Kowalkowski, J.; Green, C.

    2014-01-01

    For nearly two decades, the C++ programming language has been the dominant programming language for experimental HEP. The publication of ISO/IEC 14882:2011, the current version of the international standard for the C++ programming language, makes available a variety of language and library facilities for improving the robustness, expressiveness, and computational efficiency of C++ code. However, much of the C++ written by the experimental HEP community does not take advantage of the features of the language to obtain these benefits, either due to lack of familiarity with these features or concern that these features must somehow be computationally inefficient. In thismore » paper, we address some of the features of modern C+-+, and show how they can be used to make programs that are both robust and computationally efficient. We compare and contrast simple yet realistic examples of some common implementation patterns in C, currently-typical C++, and modern C++, and show (when necessary, down to the level of generated assembly language code) the quality of the executable code produced by recent C++ compilers, with the aim of allowing the HEP community to make informed decisions on the costs and benefits of the use of modern C++.« less

  6. Fort Benton Science Curriculum Outline.

    ERIC Educational Resources Information Center

    Fort Benton Public Schools, MT.

    The science curriculum for the Fort Benton school system was developed with funds under Title III of the Elementary and Secondary Education Act to give students the background of a modern and forward-looking program in science taught in an imaginative, investigative, and inquiry-oriented fashion. The science curriculum guide outlines a planned…

  7. The "Next Generation Science Standards" and the Earth and Space Sciences

    ERIC Educational Resources Information Center

    Wysession, Michael E.

    2013-01-01

    In this article, Michael E. Wysession comments on the "Next Generation Science Standards" (NGSS), which are based on the recommendations of the National Research Council and represent a revolutionary step toward establishing modern, national K-12 science education standards. The NGSS involves significant changes from traditional…

  8. Approaches and Strategies in Next Generation Science Learning

    ERIC Educational Resources Information Center

    Khine, Myint Swe, Ed.; Saleh, Issa M., Ed.

    2013-01-01

    "Approaches and Strategies in Next Generation Science Learning" examines the challenges involved in the development of modern curriculum models, teaching strategies, and assessments in science education in order to prepare future students in the 21st century economies. This comprehensive collection of research brings together science educators,…

  9. Intriguing Freshmen with Materials Science.

    ERIC Educational Resources Information Center

    Pond, Robert B., Sr.

    Described is a course designed for engineering science and natural science freshmen and open to upperclass nonscience majors entitled "Science of Modern Materials" and which has been successfully presented for several years. This paper presents the philosophy behind the course, the teaching methods employed, and the content of the course. The…

  10. EIAGRID: In-field optimization of seismic data acquisition by real-time subsurface imaging using a remote GRID computing environment.

    NASA Astrophysics Data System (ADS)

    Heilmann, B. Z.; Vallenilla Ferrara, A. M.

    2009-04-01

    The constant growth of contaminated sites, the unsustainable use of natural resources, and, last but not least, the hydrological risk related to extreme meteorological events and increased climate variability are major environmental issues of today. Finding solutions for these complex problems requires an integrated cross-disciplinary approach, providing a unified basis for environmental science and engineering. In computer science, grid computing is emerging worldwide as a formidable tool allowing distributed computation and data management with administratively-distant resources. Utilizing these modern High Performance Computing (HPC) technologies, the GRIDA3 project bundles several applications from different fields of geoscience aiming to support decision making for reasonable and responsible land use and resource management. In this abstract we present a geophysical application called EIAGRID that uses grid computing facilities to perform real-time subsurface imaging by on-the-fly processing of seismic field data and fast optimization of the processing workflow. Even though, seismic reflection profiling has a broad application range spanning from shallow targets in a few meters depth to targets in a depth of several kilometers, it is primarily used by the hydrocarbon industry and hardly for environmental purposes. The complexity of data acquisition and processing poses severe problems for environmental and geotechnical engineering: Professional seismic processing software is expensive to buy and demands large experience from the user. In-field processing equipment needed for real-time data Quality Control (QC) and immediate optimization of the acquisition parameters is often not available for this kind of studies. As a result, the data quality will be suboptimal. In the worst case, a crucial parameter such as receiver spacing, maximum offset, or recording time turns out later to be inappropriate and the complete acquisition campaign has to be repeated. The EIAGRID portal provides an innovative solution to this problem combining state-of-the-art data processing methods and modern remote grid computing technology. In field-processing equipment is substituted by remote access to high performance grid computing facilities. The latter can be ubiquitously controlled by a user-friendly web-browser interface accessed from the field by any mobile computer using wireless data transmission technology such as UMTS (Universal Mobile Telecommunications System) or HSUPA/HSDPA (High-Speed Uplink/Downlink Packet Access). The complexity of data-manipulation and processing and thus also the time demanding user interaction is minimized by a data-driven, and highly automated velocity analysis and imaging approach based on the Common-Reflection-Surface (CRS) stack. Furthermore, the huge computing power provided by the grid deployment allows parallel testing of alternative processing sequences and parameter settings, a feature which considerably reduces the turn-around times. A shared data storage using georeferencing tools and data grid technology is under current development. It will allow to publish already accomplished projects, making results, processing workflows and parameter settings available in a transparent and reproducible way. Creating a unified database shared by all users will facilitate complex studies and enable the use of data-crossing techniques to incorporate results of other environmental applications hosted on the GRIDA3 portal.

  11. On the Emergence of Modern Humans

    ERIC Educational Resources Information Center

    Amati, Daniele; Shallice, Tim

    2007-01-01

    The emergence of modern humans with their extraordinary cognitive capacities is ascribed to a novel type of cognitive computational process (sustained non-routine multi-level operations) required for abstract projectuality, held to be the common denominator of the cognitive capacities specific to modern humans. A brain operation (latching) that…

  12. The concepts of science in Japanese and Western education

    NASA Astrophysics Data System (ADS)

    Kawasaki, Ken

    1996-01-01

    Using structural linguistics, the present article offers an impartial frame of reference to analyze science education in the non-Western world. In Japan, science education has been free from epistemological reflection because Japan regards science only as effective technology for modernization. By not taking account of the world-view aspect of science, Japan can treat science as not self-referential. Issues of science education are then rather simple; they are only concerned with the question of ‘how to’, and answers to this question are judged according to the efficiency achieved for modernization. Science, however, is a way of seeing ‘nature’. This word is generally translated into Japanese as ‘shizen’ which has a totally different connotation and therefore does not lead to an understanding of the Western scientific spirit. Saussure's approach to language is used to expose the consequences of the misinterpretations that spring from this situation. In order to minimize or prevent these misinterpretations, it is emphasized that science education should be identified with foreign language education in the non-Western world.

  13. The revolution in risk assessment and disease detection made possible with non-invasive imaging: implications for population science.

    PubMed

    Carr, J Jeffrey

    2012-01-01

    The ability to quantify subclinical disease to assess cardiovascular disease is greatly enhanced by modern medical imaging techniques that incorporate concepts from biomedical engineering and computer science. These techniques' numerical results, known as quantitative phenotypes, can be used to help us better understand both health and disease states. In this report, we describe our efforts in using the latest imaging technologies to assess cardiovascular disease risk by quantifying subclinical disease of participants in the Jackson Heart Study. The CT and MRI exams of the Jackson Heart Study have collected detailed information from approximately 3,000 participants. Analyses of the images from these exams provide information on several measures including the amount of plaque in the coronary arteries and the ability of the heart to pump blood. These measures can then be added to the wealth of information on JHS participants to understand how these conditions, as well as how clinical events, such as heart attacks and heart failure, occur in African Americans.

  14. Complexity in Nature and Society: Complexity Management in the Age of Globalization

    NASA Astrophysics Data System (ADS)

    Mainzer, Klaus

    The theory of nonlinear complex systems has become a proven problem-solving approach in the natural sciences from cosmic and quantum systems to cellular organisms and the brain. Even in modern engineering science self-organizing systems are developed to manage complex networks and processes. It is now recognized that many of our ecological, social, economic, and political problems are also of a global, complex, and nonlinear nature. What are the laws of sociodynamics? Is there a socio-engineering of nonlinear problem solving? What can we learn from nonlinear dynamics for complexity management in social, economic, financial and political systems? Is self-organization an acceptable strategy to handle the challenges of complexity in firms, institutions and other organizations? It is a main thesis of the talk that nature and society are basically governed by nonlinear and complex information dynamics. How computational is sociodynamics? What can we hope for social, economic and political problem solving in the age of globalization?.

  15. Big biomedical data as the key resource for discovery science.

    PubMed

    Toga, Arthur W; Foster, Ian; Kesselman, Carl; Madduri, Ravi; Chard, Kyle; Deutsch, Eric W; Price, Nathan D; Glusman, Gustavo; Heavner, Benjamin D; Dinov, Ivo D; Ames, Joseph; Van Horn, John; Kramer, Roger; Hood, Leroy

    2015-11-01

    Modern biomedical data collection is generating exponentially more data in a multitude of formats. This flood of complex data poses significant opportunities to discover and understand the critical interplay among such diverse domains as genomics, proteomics, metabolomics, and phenomics, including imaging, biometrics, and clinical data. The Big Data for Discovery Science Center is taking an "-ome to home" approach to discover linkages between these disparate data sources by mining existing databases of proteomic and genomic data, brain images, and clinical assessments. In support of this work, the authors developed new technological capabilities that make it easy for researchers to manage, aggregate, manipulate, integrate, and model large amounts of distributed data. Guided by biological domain expertise, the Center's computational resources and software will reveal relationships and patterns, aiding researchers in identifying biomarkers for the most confounding conditions and diseases, such as Parkinson's and Alzheimer's. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  16. Analysis of Context Dependence in Social Interaction Networks of a Massively Multiplayer Online Role-Playing Game

    PubMed Central

    Son, Seokshin; Kang, Ah Reum; Kim, Hyun-chul; Kwon, Taekyoung; Park, Juyong; Kim, Huy Kang

    2012-01-01

    Rapid advances in modern computing and information technology have enabled millions of people to interact online via various social network and gaming services. The widespread adoption of such online services have made possible analysis of large-scale archival data containing detailed human interactions, presenting a very promising opportunity to understand the rich and complex human behavior. In collaboration with a leading global provider of Massively Multiplayer Online Role-Playing Games (MMORPGs), here we present a network science-based analysis of the interplay between distinct types of user interaction networks in the virtual world. We find that their properties depend critically on the nature of the context-interdependence of the interactions, highlighting the complex and multilayered nature of human interactions, a robust understanding of which we believe may prove instrumental in the designing of more realistic future virtual arenas as well as provide novel insights to the science of collective human behavior. PMID:22496771

  17. Hierarchy, determinism, and specificity in theories of development and evolution.

    PubMed

    Deichmann, Ute

    2017-10-16

    The concepts of hierarchical organization, genetic determinism and biological specificity (for example of species, biologically relevant macromolecules, or genes) have played a crucial role in biology as a modern experimental science since its beginnings in the nineteenth century. The idea of genetic information (specificity) and genetic determination was at the basis of molecular biology that developed in the 1940s with macromolecules, viruses and prokaryotes as major objects of research often labelled "reductionist". However, the concepts have been marginalized or rejected in some of the research that in the late 1960s began to focus additionally on the molecularization of complex biological structures and functions using systems approaches. This paper challenges the view that 'molecular reductionism' has been successfully replaced by holism and a focus on the collective behaviour of cellular entities. It argues instead that there are more fertile replacements for molecular 'reductionism', in which genomics, embryology, biochemistry, and computer science intertwine and result in research that is as exact and causally predictive as earlier molecular biology.

  18. The discovery of circulation and the origin of modern medicine during the italian renaissance.

    PubMed

    Thiene, G

    1997-03-01

    This historical article discusses the dawn of anatomy during the Italian Renaissance, the role of the University of Padua in the origin of modern medicine, milestones in the development of modern medicine, the discovery of circulation, Padua leadership and Galileo's persecution for his scientific theories. Copyright © 1997 Elsevier Science Inc. All rights reserved.

  19. Through Kazan ASPERA to Modern Projects

    NASA Astrophysics Data System (ADS)

    Gusev, Alexander; Kitiashvili, Irina; Petrova, Natasha

    Now the European Union form the Sixth Framework Programme. One of its the objects of the EU Programme is opening national researches and training programmes. The Russian PhD students and young astronomers have business and financial difficulties in access to modern databases and astronomical projects and so they has not been included in European overview of priorities. Modern requirements to the organization of observant projects on powerful telescopes assumes painstaking scientific computer preparation of the application. A rigid competition for observation time assume preliminary computer modeling of target object for success of the application. Kazan AstroGeoPhysics Partnership

  20. The science of consciousness - Basics, models, and visions.

    PubMed

    Hinterberger, Thilo

    2015-12-01

    This article presents a few models and aspects of the phenomenon consciousness that are emerging from modern neuroscience and might serve as a basis for scientific discourse in the field of Applied Consciousness Sciences. A first model describes the dynamics of information processing in the brain. The evoked electric brain potentials represent a hierarchical sequence of functions playing an important role in conscious perception. These range from primary processing, attention, pattern recognition, categorization, associations to judgments, and complex thoughts. Most functions seem to be implemented in the brain's neural network operating as a neurobiological computer. Another model treats conscious perception as a process of internalisation leading to the "self" as conscious observer. As a consequence, every conscious perception can be seen as a reduced and already interpreted observation of an inner representation of an outer or imagined "world." Subjective experience thus offers properties which can only be experienced from the inside and cannot be made objective. Basic values of humanity such as responsibility, love, compassion, freedom, and dignity can be derived from these subjective qualities. Therefore, in contrast to the Natural Sciences, the Science of Consciousness additionally is challenged to deal with those subjective qualities, emphasizing the resulting influence on health, social interactions, and the whole society. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Oscillatory Threshold Logic

    PubMed Central

    Borresen, Jon; Lynch, Stephen

    2012-01-01

    In the 1940s, the first generation of modern computers used vacuum tube oscillators as their principle components, however, with the development of the transistor, such oscillator based computers quickly became obsolete. As the demand for faster and lower power computers continues, transistors are themselves approaching their theoretical limit and emerging technologies must eventually supersede them. With the development of optical oscillators and Josephson junction technology, we are again presented with the possibility of using oscillators as the basic components of computers, and it is possible that the next generation of computers will be composed almost entirely of oscillatory devices. Here, we demonstrate how coupled threshold oscillators may be used to perform binary logic in a manner entirely consistent with modern computer architectures. We describe a variety of computational circuitry and demonstrate working oscillator models of both computation and memory. PMID:23173034

  2. (Post) Modern Science (Education): Propositions and Alternative Paths. Counterpoints: Studies in the Postmodern Theory of Education.

    ERIC Educational Resources Information Center

    Weaver, John A., Ed.; Morris, Marla, Ed.; Appelbaum, Peter, Ed.

    This collection of essays offers new perspectives for science educators, curriculum theorists, and cultural critics on science education, French post-structural thought, and the science debates. This book contains chapters on the work of Bruno Latour, Michael Serres, and Jean Baudrillard plus chapters on postmodern approaches to science education…

  3. Science Teachers' Misconceptions in Science and Engineering Distinctions: Reflections on Modern Research Examples

    ERIC Educational Resources Information Center

    Antink-Meyer, Allison; Meyer, Daniel Z.

    2016-01-01

    The aim of this exploratory study was to learn about the misconceptions that may arise for elementary and high school science teachers in their reflections on science and engineering practice. Using readings and videos of real science and engineering work, teachers' reflections were used to uncover the underpinnings of their understandings. This…

  4. The Mona Lisa of modern science.

    PubMed

    Kemp, Martin

    2003-01-23

    No molecule in the history of science has reached the iconic status of the double helix of DNA. Its image has been imprinted on all aspects of society, from science, art, music, cinema, architecture and advertising. This review of the Mona Lisa of science examines the evolution of its form at the hands of both science and art.

  5. Universal Cosmic Absolute and Modern Science

    NASA Astrophysics Data System (ADS)

    Kostro, Ludwik

    The official Sciences, especially all natural sciences, respect in their researches the principle of methodic naturalism i.e. they consider all phenomena as entirely natural and therefore in their scientific explanations they do never adduce or cite supernatural entities and forces. The purpose of this paper is to show that Modern Science has its own self-existent, self-acting, and self-sufficient Natural All-in Being or Omni-Being i.e. the entire Nature as a Whole that justifies the scientific methodic naturalism. Since this Natural All-in Being is one and only It should be considered as the own scientifically justified Natural Absolute of Science and should be called, in my opinion, the Universal Cosmic Absolute of Modern Science. It will be also shown that the Universal Cosmic Absolute is ontologically enormously stratified and is in its ultimate i.e. in its most fundamental stratum trans-reistic and trans-personal. It means that in its basic stratum. It is neither a Thing or a Person although It contains in Itself all things and persons with all other sentient and conscious individuals as well, On the turn of the 20th century the Science has begun to look for a theory of everything, for a final theory, for a master theory. In my opinion the natural Universal Cosmic Absolute will constitute in such a theory the radical all penetrating Ultimate Basic Reality and will substitute step by step the traditional supernatural personal Absolute.

  6. The Human Face as a Dynamic Tool for Social Communication.

    PubMed

    Jack, Rachael E; Schyns, Philippe G

    2015-07-20

    As a highly social species, humans frequently exchange social information to support almost all facets of life. One of the richest and most powerful tools in social communication is the face, from which observers can quickly and easily make a number of inferences - about identity, gender, sex, age, race, ethnicity, sexual orientation, physical health, attractiveness, emotional state, personality traits, pain or physical pleasure, deception, and even social status. With the advent of the digital economy, increasing globalization and cultural integration, understanding precisely which face information supports social communication and which produces misunderstanding is central to the evolving needs of modern society (for example, in the design of socially interactive digital avatars and companion robots). Doing so is challenging, however, because the face can be thought of as comprising a high-dimensional, dynamic information space, and this impacts cognitive science and neuroimaging, and their broader applications in the digital economy. New opportunities to address this challenge are arising from the development of new methods and technologies, coupled with the emergence of a modern scientific culture that embraces cross-disciplinary approaches. Here, we briefly review one such approach that combines state-of-the-art computer graphics, psychophysics and vision science, cultural psychology and social cognition, and highlight the main knowledge advances it has generated. In the light of current developments, we provide a vision of the future directions in the field of human facial communication within and across cultures. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Large Eddy Simulation of Engineering Flows: A Bill Reynolds Legacy.

    NASA Astrophysics Data System (ADS)

    Moin, Parviz

    2004-11-01

    The term, Large eddy simulation, LES, was coined by Bill Reynolds, thirty years ago when he and his colleagues pioneered the introduction of LES in the engineering community. Bill's legacy in LES features his insistence on having a proper mathematical definition of the large scale field independent of the numerical method used, and his vision for using numerical simulation output as data for research in turbulence physics and modeling, just as one would think of using experimental data. However, as an engineer, Bill was pre-dominantly interested in the predictive capability of computational fluid dynamics and in particular LES. In this talk I will present the state of the art in large eddy simulation of complex engineering flows. Most of this technology has been developed in the Department of Energy's ASCI Program at Stanford which was led by Bill in the last years of his distinguished career. At the core of this technology is a fully implicit non-dissipative LES code which uses unstructured grids with arbitrary elements. A hybrid Eulerian/ Largangian approach is used for multi-phase flows, and chemical reactions are introduced through dynamic equations for mixture fraction and reaction progress variable in conjunction with flamelet tables. The predictive capability of LES is demonstrated in several validation studies in flows with complex physics and complex geometry including flow in the combustor of a modern aircraft engine. LES in such a complex application is only possible through efficient utilization of modern parallel super-computers which was recognized and emphasized by Bill from the beginning. The presentation will include a brief mention of computer science efforts for efficient implementation of LES.

  8. Automated information system for analysis and prediction of production situations in blast furnace plant

    NASA Astrophysics Data System (ADS)

    Lavrov, V. V.; Spirin, N. A.

    2016-09-01

    Advances in modern science and technology are inherently connected with the development, implementation, and widespread use of computer systems based on mathematical modeling. Algorithms and computer systems are gaining practical significance solving a range of process tasks in metallurgy of MES-level (Manufacturing Execution Systems - systems controlling industrial process) of modern automated information systems at the largest iron and steel enterprises in Russia. This fact determines the necessity to develop information-modeling systems based on mathematical models that will take into account the physics of the process, the basics of heat and mass exchange, the laws of energy conservation, and also the peculiarities of the impact of technological and standard characteristics of raw materials on the manufacturing process data. Special attention in this set of operations for metallurgic production is devoted to blast-furnace production, as it consumes the greatest amount of energy, up to 50% of the fuel used in ferrous metallurgy. The paper deals with the requirements, structure and architecture of BF Process Engineer's Automated Workstation (AWS), a computer decision support system of MES Level implemented in the ICS of the Blast Furnace Plant at Magnitogorsk Iron and Steel Works. It presents a brief description of main model subsystems as well as assumptions made in the process of mathematical modelling. Application of the developed system allows the engineering and process staff to analyze online production situations in the blast furnace plant, to solve a number of process tasks related to control of heat, gas dynamics and slag conditions of blast-furnace smelting as well as to calculate the optimal composition of blast-furnace slag, which eventually results in increasing technical and economic performance of blast-furnace production.

  9. Gender differences in the use of computers, programming, and peer interactions in computer science classrooms

    NASA Astrophysics Data System (ADS)

    Stoilescu, Dorian; Egodawatte, Gunawardena

    2010-12-01

    Research shows that female and male students in undergraduate computer science programs view computer culture differently. Female students are interested more in the use of computers than in doing programming, whereas male students see computer science mainly as a programming activity. The overall purpose of our research was not to find new definitions for computer science culture but to see how male and female students see themselves involved in computer science practices, how they see computer science as a successful career, and what they like and dislike about current computer science practices. The study took place in a mid-sized university in Ontario. Sixteen students and two instructors were interviewed to get their views. We found that male and female views are different on computer use, programming, and the pattern of student interactions. Female and male students did not have any major issues in using computers. In computing programming, female students were not so involved in computing activities whereas male students were heavily involved. As for the opinions about successful computer science professionals, both female and male students emphasized hard working, detailed oriented approaches, and enjoying playing with computers. The myth of the geek as a typical profile of successful computer science students was not found to be true.

  10. Application of web-GIS approach for climate change study

    NASA Astrophysics Data System (ADS)

    Okladnikov, Igor; Gordov, Evgeny; Titov, Alexander; Bogomolov, Vasily; Martynova, Yuliya; Shulgina, Tamara

    2013-04-01

    Georeferenced datasets are currently actively used in numerous applications including modeling, interpretation and forecast of climatic and ecosystem changes for various spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their huge size which might constitute up to tens terabytes for a single dataset at present studies in the area of climate and environmental change require a special software support. A dedicated web-GIS information-computational system for analysis of georeferenced climatological and meteorological data has been created. It is based on OGC standards and involves many modern solutions such as object-oriented programming model, modular composition, and JavaScript libraries based on GeoExt library, ExtJS Framework and OpenLayers software. The main advantage of the system lies in a possibility to perform mathematical and statistical data analysis, graphical visualization of results with GIS-functionality, and to prepare binary output files with just only a modern graphical web-browser installed on a common desktop computer connected to Internet. Several geophysical datasets represented by two editions of NCEP/NCAR Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 Reanalysis, ECMWF ERA Interim Reanalysis, MRI/JMA APHRODITE's Water Resources Project Reanalysis, DWD Global Precipitation Climatology Centre's data, GMAO Modern Era-Retrospective analysis for Research and Applications, meteorological observational data for the territory of the former USSR for the 20th century, results of modeling by global and regional climatological models, and others are available for processing by the system. And this list is extending. Also a functionality to run WRF and "Planet simulator" models was implemented in the system. Due to many preset parameters and limited time and spatial ranges set in the system these models have low computational power requirements and could be used in educational workflow for better understanding of basic climatological and meteorological processes. The Web-GIS information-computational system for geophysical data analysis provides specialists involved into multidisciplinary research projects with reliable and practical instruments for complex analysis of climate and ecosystems changes on global and regional scales. Using it even unskilled user without specific knowledge can perform computational processing and visualization of large meteorological, climatological and satellite monitoring datasets through unified web-interface in a common graphical web-browser. This work is partially supported by the Ministry of education and science of the Russian Federation (contract #8345), SB RAS project VIII.80.2.1, RFBR grant #11-05-01190a, and integrated project SB RAS #131.

  11. A History of Soil Science Education in the United States

    NASA Astrophysics Data System (ADS)

    Brevik, Eric C.

    2017-04-01

    The formal study of soil science is a fairly recent undertaking in academics. Fields like biology, chemistry, and physics date back hundreds of years, but the scientific study of soils only dates to the late 1800s. Academic programs to train students in soil science are even more recent, with the first such programs only developing in the USA in the early 1900s. Some of the first schools to offer soil science training at the university level included the University of North Carolina (UNC), Earlham College (EC), and Cornell University. The first modern soil science textbook published in the United States was "Soils, Their Properties and Management" by Littleton Lyon, Elmer Fippin and Harry Buckman in 1909. This has evolved over time into the popular modern textbook "The Nature and Properties of Soils", most recently authored by Raymond Weil and Nyle Brady. Over time soil science education moved away from liberal arts schools such as UNC and EC and became associated primarily with land grant universities in their colleges of agriculture. There are currently about 71 colleges and universities in the USA that offer bachelors level soil science degree programs, with 54 of these (76%) being land grant schools. In the 1990s through the early 2000s enrollment in USA soil science programs was on the decline, even as overall enrollment at USA colleges and universities increased. This caused considerable concern in the soil science community. More recently there is evidence that soil science student numbers may be increasing, although additional information on this potential trend is desirable. One challenge soil science faces in the modern USA is finding an academic home, as soils are taught by a wide range of fields and soils classes are taken by students in many fields of study, including soil science, a range of agricultural programs, environmental science, environmental health, engineering, geology, geography, and others.

  12. Resiliency in Future Cyber Combat

    DTIC Science & Technology

    2016-04-04

    including the Internet , telecommunications networks, computer systems, and embed- ded processors and controllers.”6 One important point emerging from the...definition is that while the Internet is part of cyberspace, it is not all of cyberspace. Any computer processor capable of communicating with a...central proces- sor on a modern car are all part of cyberspace, although only some of them are routinely connected to the Internet . Most modern

  13. Western Australian school students' understanding of biotechnology

    NASA Astrophysics Data System (ADS)

    Dawson, Vaille; Schibeci, Renato

    2003-01-01

    Are science educators providing secondary school students with the background to understand the science behind recent controversies such as the recently introduced compulsory labelling of genetically modified foods? Research from the UK suggests that many secondary school students do not understand the processes or implications of modern biotechnology. The situation in Australia is unclear. In this study, 1116 15-year-old students from eleven Western Australian schools were surveyed to determine their understanding of, and attitude towards, recent advances in modern biotechnology. The results indicate that approximately one third of students have little or no understanding of biotechnology. Many students over-estimate the use of biotechnology in our society by confusing current uses with possible future applications. The results provide a rationale for the inclusion of biotechnology, a cutting edge science, in the school science curriculum

  14. Application of diet-derived taste active components for clinical nutrition: perspectives from ancient Ayurvedic medical science, space medicine, and modern clinical nutrition.

    PubMed

    Kulkarni, Anil D; Sundaresan, Alamelu; Rashid, Muhammad J; Yamamoto, Shigeru; Karkow, Francisco

    2014-01-01

    The principal objective of this paper is to demonstrate the role of taste and flavor in health from the ancient science of Ayurveda to modern medicine; specifically their mechanisms and roles in space medicine and their clinical relevance in modern heath care. It also describes the brief history of the use of the monosodium glutamate or flavor enhancers ("Umami substance") that improve the quality of food intake by stimulating chemosensory perception. In addition, the dietary nucleotides are known to be the components of "Umami substance" and the benefit of their use has been proposed in various types of patients with cancer, radiation therapy, organ transplantation, and for application in space medicine.

  15. Feminization and marginalization? Women Ayurvedic doctors and modernizing health care in Nepal.

    PubMed

    Cameron, Mary

    2010-03-01

    The important diversity of indigenous medical systems around the world suggests that gender issues, well understood for Western science, may differ in significant ways for non-Western science practices and are an important component in understanding how social dimensions of women's health care are being transformed by global biomedicine. Based on ethnographic research conducted with formally trained women Ayurvedic doctors in Nepal, I identify important features of medical knowledge and practice beneficial to women patients, and I discuss these features as potentially transformed by modernizing health care development. The article explores the indirect link between Ayurveda's feminization and its marginalization, in relation to modern biomedicine, which may evolve to become more direct and consequential for women's health in the country.

  16. Theme: The Role of Science in the Agricultural Education Curriculum.

    ERIC Educational Resources Information Center

    Agricultural Education Magazine, 2002

    2002-01-01

    Thirteen theme articles discuss integration of science and agriculture, the role of science in agricultural education, biotechnology, agriscience in Tennessee and West Virginia, agriscience and program survival, modernization of agricultural education curriculum, agriscience and service learning, and biotechnology websites. (SK)

  17. Many Experts, Many Audiences: Public Engagement with Science and Informal Science Education. A CAISE Inquiry Group Report. Executive Summary

    ERIC Educational Resources Information Center

    McCallie, Ellen; Bell, Larry; Lohwater, Tiffany; Falk, John H.; Lehr, Jane L.; Lewenstein, Bruce V.; Needham, Cynthia; Wiehe, Ben

    2009-01-01

    Science and technology are embedded in every aspect of modern life. This executive summary describes how Public Engagement with Science (PES), in the context of informal science education (ISE), can provide opportunities for public awareness of and participation in science and technology. PES is an approach that has developed in the last 10 years…

  18. ESIF 2016: Modernizing Our Grid and Energy System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Becelaere, Kimberly

    This 2016 annual report highlights work conducted at the Energy Systems Integration Facility (ESIF) in FY 2016, including grid modernization, high-performance computing and visualization, and INTEGRATE projects.

  19. Computer Technology: State of the Art.

    ERIC Educational Resources Information Center

    Withington, Frederic G.

    1981-01-01

    Describes the nature of modern general-purpose computer systems, including hardware, semiconductor electronics, microprocessors, computer architecture, input output technology, and system control programs. Seven suggested readings are cited. (FM)

  20. Georges Lemaître: Science and Religion

    NASA Astrophysics Data System (ADS)

    Coyne, George V.

    In order to appreciate the contribution which Georges Lemaître made to the relationship between religion and science it is necessary to understand how the Catholic Church, of which he was a priest, passed in the course of three centuries, from a position of conflict with the sciences to one of compatible openness and dialogue. In doing this I hope to show that the natural sciences have played a significant role in helping to establish the kind of dialogue that is absolutely necessary for the enrichment of the multifaceted aspects of human culture. I will speak of the following four periods of history: (l) the rise of modern atheism in the seventeenth and eighteenth centuries; (2) anticlericalism in Europe in the nineteenth century; (3) the awakening within the Catholic Church to modern science in the first six decades of the twentieth century; (4) the Church's view today.

  1. History of Science and History of Philologies.

    PubMed

    Daston, Lorraine; Most, Glenn W

    2015-06-01

    While both the sciences and the humanities, as currently defined, may be too heterogeneous to be encompassed within a unified historical framework, there is good reason to believe that the history of science and the history of philologies both have much to gain by joining forces. This collaboration has already yielded striking results in the case of the history of science and humanist learning in early modern Europe. This essay argues that first, philology and at least some of the sciences (e.g., astronomy) remained intertwined in consequential ways well into the modern period in Western cultures; and second, widening the scope of inquiry to include other philological traditions in non-Western cultures offers rich possibilities for a comparative history of learned practices. The focus on practices is key; by shifting the emphasis from what is studied to how it is studied, deep commonalities emerge among disciplines--and intellectual traditions--now classified as disparate.

  2. An Assessment of a Science Discipline Archive Against ISO 16363

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; Downs, R. R.

    2016-12-01

    The Planetary Data System (PDS) is a federation of science discipline nodes formed in response to the findings of the Committee on Data Management and Computing (CODMAC 1986) that a "wealth of science data would ultimately cease to be useful and probably lost if a process was not developed to ensure that the science data were properly archived." Starting operations in 1990 the stated mission of the PDS is to "facilitate achievement of NASA's planetary science goals by efficiently collecting, archiving, and making accessible digital data and documentation produced by or relevant to NASA's planetary missions, research programs, and data analysis programs."In 2008 the PDS initiated a transition to a more modern system based on key principles found in the Archival Information System (OAIS) Reference Model (ISO 14721), a set of functional requirements provided by the designated community, and about twenty years of lessons-learned. With science digital data now being archived under the new PDS4, the PDS is a good use case to be assessed as a trusted repository against ISO 16363, a recommended practice for assessing the trustworthiness of digital repositories.This presentation will summarize the OAIS principles adopted for PDS4 and the findings of a desk assessment of the PDS against ISO 16363. Also presented will be specific items of evidence, for example the PDS mission statement above, and how they impact the level of certainty that the ISO 16363 metrics are being met.

  3. NSF in a Changing World: The National Science Foundation's Strategic Plan.

    ERIC Educational Resources Information Center

    National Science Foundation, Washington, DC.

    The National Science Foundation's (NSF) role as a leader and steward of the Nation's science and engineering enterprise faces new tests--promoting new approaches to research, education, and workforce training that reach all Americans; responding to the increased importance of science and engineering in many aspects of daily life; modernizing the…

  4. Techniques for Promoting Intellectual Self Confidence among At-Risk Science Students in Rural and Small Schools.

    ERIC Educational Resources Information Center

    Prather, J. Preston

    Most students enter their first formal science courses with intelligently conceived and sophisticated concepts of science. Some of these may be compatible with the principles of modern science, but others may be incorrect, inadequate, outdated, or otherwise unacceptable. Conceptual frameworks based on intuitive misperceptions, naive inferences,…

  5. Designing Intelligent Knowledge: Epistemological Faith and the Democratization of Science

    ERIC Educational Resources Information Center

    Pierce, Clayton

    2007-01-01

    In this essay, Clayton Pierce examines the epistemological standpoints of Intelligent Design (ID) and evolutionary science education, focusing specifically on the pedagogical question of how ID and modern science-based education fail to promote democratic relations in how students learn, think, and associate with science and technology in society.…

  6. Shaking the Tree, Making a Rhizome: Towards a Nomadic Geophilosophy of Science Education

    ERIC Educational Resources Information Center

    Gough, Noel

    2006-01-01

    This essay enacts a philosophy of science education inspired by Gilles Deleuze and Felix Guattari's figurations of rhizomatic and nomadic thought. It imagines rhizomes shaking the tree of modern Western science and science education by destabilising arborescent conceptions of knowledge as hierarchically articulated branches of a central stem or…

  7. Consciousness and Science: A Non-Dual Perspective on the Theology-Science Dialogue

    ERIC Educational Resources Information Center

    Sriraman, Bharath; Benesch, Walter

    2013-01-01

    In modern science, the synthesis of "nature/mind" in observation, experiment, and explanation, especially in physics and biology increasingly reveal a non-linear totality in which subject, object, and situation have become inseparable. This raises the interesting ontological question of the true nature of reality? Western science as seen in its…

  8. Advanced interdisciplinary undergraduate program: light engineering

    NASA Astrophysics Data System (ADS)

    Bakholdin, Alexey; Bougrov, Vladislav; Voznesenskaya, Anna; Ezhova, Kseniia

    2016-09-01

    The undergraduate educational program "Light Engineering" of an advanced level of studies is focused on development of scientific learning outcomes and training of professionals, whose activities are in the interdisciplinary fields of Optical engineering and Technical physics. The program gives practical experience in transmission, reception, storage, processing and displaying information using opto-electronic devices, automation of optical systems design, computer image modeling, automated quality control and characterization of optical devices. The program is implemented in accordance with Educational standards of the ITMO University. The specific features of the Program is practice- and problem-based learning implemented by engaging students to perform research and projects, internships at the enterprises and in leading Russian and international research educational centers. The modular structure of the Program and a significant proportion of variable disciplines provide the concept of individual learning for each student. Learning outcomes of the program's graduates include theoretical knowledge and skills in natural science and core professional disciplines, deep knowledge of modern computer technologies, research expertise, design skills, optical and optoelectronic systems and devices.

  9. Adapting bioinformatics curricula for big data.

    PubMed

    Greene, Anna C; Giffin, Kristine A; Greene, Casey S; Moore, Jason H

    2016-01-01

    Modern technologies are capable of generating enormous amounts of data that measure complex biological systems. Computational biologists and bioinformatics scientists are increasingly being asked to use these data to reveal key systems-level properties. We review the extent to which curricula are changing in the era of big data. We identify key competencies that scientists dealing with big data are expected to possess across fields, and we use this information to propose courses to meet these growing needs. While bioinformatics programs have traditionally trained students in data-intensive science, we identify areas of particular biological, computational and statistical emphasis important for this era that can be incorporated into existing curricula. For each area, we propose a course structured around these topics, which can be adapted in whole or in parts into existing curricula. In summary, specific challenges associated with big data provide an important opportunity to update existing curricula, but we do not foresee a wholesale redesign of bioinformatics training programs. © The Author 2015. Published by Oxford University Press.

  10. The future of computing--new architectures and new technologies.

    PubMed

    Warren, P

    2004-02-01

    All modern computers are designed using the 'von Neumann' architecture and built using silicon transistor technology. Both architecture and technology have been remarkably successful. Yet there are a range of problems for which this conventional architecture is not particularly well adapted, and new architectures are being proposed to solve these problems, in particular based on insight from nature. Transistor technology has enjoyed 50 years of continuing progress. However, the laws of physics dictate that within a relatively short time period this progress will come to an end. New technologies, based on molecular and biological sciences as well as quantum physics, are vying to replace silicon, or at least coexist with it and extend its capability. The paper describes these novel architectures and technologies, places them in the context of the kinds of problems they might help to solve, and predicts their possible manner and time of adoption. Finally it describes some key questions and research problems associated with their use.

  11. Adapting bioinformatics curricula for big data

    PubMed Central

    Greene, Anna C.; Giffin, Kristine A.; Greene, Casey S.

    2016-01-01

    Modern technologies are capable of generating enormous amounts of data that measure complex biological systems. Computational biologists and bioinformatics scientists are increasingly being asked to use these data to reveal key systems-level properties. We review the extent to which curricula are changing in the era of big data. We identify key competencies that scientists dealing with big data are expected to possess across fields, and we use this information to propose courses to meet these growing needs. While bioinformatics programs have traditionally trained students in data-intensive science, we identify areas of particular biological, computational and statistical emphasis important for this era that can be incorporated into existing curricula. For each area, we propose a course structured around these topics, which can be adapted in whole or in parts into existing curricula. In summary, specific challenges associated with big data provide an important opportunity to update existing curricula, but we do not foresee a wholesale redesign of bioinformatics training programs. PMID:25829469

  12. Well-ordered science and Indian epistemic cultures: toward a polycentered history of science.

    PubMed

    Ganeri, Jonardon

    2013-06-01

    This essay defends the view that "modern science," as with modernity in general, is a polycentered phenomenon, something that appears in different forms at different times and places. It begins with two ideas about the nature of rational scientific inquiry: Karin Knorr Cetina's idea of "epistemic cultures," and Philip Kitcher's idea of science as "a system of public knowledge," such knowledge as would be deemed worthwhile by an ideal conversation among the whole public under conditions of mutual engagement. This account of the nature of scientific practice provides us with a new perspective from which to understand key elements in the philosophical project of Jaina logicians in the seventh, eighth, and ninth centuries C.E. Jaina theory seems exceptionally well targeted onto two of the key constituents in the ideal conversation--the classification of all human points of view and the representation of end states of the deliberative process. The Buddhist theory of the Kathāvatthu contributes to Indian epistemic culture in a different way: by supplying a detailed theory of how human dialogical standpoints can be revised in the ideal conversation, an account of the phenomenon Kitcher labels "tutoring." Thus science in India has its own history, one that should be studied in comparison and contrast with the history of science in Europe. In answer to Joseph Needham, it was not 'modern science' which failed to develop in India or China but rather non-well-ordered science, science as unconstrained by social value and democratic consent. What I argue is that this is not a deficit in the civilisational histories of these countries, but a virtue.

  13. Exploring the Relationships between Self-Efficacy and Preference for Teacher Authority among Computer Science Majors

    ERIC Educational Resources Information Center

    Lin, Che-Li; Liang, Jyh-Chong; Su, Yi-Ching; Tsai, Chin-Chung

    2013-01-01

    Teacher-centered instruction has been widely adopted in college computer science classrooms and has some benefits in training computer science undergraduates. Meanwhile, student-centered contexts have been advocated to promote computer science education. How computer science learners respond to or prefer the two types of teacher authority,…

  14. Physical Attraction: The Mysteries of Magnetism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stohr, Joachim

    2004-12-14

    Most people have intuitive associations with the word 'magnetism' based on everyday life: refrigerator magnets, the compass, north and south poles, or someone's 'magnetic personality'. Few people, however, realize how complicated the phenomenon really is, how much research still deals with the topic today, and how much it penetrates our modern industrialized world - from electricity, wireless communication at the speed of light to magnetic sensors in cars and data storage in computers. Stohr's lecture will provide a glimpse at the magic and science behind magnetism: its long history, scientific breakthroughs in its understanding, and its use in our modernmore » society. In the process Stohr will show how research at SSRL/SLAC is addressing some of the forefront issues in magnetism research and technology today.« less

  15. Three-dimensional printing physiology laboratory technology.

    PubMed

    Sulkin, Matthew S; Widder, Emily; Shao, Connie; Holzem, Katherine M; Gloschat, Christopher; Gutbrod, Sarah R; Efimov, Igor R

    2013-12-01

    Since its inception in 19th-century Germany, the physiology laboratory has been a complex and expensive research enterprise involving experts in various fields of science and engineering. Physiology research has been critically dependent on cutting-edge technological support of mechanical, electrical, optical, and more recently computer engineers. Evolution of modern experimental equipment is constrained by lack of direct communication between the physiological community and industry producing this equipment. Fortunately, recent advances in open source technologies, including three-dimensional printing, open source hardware and software, present an exciting opportunity to bring the design and development of research instrumentation to the end user, i.e., life scientists. Here we provide an overview on how to develop customized, cost-effective experimental equipment for physiology laboratories.

  16. Citizen Science in the Age of Surveys

    NASA Astrophysics Data System (ADS)

    Henden, Arne A.

    2014-06-01

    Paid professional astronomers are a new phenomenon - most of astronomical history has been written by amateurs. Modern technology has again leveled the playing field, with quality equipment, computers, software and the Internet giving amateurs the ability to match or exceed the data quality and quantity achievable by professionals. The Internet in particular has come into play, with crowd-sourcing through projects like Zooniverse, worldwide installation of private robotic observatories, and rapid dissemination of information leading the way.The future only shows more of these collaborative activities ahead, as all proposed surveys will require significant input from citizen scientists in order to achieve their goals. How the public is currently helping professional astronomers, how researchers can get involved, and some of the future opportunities will be presented.

  17. Applying the Coupled-Cluster Ansatz to Solids and Surfaces in the Thermodynamic Limit

    NASA Astrophysics Data System (ADS)

    Gruber, Thomas; Liao, Ke; Tsatsoulis, Theodoros; Hummel, Felix; Grüneis, Andreas

    2018-04-01

    Modern electronic structure theories can predict and simulate a wealth of phenomena in surface science and solid-state physics. In order to allow for a direct comparison with experiment, such ab initio predictions have to be made in the thermodynamic limit, substantially increasing the computational cost of many-electron wave-function theories. Here, we present a method that achieves thermodynamic limit results for solids and surfaces using the "gold standard" coupled cluster ansatz of quantum chemistry with unprecedented efficiency. We study the energy difference between carbon diamond and graphite crystals, adsorption energies of water on h -BN, as well as the cohesive energy of the Ne solid, demonstrating the increased efficiency and accuracy of coupled cluster theory for solids and surfaces.

  18. The Helicopter Antenna Radiation Prediction Code (HARP)

    NASA Technical Reports Server (NTRS)

    Klevenow, F. T.; Lynch, B. G.; Newman, E. H.; Rojas, R. G.; Scheick, J. T.; Shamansky, H. T.; Sze, K. Y.

    1990-01-01

    The first nine months effort in the development of a user oriented computer code, referred to as the HARP code, for analyzing the radiation from helicopter antennas is described. The HARP code uses modern computer graphics to aid in the description and display of the helicopter geometry. At low frequencies the helicopter is modeled by polygonal plates, and the method of moments is used to compute the desired patterns. At high frequencies the helicopter is modeled by a composite ellipsoid and flat plates, and computations are made using the geometrical theory of diffraction. The HARP code will provide a user friendly interface, employing modern computer graphics, to aid the user to describe the helicopter geometry, select the method of computation, construct the desired high or low frequency model, and display the results.

  19. The national science agenda as a ritual of modern nation-statehood: The consequences of national "Science for National Development" projects

    NASA Astrophysics Data System (ADS)

    Drori, Gili S.

    This study is a comparative investigation of the ways by which the globalization of modern science affects the characteristics of different nation-states. Whereas much research and policy discussion focuses on science as an instrumental, or technical, system with immediate consequences for national conditions, such as economic development, science should also be regarded as a general cultural framework, which is highly institutionalized at the global level. As such, the institutionalization of science at both the global and national levels affects a wide variety of national properties. Following this line of reasoning, this dissertation study employs cross-national and longitudinal data and multiple-indicator methods to show national-level consequences of scientific expansion on the processes of rationalization and modernization of social and political life. It appears that the cross-national expansion of science practice results in, or is associated with, a variety of measures of (a) the standardization of civil and governmental procedures and (b) the expansion of the political rights and political engagement. I conclude from these empirical findings that scientization encourages (a) greater general societal rationalization and (b) expanded notions of social actorhood and agency. This evidence demonstrates how the globalization of science alters local conditions, both civil and political, by supporting the institutionalization of bureaucratic practices and participatory politics. Thus, the expansion of science--clearly affected by global processes--carries a general secularized faith in a rationalized world and in human agency. In this sense, the practice of science is a national ritual, whose social role is as a legitimacy-providing institution, rather then a technically functional institution. On a broader level, the study emphasizes the relations between globalization processes and the sovereignty of the nation-state. I conclude that science carries modernist and global notions of rational governance, identity politics, self-determination, and democratization. Science globalization processes, therefore, encourage the worldwide institutionalization of the liberal mode of governmentality.

  20. The new classic data acquisition system for NPOI

    NASA Astrophysics Data System (ADS)

    Sun, B.; Jorgensen, A. M.; Landavazo, M.; Hutter, D. J.; van Belle, G. T.; Mozurkewich, David; Armstrong, J. T.; Schmitt, H. R.; Baines, E. K.; Restaino, S. R.

    2014-07-01

    The New Classic data acquisition system is an important portion of a new project of stellar surface imaging with the NPOI, funded by the National Science Foundation, and enables the data acquisition necessary for the project. The NPOI can simultaneously deliver beams from 6 telescopes to the beam combining facility, and in the Classic beam combiner these are combined 4 at a time on 3 separate spectrographs with all 15 possible baselines observed. The Classic data acquisition system is limited to 16 of 32 wavelength channels on two spectrographs and limited to 30 s integrations followed by a pause to ush data. Classic also has some limitations in its fringe-tracking capability. These factors, and the fact that Classic incorporates 1990s technology which cannot be easily replaced are motivation for upgrading the data acquisition system. The New Classic data acquisition system is based around modern electronics, including a high-end Stratix FPGA, a 200 MB/s Direct Memory Access card, and a fast modern Linux computer. These allow for continuous recording of all 96 channels across three spectrographs, increasing the total amount of data recorded by a an estimated order of magnitude. The additional computing power on the data acquisition system also allows for the implementation of more sophisticated fringe-tracking algorithms which are needed for the Stellar Surface Imaging project. In this paper we describe the New Classic system design and implementation, describe the background and motivation for the system as well as show some initial results from using it.

  1. Psycho-informatics: Big Data shaping modern psychometrics.

    PubMed

    Markowetz, Alexander; Błaszkiewicz, Konrad; Montag, Christian; Switala, Christina; Schlaepfer, Thomas E

    2014-04-01

    For the first time in history, it is possible to study human behavior on great scale and in fine detail simultaneously. Online services and ubiquitous computational devices, such as smartphones and modern cars, record our everyday activity. The resulting Big Data offers unprecedented opportunities for tracking and analyzing behavior. This paper hypothesizes the applicability and impact of Big Data technologies in the context of psychometrics both for research and clinical applications. It first outlines the state of the art, including the severe shortcomings with respect to quality and quantity of the resulting data. It then presents a technological vision, comprised of (i) numerous data sources such as mobile devices and sensors, (ii) a central data store, and (iii) an analytical platform, employing techniques from data mining and machine learning. To further illustrate the dramatic benefits of the proposed methodologies, the paper then outlines two current projects, logging and analyzing smartphone usage. One such study attempts to thereby quantify severity of major depression dynamically; the other investigates (mobile) Internet Addiction. Finally, the paper addresses some of the ethical issues inherent to Big Data technologies. In summary, the proposed approach is about to induce the single biggest methodological shift since the beginning of psychology or psychiatry. The resulting range of applications will dramatically shape the daily routines of researches and medical practitioners alike. Indeed, transferring techniques from computer science to psychiatry and psychology is about to establish Psycho-Informatics, an entire research direction of its own. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. Fuzzy Logic in Legal Education

    ERIC Educational Resources Information Center

    Balkir, Z. Gonul; Alniacik, Umit; Apaydin, Eylem

    2011-01-01

    The necessity of examination of every case within its peculiar conditions in social sciences requires different approaches complying with the spirit and nature of social sciences. Multiple realities require different and various perceptual interpretations. In modern world and social sciences, interpretation of perception of valued and multi-valued…

  3. How to Reconcile the Multiculturalist and Universalist Approaches to Science Education

    ERIC Educational Resources Information Center

    Hansson, Sven Ove

    2018-01-01

    The "multiculturalist" and "universalist" approaches to science education both fail to recognize the strong continuities between modern science and its forerunners in traditional societies. Various fact-finding practices in indigenous cultures exhibit the hallmarks of scientific investigations, such as collectively achieved…

  4. The Symbiotic Relationship between Liberal Studies and Science

    ERIC Educational Resources Information Center

    Unah, Jim I.

    2008-01-01

    The Artistic and Humanistic studies (liberal studies) and the science and technology disciplines (science) constitute the two dominant cultures in a modern university. Subsumed in these cultures are the professional disciplines of law, architecture, engineering, medicine, accounting, administration and a few others. Essentially, the university…

  5. Academic computer science and gender: A naturalistic study investigating the causes of attrition

    NASA Astrophysics Data System (ADS)

    Declue, Timothy Hall

    Far fewer women than men take computer science classes in high school, enroll in computer science programs in college, or complete advanced degrees in computer science. The computer science pipeline begins to shrink for women even before entering college, but it is at the college level that the "brain drain" is the most evident numerically, especially in the first class taken by most computer science majors called "Computer Science 1" or CS-I. The result, for both academia and industry, is a pronounced technological gender disparity in academic and industrial computer science. The study revealed the existence of several factors influencing success in CS-I. First, and most clearly, the effect of attribution processes seemed to be quite strong. These processes tend to work against success for females and in favor of success for males. Likewise, evidence was discovered which strengthens theories related to prior experience and the perception that computer science has a culture which is hostile to females. Two unanticipated themes related to the motivation and persistence of successful computer science majors. The findings did not support the belief that females have greater logistical problems in computer science than males, or that females tend to have a different programming style than males which adversely affects the females' ability to succeed in CS-I.

  6. Prediction: The Modern-Day Sport-Science and Sports-Medicine "Quest for the Holy Grail".

    PubMed

    McCall, Alan; Fanchini, Maurizio; Coutts, Aaron J

    2017-05-01

    In high-performance sport, science and medicine practitioners employ a variety of physical and psychological tests, training and match monitoring, and injury-screening tools for a variety of reasons, mainly to predict performance, identify talented individuals, and flag when an injury will occur. The ability to "predict" outcomes such as performance, talent, or injury is arguably sport science and medicine's modern-day equivalent of the "Quest for the Holy Grail." The purpose of this invited commentary is to highlight the common misinterpretation of studies investigating association to those actually analyzing prediction and to provide practitioners with simple recommendations to quickly distinguish between methods pertaining to association and those of prediction.

  7. A Webcast of Bird Nesting as a State-of-the-Art Citizen Science.

    PubMed

    Zárybnická, Markéta; Sklenicka, Petr; Tryjanowski, Piotr

    2017-01-01

    The quality of people's knowledge of nature has always had a significant influence on their approach to wildlife and nature conservation. However, direct interactions of people with nature are greatly limited nowadays, especially because of urbanization and modern lifestyles. As a result, our isolation from the natural world has been growing. Here, we present an example of a state-of-the-art Citizen Science project with its educational, scientific, and popularizing benefits. We conclude that modern media and new forms of education offer an effective opportunity for inspiring children and others to have fun learning to act like scientists. This approach provides broad opportunities for developing the hitherto neglected educational potential of Citizen Science.

  8. [Cardiology was born with the modern medical science].

    PubMed

    de Micheli, Alfredo

    2015-01-01

    Modern medical science was born in the post-Renaissance age and began to consolidate towards the middle of the XVII century thanks to physicists, physiologists and biologists, most of whom were direct or indirect pupils of Galileo. The discovery of blood circulation by Harvey is now considered the only progress in physiology at the beginning of the XVII century, comparable to the current advances seen in physical sciences. The history of this exploit could be written from view point of the progressive advance in knowledge. In his experiments, Harvey referred to the authentic not imaginary experiments, and put forward irrefutable quantitative arguments. We can therefore claim that his discovery of blood circulation was the first proper explanation of an organic process and the starting point leading to experimental physiology. So it seems justified to assert that modern medical science did not all rise suddenly, but was gradually structured starting from the middle of the XVII century following the path traced by William Harvey in light of Galileo's thought. Copyright © 2014 Instituto Nacional de Cardiología Ignacio Chávez. Published by Masson Doyma México S.A. All rights reserved.

  9. Jorge Luis Borges and the New Physics: the Literature of Modern Science and the Science of Modern Literature

    NASA Astrophysics Data System (ADS)

    Mosher, Mark Robert

    1992-01-01

    By examining the works of the Argentine writer, Jorge Luis Borges, and the parallels it has with modern physics, literature and science converge in their quest for truth regarding the structure and meaning of the universe. The classical perception of physics as a "hard" science--that of quantitative, rational thought which was established during the Newtonian era--has been replaced by the "new physics," which integrates the so-called "soft" elements into its paradigm. It presents us with a universe based not exclusively on a series of particle-like interactions, or a "billiard-ball" hypothesis where discrete objects have a measurable position and velocity in absolute space and time, but rather on a combination of these mechanistic properties and those that make up the non-physical side of nature such as intuition, consciousness, and emotion. According to physicists like James Jeans science has been "humanized" to the extent that the universe as a "great machine" has been converted into a "great thought.". In nearly all his collections of essays and short stories, Borges complements the new physics by producing a literature that can be described as "scientized." The abstract, metaphysical implications and concerns of the new world-view, such as space, time, language, consciousness, free will, determinism, etc., appear repeatedly throughout Borges' texts, and are treated in terms that are remarkably similar to those expressed in the scientific texts whose authors include Albert Einstein, Niels Bohr, Werner Heisenberg, and Erwin Schrodinger. As a final comparison, Borges and post-modern physicists address the question of the individual's ability to ever comprehend the universe. They share an attitude of incredulity toward all models and theories of reality simply because they are based on partial information, and therefore seen only as conjectures.

  10. The busy shall inherit the earth: the evolution from 'hard work' to 'busyness' in modern science and society.

    PubMed

    Charlton, Bruce G

    2006-01-01

    Although 'hard work' and 'busyness' are somewhat similar terms, there seem to be significant differences in the way that they are used. While hard work has always been a feature of complex societies, modern society can be seen as evolving toward being dominated by jobs characterized by busyness. Busyness refers to multi-tasking - having many sequential jobs to perform and switching frequently between them on an externally-imposed schedule. Traditionally, the individual gifts of a successful scientist were mainly in terms of knowledge, theoretical or technical aptitude. But nowadays the successful scientist is often one who has been promoted from hard-work to busyness: an expert in synthesizing a sufficient degree of scientific ability with a broad range of managerial and political skills. It is psychologically tough to be busy, because busyness is a consequence of human beings being constrained by the functioning of abstract social systems. In a complex modern organization, individual psychology is subordinated to inflexible programs of being in specific places at specific times doing specific things - this is both tricky to do well and demanding to do at all. Since people are paid (mainly) to do difficult but necessary things they would prefer not to do, busyness has become a major reason why people are paid a premium salary. In the long-term, many straightforward jobs will be analyzed and routinized out of existence, with the narrowly-skilled worker being replaced by teams, machines or computers. But busy jobs are hard to eliminate because they are those in which it is optimal for a variety of disparate and unpredictable tasks to be done by a single person. Consequently, those individuals who can cope with, even thrive-upon, busyness are becoming indispensable. In future 'the busy shall inherit the earth' (or, at least, the most powerful and highest paid jobs), not just in science but in all major social domains.

  11. Computer-Game Construction: A Gender-Neutral Attractor to Computing Science

    ERIC Educational Resources Information Center

    Carbonaro, Mike; Szafron, Duane; Cutumisu, Maria; Schaeffer, Jonathan

    2010-01-01

    Enrollment in Computing Science university programs is at a dangerously low level. A major reason for this is the general lack of interest in Computing Science by females. In this paper, we discuss our experience with using a computer game construction environment as a vehicle to encourage female participation in Computing Science. Experiments…

  12. Root-cause estimation of ultrasonic scattering signatures within a complex textured titanium

    NASA Astrophysics Data System (ADS)

    Blackshire, James L.; Na, Jeong K.; Freed, Shaun

    2016-02-01

    The nondestructive evaluation of polycrystalline materials has been an active area of research for many decades, and continues to be an area of growth in recent years. Titanium alloys in particular have become a critical material system used in modern turbine engine applications, where an evaluation of the local microstructure properties of engine disk/blade components is desired for performance and remaining life assessments. Current NDE methods are often limited to estimating ensemble material properties or detecting localized voids, inclusions, or damage features within a material. Recent advances in computational NDE and material science characterization methods are providing new and unprecedented access to heterogeneous material properties, which permits microstructure-sensing interactions to be studied in detail. In the present research, Integrated Computational Materials Engineering (ICME) methods and tools are being leveraged to gain a comprehensive understanding of root-cause ultrasonic scattering processes occurring within a textured titanium aerospace material. A combination of destructive, nondestructive, and computational methods are combined within the ICME framework to collect, holistically integrate, and study complex ultrasound scattering using realistic 2-dimensional representations of the microstructure properties. Progress towards validating the computational sensing methods are discussed, along with insight into the key scattering processes occurring within the bulk microstructure, and how they manifest in pulse-echo immersion ultrasound measurements.

  13. Central Computer Science Concepts to Research-Based Teacher Training in Computer Science: An Experimental Study

    ERIC Educational Resources Information Center

    Zendler, Andreas; Klaudt, Dieter

    2012-01-01

    The significance of computer science for economics and society is undisputed. In particular, computer science is acknowledged to play a key role in schools (e.g., by opening multiple career paths). The provision of effective computer science education in schools is dependent on teachers who are able to properly represent the discipline and whose…

  14. Globalization of Science Education: Comment and a Commentary

    ERIC Educational Resources Information Center

    Fensham, Peter J.

    2011-01-01

    The globalized nature of modern society has generated a number of pressures that impact internationally on countries' policies and practices of science education. Among these pressures are key issues of health and environment confronting global science, global economic control through multi-national capitalism, comparative and competitive…

  15. Promising Teacher Practices: Students' Views about Their Science Learning

    ERIC Educational Resources Information Center

    Moeed, Azra; Easterbrook, Matthew

    2016-01-01

    Internationally, conceptual and procedural understanding, understanding the Nature of Science, and scientific literacy are considered worthy goals of school science education in modern times. The empirical study presented here reports on promising teacher practices that in the students' views afford learning opportunities and support their science…

  16. Contributions of Basic Sciences to Science of Education. Studies in Educational Administration.

    ERIC Educational Resources Information Center

    Lall, Bernard M.

    The science of education has been influenced by the basic sciences to the extent that educational research now has been able to modernize its approach by accepting and using the basic scientific methodology and experimental techniques. Using primarily the same steps of scientific investigations, education today holds a place of much greater esteem…

  17. Ein Klassiker der Padagogik in Evolutionarer Perspektive: Eduard Sprangers "Lebensformen" im Lichte der Modernen Biologie (A Classic of Pedagogics from an Evolutionary Perspective: Edward Spranger's "Forms of Life" in the Light of Modern Biology).

    ERIC Educational Resources Information Center

    Neumann, Dieter

    2002-01-01

    Interprets Edward Spranger's "Forms of Life" against the background of the findings of modern biology. Shows how far Spranger's diagnosis of different human types, which are not affected by external influences on characteristics, conform with research hypotheses of modern biological sciences. (CAJ)

  18. Facilities available for biomedical science research in the public universities in Lagos, Nigeria.

    PubMed

    John, T A

    2010-03-01

    Across the world, basic medical scientists and physician scientists work on common platforms in state-of-the-arts laboratories doing translational research that occasionally results in bedside application. Biotechnology industries capitalise on useful findings for colossal profit.1 In Nigeria and the rest of Africa, biomedical science has not thrived and the contribution of publications to global high impact journals is low.2 This work investigated facilities available for modern biomedical research in Lagos public universities to extract culprit factors. The two public universities in Lagos, Nigeria were investigated by a cross sectional questionnaire survey of the technical staff manning biomedical science departments. They were asked about availability of 47 modern biomedical science research laboratory components such as cold room and microscopes and six research administration components such as director of research and grants administration. For convenient basic laboratory components such as autoclaves and balances, 50% responses indicated "well maintained and always functional" whereas for less convenient complex, high maintenance, state-of-the-arts equipment 19% responses indicated "well maintained and always functional." Respondents indicated that components of modern biomedical science research administration were 44% of expectation. The survey reveal a deficit in state-of the-arts research equipment and also a deficit in high maintenance, expensive equipment indicating that biomedical science in the investigated environment lacks the momentum of global trends and also lacks buoyant funding. In addition, administration supporting biomedical science is below expectation and may also account for the low contributions of research articles to global high impact journals.

  19. The wage of fame: how non-epistemic motives have enabled the phenomenal success of modern science.

    PubMed

    Franck, Georg

    2015-01-01

    This paper ventures an economic view of modern science. It points out how science works as a closed economy of attention where researchers invest their own attention in order to get the attention of fellow researchers. Attention thus enters economy in two properties: (1) as a scarce resource energising scientific production and (2) as a means of gratification rewarding the effort of the working scientist. Economising on attention as a scarce resource is another expression of thought economy. The income of expert attention is what gives rise to reputation, renown, prominence and eventually fame. By its being conceived as a closed economy of attention, science shows to be capable of self-organising a tendency towards overall efficiency and thus towards collective rationality. © 2014 S. Karger AG, Basel.

  20. EDITORIAL: Computational materials science Computational materials science

    NASA Astrophysics Data System (ADS)

    Kahl, Gerhard; Kresse, Georg

    2011-10-01

    Special issue in honour of Jürgen Hafner On 30 September 2010, Jürgen Hafner, one of the most prominent and influential members within the solid state community, retired. His remarkably broad scientific oeuvre has made him one of the founding fathers of modern computational materials science: more than 600 scientific publications, numerous contributions to books, and a highly cited monograph, which has become a standard reference in the theory of metals, witness not only the remarkable productivity of Jürgen Hafner but also his impact in theoretical solid state physics. In an effort to duly acknowledge Jürgen Hafner's lasting impact in this field, a Festsymposium was held on 27-29 September 2010 at the Universität Wien. The organizers of this symposium (and authors of this editorial) are proud to say that a large number of highly renowned scientists in theoretical condensed matter theory—co-workers, friends and students—accepted the invitation to this celebration of Hafner's jubilee. Some of these speakers also followed our invitation to submit their contribution to this Festschrift, published in Journal of Physics: Condensed Matter, a journal which Jürgen Hafner served in 2000-2003 and 2003-2006 as a member of the Advisory Editorial Board and member of the Executive Board, respectively. In the subsequent article, Volker Heine, friend and co-worker of Jürgen Hafner over many decades, gives an account of Hafner's impact in the field of theoretical condensed matter physics. Computational materials science contents Theoretical study of structural, mechanical and spectroscopic properties of boehmite (γ-AlOOH) D Tunega, H Pašalić, M H Gerzabek and H Lischka Ethylene epoxidation catalyzed by chlorine-promoted silver oxide M O Ozbek, I Onal and R A Van Santen First-principles study of Cu2ZnSnS4 and the related band offsets for photovoltaic applicationsA Nagoya, R Asahi and G Kresse Renormalization group study of random quantum magnetsIstván A Kovács and Ferenc Iglói Ordering effects in disordered systems: the Au-Si systemN Jakse, T L T Nguyen and A Pasturel On the stability of Archimedean tilings formed by patchy particlesMoritz Antlanger, Günther Doppelbauer and Gerhard Kahl

  1. The future(s) of open science.

    PubMed

    Mirowski, Philip

    2018-04-01

    Almost everyone is enthusiastic that 'open science' is the wave of the future. Yet when one looks seriously at the flaws in modern science that the movement proposes to remedy, the prospect for improvement in at least four areas are unimpressive. This suggests that the agenda is effectively to re-engineer science along the lines of platform capitalism, under the misleading banner of opening up science to the masses.

  2. (Re)considering Foucault for science education research: considerations of truth, power and governance

    NASA Astrophysics Data System (ADS)

    Bazzul, Jesse; Carter, Lyn

    2017-06-01

    This article is a response to Anna Danielsonn, Maria Berge, and Malena Lidar's paper, "Knowledge and power in the technology classroom: a framework for studying teachers and students in action", and an appeal to science educators of all epistemological orientations to (re)consider the work of Michel Foucault for research in science education. Although this essay does not come close to outlining the importance of Foucault's work for science education, it does present a lesser-known side of Foucault as an anti-polemical, realist, modern philosopher interested in the way objective knowledge is entangled with governance in modernity. This latter point is important for science educators, as it is the intersection of objective knowledge and institutional imperatives that characterizes the field(s) of science education. Considering the lack of engagement with philosophy and social theory in science education, this paper offers one of many possible readings of Foucault (we as authors have also published different readings of Foucault) in order to engage crucial questions related to truth, power, governance, discourse, ethics and education.

  3. Econophysics and evolutionary economics (Scientific session of the Physical Sciences Division of the Russian Academy of Sciences, 2 November 2010)

    NASA Astrophysics Data System (ADS)

    2011-07-01

    The scientific session "Econophysics and evolutionary economics" of the Division of Physical Sciences of the Russian Academy of Sciences (RAS) took place on 2 November 2010 in the conference hall of the Lebedev Physical Institute, Russian Academy of Sciences. The session agenda announced on the website www.gpad.ac.ru of the RAS Physical Sciences Division listed the following reports: (1) Maevsky V I (Institute of Economics, RAS, Moscow) "The transition from simple reproduction to economic growth"; (2) Yudanov A Yu (Financial University of the Government of the Russian Federation, Moscow) "Experimental data on the development of fast-growing innovative companies in Russia"; (3) Pospelov I G (Dorodnitsyn Computation Center, RAS, Moscow) "Why is it sometimes possible to successfully model an economy? (4) Chernyavskii D S (Lebedev Physical Institute, RAS, Moscow) "Theoretical economics"; (5) Romanovskii M Yu (Prokhorov Institute of General Physics, RAS, Moscow) "Nonclassical random walks and the phenomenology of fluctuations of the yield of securities in the securities market"; (6) Dubovikov M M, Starchenko N V (INTRAST Management Company, Moscow Engineering Physics Institute, Moscow) "Fractal analysis of financial time series and the prediction problem"; Papers written on the basis of these reports are published below. • The transition from simple reproduction to economic growth, V I Maevsky, S Yu Malkov Physics-Uspekhi, 2011, Volume 54, Number 7, Pages 729-733 • High-growth firms in Russia: experimental data and prospects for the econophysical simulation of economic modernization, A Yu Yudanov Physics-Uspekhi, 2011, Volume 54, Number 7, Pages 733-737 • Equilibrium models of economics in the period of a global financial crisis, I G Pospelov Physics-Uspekhi, 2011, Volume 54, Number 7, Pages 738-742 • On econophysics and its place in modern theoretical economics, D S Chernavskii, N I Starkov, S Yu Malkov, Yu V Kosse, A V Shcherbakov Physics-Uspekhi, 2011, Volume 54, Number 7, Pages 742-749 • Nonclassical random walks and the phenomenology of fluctuations of securities returns in the stock market, P V Vidov, M Yu Romanovsky Physics-Uspekhi, 2011, Volume 54, Number 7, Pages 749-753 • Econophysics and the fractal analysis of financial time series, M M Dubovikov, N V Starchenko Physics-Uspekhi, 2011, Volume 54, Number 7, Pages 754-761

  4. Using spatial principles to optimize distributed computing for enabling the physical science discoveries

    PubMed Central

    Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing

    2011-01-01

    Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century. PMID:21444779

  5. Using spatial principles to optimize distributed computing for enabling the physical science discoveries.

    PubMed

    Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing

    2011-04-05

    Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century.

  6. Making Early Modern Medicine: Reproducing Swedish Bitters.

    PubMed

    Ahnfelt, Nils-Otto; Fors, Hjalmar

    2016-05-01

    Historians of science and medicine have rarely applied themselves to reproducing the experiments and practices of medicine and pharmacy. This paper delineates our efforts to reproduce "Swedish Bitters," an early modern composite medicine in wide European use from the 1730s to the present. In its original formulation, it was made from seven medicinal simples: aloe, rhubarb, saffron, myrrh, gentian, zedoary and agarikon. These were mixed in alcohol together with some theriac, a composite medicine of classical origin. The paper delineates the compositional history of Swedish Bitters and the medical rationale underlying its composition. It also describes how we go about to reproduce the medicine in a laboratory using early modern pharmaceutical methods, and analyse it using contemporary methods of pharmaceutical chemistry. Our aim is twofold: first, to show how reproducing medicines may provide a path towards a deeper understanding of the role of sensual and practical knowledge in the wider context of early modern medical culture; and second, how it may yield interesting results from the point of view of contemporary pharmaceutical science.

  7. [Trueness of modern natural science (1): the scientific revolution and the problem of philosophy].

    PubMed

    Maeda, Y

    2001-12-01

    How can one characterize modern Europe? This problem is essentially related to the meaning of modern natural science, which was developed during the scientific revolution. Then how did viewpoints change during this revolution? The answer to this question also determined the basic character of modern philosophy. Through the examination of Aristotle's geocentric theory and kinematics, I have come to believe that the defect of Aristotle's was that he concluded that a visible sense image is an actual reflection of the reality as it is. From this point of view, the traditional theory of truth called "correspondence theory" is found to be an insufficient one. Therefore, in this paper I will show that the methodological and philosophical question "How do we see reality among phenomena?" is a very important one. This question is the one Plato struggled with, and also the one which guided Kant. It may be said that this can be seen as a group for a new metaphysics as a basic theory of reality.

  8. [Archaic stereotypies and modern approaches for understanding of ageing].

    PubMed

    Grigoryeva, I A; Kelasev, V N

    2017-01-01

    In the article we discussed the processes of awareness of the place of elderly people in modern society, elaboration of adequate relation to global aging and elderly themselves are still going in social sciences. These processes are expressed in a clash of archaic stereotypes and new approaches which changed social and age structure requires. Not only elderly people are providers of archaic stereotypes, but scientific institutions and practices as well. Reorientation of science, media and social policy towards study and realization of «postponed aging» opportunities is needed.

  9. A Financial Technology Entrepreneurship Program for Computer Science Students

    ERIC Educational Resources Information Center

    Lawler, James P.; Joseph, Anthony

    2011-01-01

    Education in entrepreneurship is becoming a critical area of curricula for computer science students. Few schools of computer science have a concentration in entrepreneurship in the computing curricula. The paper presents Technology Entrepreneurship in the curricula at a leading school of computer science and information systems, in which students…

  10. Preface: SciDAC 2006

    NASA Astrophysics Data System (ADS)

    Tang, William M., Dr.

    2006-01-01

    The second annual Scientific Discovery through Advanced Computing (SciDAC) Conference was held from June 25-29, 2006 at the new Hyatt Regency Hotel in Denver, Colorado. This conference showcased outstanding SciDAC-sponsored computational science results achieved during the past year across many scientific domains, with an emphasis on science at scale. Exciting computational science that has been accomplished outside of the SciDAC program both nationally and internationally was also featured to help foster communication between SciDAC computational scientists and those funded by other agencies. This was illustrated by many compelling examples of how domain scientists collaborated productively with applied mathematicians and computer scientists to effectively take advantage of terascale computers (capable of performing trillions of calculations per second) not only to accelerate progress in scientific discovery in a variety of fields but also to show great promise for being able to utilize the exciting petascale capabilities in the near future. The SciDAC program was originally conceived as an interdisciplinary computational science program based on the guiding principle that strong collaborative alliances between domain scientists, applied mathematicians, and computer scientists are vital to accelerated progress and associated discovery on the world's most challenging scientific problems. Associated verification and validation are essential in this successful program, which was funded by the US Department of Energy Office of Science (DOE OS) five years ago. As is made clear in many of the papers in these proceedings, SciDAC has fundamentally changed the way that computational science is now carried out in response to the exciting challenge of making the best use of the rapid progress in the emergence of more and more powerful computational platforms. In this regard, Dr. Raymond Orbach, Energy Undersecretary for Science at the DOE and Director of the OS has stated: `SciDAC has strengthened the role of high-end computing in furthering science. It is defining whole new fields for discovery.' (SciDAC Review, Spring 2006, p8). Application domains within the SciDAC 2006 conference agenda encompassed a broad range of science including: (i) the DOE core mission of energy research involving combustion studies relevant to fuel efficiency and pollution issues faced today and magnetic fusion investigations impacting prospects for future energy sources; (ii) fundamental explorations into the building blocks of matter, ranging from quantum chromodynamics - the basic theory that describes how quarks make up the protons and neutrons of all matter - to the design of modern high-energy accelerators; (iii) the formidable challenges of predicting and controlling the behavior of molecules in quantum chemistry and the complex biomolecules determining the evolution of biological systems; (iv) studies of exploding stars for insights into the nature of the universe; and (v) integrated climate modeling to enable realistic analysis of earth's changing climate. Associated research has made it quite clear that advanced computation is often the only means by which timely progress is feasible when dealing with these complex, multi-component physical, chemical, and biological systems operating over huge ranges of temporal and spatial scales. Working with the domain scientists, applied mathematicians and computer scientists have continued to develop the discretizations of the underlying equations and the complementary algorithms to enable improvements in solutions on modern parallel computing platforms as they evolve from the terascale toward the petascale regime. Moreover, the associated tremendous growth of data generated from the terabyte to the petabyte range demands not only the advanced data analysis and visualization methods to harvest the scientific information but also the development of efficient workflow strategies which can deal with the data input/output, management, movement, and storage challenges. If scientific discovery is expected to keep apace with the continuing progression from tera- to petascale platforms, the vital alliance between domain scientists, applied mathematicians, and computer scientists will be even more crucial. During the SciDAC 2006 Conference, some of the future challenges and opportunities in interdisciplinary computational science were emphasized in the Advanced Architectures Panel and by Dr. Victor Reis, Senior Advisor to the Secretary of Energy, who gave a featured presentation on `Simulation, Computation, and the Global Nuclear Energy Partnership.' Overall, the conference provided an excellent opportunity to highlight the rising importance of computational science in the scientific enterprise and to motivate future investment in this area. As Michael Strayer, SciDAC Program Director, has noted: `While SciDAC may have started out as a specific program, Scientific Discovery through Advanced Computing has become a powerful concept for addressing some of the biggest challenges facing our nation and our world.' Looking forward to next year, the SciDAC 2007 Conference will be held from June 24-28 at the Westin Copley Plaza in Boston, Massachusetts. Chairman: David Keyes, Columbia University. The Organizing Committee for the SciDAC 2006 Conference would like to acknowledge the individuals whose talents and efforts were essential to the success of the meeting. Special thanks go to Betsy Riley for her leadership in building the infrastructure support for the conference, for identifying and then obtaining contributions from our corporate sponsors, for coordinating all media communications, and for her efforts in organizing and preparing the conference proceedings for publication; to Tim Jones for handling the hotel scouting, subcontracts, and exhibits and stage production; to Angela Harris for handling supplies, shipping, and tracking, poster sessions set-up, and for her efforts in coordinating and scheduling the promotional activities that took place during the conference; to John Bui and John Smith for their superb wireless networking and A/V set-up and support; to Cindy Latham for Web site design, graphic design, and quality control of proceedings submissions; and to Pamelia Nixon-Hartje of Ambassador for budget and quality control of catering. We are grateful for the highly professional dedicated efforts of all of these individuals, who were the cornerstones of the SciDAC 2006 Conference. Thanks also go to Angela Beach of the ORNL Conference Center for her efforts in executing the contracts with the hotel, Carolyn James of Colorado State for on-site registration supervision, Lora Wolfe and Brittany Hagen for administrative support at ORNL, and Dami Rich and Andrew Sproles for graphic design and production. We are also most grateful to the Oak Ridge National Laboratory, especially Jeff Nichols, and to our corporate sponsors, Data Direct Networks, Cray, IBM, SGI, and Institute of Physics Publishing for their support. We especially express our gratitude to the featured speakers, invited oral speakers, invited poster presenters, session chairs, and advanced architecture panelists and chair for their excellent contributions on behalf of SciDAC 2006. We would like to express our deep appreciation to Lali Chatterjee, Graham Douglas, Margaret Smith, and the production team of Institute of Physics Publishing, who worked tirelessly to publish the final conference proceedings in a timely manner. Finally, heartfelt thanks are extended to Michael Strayer, Associate Director for OASCR and SciDAC Director, and to the DOE program managers associated with SciDAC for their continuing enthusiasm and strong support for the annual SciDAC Conferences as a special venue to showcase the exciting scientific discovery achievements enabled by the interdisciplinary collaborations championed by the SciDAC program.

  11. Computer Science Teacher Professional Development in the United States: A Review of Studies Published between 2004 and 2014

    ERIC Educational Resources Information Center

    Menekse, Muhsin

    2015-01-01

    While there has been a remarkable interest to make computer science a core K-12 academic subject in the United States, there is a shortage of K-12 computer science teachers to successfully implement computer sciences courses in schools. In order to enhance computer science teacher capacity, training programs have been offered through teacher…

  12. Mantle convection on modern supercomputers

    NASA Astrophysics Data System (ADS)

    Weismüller, Jens; Gmeiner, Björn; Mohr, Marcus; Waluga, Christian; Wohlmuth, Barbara; Rüde, Ulrich; Bunge, Hans-Peter

    2015-04-01

    Mantle convection is the cause for plate tectonics, the formation of mountains and oceans, and the main driving mechanism behind earthquakes. The convection process is modeled by a system of partial differential equations describing the conservation of mass, momentum and energy. Characteristic to mantle flow is the vast disparity of length scales from global to microscopic, turning mantle convection simulations into a challenging application for high-performance computing. As system size and technical complexity of the simulations continue to increase, design and implementation of simulation models for next generation large-scale architectures demand an interdisciplinary co-design. Here we report about recent advances of the TERRA-NEO project, which is part of the high visibility SPPEXA program, and a joint effort of four research groups in computer sciences, mathematics and geophysical application under the leadership of FAU Erlangen. TERRA-NEO develops algorithms for future HPC infrastructures, focusing on high computational efficiency and resilience in next generation mantle convection models. We present software that can resolve the Earth's mantle with up to 1012 grid points and scales efficiently to massively parallel hardware with more than 50,000 processors. We use our simulations to explore the dynamic regime of mantle convection assessing the impact of small scale processes on global mantle flow.

  13. The triumph of politics over wilderness science

    Treesearch

    Craig W. Allin

    2000-01-01

    The National Wilderness Preservation System reflects the triumph of politics over science. The history of wilderness allocation has reflected political rather than scientific sensibilities. The preeminence of politics over science extends to wilderness management as well and is illustrated here by representative examples from the modern history of Yellowstone National...

  14. Science: Servant or Master?

    ERIC Educational Resources Information Center

    Morgenthau, Hans J.

    In this tenth book of a series entitled "Perspectives in Humanism," analyses are included concerning the meaning of science for modern man and its effects on contemporary politics. Natural, social, and humanistic sciences are discussed in connection with religion, philosophy, and politics to indicate the importance of the scholar who fulfills the…

  15. Agriculture and Biology Teaching. Science and Technology Education Document Series 11.

    ERIC Educational Resources Information Center

    Rao, A. N.; Pritchard, Alan J.

    The six-chapter document is part of Unesco's Science and Technology Education Programme to encourage an international exchange of ideas and information on science and technology education. Chapters discuss: (1) development of agriculture (beginning and modern); (2) agroecosystems (land utilization, soils, food production, irrigation, and…

  16. Igniting the Sparkle: An Indigenous Science Education Model.

    ERIC Educational Resources Information Center

    Cajete, Gregory A.

    This book describes a culturally responsive science curriculum that the author has been teaching for 25 years. The curriculum integrates Native American traditional values, teaching principles, and concepts of nature with those of modern Western science. Every Indigenous culture has an orientation to learning that is metaphorically represented in…

  17. CONASTA Brings Teachers a Kaleidoscope of Science

    ERIC Educational Resources Information Center

    Teaching Science, 2015

    2015-01-01

    From star systems to social systems, CONASTA 64 connects teachers to researchers and scientists working on the cutting edge of modern science. We asked two CONASTA 64 Keynote speakers, Steven Tingay and Ian Walker to share their passion for their work and their dedication for giving back to the science community.

  18. Feyerabend on Science and Education

    ERIC Educational Resources Information Center

    Kidd, Ian James

    2013-01-01

    This article offers a sympathetic interpretation of Paul Feyerabend's remarks on science and education. I present a formative episode in the development of his educational ideas--the "Berkeley experience"--and describe how it affected his views on the place of science within modern education. It emerges that Feyerabend arrived at a…

  19. Authorized Course of Instruction for the Quinmester Program. Science: Cell Biology, Introduction to Life Science.

    ERIC Educational Resources Information Center

    Dade County Public Schools, Miami, FL.

    This instructional package contains two biological units developed for the Dade County Florida Quinmester Program. "Introduction to Life Sciences" develops student understandings of cell structure and function, and compares different levels of cellular organization. "Cell Biology" investigates the origin of modern cellular…

  20. Future Trends in the Kinesiology Sciences

    ERIC Educational Resources Information Center

    Knudson, Duane

    2016-01-01

    Kinesiology emerged from its preventative medicine and education roots to establish itself as a recognized field of inquiry with numerous sub-disciplines. This article presents four trends in modern science that will likely influence the future of kinesiology sciences. Will recent increases in greater scientific specialization be overcome by the…

Top